If there is anyone left out there that is handcuffed to the following as we are:
1. The need to author and publish FGDC metadata
2. A workflow that supports the "old" FGDC xml format in ArcCatalog
3. The need to display that in a human readable format (HTML)
This post is for you. I recently wrote a simple Add-in for ArcCatalog that translates the "old" or
"classic" FDGC Metadata XML Format to classic html or FAQ html.
Basically, it is the same functionality that was offered in 9.3. For
most folks, this tool may be unneeded given the new metadata framework
in ArcGIS 10 but if you are still using the "old" workflow and FGDC
metadata editor, this will be helpful given that the ArcGIS 10 export does
not have translators for the old FDGC XML format.
For installation, simply follow the instructions in the ArcGIS Desktop help system for installing and using Add-ins.
Map Vibes
Thoughts, resources, and discussions on a bunch of geogeek stuff (sometimes some other topics creep in as well). This is a personal blog and is not affiliated with the any organization, government office, or academic institution. Any postings on this blog represent the thoughts, opinions and actions of myself only.
Friday, August 17, 2012
"Old" FDGC XML Metadata Format Translator
Labels:
.NET,
ArcCatalog,
ArcGIS Add-ins,
C#,
FGDC,
Metadata,
xml,
xsl,
xslt
Tuesday, October 18, 2011
Mobile Web Maps with OpenLayers and jQuery: A Tutorial
I recently helped host a training session sponsored by GITA Rocky Mountain Chapter introducing various OpenSource Geo packages including PostGIS, GeoServer and OpenLayers with Cliff Inbau. I have been doing a bit of work refactoring some web maps for mobile friendliness and thought it would be worth sharing some of my musings with OpenLayers and jQuery. I had worked with jQuery a fare amount but hadn't touched OpenLayers. My interest in playing with it came from a talk given by Tory Alford entitled "GIS for the Mobile Web" which is worth looking at. Below you will find my deck and the hands-on tutorials. They haven't been vetted all that well but would welcome any feedback.
Labels:
Javascript,
jquery,
mobile,
OpenLayers
Friday, June 24, 2011
A Real Life Geoprocessing Service In Action (ArcGIS Server 10)
As I have mentioned before, I'm an oldschool GIS guy. Because of this, I have a hard time embracing "secular" web mapping toolkits. Its kind of like the old days. Yes, you can use GIS to just make maps but it is really about the data and analytic capabilities. So translated into the geoweb, yes there are all kinds of toolkits to make webmaps but very few that expose the richness of the data and provide real analytic capabilities. That is why a recent project I was lucky enough to be apart of was so exciting. Web enabling rich GIS capabilities via a simple user interface and user experience using ArcGIS Server's Geoprocessing service.
The application is really simple, basically allowing users to dynamically calculate the amount of in-place oil shale amounts. We have kind of been calling it the oil shale calculator. However, the science and methodology for assessing the oil shale resources in the first place was far more complex and something that I can't take credit for but would like to acknowledge those who did (Johnson, R.C., Brownfield, M.E., and Mercier, T.J., (U.S. Geological Survey Oil Shale Assessment Team)). Have a look at it if you are interested.
But for the sake of this post, the result of this work (at least the part that was used for this application) was basically a raster with gallons of in-place oil shale resources per cell and a model implementing the zonal stats function for doing the dynamic calculations (nice work Tracey!). Here is the geoprocessing model.
Pretty simple really. Basically, two input parameters. A user defined polygon and the raster containing oil shale values. The output being a sum based on the zonal statistics function. This was actually modeled after one of the AGS samples. When implementing this model, we ran into what I consider a bug. First, the literature suggests that you can store the output table in memory(ArcGIS 10 only). However, we found this was not the case. We were forced to use a physical file system location, basically the scratch disk (bad: output table=in_memory\results, good: output table=%scratchworkspace%\oilshale.dbf).
The application itself was pretty simple as well. Have a look. One gotcha we ran into here was that basically the REST API endpoint for this service actually expects 2 input parameters, not just one as is suggested by its Services Directory page. The missing parameter that is actually required is the zone field parameter. We called ours id. You can view the page source of the app to see how we manually included this parameter but a preview is below:
The application is really simple, basically allowing users to dynamically calculate the amount of in-place oil shale amounts. We have kind of been calling it the oil shale calculator. However, the science and methodology for assessing the oil shale resources in the first place was far more complex and something that I can't take credit for but would like to acknowledge those who did (Johnson, R.C., Brownfield, M.E., and Mercier, T.J., (U.S. Geological Survey Oil Shale Assessment Team)). Have a look at it if you are interested.
But for the sake of this post, the result of this work (at least the part that was used for this application) was basically a raster with gallons of in-place oil shale resources per cell and a model implementing the zonal stats function for doing the dynamic calculations (nice work Tracey!). Here is the geoprocessing model.
Pretty simple really. Basically, two input parameters. A user defined polygon and the raster containing oil shale values. The output being a sum based on the zonal statistics function. This was actually modeled after one of the AGS samples. When implementing this model, we ran into what I consider a bug. First, the literature suggests that you can store the output table in memory(ArcGIS 10 only). However, we found this was not the case. We were forced to use a physical file system location, basically the scratch disk (bad: output table=in_memory\results, good: output table=%scratchworkspace%\oilshale.dbf).
The application itself was pretty simple as well. Have a look. One gotcha we ran into here was that basically the REST API endpoint for this service actually expects 2 input parameters, not just one as is suggested by its Services Directory page. The missing parameter that is actually required is the zone field parameter. We called ours id. You can view the page source of the app to see how we manually included this parameter but a preview is below:
Friday, May 13, 2011
Sexier Posters and Poster Sessions Using Zoomify and QR Codes
This isn't a typical post for this blog but working with geologists gives me opportunities to geek out on even classic information delivery mediums. This means trying to make posters even sexy. Thats right, posters (they still do those at earth science events and conferences). I give the crew of geoscientists I work with alot of credit. At least they are creating posters digitally and not using scissors and glue like a 3rd grade science project as it once was done once upon a time.
Anyway, I have ran across a couple of interesting techniques to help make posters at least a little more usable. The first isn't earth shattering. For our agency, delivering information and products to the public is a major requirement and delivering posters that scientists have presented at various conference poster sessions is part of this. Typically, we provide a thumbnail and a link to the entire poster in PDF format for users to download and print on their own. However, a typical use case is for users to just "have a look" at the poster or preview it directly online. Doing this with full resolution PDFs or low resolution images is problematic but with the use of a product called Zoomify, we are able to provide users with the ability to zoom-in, pan and share the posters we post online interactively. Here are some examples of how we have used Zoomify for our online posters presentations.
There is a free version of the product that does about 80 percent of what you might need along with upgrades for purchase that allow you full control to the component via its ActionScript API (it does need Flash to run).
The second technique to share is a little more modern I guess. Still not earth shattering but a little more techie. I'm sure you have seen these around:
Anyway, I have ran across a couple of interesting techniques to help make posters at least a little more usable. The first isn't earth shattering. For our agency, delivering information and products to the public is a major requirement and delivering posters that scientists have presented at various conference poster sessions is part of this. Typically, we provide a thumbnail and a link to the entire poster in PDF format for users to download and print on their own. However, a typical use case is for users to just "have a look" at the poster or preview it directly online. Doing this with full resolution PDFs or low resolution images is problematic but with the use of a product called Zoomify, we are able to provide users with the ability to zoom-in, pan and share the posters we post online interactively. Here are some examples of how we have used Zoomify for our online posters presentations.
There is a free version of the product that does about 80 percent of what you might need along with upgrades for purchase that allow you full control to the component via its ActionScript API (it does need Flash to run).
The second technique to share is a little more modern I guess. Still not earth shattering but a little more techie. I'm sure you have seen these around:
If not, this is an example of a QR code. Basically, it is a matrix barcode that can be read by barcode readers and smart phones. They can be used to encode information like telephone numbers or URI's. In our case, they make an interesting addition to a standard, hardcopy poster. Imagine encoding a link to a manuscript, online map, or even a database that relates to information summarized on a poster. Perhaps even encoding contact information such as your telephone number that can be digitally captured by viewers of your poster and added directly to their contact list on their phone, all done using QR codes. Here is an example of a QR code embedded next to a figure from a post that links to a data download website related to the figure.
Tuesday, March 8, 2011
The 2011 ESRI Developers Summit Really Blew
I arrived on Monday with some flair. Evidently, I was the first flight to land (12:30pm local) after some really crazy winds. I was in seat 2A and the lady in front of me barfed on arrival (God Love her). Luckily, it was at the end of the flight.
I landed, got my car and was engulfed in the craziest sandstorm ever. The next day, the main feeders from the highway to downtown were closed as I tried to make my way to the convention center for the plenary (I like to stay in Palm Desert), so I used the interior city streets which sucked. This years Developer's Summit really blew so far.
However, things changed upon my arrival to the convention center (well actually before). First, the weather could not have been better. Clear, sunny, warm and no wind. Next, the plenary was really good (here is the 1st part of it) which has not been the case for several years of attendance at various Developer's Summits, user group gatherings, as well as UCs. What was striking this time was that this was the largest attendance ever for a Developer's Summit where 50% of the attendees were new and largely international.
Secondly, the vibe was SEXY. I can't put my finger on it but I have been to dozens of ESRI sponsored events and this one was just cooler. I guess the seating is part of it, but I think it is something more as well. The lunch on the first day was yummy too.
Technology wise, its time to get mobile. I am kind of old-school and have forced myself to re-invent and re-brand as needed and this years Summit represents another milestone for that. I hearken back to these times in my career...
Given this need to change, I focused on mobile tech sessions mostly as well as chatted with the alot of the engineers from the various product teams. The tech sessions were great. I especially want to lift up the "Choosing the best mobile platform" tech session. I liked this session because it addressed real business issues. Most tech sessions show a demo with code, but this session had larger relevance that is often missed in most tech sessions.
Anyhow, nice show this year!
I landed, got my car and was engulfed in the craziest sandstorm ever. The next day, the main feeders from the highway to downtown were closed as I tried to make my way to the convention center for the plenary (I like to stay in Palm Desert), so I used the interior city streets which sucked. This years Developer's Summit really blew so far.
However, things changed upon my arrival to the convention center (well actually before). First, the weather could not have been better. Clear, sunny, warm and no wind. Next, the plenary was really good (here is the 1st part of it) which has not been the case for several years of attendance at various Developer's Summits, user group gatherings, as well as UCs. What was striking this time was that this was the largest attendance ever for a Developer's Summit where 50% of the attendees were new and largely international.
Secondly, the vibe was SEXY. I can't put my finger on it but I have been to dozens of ESRI sponsored events and this one was just cooler. I guess the seating is part of it, but I think it is something more as well. The lunch on the first day was yummy too.
Technology wise, its time to get mobile. I am kind of old-school and have forced myself to re-invent and re-brand as needed and this years Summit represents another milestone for that. I hearken back to these times in my career...
- Arc/Info command line to ArcView 3.x
- AML to Avenue
- ArcX to Arc 8
- ArcIMS to ArcGIS Server
- you get the idea
Given this need to change, I focused on mobile tech sessions mostly as well as chatted with the alot of the engineers from the various product teams. The tech sessions were great. I especially want to lift up the "Choosing the best mobile platform" tech session. I liked this session because it addressed real business issues. Most tech sessions show a demo with code, but this session had larger relevance that is often missed in most tech sessions.
Anyhow, nice show this year!
Monday, February 28, 2011
Tips and Tricks With PERL for NON-PERL Types
I have been working on an integration/data management project for managing geophysical well logs in an E&P environment and I hope to post a blog about that shortly but the basic jist is that we have all kinds of geologic data located all over our file system and need some way of finding that data and integrating systems that use it such as a GIS and other E&P systems (well master, seismic systems, PPDM, Petra and Geographix, etc). The first step was to perform an inventory of the files on our network and, to me, there isn't anything better for file level management/text parsing than good old PERL. Here are a few gems I found that I think are worth sharing. Be aware that this is a total hack so please don't take offense all of you PERL purists.
Example 1: Here is a handy way to iterate through a file system and do something to each record. $startpath is the file system directory and the second line is where you can perform a filter. In this case, I am filtering out paths that contain snapshot which is related to our backup solution.
Example 2: Opening up zip archives and do something with the contents.
Example 3: Get local time in and nicer format.
Example 2: Opening up zip archives and do something with the contents.
Example 3: Get local time in and nicer format.
Labels:
PERL
Wednesday, September 29, 2010
Participating with OneGeology: Experience, Lessons Learned and Other Tidbits
I recently had the opportunity to "stand up" a WMS representing a 1:5,000,000 scale geologic map of North America for integration into a platform called OneGeology.
OneGeology, whos mission statement is to:
"Make web-accessible the best available geological map data worldwide at a scale of about 1: 1 million, as a geological survey contribution to the International Year of Planet Earth." (see: http://onegeology.org/what_is/mission.html)
aims to:
The process for contributing to this worthy effort was pretty interesting and I wanted to share this experience a bit.
To begin with, the map we were submitting was actually a replacement for an existing geologic map of North America which only covered the southern portion. The previous WMS was served using Map Server while our new one was being powered with ArcGIS Server 9.3.1. This was quite the test for AGS given the outstanding performance of Map Server with regard to WMS serving. In an effort to boost performance without doing a bunch of map authoring and data processing, we performed a couple of interesting tricks that I outlined in some detail in a previous post.
Basically, participation with the OneGeology platform can occur at several tiers, each basically correlated to functionality. Our participation was of the first tier, that being simply contributing a WMS. This experience was not trivial. Even though WMS is technically a "standard," there is still enough wiggle in the spec that a specific implementation is needed to ensure interoperability. The folks at OneGeology have done a outstanding job with this and have documented, in detail, requirements for how WMS services need to be configured for integration into their system. Our specific instances are here:
http://certmapper.cr.usgs.gov/arcgis/rest/services/one_geology_wms/USGS_Geologic_Map_of_North_America/MapServer -- raster one
http://certmapper.cr.usgs.gov/arcgis/rest/services/one_geology_wms/USGS_Geologic_Map_of_North_America_GFI/MapServer -- vector one for getFeatureInfo requests only
And finally the portal. It is a really nice application and serves as a great demonstration of "mashing up" distributed data services that originate throughout the globe. It is really pretty astonishing, given the wide range of participating organizations.
OneGeology, whos mission statement is to:
"Make web-accessible the best available geological map data worldwide at a scale of about 1: 1 million, as a geological survey contribution to the International Year of Planet Earth." (see: http://onegeology.org/what_is/mission.html)
aims to:
- create dynamic digital geological map data for the world.
- make existing geological map data accessible in whatever digital format is available in each country. The target scale is 1:1 million but the project will be pragmatic and accept a range of scales and the best available data.
- transfer know-how to those who need it, adopting an approach that recognises that different nations have differing abilities to participate.
- the initiative is truly multilateral and multinational and will be carried out under the umbrella of several global organisations. (see: http://onegeology.org/what_is/objective.html)
The process for contributing to this worthy effort was pretty interesting and I wanted to share this experience a bit.
To begin with, the map we were submitting was actually a replacement for an existing geologic map of North America which only covered the southern portion. The previous WMS was served using Map Server while our new one was being powered with ArcGIS Server 9.3.1. This was quite the test for AGS given the outstanding performance of Map Server with regard to WMS serving. In an effort to boost performance without doing a bunch of map authoring and data processing, we performed a couple of interesting tricks that I outlined in some detail in a previous post.
Basically, participation with the OneGeology platform can occur at several tiers, each basically correlated to functionality. Our participation was of the first tier, that being simply contributing a WMS. This experience was not trivial. Even though WMS is technically a "standard," there is still enough wiggle in the spec that a specific implementation is needed to ensure interoperability. The folks at OneGeology have done a outstanding job with this and have documented, in detail, requirements for how WMS services need to be configured for integration into their system. Our specific instances are here:
http://certmapper.cr.usgs.gov/arcgis/rest/services/one_geology_wms/USGS_Geologic_Map_of_North_America/MapServer -- raster one
http://certmapper.cr.usgs.gov/arcgis/rest/services/one_geology_wms/USGS_Geologic_Map_of_North_America_GFI/MapServer -- vector one for getFeatureInfo requests only
And finally the portal. It is a really nice application and serves as a great demonstration of "mashing up" distributed data services that originate throughout the globe. It is really pretty astonishing, given the wide range of participating organizations.
OneGeology Portal Depicting Geologic Map of North America |
Labels:
ArcGIS Server,
Geospatial Web Technologies,
Geoweb,
GIS 2.0,
Webmaps,
WMS
Subscribe to:
Posts (Atom)