Friday, June 24, 2011

A Real Life Geoprocessing Service In Action (ArcGIS Server 10)

As I have mentioned before, I'm an oldschool GIS guy.  Because of this, I have a hard time embracing "secular" web mapping toolkits.  Its kind of like the old days.  Yes, you can use GIS to just make maps but it is really about the data and analytic capabilities.  So translated into the geoweb, yes there are all kinds of toolkits to make webmaps but very few that expose the richness of the data and provide real analytic capabilities.  That is why a recent project I was lucky enough to be apart of was so exciting.  Web enabling rich GIS capabilities via a simple user interface and user experience using ArcGIS Server's Geoprocessing service.
The application is really simple, basically allowing users to dynamically calculate the amount of in-place oil shale amounts.  We have kind of been calling it the oil shale calculator.  However, the science and methodology for assessing the oil shale resources in the first place was far more complex and something that I can't take credit for but would like to acknowledge those who did (Johnson, R.C., Brownfield, M.E., and Mercier, T.J., (U.S. Geological Survey Oil Shale Assessment Team)).  Have a look at it if you are interested.

But for the sake of this post, the result of this work (at least the part that was used for this application) was basically a raster with gallons of in-place oil shale resources per cell and a model implementing the zonal stats function for doing the dynamic calculations (nice work Tracey!).  Here is the geoprocessing model.
Pretty simple really.  Basically, two input parameters.  A user defined polygon and the raster containing oil shale values.  The output being a sum based on the zonal statistics function.  This was actually modeled after one of the AGS samples.  When implementing this model, we ran into what I consider a bug.  First, the literature suggests that you can store the output table in memory(ArcGIS 10 only).  However, we found this was not the case.  We were forced to use a physical file system location, basically the scratch disk (bad:  output table=in_memory\results, good: output table=%scratchworkspace%\oilshale.dbf).

The application itself was pretty simple as well.  Have a look.  One gotcha we ran into here was that basically the REST API endpoint for this service actually expects 2 input parameters, not just one as is suggested by its Services Directory page.  The missing parameter that is actually required is the zone field parameter.  We called ours id. You can view the page source of the app to see how we manually included this parameter but a preview is below: