GIS Blog

SOSC301 Web App – Part 1

This is part 1 of the process of creating a web map based field data collection app that will run on smartphones for a SOSC301 field trip.  In this post we set the specifications for the app and think through the data needed.  Along the way we use mosaic datasets, batch processing and a quick ModelBuilder model.

SOSC301 is planning a field trip for the mid-semester break; they’ll be travelling to Glenmore Station on the banks of beautiful Lake Tekapo to do some soil sampling.   On the trip they’re aiming to collect data in the field from soil pits and augering, and then later construct a high resolution soil map from those data.  Wouldn’t it be nice if we could set up a mapping app to collect their field data?  GIS is here to help!  In this and a few subsequent posts, we’ll work through the development and deployment of this app and then how the data get used afterwards.  Here we’ll go over the specifications and start compiling the data for this app.

App Specifications

As a first step, Peter Almond and I had a chat about what the app needed to do.  It’s aim is to provide the students with some spatial context about things like terrain and climate and also have the capability to capture field data entered by the students.

We started our discussion by generating a list of the layers he wanted displayed in the app.  I’ll list these below with tentative thoughts about which layers will suit and where they will come from:

Data Layer Data Source
Geology QMAP (faults and geological units) J: drive (from GNS)
Glacial Geomorphology CSIGG (Glacial Geomorphology of the central South Island J: drive (from GNS)
Climate NZ Climate Grids (precipitation, temperature and soil moisture deficit days) J: drive (from NIWA)
Elevation 1 m LiDAR DEM LINZ Data Service
Slope  Derive from DEM Downloaded DEM
Aspect Derive from DEM Downloaded DEM
Hillshade Derive from DEM Downloaded DEM

Compiling the Data

With a good idea of where the data will come from, I next needed an area to work with – what are the station boundaries?  Peter sent me this map from the Crown Pastoral Land Tenure Review for Glenmore Station (released under the Official Information Act), which shows the station extent:

That gives me a good place to start – all the data for the app will be somewhere inside this boundary.  This being a PDF, it’s useful as a visual guide but to get started I’ll need a polygon that explicitly has this boundary for clipping.  I could use the PDF as a basemap and digitise in my own boundary, but I suspect the data are available on one of the online data portals.  The search begins on, where I find a layer of NZ Property Titles.  Zooming in to the area around Lake Tekapo, I can see the polygon that defines the station boundary.

After cropping the extent and downloading the layer as a shapefile, I’ve added it to a map in Pro:

Next I’ll just select the Glenmore polygon and export it into my project geodatabase (this eliminates the other property polygons) – I can then use this boundary to clip out all the other layers I will need for the app:

Now I’m ready to start collating the data.  Most of what I need is on the J: drive so I’ll just need to clip them to the station boundary.  The elevation data are a bit problematic – there are some LiDAR data on the LINZ data service as shown below.  I’m only interested in the areas that are within the station so I’ve cropped the extent, as shown below (reducing the download from 6.8 Gb to 146 Mb):

When downloaded from LINZ the DEM comes as a collection of 266 TIFF files and associated files.  We’ve seen this before – when downloading rasters from the online data portals, they are typically broken down into tiles which need to be restitched together into one layer in a mosaic dataset.  Not difficult, but a bit time consuming.  After mosaicing, we go from 266 individual DEMs to one single layer:

(Okay, so not a visually stunning image…the elevations vary between ~913 and ~931 m.a.s.l in this extent so no dramatic variation, but I hope you get the idea.  When we do a hillshade layer later on we’ll see a lot more detail, methinks.  And, yes, this only covers one small portion of the station but Peter’s happy with that.)

Clipping the Data

Now we’re ready to clip all the data.

I could go through and clip each layer one by one, but there will no doubt be a lot of pointing and clicking and waiting around, so I chose to do some batch processing for the clipping.  With a mixture of vector and raster layers I’ll need to run two tools, Clip and Clip Raster.  Not many people are aware that several key tools allow you to do batch processing – both of these do.

To set this up, find the tool of interest either by searching for it or finding it in the right toolbox.  In the image below, I’ve searched on “Clip” in the Geoprocessing pane and then right-clicked on the tool in the results:

One of the options in “Batch” and in the resulting pane I can specify the clip layer and all the layers to be clipped:

With Pro, any outputs are automatically saved to the default geodatabase (the one ending in .gdb) – notice the “Glenmore_%Name%” in the Output Feature Class window.  “%” is a wildcard symbol and what it means here is that the clipped output will be named staring with “Glenmore_” and then appended with the same name as the input layer.

Same deal for the batch raster clip:




All the input rasters here



Outputs will have “Glenmore_” appended to the input raster name


Important to tick these two so the output extent matches the polygon boundary!

Now that I’ve got a clipped DEM there’s one more  task – deriving Slope, Aspect and a Hillshade.  I’ll set up a quick little ModelBuilder model to do this:

After running this model I think I’ve got all the layers I need for the app.  Interestingly, even though the elevations across the LiDAR extent don’t vary by much, the hillshade shows us the terrain really nicely:

Glacial moraine anyone?  As a next step, I’ll work through symbolising everything and get ready to set these up as web services – tune in next week for all that.


• 23/07/2020

Previous Post

Next Post

Leave a Reply

Your email address will not be published / Required fields are marked *