In this post we’ll cover the ins and outs of creating home ranges for animals based on point locations.  This was mainly written as a tutorial for Ecology students but by all means, feel free to have a go.

In animal ecology there’s an important concept around home ranges for animals, which could be thought of as their territory, for lack of a better word.  If animal ecologists are lucky to have locational data for their animals of interest, they are also often concerned with identifying this territory and some of the environmental characteristics that go with that territory.  Home ranges tend to look something like this when plotted on a map:

EndResult

So here we can see isolpleths (contour lines) that contain 50 and 95% of all the locations.  Splitting the home range into 50% and 95% contour lines is important to animal ecologists as it enables the researcher to see activity patterns. For example, within the 50% contour is where the animals was estimated to have spent 50% of its time during the period of interest. We can also see the Landcover Database underneath and also the attribute table from a spatial join that allows us to know how many points occurred in each of the landcovers within the 95% isopleth.  Sweet, huh?

In this post I’ll take you through how home ranges are derived and what you can then do with them.  There are some built in tools in ArcGIS that allow us to derive some crude home ranges, but as we’ll see, to do it properly means going outside ArcGIS a bit and working with some other software.  I’ll cover what you need and how to install the extra bits.

In many cases, their data have been collected from GPS (Global Positioning System) collars fitted to the animals.  Left to wander freely, the GPS records their positions over time which can then be downloaded for analysis later.  Below is an example of this in an Excel spreadsheet:

SpreadSheetwithGPS2

(click here for a copy of this speadsheet.

The first column is an identifier for the individual.  In this case, each record relates to the location of a Wapiti known as Edith.

wapiti

(That isn’t Edith as far as I know – unless females are now growing antlers.  Kids these days…)

From a GIS point of view, the important columns are the Latitude and Longitude ones – these are the x- and y-coordinates that will allow us to map their locations.  This spreadsheet can be added to ArcMap in the usual way (though note than when you add it you need to navigate down to the individual sheet, here it’s called ‘Edit 2$’) – here it is shown in my Table of Contents (you’ll need to click the List by Source button to see the table):

TOC

In order to map the points, right-click on the table name and go to “Display X Y Data…”  and set it up with the field name for the x-coordinate (Longitude) and the field name for the y-coordinate (Latitude).  We haven’t got a z-coordinate (elevation) in these data, but if we did we could also set that.

DisplayXY

When the window first opens, the Coordinate System Description will be blank.  We could leave it that way, but a better practice is to set it.  We know that these data were collected with a GPS unit and that GPS uses the WGS84 (World Geodetic System 1984) as its coordinate system, so to set it, click the Edit button and wind your way down the path of: Geographic Coordinate Systems > World > WGS84.  (If you’re going to use this one a lot, click the “Add to Favorites” (sic) button, AddToFavorites, and it’ll be there for you next time without having to click through all the folders.  Once I’ve clicked okay, the points are then mapped:

MappedPoints

She’s been busy.  Note the points are in a new layer called “‘Edith 2$’ Events”  An event layer is a temporary layer and to do any substantial analysis we should really save it as a permanent file.  Even better, we should convert it (project it) from Latitude/Longitude (which is really a 3 dimensional coordinate system) to a 2D flat map projection, which will make area and distance measurements easier and more accurate.  The standard projection for New Zealand is New Zealand Transverse Mercator (NZTM), a projection that was designed to play very nicely with GPS data.  So we’ve got a two step process up next: export the events layer to a shapefile and then project that file from WGS84 to NZTM.

  • Right-click on the Events layer and go to Data > Export Data.  Save your output as a shapefile (you could just as easily save it as a feature class in a geodatabase if you prefer).  Here I’m saving it to a HomeRanges folder in my Dropbox (I’m doing this on a laptop) and calling it EdithPointsWGS84.shp:

ExportPoints

  • To project these points, open ArcToolbox and go to Data Management Tools > Projections and Transformation > Project (note that this tool only projects vector data; there’s a different tool to project rasters in the Raster folder).

PorjectTool

  • The Project tool window opens:

ProjectWindow

  • Once you’ve set the layer to project (EdithPointsWGS84.shp in this case) note that coordinate system gets automatically picked up from that layer.
  • Give the new, projected layer a useful name.  Here I’ve used EdithPointsNZTM.shp.
  • The Output Coordinate system can be set by clicking the Edit button at right.  The path is Projected Coordinate Systems > National Grids > New Zealand > NZGD2000 New Zealand Transverse Mercator about six pages down (note the dizzying array of NZ systems).
  • The Geographic Transformation gets picked up automatically – in this case the default is fine.
  • Click OK and a new layer gets created and added to your map.

When added to the map it will overlay on top of the original one even though it’s in a completely different coordinate system.  The coordinate system of the map is usually set to the first layer added, which in this case was WGS84 (check out the coordinates at lower right of the map window).  We’re going to work from here on in in NZTM, so remove all the layers except the NZTM layer.  Then right-click on “Layers” (the name of the data frame) and go to Properties > Coordinate System and set it to NZTM (path above) and then click apply.  The last thing we need to do it to change the display units so switch over to the General tab and change the Display units from Decimal Degrees to Meters:

GeneralTab

Okay, so we’re just about ready to get started on our home ranges.  ArcGIS has a few built in tools to derive rough home ranges.  The main one is the Minimum Bounding Geometry tool (ArcToolbox > Data Management Tools > Features).  What this does is create the smallest polygon that surrounds all the data points.  It’s pretty crude and doesn’t really allow you to do anything special:

MBG

Setting the Geometry Type to CONVEX_HULL produces this:

MGBPolygon

That’s useful, but from an animal ecologist’s point of view it’s not quite robust enough.  The literature demands a different approach, that of kernel density estimations.  It sounds a bit involved, but really it’s just a way to objectively measure how densely grouped a given set of points are.  This is done by moving a fixed “neighbourhood” over each point (this is the kernel) and see how many other points are within that neighbourhood, thus allowing us to estimate the density.  Conceptually, you’re fitting a smooth curve over each of the point locations and then adding up all where any of the curves overlap.  The kernel is usually a 3 x 3 neighbourhood of grid cells.  This is essentially a raster analysis – the output is a grid where each grid cell has a measure of density.  The next step is to create contours (or isopleths) based on the grid values and use these to define the polygon home ranges.  Animal ecologists are often particularly interested in the 95% and 50% isopleths, those contours that contain within them either 95% or 50% of all the points (the 95% isopleth is often used to define the home range while the 50% isopleth is the core range).  So far, so straightforward (right?).  So how do we do it?

ArcGIS does have a kernel density estimator tool which works fine (ArcToolbox > Spatial Analyst Tools > Density).  The issue really is isopleths.  Determining the 95% points is not so straightforward but there is another option that allows us to do both in a reasonably easy way – the Geospatial Modelling Environment (GME).  This is an independent bit of software that does all sorts of great spatial analysis (some similar to built in ArcGIS tools, but there are also more specialised tools that apply to specific disciplines).  So before we do our home ranges we need to download and install GME.

Installing GME

GME is a command line program which a huge array of tools built in.  Some of those tools are actually using R, the open source statistical modelling tool set, so we’re also going to need to ensure that the correct version of R is also installed.

Going to the GME download page, you’ll see that there are various versions that run against ArcGIS.  Most campus machines are running ArcGIS 10.3 so the version you would want to download is probably 0.7.4.  (Check which version you have on your computer before downloading GME.), plus the right version of R (I’m using the latest which is 3.2.0).

GMEPage

Further down the page are installation instructions.  What you end up downloading is a zip file that needs to be extracted somewhere.  I’d try to put it somewhere safe and run the Setup.exe file.  (If you’re trying this on an LU computer, you’ll probably need to contact ITS and request it to be installed).

Next, get the latest version of R and have it installed if you don’t already have it.  There are a few libraries that need to be added to the basic installation so when you start up R, copy and paste the line below at the R cursor, hit return, pick a mirror site (scroll down to New Zealand – there’s a mirror at Auckland Uni):

install.packages(pkgs=c(“CircStats”, “deSolve”, “coda”, “deldir”, “igraph”, “RandomFields”, “ks”))

That’s all we should need to do in R once those new libraries are installed.

To check and make sure that everything’s set up properly for GME, open it and go to Help > Citation.  If you see lines for both GME and R than you’re good to go.  If not, well I think it may be time to consider a new career path…

RCitation

(Are we ever going to get to the home ranges?  Yes.)

So now were ready to crank out some home ranges (told you).  Back in the main GME window, type “kde” in to the Commands window at upper left of the window.  It will pop up in the box below it, so click once on it and a set of entry windows appears at the right, as shown below:

GMEKDE

You can now populate it with your parameters.  We’ll set EdithPointsNZTM.shp as the “in” file and save the “out” grid as EdithKDE.  The bandwidth entry is an important one and one of the reasons why using GME is preferable (ArcGIS doesn’t have this option).  There are several options but the one most favoured in the literature seems to be LSCV (Least Squares Cross Validation).  (I don’t claim to understand how this works, but then I don’t understand what my dog thinks).

Here are the settings I’ve used as a first go.  I’ve opted to use the defaults for the remaining parameters:

EdithKDE1After clicking Run at bottom, here’s the result on my map (I had to manually add the layer from my folder):

EdithKDEMap

Just for kicks, here’s what it looks like in 3D to give you a sense of what it’s created:

EdithKDE3D

So you can (hopefully) see that the surface is higher where the points are denser.  We’re not quite done yet.  Next we need to derive our isopleths.  Back to GME and type “isopleth” in the command window.

GMEIsoplethWindow

Key settings here are the input, output and Quantiles.  With the Quantiles I can set which levels I want. ” c(0.5,0.95)” tells GME I was the 50% and 95% isopleths.  If I leave this blank it will do the 100% isopleth.  Also of note, I’ve specified a polygon output.  Isopleth lines are good for display but having a polygon means I could use that clip out other layers or do some spatial analysis to see the distribution of stuff inside a given isopleth as we’ll see later.  Here’s the output.

EdithIsoandPoly

You can see here that I’ve got two separate polygons for each isopleth, plus the isopleths themselves.  As a next step I can add a layer like the Landcover Database and get a sense of what landcovers Edith spends her time in:

KDEwithLCDB

Great, so now we can visually see which landcovers fall within the isopleths.  If we want to be a bit more analytical then a good next step is to clip out the LCDB polygons within the isopleths.  Here’s where the polygons come in handy – I can use one of the polygons to clip out what I need.  I first went into the polygon attribute table and selected the record with 0.95 in the isopleth field.  Then I used the Clip tool (ArcToolbox > Analysis > Extract or Geoprocessing menu > Clip).  Here’s the output.

LCDBClip

Next, I’d like to get a sense of how many of the 95% points fall within each of the three landcover classes.  There are a few ways to do this – one handy way is to use the Spatial Join tool (ArcToolbox > Analysis > Overlay).  This tool is very similar to the Identity with one useful exception.  Depending on how you set it up, one of the fields you get in the output is called “Join Count” and neatly summarises the number of features within another feature.  So in this case, I’ll join the Edith points to the LCDB polygons as a one-to-one join, which will summarise the number of points falling within each LCDB class:

EndResult

We can see from this that, over the time these data were collected, Edith spent most of her time (181 of 202 records) hanging out in Indigenous Forest (can’t say that I blame her for that).

So, we’ve covered a lot of ground here (excuse the pun).  We’ve seen how to first map point data from a spreadsheet, then projected them from WGS84 to NZTM.  We next got GME and R set up and used them to derive the 50% and 95% home ranges using kernel density estimation.  Then, a spatial join allowed us to summarise where Edith was spending her time.

C