Sunday, June 28, 2015

Applications in GIS: MEDS Protection

Good Day GIS enthusiasts,
      Welcome to my continuation of Homeland Security GIS topics. We are continuing our look from last week at Minimum Essential Data Sets (MEDS). But this week we are looking at practical application. This week revolves around taking the data sets and layers generated last week and applying them with some additional data, particularly some LiDAR derived rasters, and looking at a real world situation. The situation in question is the Boston Marathon bombing of 2013. Here we are taking the MEDS Boston data, and looking more so to prevention through establishment of surveillance, security, and observation points within view of the finish line and surrounding area. With these points I am utilizing specific analysis tools available through ArcMAP and ArcScene. The overall objectives for this week were to explore the LiDAR data using it to generate Hillshading, perform a Viewshed analysis, and create a Line Of Sight analysis utilizing our created observation points. Two maps were generated using predominately ArcMAP, and a little ArcScene on the second.
Lets look at the first map.


This is an overview map of the event area. This shows a 3 mile buffer area around the marathon finish line. Identified around the finish line area are the 10 closest hospitals and medical centers. All of these have been identified as needing increased security during the event. A 500 ft buffer has been placed around each of these critical infrastructure facilities. This is a fairly simplistic view of the area showing the various levels of road features throughout. The primary, secondary, and local roads are all symbolized appropriately for easy acquisition and understanding. The lower inset which is an up close look at the finish line highlights additional security locations by placing checkpoints at each road intersecting with the 500 foot buffer from the finish line.  Additionally, another inset highlights the 6 counties that are a part of the Boston Metropolitan Statistical Area.


There is much more deliberate analysis in this map. The first section at the top is a straightforward look at 15 identified observation points around the block within view of the finish line, highlighted in the center. These are also labeled with the elevation of the best observation height for the point. The second frame down combines a multitude of analysis. Most clearly visible is the Viewshed analysis. This is the pink and green layer symbolized by pink meaning an area is not visible from the closest associated observation point, and green meaning the view is unobstructed. This layer is generated by a Hillshade layer which provides the gray shadowing underlying the Viewshed. Over top of this layer is a line of sight look from each point to the finish line. The red areas indicate some form of obstruction and the green areas are clear. This view is further broken down by the most obstructed point (# 4), and shown in the graph just under and to the right of the line of sight look. Additionally all of the line of sights are evaluated in the 3D environment of ArcScene. The lower left inset shows an exported feature from ArcScene depicting the lines of sight in a 3D relationship with the surroundings. This layer is also oriented with the same SW - NE look as the other data frames above for continuity of reference. Also an inset for this area is provided much closer in than the base map above.
I dont know the exact process that was in place on that fateful day. However with this type of planning and technical ability we can hope that we can better plan and prevent such acts in the future. This is an excellent look at some of the analysis that goes into such large events drawing tens to hundreds of thousands of people. Thank you.


v/r


Brandon

Wednesday, June 24, 2015

Applications in GIS: MEDS Prep

Hello,
    Welcome to this weeks discussion revolving around the Department of Homeland Security (DHS), specifically on some of the measures the department uses and has been directed to do to prepare for potential natural or man made disasters across the country. This is part one of a two part / two week look starting with the preparation of a Minimum Essential Data Set (MEDS). The basis for this week was on accumulating and organizing 8 different data layers described below in ArcMAP for a Tier 1 urban area as defined by the DHS. The overall objectives for this week were to complete this preparation, while exploring and developing an understanding on the different directives and concepts that shape DHS role in national preparedness.
     The Presidential Policy Directive 8 which outlines a directive from the President to the Department of Homeland Security on establishing a National Preparedness Goal which will be "informed by the risk of specific threats and vulnerabilities – taking into account regional variations - and include concrete, measurable, and prioritized objectives to mitigate that risk" (http://www.dhs.gov/presidential-policy-directive-8-national-preparedness). The guidance in the directive establishes a need to coordinate at all levels of government from local to state to federal, to systematically prepare for varying types of major incidents. 
    As a result of the directive several other concepts were formalized, such as the Universal Task List (UTL), and Target Capabilities List (TCL). These lists are a combination of tasks and capabilities that bolster communities capabilities toward specifically targeted threats and hazards. When adhering to measures from these lists the local community and government should be ideally prepared for a potential incident, including response and recovery.
     A significant portion of DHS involvement in this preparation is the establishment of the MEDS. This is DHS standards based model for the incorporation of geographic information systems and sciences into the collection, preparation, handling, and sharing of geospatial data beneficial to the response of a major incident at any level. The MEDS is broken down into 8 essential data layers: orthoimagery, elevation, hydrography, transportation, boundaries, structures, land cover, and geographic names (from the Geographic Names Information System). These layers are applied to a tiered area. Specifically, a Tier 1 Urban Area based on the Homeland Security Grant Program (HSGP). These are areas that have undergone assessment to determine the level of risk associated with high-density urban areas. The area is described based on potential threats and overall critical infrastructure that could be exposed, then defined by the cumulative extent of the metropolitan area, with a 10 mile buffer area around it.
      This weeks MEDS preparation focused on Boston, MA. and the 10 mile surrounding area. Most of the data originated in the National Map viewer from the USGS. All of the data for the various datasets  has been projected to the same coordinate system and datum for uniformity. Also, background data such as the transportation lines, and GNIS have all had scaled labels applied. Each dataset has been saved as a specific layer file, or layer package where needed for easy transfer from one user to another. Interoperability and transferability are the name of the game in quick preparation for these datasets. The data will be used as necessary for the disaster at hand. The transportation layer has been divided into primary, secondary, and local roads as prescribed by the Census Feature Class Code classifications. Need evacuation routes, or just roads impacted by some disaster, they are there. Need water mains or flow valve information, its there in the hydrography dataset. Do you need to know the potential coastal swell from a large storm, the elevation data is available. What geographic places or specific land types are impacted by a terrorist plot to destroy downtown? We have the information readily available. Overall that is the purpose of the MEDS, to be prepared for a broad range of unfortunate events. Below you can see the final computer layout of available datasets. Why, because in a blog post you should have something to look at! Thank you.




BMSA = Boston Metropolitan Statistical Area
bcd = my intials

Tuesday, June 23, 2015

GIS Programming: Geoprocessing with Python

Hello,
     This week in programming focuses on using geoprocessing tools within python script. Key objectives associated with this were to identify helping features of arcMAP and Python for working with different tools specific python syntax. Exploring functions and classes within Arcpy and applying them to environment settings through Arcpy. And also working with specific messages generated by tools being ran. These objectives were explored by creating a script which takes a point feature class (Hospitals) of an area, and adding defining its current projection, then adding XY data to the feature class. This allowed the points to be buffered out to 1,000 meters. Then another tool was used to dissolve the areas where the different hospitals buffers overlapped. The buffer tool does have optional inputs to do this dissolving process, however to further explore scripting with Arcpy, two different script tools were used. Below is the result message of the functional script.


The assignment portion of the lab was broken down into three specific functions, adding the XY data, buffering the hospitals, and dissolving overlapping buffer lines. This is reflected in this order in the resulting message blocks. Each segment of script written for an individual tool had messages generated to show its overall progress and success or in-success. Additional background processing is not noted by this screen shot, such as the workspace being set, or defining environment specifics such as the coordinate reference being used for the various feature classes, but all of these were programmed into the script as a part of the lab. All inputs that i used were hard coded to the individual tools, meaning that I was not using particular variables to represent the tool entries first. Overall this was a good exploration of the automation of multiple geoprocessing tools within Python more so than the interactive window within ArcMap. Kudos goes to Arc Help for having detailed examples of different methods to script the tools used! And thank you for your interest and stopping by.


v/r



Brandon

Tuesday, June 16, 2015

GIS Programming: A Look Into Geoprocessing

Hello GIS enthusiasts,
     This week in programming we are finally getting into more of the meat of Python integration with GIS. Geoprocessing is a huge portion of ArcGIS as it is what allows you to perform spatial analysis and modeling, along with automation of certain tasks.This week I looked at a number of different ways to facilitate some common geoprocessing tasks. The main objectives being explored throughout this weeks lab are below:

  • Locate Tools and Toolsets
  • Use batch processing to run a single tool multiple times
  • Create a new toolbox and model
  • Use the ArcMap Python Window to run a geoprocessing script
  • Create a geoprocessing script and script tool
  • Export a model as a script

The objectives above may not make much sense, but in the end they boil down to three main outputs. The creation of a new arctoolbox, which holds a newly created model, and newly created script based off the model. The toolbox housed the model which was exported as a python script, this script was adjusted and imported as a tool back to the toolbox.  So what do the model, script, and script tool do?
All of these were a variation on the same theme, they take a Soils feature class, and clip it to the extent of a basin feature. This new clipped soils feature is then examined for areas of non-suitable farmland. Once the bad farmland is identified, then these selected areas are removed from the clipped soils feature for an all new feature class which only shows prime farmland. The visual result of this geoprocessing is below.


Its not an island chain, its merely as stated above. It is a soils feature with identified areas removed in an incredibly simplistic map. The finalized toolbox which contains the script that generated this feature class along with the model and the host script used to create the script tool were all packaged together in a .zip file. This allows me to be able to share the whole package with whomever might need access. You can take this package and adjust the individual items to suit your purposes, which is one of the amazing things about Geoprocessing as it relates to Python and ArcMAP. Thank you for stopping by this week.

Sunday, June 14, 2015

Application in GIS: DC Crime

         Hello, and welcome to the first week discussing homeland security type topics in GIS. We are done with the natural disasters and GIS responses for now. Its time to look at how we can help out our law enforcement agencies. Today we will look at GIS as it applies to mapping crime data in both proximity to police stations as a whole, and percentage of crimes occurring closest to individual police stations in the Washington DC area, for crime data as of 2011. The crimes data was provided by UWF, as available through the DC Metro Police Department (Https://data.dc.gov). The overall objectives for this week being exercised by this lab are as follows:

  • Analyze data stored in a Microsoft Excel Database 
  • Create data using the Display XY tool
  • Create an address locator using street data
  • Geocode tabular address data to point features
  • Prepare data for processing in a geodatabase
  • Use Field Calculator to calculate attribute table values
  • Perform multiple ring buffers and create spatial joins in the attribute tables
  • Create Multiple data frame maps to show various crime distributions
  • Use Kernel Density to display crime clusters
  • Compile and present results for real world problem solving
These objectives were used to develop the two maps below:


There is a multitude of things being explored in this map simultaneously, most of which are highlighted in the central map. The data for the police stations and crimes was given as excell spreadsheet data which required conversion to ArcMAP points. This were done by adding the X and Y coordinates and importing the information in the excel table. Then I moved forward with the use of multi ring buffers located around the DC police stations to show the areas examined for crimes occurring within 0.5, 1, and 2 miles of police stations. Percentages of all crimes occurring within these distances are displayed in the small chart next to the appropriate legend items. Also, looking at the individual police stations it should become apparent that the police station marker is of varying sizes. This is a result of taking all of the crimes shown and determining which police station they are closest too. With that we then portray it as a percentage of the total crimes. The three largest police stations at 13, 12, and 11 percent show as the largest symbol, while stations only close to 1-2% of total crimes are the smallest. Both this and the buffer layer wouldn't be possible without the use of spatial joining tools. That is tools that allowed me to join the attribute from one dataset like the police stations with another, that of the crimes, to compare which stations were closest to what crimes. With this analysis comes a proposal. Where should one or two new police stations ideally be places. The two green markers represent my pitch for new police stations. The addition of these two stations would decrease the two closest and highest percentage stations as well as not raise the new stations overall percentage above 9%. This is explained in the narrative on the map. Additionally you can see overall population totals by tract area and also see a graph of total crimes by offense. 
Next we will look at a distribution of some of these particular crimes.


Again with the multi product theme, three different density analyses (based on the crime points generated for the previous map) are being compared in this page. These are kernel density analysis which involve taking specific point locations and grouping them together by proximity to one another, and then gives a variable input raster. This the varying cells are given values based on proximity to points using a search radius for points. This method of analysis gives you a look at where the crimes were occurring spatially, and can also be used as a predictive tool for leveraging resources. In general, I tried to make things as simple as possible with this, under a less is more. You're left with just map overlays for the majority of what you need to process, with simple one size fits all legends. Population density is represented underneath all of the crime density layers, but for more clarification another inset of just this feature was provided. These are just a couple of the ways we can look at how GIS can affect and help the realm of law enforcement. We will continue to explore this topic in the coming weeks. Thank you.

Monday, June 8, 2015

GIS Programming: Participation and Python in the Virtual/Real World

Greetings all,
       This post is dedicated to a participation based assignment for the Programming class. It essentially required me to do some research on GIS, Mapping, Python uses and application out in the wide world and share with the class. When doing this research on Python for this participatory assignment I really wanted find something very forward thinking. Also, coming off the end of last semester I still had the last and some of the most interesting labs still on my mind. They referenced 3D mapping using ArcScene, Arc Globe, Google Earth and the like. Its hard to think of mapping applications that are more forward thinking than the continued evolution of 3D mapping environments. I came across a particularly interesting blog to satisfy the requirements of this first assignment. The posting is from Google's Geo Developers Blog referencing the beginning of a relatively new facet of Python applications that came about toward the end of 2011 to beginning of 2012. This application is called pyKML. Its a Python library that is specifically designed to add Python scripting functionality to work with KML files utilized in the previously mentioned 3D environments. Its interesting to note that the blog post linked here really hits on everything we have just gone over the past few weeks of this course. the basics from the iconic "Hello World" to discussions on looping and branching. It is these simple facets of Python that the creator wanted to bring to KML processing. It is still evolving today. the link also provides further links to many available resources, explanations, and applications involving pyKML. I don't know that we would ever get into something this in depth in this kind of course, however it is nice to know that as far as you can go with mapping and cartography there is likely a Python script to make your life easier. 
        On a side note to the actual assignment, but of interest along the same lines. Whats after 3D... thats right 4D. The completely immersive environment. Some of you may have heard of the new Oculus Rift technology still largely in development. Its a virtual reality engine essentially. I found a Youtube video which can fall somewhat in line with the assignment of sharing Python and how its affecting new and upcoming technologies and world applications. Imagine adjusting a Python code to modify your environment in your own virtual world. Thats what the video below is all about. Please view if interested and understand that it doesn't have any bearing on the portion above. And for anyone else needing a good jumping off point about Python applications in the world please visit here. Its Reddits dedicated Python page. Thank you.

v/r

Brandon

Sunday, June 7, 2015

GIS Programming: Error Handling and Debugging

Hello all,
     Last week we were concluding the big fundamentals of Python scripting. Yet this week feels like it could be even bigger still for really understanding and going in depth with Python. That is error handling and debugging. These are integral facets of scripting as errors will happen. Knowing how to find, troubleshoot, and potentially fix an error is key to successful scripting. Thankfully thats what this week was dedicated to. The overall objectives were to examine syntax errors and exceptions, implement debugging procedures available in PythonWin, and handle some exceptions. I was given 3 complete scripts of various functions, all with particular errors focused on one of the objectives above and then had to correct or handle the error accordingly.

The first scripts function was to print out a list of the field names in a specific shapefile. Its results can be seen below.

This script had a couple syntax errors, such as improper capitalization  or a set of dependant variables were reversed of each other etc.

The second script prints out a list of layers for a data frame in an example ArcMAP document.

This script had significantly more errors within it, beyond just syntax. There were variables missing definition, or improperly referenced, or added arguments where there shouldn't be.

The third script is likely the most interesting. This script has two parts, and the goal was to demonstrate the functionality of try/except statements within Python script. These scripts allow errors to be contained and let a script run despite having a segment flagged with an error. So the first part of the script contains an error, and the goal was to isolate it and allow the script to move on and run the second part.


You can see that Part A has an error that was flagged and described while running, but it successfully moved past it and provided the desired output for Part B.

Error handling and debugging can be time consuming but understanding it will pay dividends in the long run when working with these scripts. Reviewing syntax errors or employing the try except statement above are only the beginning of my understanding of how to best employ Python debugging. But it was a good start! 

Saturday, June 6, 2015

Applications in GIS:Hurricanes

Hello readers,
     This post is continuing the past few weeks theme of GIS and natural disasters and associated hazards. The disaster being focused this week is Hurricane Sandy which struck the East Coast of the United States in October 2012. GIS is being used to map the storms path while presenting which states subsequently declared emergencies with FEMA as a result of the damage from the storm. Additionally, GIS is being used to do a street by street comparison of pre and post storm imagery to assess damage at the county parcel level. These are both prime examples of the information that GIS can portray at two vastly different scales. From the continental down to an individual neighborhood.
The data was predominately provided by UWF, and all processing was completed using ArcMAP. The objectives that were explored while building the below maps were as follows:

Analyze data stored in a Microsoft Excel Database.
Create data using the Display XY tool
Create data using the Points to Line Feature Tool
Create effective Labels utilizing VB Scripting
Prepare data for processing in a geodatabase including, but not limited to, proper nomenclature.
Perform a raster mosaic
Explore the Effects Toolbar using the Flicker and Swipe Tool
Prepare Post-Storm Damage Assessment Data using Attribute Domains in a Geodatabase
Locate and identify attributes based on storm damage
Generate report/table based on damage results for given study area
Create effective locator inset maps
Compile damage assessment map in a format that best communicates the data


Most of these objectives focus on specific features and toolsets used in ArcMAP. The rest focus on creating coherent maps for the best information portrayal possible. My attempt at these things is below:


This map focuses on portraying Sandy's path from the Caribbean Sea through the Atlantic and into the NE United States. You can see that the points along the path show relative wind strength and pressure as the storm fluctuated in its intensity. This map was created by taking the known intensity points and creating a path out of them, and displaying them as seen. The symbols used were also custom made for the assignment and color coded for relative strength. Once the storm made landfall the intensity dropped, however it was still such a massive sized storm that it had severe impacts on all the states you see displayed.


In this map we've now zoomed into a section of the New Jersey coast particularly hard hit by the storm. The two insets highlight the state, and then the local area seen in the before and after imagery. Generating this map was much more involved than the base map above. The highlight however for this map is the defined damage levels across the parcels along the south side of Fort Ave. There wasn't an established point layer for each of these parcels, so I had to heads up digitize (edit a new feature class of points) a layer to show the associated damage level. Essentially I created a point in each of the parcels at the centerpoint of the structure seen in the before image, and then referenced the after image and assigned a level of destruction. The overal flow from east to west in this case shows houses going from completely destroyed to more minor structural damage. Everything in this block also has some inundation damage, that is damage from rising of sea level above ground level. Overall this map serves as an example of the type of analysis that could be done block by block in a larger scale project for this type of disaster response. Thank you for your time.

Wednesday, June 3, 2015

GIS Programming: Fundamentals part 2

Hello viewers,
     This weeks post is dedicated to continuing the fundamentals being explored from last week. The assignment this week focused on completing a partially written script working on conditional statements and exploring branching and looping. The specific python functions explored this week were the use of the import function and while, for, if, elif, and else. Continuing from last week the overall objectives for this week were to import new modules, correct script errors, create statements in conjuction with the tools above, and add scripting comments. Lets look at the results from the script, and then explore briefly how it was created.


You can see in the interactive window there are several aspects of this script. The first and largest section is the dice game at the beginning. Essentially this game takes the persons name, looks at the length of their name in letters, then rolls a number between 0 and double the length of letters. If the person rolls a number higher than the amount of letters in their name, they win, else they lose. The block of code for the game was already written but required me to evaluate it for errors.
The next segment is the list of 20 numbers just below the dice game. This was derived from a block of code taking an empty list, and populating it with the numbers 0-10 at random. It also involved the random module with random integer generation, and a while loop to continue to add to the list until it had 20 numbers, then displaying the final list. The last segment was particularly tricky for me, and I found myself over-complicating my code multiple times. I began with for loops, with while loops, and if / else statements all together. Ultimately after much trial and error I simplified it down to defining a couple key variables up front, then added if, and else conditional statements designed to either tell you there is none of a particular number in the list, or remove that number and tell you how many of them were removed before showing the new list. This was a difficult challenge at first, and I felt befuddled during the last section for sure, and then ultimately was amazed at how simple the finalized solution really was compared to my initial thoughts. Thanks for coming and viewing my continuing Python experience.