Monday, September 28, 2015

Special Topics and Mountain Top Removal Analysis

     Mountaintop Removal (MTR) analysis is this weeks theme continuing the Special Topics preparation done last week. The main objective of this week was to create a reclassified raster using a combination of ERDAS Imagine and ArcMAP ultimately depicting user defined areas of suspected MTR. Recall that the overall study area was divided into groups. I am a part of group three and you can see the overview of my working area in last weeks post. The area that my group has been assigned is broken down into two landsat rasters. I took one raster as my portion of the assigned work while other members worked with the same or other raster as well. The key element to all of the calculations this week was in visually determining evidence of mountain top removal through unsupervised image classification and reassigning these elements a new value.  


This is a screen shot of ongoing work in ArcMAP of the reclassified areas as MTR. The red depicts areas that have been preliminary classified as MTR. Note there are two bulk areas classified as such, those in the lower right appear darkest and are more legitimate examples of MTR. The area to the north west on visual inspection (imagery not available here) more closely resembles an urban build up. Note that the process here was to assign all pixels in the original raster to 1 of 50 different color shades. Then from the color shades taking what stands out as MTR and changing it to red. This change impacts all other pixels of that color shade on the image. So imaging the clearing of a field for planting agriculture, or the build up of an urban area, its easy to see how many of those properties would be similar to land being cleared for mining. This is just one example of how we can end up with other areas being reclassified similarly to actual sites of MTR. The analysis and composition of results will continue next week as we take out some of the key factors from the results above. If I remove areas within proximity to developed roadways in the town in the upper portion of the image likely very few areas of MTR will remain there. That and further work will be done and presented next week. Please stay tuned!

Saturday, September 26, 2015

Remote Sensing: An Intro to ERDAS

     ERDAS Imagine is an excellent software suite designed to stand alone or provide interoperability with other spatial analysis software such as ESRI's ArcGIS. This weeks look into remote sensing has dealt with two things; a good look at the spectrum of Electromagnetic Radiation (EMR), and an introduction into ERDAS Imagine through some image files. The overall goals here were to understand some of the key concepts about EMR to put more substance behind the images we have looked at the past few weeks, and to look further into how we can manipulate the information in these images.

                      Lets first get a glimpse of the Electromagnetic spectrum to start things off.


      This is the spectrum looking at long waves on the right and short waves on the left and watching their transition from one side to the other. Happily we have our nice visible portion there in the center. What you need to understand here is that everything in our world is an emitter, reflecter, or absorber of energy falling somewhere on this spectrum. There are also things like the sun which emit energy in a continuous amount of energy across the entirety of the spectrum. All of the EMR sent out by the sun then interacts with our atmosphere or things it comes in contact with which it either reflects off of, is absorbed by, or is refracted from. Remote sensing cameras then get to absorb some of this EMR for us to view. From there we then get to manipulate the imagery we get into usable products like the below.


     This is a very simplistic map which was built as an example of some basic processing from ERDAS Imagine. The image itself was first used in ERDAS to calculate the area of each category represented here. There are different classes of vegetation and land cover shown. You might also notice than when the remote sensing platform originally took the image there was something else we couldnt classify as ground cover... Clouds. Clouds occupy 507 acres of the image. from ERDAS Imagine the image was essentially exported for refining in ArcMAP. This was a good and simple intro into a program that will be used likely for more and more complex tasks throughout the remainder of the class.

Monday, September 21, 2015

Special Applications and Mountaintop Removal Preparation

       Have you ever lost a mountain? How could you lose a mountain if you wanted to? Well the next few weeks will be spent looking at Special Applications in GIS that can be used to analyze mountain top removal. Mountaintop removal and valley filling are a common practice most particularly dealing with coal mining. The Appalachian Mountain chain in the mid eastern United States is an area that is particularly afflicted with this form of mining. The mining essentially involves peeling away the surface of the earth including, trees, brush, soil to get at the rocky layer beneath to harvest away the precious precious coal. This is the premise of the project encompassing the next few weeks posts. This weeks objective is to skim the surface of what mountain top removal is, means and does. Then create a basemap for the study area that I will be exploring for the next few weeks. This project has both an individual and group component. Deliverable's like the map below still have to be done independently but much of the upcoming analysis will be broken down into manageable chunks to be done in groups with a final group presentation project a couple posts from now.


      The basemap above gives an overview of the study area that is being broken down by group and a look at the chunk that belongs with the group to which I am a part. Many things have been done to what was originally a DEM layer to show the Elevation, Streams, and Basins depicted in the group sliver of the study area. Essentially a mosaic raster was made out of 4 DEM sections which were then paired down to their extent that falls within the study area. From there multiple tools were applied to the mosaic to generate the streams and basins. Essentially holes in the pixel database had to be filled with a fill tool, this makes it so when running a subsequent flow analysis there arent holes for the "flowing water" to go into. Flow direction is applied to see how and where water would or should move given the overall contours of the elevation slopes. From there a calculation is ran to determine what actually correlates to a running stream. This calculation funnels into a conditional statement tool identifying areas that should be streams. And last from there a feature class is created from that entire process and then displayed appropriately.
      What does this have to do with coal? Well one of the big problem sets associated with this subject matter is in determining how much land are we losing once an area is subjected to MTR? If this is a good before picture we can then take an after or during picture and calculate the difference. Overall you should keep checking back to see where we go with the data at hand.
      In the mean time you might find these resources interesting. The first is a Story Map which displays the 6 stages of Mountaintop Removal for your perusal. This next Journal Map is the building blocks in progress toward a final compilation for the project to be finalized in the next couple weeks. Thank you.

Saturday, September 19, 2015

Remote Sensing: Classification Accuracy

Welcome to a continuation of last weeks look at Land Use / Land Cover.
This week I am specifically looking at the classification accuracy of last weeks generated Land Use Land Cover Example map of Pascagoula Mississippi. The overall objectives of this week were to explore the different types of accuracy and how to compare them with the data presented. This builds on last weeks map and is based on taking samples at locations of each classification type and determining if the sample is accurately portrayed by the assigned classification. these samples are typically generated in situ meaning at the physical location being examined, or ex situ using some other external means such as higher resolution imagery. All of the points on the map below were looked at in an ex situ manner using Google Maps satellite view. There are three different types of accuracy that can be calculated utilizing the sample points. These are the overall accuracy which is the totally correctly classified sample points, users accuracy which is the probability that a sample point is actually of the appropriate classification. If this is not the case it generates an error of commission which is when a point is committed to an incorrect class. And the last error is the producers error, which is the overall probability that any location on the map is correctly classified.  Lets look at the example map.


      There are 35 points on the map with a total accuracy of 66%. 12 of the points have been miss-classified. These points are in red. Correctly classified points are in green. A random stratified approach was taken to selecting sample point locations. Each category has a minimum of one point with categories that cover much larger areas having more sample points allocated. None more than 5 points total for proportionality. You can start to see themes on which points are inappropriately classified. For example, the light purple class 61 was determined to be entirely false throughout. Upon closer imagery inspection this region should all be classified as a 62, essentially going from forested wetland to emergent wetland.
       Swapping trains of thought and looking at user accuracy we have a couple examples where categories 11 and 12 both had 4/5 (80%) user accuracy rating. However 12 did have 3 points that should have been classified to it. This means there was a possible total of 8 sites, 4 classified appropriately 1 not and 3 discovered during accuracy testing totaling 50% classification accuracy.  This lab was a good look at the kinds of errors that can be made when classifying areas for land use and land cover and how to start avoiding them. Thank you.

Tuesday, September 15, 2015

Special Topics Network Analyst Presentation

     Imagine there is a tropical storm or hurricane bearing down on your quiet cul de sac near the beach. You live near downtown, which is on the water, and unfortunately you haven’t had a chance to leave town before the storm arrives. What do you need to know? Where can you go? You were waiting to make sure a loved one got out of a nearby hospital safe before the storm. What’s in place for them during this time? What happens once the storm is gone? Answering these questions is the culmination of the assignment involving a hurricane or other disastrous storm bringing flooding to Tampa Bay, Florida area these past few weeks.  Examining and putting into practice those analyses that a local area or government GIS practitioner in conjunction with other services can use to prepare for this type of disaster was the overarching goal for the culminated products below. I will look at two of the maps that answered some of those questions, which have built on the posts from the past few weeks.
      All of the questions posed above had one base utility which was used for the answer. That utility or suite of analyses was the Network Analyst within ArcMap.  Network analyst takes a network dataset built of paths and junctions such as those found in a road network and utilize specific attributes for impedance, e.g. distance or time, to calculate a preferred route, a closest service center, and overall area etc from one or more points to another. With it and these past weeks I was able to build cartographic answers to the above questions. Lets examine a couple of the answers.


      This is my cartographically favorite map of all of the ones I built. I enjoy this one because it is incredibly simplistic but provides a lot of information, with some enhancements from Corel Draw X7. This map answers the question of what if you have an entire area that needs evacuation and not everyone could or should take the same route. The red route has multiple starting points at the bottom that eventually converge on one shelter in the upper right. More is less with this map as your only key features are labeled; e.g. key roads, routes, and focus items depicted with arrows. I have three focus areas, two larger arrows and one smaller. Looking at these first its clearly important to look at I 275 and Nebraska Ave. Still important but slightly less so is Kennedy Ave, but im not sure why just looking at the picture. Now to back up the visuals there are some Goals and routing information. Big key concepts are presented, as well as a breakdown of routing from different areas of downtown Tampa. I can see from the 4th bullet down, that if im south of Kennedy Ave I should make my way one way over another. At this point its clear cut which way I should go without bogging me down on peripheral information I might not need in the moment of torrential downpour. My theme here, less is more. Lets look at another.



     This is my next favored map of those built this assignment. It is another clean map, but built with an entirely different form and function that the previous one. This is a look at one of those answers to what happens after the storm. It is likely that people staying in a shelter for multiple days while access to their homes is regained will need resupplied. Those doing the resupplying will need a depiction of where to go to do just that. This is a route depiction with directions getting from a start point to an end point without the colored enhancements seen previously. I still maintain all of the essential cartographic elements but have made it strictly functional. It is still pleasant for being all gray scale.
     Other questions were answered through the use of network analysts new service area analysis to show which shelter should be your primary evacuation location based on the area of town you live in. All work with network analyst in these cases was a result of using time as the impedance to generate the quickest possible routes, which are usually the most direct. Being the most direct isnt always the shortest possible distance, but when seconds count you'll hope the analysis was done similarly! The network analyst utilities can have massive impacts on local, city, state and beyond. These were some particularly good examples of its use especially as hurricane season here in the Gulf Coast comes to a close. Thank you.

Friday, September 11, 2015

Remote Sensing: Land Use and Land Cover

      Hello and welcome to this week in Remote Sensing. This week introduces Land Use and Land Cover classification. These topics build on last weeks introduction to classification of elements in an image. Last week we looked at elements such as size, shape, pattern, texture, association. Here in this week we are taking these elements a step further in how they apply to classification, breakdown and identification of different land use and land cover through visual interpretation of an image. the image itself was provided by UWF with the overall objective of exploring and applying Land Use and Land Cover concepts in conjunction with the different elements of identification. Lets briefly define Land Use Vs Land Cover before exploring the map below.
     Land Cover is the actual biophysical description of the Earths surface. Examples include, is the ground forested, is it desert, water, built up urban area etc? Land Use specifically documents how we humans have interacted with the landscape and changed or developed it. The two of these elements combined become a Land Use / Land Cover layer when an image is classified into appropriate categories. An example of this is seen in the map below.
     One other thing to note before we get to the map. Land Use / Land Cover Classification is divided into 4 levels, the key determining factor in deciding how effectively you can categories elements of an image into these different levels is spatial resolution of the image. Poor resolution only lets you identify elements to a certain degree whereas high resolution can give you much more certainty of an elements purpose, especially when taking into account the size, shape, pattern, texture and association differences you'll see at higher or lower resolution.


     Examples of level two land use and land cover are seen in the map above. An image of Pascagoula Mississippi has been heads up digitized with a polygon feature class showing similar compositions of land features and development. The code descriptions are all two digit to signify second level classification. Many items could easily be broken down into 3rd level classification, However that was not necessary for the intent of the assignment. What you can start to break down is that there are different patterned areas around different features. Note there is much higher likelihood of having commercial or industrial areas along the main road feature, and forested areas interspersed with the residential areas. Looking at the different items at this resolution and scale you can start to recognize like features by size, shape, pattern, texture and association and see why they logically are grouped together and represented as they are by the polygons. Overall this was a good intro to Land Use and Land Cover that we get to expound upon in the coming week, so stay tuned! Thank you.

Monday, September 7, 2015

Special Topics: Network Analysis

     Hello and welcome to week 2 of the first project for Special Topics in GIS focusing on the Network Analyst utilities in Arc GIS. The last post was all about the preparation for the project including building the base map and organizing the necessary background information for this weeks analysis. All of the original data was still provided by UWF, and the overall goal was to explore, utilize, and gain understanding about the Network Analyst utilities utilizing ArcMAP. As you may recall from the last post, our study area is Tampa Bay focusing on an impending hurricane or major storm which could result in severe flooding. This weeks analysis works through 3 different individual Network Analyst applications and displays them concurrently on the map below. The key applications that the network analyst is being used for are evacuation route planning to include transferring patients from one hospital to another and exiting a heavily flooded area, supply distribution, closest shelter servicing a particular area. Lets look at my final map and then discuss the different aspects of the network analyst utility that were used in the analysis.


     First notice that the same layout has been employed as the base map for continuity, however there is a much different theme with the items that stand out, the routes. There were 3 different Network Analyst tools that were used to create the routes presented. In the prepare phase everything was made ready for the creation of a Network Dataset. This was the first step toward enabling the network analyst and the creation of these routes.
     The Network Dataset was created based off of the transportation layer you see underlying the map. This required a couple base additions to the transportation layer, ensuring that items distance could be reflected as a measure of time, in this case seconds. This is used as the impedance for the network analyst computations later. Also, identifying areas as being flooded or not flooded was particularly important for later routing. This was done identifying areas 5 feet and under, and designating each record in the transportation attribute table as flooded or not flooded.
    With the Network Dataset established I was ready to begin the analysis. As previously mentioned three types of analysis were conducted with the network analyst. These were the creation of a new route (multiple of them), a new service area, and a new closest facility. All of these are tool which were used to complete the objective above. Lets look at each individually.
     A new route is simple enough; it creates a route from a starting point to one or more stopping points based on either shortest overall distance or shortest time being set as an impedance. All routes here utilize time as the impedance. This is because the shortest duration route is usually the most expeditious from one point to another rather than the shortest distance in such a densely compacted urban area. A simple start to end point route was used for both hospital evacuations, and the supply lines from the National Guard center to the three shelters.
     A slightly more complex routing was used for the specific downtown incident evacuation seen with red points and red highlighted route. This took multiple points in the middle of a highly likely flood zone downtown and required individual routing from multiple start points to a known end point, Middleton high school. One difference about this calculation is that it revolved around a scaled cost attribute rather than a particular restriction. A restriction when talking about network analyst applies a specific value to a particular road segment based on user input. For example we could prohibit travel on a flooded segment entirely. However with the use of a scaled cost attribute, we could tell the analyst that we want flooded segments to triple the impedance cost rather than say "no we dont want to use it at all." This still gives the route a chance to be used if necessary but it is more heavily weighted against it when other better alternatives are present. You can see the result of the scaled cost in the form of multiple branches of start points finally combining to the end point.
     Finally, I defined a new service area which is the depicted red green and yellow underlying transparency. The service area defines the shelter that the shaded area around it should report to in the event of an evacuation. This is done once again through impedance, but then telling the resulting display that I don't want any of the coverage areas to overlap. This gives a clear picture of the shelter you should go to. As stated before this is based off of time, not closest distance, but specifies which should be reached quickest based on the impedance and likelihood of flooding.
    Overall network analyst can be used with much more in depth settings, particularly with inputting barriers to travel, and other restriction types. Each particular task seems relatively easy but when put all together you have an excellent example of some monumental processing being done. Just think about the big picture of what this simple map represents. Supplies moving to hurricane shelters, evacuation of a hospital for another due to disaster, where should you go in the event the worst happens in your neighborhood. Presenting refined products for all of these is whats in store in the next portion of this project. Thank you.

Friday, September 4, 2015

Remote Sensing and Visual Interpretation

     Welcome to my first week posting about Remote Sensing. Remote Sensing is defined in several different ways but in the context of this course really its about observing some feature from some distance away from the feature itself, usually from an airborne or orbital sensor platform. These sensors are able to obtain photos or other imagery that can range across multiple parts of the electromagnetic spectrum from the visual like the human eye to infrared and beyond.
     This first week of the course provides a introduction to visual interpretation elements. As such it involved looking at a couple different images and then selecting specific elements that correspond to particular reference criteria. All images were provided by UWF with the overall objective being to identify particular criteria in visually interpreting the images. Lets look at the first image map.


     This map is centered around the image, and examined by two particular criteria; texture and tone. These are displayed in different polygonal areas as representative of a type of texture or tones. The tones are displayed in the blue scaled shapes and range from very light to very dark. The textures are shown in green hued shapes and range from very fine or smooth to very coarse. Basic definitions for these features are provided on the map. Two examples of each tone and texture have been provided in different spots across the image to show you how these criteria really intermingle and merge and interact across the entirety of the image. Lets look at our next image.



     The above looks at a different aspect of the image, particularly that of identifying features within the image. This is done by analyzing 4 specific criteria; Shape and Size, Shadow, Pattern, and Association. A brief description of each is located on the map. Each criteria is also color coded byt its criteria type and labeled according to what the feature is. It's important to note that in most cases features can be identified or are aided in identification by multiple criteria. Its the combination of many different features that allow us to pull from out experience and identify an item. However with the above references, features have been identified by the criteria most specifically used in its identification. This like the race track are really a combination of both pattern, circular track, and association of no roads leading in to it, small building next to it, water tower is outside of it so its not a boundary, etc. The neighborhood is both a pattern and association of multiple houses with U shaped road pattern. The criteria really do go hand in hand.
     Overall this was an interesting look at some of the base elements of photo interpretation encompassing basic features and methods to identifying elements in an image. It will be interesting to delve deeper into the aspects of remote sensing, and its applications.