16

Sep

2014

Top 3 Applications for Data Fusion

Author: Rebecca Lasica

I've been doing quite a bit of work lately in the realm of data fusion, and with every unique project I learn a new and unique way to interpret what “data fusion” can mean. Here are the top 3 recent applications for data fusion that I have had the opportunity to explore.

Imagery + LiDAR

Fusion of imagery with LiDAR is probably the number one application for data fusion in terms of popularity. In fact, combining a 3-D point cloud with 2-D imagery provides a long list of analysis options that are not possible with just one modality alone. One example of this richness is illustrated in a recent webinar to use image information to perform DEM hydro-flattening from a LiDAR-derived surface. While LiDAR is one of the best data sources to derive topographic information, there are times when a return over shallow water or even algae or animals can cause a surface feature indicating that the water is not flat. This is a problem specifically for hydrology modeling and drainage planning. Image information, specifically the NIR (Near Infrared) wavelengths, have very low reflection values over water thus accurate shoreline delineation is straightforward, and actually quite easy.Constrain the DEM, or even the point cloud by shoreline vectors, set the values to the elevation of the water body and voila, you have just performed some powerful data fusion.

Figure 1: Left: LiDAR-derived DEM (LiDAR from OpenTopo) with color table applied showing variable elevation inside water bodies. Right: LiDAR-derived DEM with color table applied after performing hydro-flattening showing constant elevation inside water bodies.

LiDAR + Maps

Another popular data fusion exercise is the automatic extraction of building footprints from LiDAR. Many workflows require manual building extraction by delineating each footprint by hand, one by one. This can be an incredibly labor intensive process for any GIS department, but is very necessary for several applications. Some municipalities use this derived information to correlate new structures with building permits. Organizations in energy markets perform population density to plan infrastructure development and monitor existing easements. Therefore a powerful data fusion solution is to perform automated building extraction and overlay the footprints on an existing map to evaluate current “as-built” information. Once the vectors are derived,location, size, and proximity analyses can easily be performed to identify any necessary updates or to structure plans to meet mandates.

 Figure 2: LiDAR-derived building footprints (LiDAR from OpenTopo) overlaid onto an Esri Open Streen Map base map imported to ENVI.

Image Time 1 + Image Time 2

Imagery plus imagery can absolutely be a data fusion problem to solve, and one that is very worth the extra effort. Temporal analysis is on the forefront of almost every industry that uses remote sensing and is growing due to higher satellite revisit rates, decreasing costs of airborne data acquisition, and the introduction of UAS data to the commercial market. Some common problems with comparing time-enabled data sets,especially spectral comparison, include the need to perform pixel-to-pixel co-registration and sometimes re-sample and re-project disparate layers. Additionally, images taken at different times of day or on different days can have enough variability in data range due to differences in sun illumination intensity and collection angles that radiometric calibration and atmospheric correction must be performed prior to comparison if a valid result is desired. Easy to use tools for time-enabled data is the topic of an upcoming webinar. Let me know it you're interested and I will make sure you’re on the invite list.

 Figure 3: Three images representative of Modis 8-day surface reflectance data analysis to evaluate changes in drought indices over time. 

Comments (0) Number of views (97) Article rating: No rating

Categories: Imagery Speaks

Tags:

11

Sep

2014

Remembering the Colorado Floods: One Year Later

Author: Joe Peters



This week marks one year since a major flood event wreaked havoc on communities up and down the Front Range of Colorado. Our home county of Boulder, Colorado, was one of several counties in the region that was left devastated after a week of heavy storm activity brought over 17 inches of rainfall to the region. Localized flash flooding events caused major damage to roadways, bridges and buildings - and caused many rivers throughout the county to create new channels.

Many events have been taking place throughout the week to commemorate the flood event and to evaluate how far the communities that were most affected have come in rebuilding. Many of the communities that were hit the hardest were located in the western portion of our county, where steep topography, narrow canyons and a landscape that had been stressed by years of successive forest fire activity left the area ripe for this type of disaster. This morning on my way to work there was a great story on the local radio station about just how far these communities have come. Most of the infrastructure that was damaged has been rebuilt. In the small town of Jamestown, which was left a virtual ghost town after this event, about 90% of the residents have returned.

What  has been most interesting about this event, from my humble perspective, is how even though we can rebuild, the landscape of our region will be forever changed. One example of this would be right outside our office where water coming out of the mountains washed across the floodplain, depositing several feet of sediment on the landscape. To give you an idea of what this was like, the image below was taken during the flood on September 10, 2013. The image shows the bike path right outside of our office with an open space area just beyond the path (on the far side of the fence posts). When the flood waters receded, we were left with so much sediment and debris that heavy equipment was needed to clear the bike path.

One year later, the sediment has been cleared from the bike path, but the open space area is now several feet higher than it had previously been. The open space area that was once lush with vegetation now looks more like a beach, though vegetation has slowly been finding its way back to the area throughout the summer. The image below was captured on September 11, 2014.

Scenes like this are very common throughout the county. Many of the local hiking and biking trails have been rerouted to accommodate the changes to our landscape. In some areas, small drainages have grown to resemble small canyons. All of this change has had me thinking a lot about how we can use remote sensing techniques to quantify the affects of this flood event. In the video at the top of this blog post I talk about a tiered assessment strategy for managing impact analysis. A tiered assessment strategy for managing impact analysis provides GIS analysts with a framework for honing in on the areas that are most affected by an event. Remotely sensed data is often the first source of accurate and verifiable geographic information to become available after an event has occurred. Check out the image below for a breakdown of how a tiered assessment strategy would be implemented to make use of the valuable information provided by remotely sensed data.

At tier 1, regional assessment using moderate resolution imagery can be used to provide a "quick look" at the impact a disaster has had on a region. Check out the map below to see how Landsat 8 OLI imagery was used to extract surface water from pre-flood and post-flood images to get an understanding of how the flood impacted our region.

At tier 2, neighborhood assessment using high-resolution imagery obtained from platforms such as WorldView-2 from DigitalGlobe can give us the ability to identify sediment deposits, debris, landslides, damaged roadways, and damage to structures. Check out the image below to see how thematic change detection techniques applied to before and after WorldView-2 imagery were used to identify landslides, mud and sediment that was washed into a neighborhood in Boulder, Colorado.

This all leads me to tier 3 assessment, which one year later I feel is still under way. Tier 3 assessment, which relies on very high resolution remote sensing platforms, is something that I see expanding exponentially in the coming years. With sources such as LiDAR and synthetic aperture radar (SAR), we can truly begin to look at damage to individual structures and look at minute changes to elevation. With high resolution elevation data we can measure the changes I mentioned observing earlier, such as the elevation changes brought about by erosion of stream channels and deposition of sediment along the floodplain. We are also beginning to see the emergence of unmanned aerial vehicles (UAVs) for gathering geographic information. All of these unique sources of information can provide a very detailed picture of the changes to our environment brought on by a natural disaster such as the flood event that rocked our community here in Colorado one year ago. While we cannot prevent these types of events from occurring, using the latest technology to help respond to these events and learn from what we see is an important aspect of why many of us are in this field to begin with.

If you have any questions about the techniques used to extract the information from remotely sensed data that I mentioned in this blog post, please feel free to email me at joe.peters@exelisinc.com.

Comments (0) Number of views (517) Article rating: 5.0

Categories: Imagery Speaks

Tags:

9

Sep

2014

Fracking the Right Way

Author: Patrick Collins

The hydraulic fracturing of US shale formations to extract natural gas has yielded both great profits and great controversy over the past 20 years. Proponents of the technology claim it stimulates local economies and is a low-impact way of subsidizing our country's reliance on foreign oil. Opponents say it is a hazardous practice that can contaminate air and water resources and increase seismic activity due to destabilized bedrock. The Federal government provides some of the regulation over hydraulic fracturing, however the majority of policy regulating this industry is decided at the state level.  This means that regulations differ from state to state, and it is up to the drilling party to understand the regulations they are operating under.

Site setback regulations for the industry are generally centered around water resources and populations of people. A whitepaper put out by the West Virginia University College of Law nicely summarizes the regulatory approaches of New York, Ohio, Pennsylvania, and West Virginia. According to this study setbacks for natural gas drilling operations state that a new vertical well bore shall not be within 500 feet of buildings or water wells, within 1,000 feet of water resources used by purveyors, or within 300 feet from streams, springs, wetlands, and other water bodies. Remember those numbers, as they'll come into play in a minute....

Aside from the regulatory restrictions placed on drill operations, businesses may also have best practices that they employ when choosing a site as well. This might include the original slope or area of the proposed site. There are numerous geospatial data modalities that can assist in selecting a suitable drill site based on a combination of the regulatory requirements and the business best practice. Satellite or aerial imagery, LiDAR, and vector files can all be fused together to give operators a complete picture of a potential site prior to sending valuable resources out into the field.

For this case study I found a great LiDAR dataset on Open Topography located over the town of Petersburg, Pennsylvania. Petersburg is located on the Marcellus formation in central PA, which is the largest source of natural gas in the United States. From this LiDAR, I was able to extract a high resolution digital elevation model (DEM), as well as the vectors for all of the buildings within the scene. I downloaded some vector files from the US Census Bureau which included water bodies in the area, which would allow me to ensure I was meeting state setback regulations.

The first step in my process was to identify areas within the scene that were less than or equal to five degrees. This was based on the assumption that a business best practice would be to develop a well site on an area of generally low slope, as opposed to having to level a significant amount of Earth to create a flat surface large enough to place an operation. The next step was to buffer my waterways and buildings so as to adhere to the Pennsylvania State regulations. Remember those numbers I asked you to remember? No drilling within 500 feet of buildings or water wells, within 1,000 feet of water resources used by purveyors, or within 300 feet from streams, springs, wetlands, and other water bodies. Once these buffers were applied to the appropriate vector layers, I am presented with a map that shows in green areas of slope that are 5 degrees or less which do not violate any of the state setback restrictions, which are shown in purple.

As a final consideration, I looked into the average size of a drill site to ensure I had enough room to place an operation there. According to the Shale Gas Information Platform, the average size of a multi-well pad for the drilling and fracturing phase of operations is 3.5 acres. I used a clump and sieve function in ENVI to remove areas of slope that were not at least 3.5 acres in size. From the final image below, you can see the waterway and building buffers, as well as suitable drilling areas shown in yellow.

This simple case study highlights how geospatial data fusion can help oil and gas drilling operations determine drill site suitability. Regardless of your personal stance on hydraulic fracturing, safe operations that adhere to federal and state regulations is in the best interest of everyone, and geospatial data analysis provides valuable information that can decrease potential risks from these activities. What do you think? Have you used geospatial data to conduct site suitability assessment for oil and gas or other industries? What insights or potential warnings do you have for conducting these types of analyses?

Comments (0) Number of views (671) Article rating: 5.0

4

Sep

2014

3 Ways ENVI & IDL Contribute to Successful Agile Product Development

Author: Mark Bowersox

Our engineering team has been using an Agile Software Development philosophy for a number of years now. So, we are excited to see our customers demand Agile methods to get better results faster. Building next-generation architectures for GEOINT analytics requires greater productivity and responsiveness than ever. One recent example is the eXploit Program at NGA.

As system integrators respond to programs like eXploit, they'll need to demonstrate how they will use an Agile process to rapidly deliver web-based software applications that satisfy a variety of complex GEOINT exploitation problems. They will also need to show how their development methodology is flexible enough to rapidly adapt to changing missions so that new GEOINT solutions can be inserted as fast as warfighters need them.


When it comes to rapid capability development and quick reaction to changing needs, ENVI and IDL, represent real value. There are three ways I see them contributing to successful product development using Agile.


First, ENVI contains a robust library of image processing routines that form the building blocks for more complex algorithms and workflows. Each of these routines would take days or weeks to develop with traditional languages like C or JAVA. With ENVI, you're not starting from scratch to build foundational components like file format readers or tools to perform basic raster management like subsetting, reprojection and calibration. The ENVI library also contains robust methods for analyzing multi-dimensional data sets to extract unique signatures and locate features of interest. These building blocks allow developers to focus on solving intelligence problems instead of spending their time in the weeds recreating what already exists. ENVI can get you out of the starting blocks quick and showing real progress in your first few development sprints.


Second, ENVI exposes these building blocks through a user friendly application programming interface (API) we call ENVITask. ENVITask provides the programmer with a uniform and consistent method for executing analytical processes. ENVITask is also self documenting which minimizes the need for a developer to consult the API documentation to discover what parameters the routine expects as inputs and produces as outputs. Using ENVITask, developers will spend less time iterating on code to resolve bugs or validation errors. They will also produce clean, easy-to-read, code that can quickly be modified and extended by the next person. This is important to Agile development as multiple programmers may touch the code as the program evolves.


Third, complex algorithms developed, validated and tested in the ENVI desktop development environment can easily be published as web services. Since ENVITasks share a common and consistent set of parameters and data types, it is straightforward for our ENVI Services Engine software to recognize the task and automatically generates a REST endpoint accessible by web clients. This provides great flexibility as tradecraft can be designed and perfected by small teams in the desktop environment but quickly deployed to benefit enterprise users as web applications. 


Have you developed a software application using ENVI or IDL? Please chime in here to share any advantages you found with regard to your software development process.

Comments (0) Number of views (546) Article rating: No rating

Categories: Imagery Speaks

Tags:

3

Sep

2014

I am done with File->Open

Author: Beau Legeer

The initial step to almost any geospatial workflow is getting data into your exploitation tool of choice. For years this step was accomplished by using the “File” menu on an application and choosing the “Open”option. Over the years, open toolbar icons appeared and drag and drop was supported. All of these “Open” options are almost always preceded with an additional workflow of finding the data needed over the area of interest. This is often a time-consuming workflow that involves searching online catalogs, downloading large datasets and often correcting and calibrating data. In the age of the cloud, these time consuming  workflows can and must improve.

I want File->Open and these time consuming workflows to simply fall into the lexicon of the great storytellers who talk about times gone by. Today the workflow must evolve to one that starts with a location, is powered by search and simply presents the most relevant and available data. In this world File->Open is replaced by a map and that map provides the location for search queries to online data sources. Data providers support this effort through catalog interfaces that allow for searching on their holdings with simply a geographic bounding box or with inputs on the most complex metadata items.  Multiple providers can be searched and their results combined to form the basis of a solution that uses multiple data sources to derive a more informed answer to the problem at hand.

This is an ambitious idea but one where the pieces are already in place. Data catalogs are available from commercial and government data sources today. Desktop and online exploitation services are ready to consume this data sources today. It all just needs to come together. What can we do to take the next step?

Comments (0) Number of views (508) Article rating: 5.0

Categories: Imagery Speaks

Tags:

12345678910 Last

MOST POPULAR POSTS

AUTHORS

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

Authors

GUEST AUTHORS

© 2014 Exelis Visual Information Solutions