Author: Rebecca Lasica
This is the time of year when I deliberately spend some time to reflect and look forward, both professionally and personally. One professional reflection that fills me with gratitude is the opportunity I have had this year to work on some innovative projects in remote sensing. Looking back I realize how much I appreciate the beauty in nature and how fortunate I am to work in an industry that continues to thrive and grow. In the spirit of beauty in nature, I share with you this image of the month.
Phytoplankton in Patagonia Shelf Break currents off the coast of South America in the Atlantic. Image Credit: NormanKuring, NASA’s Ocean Color Group, using VIIRS data from the Suomi NationalPolar-orbiting Partnership
Categories: ENVI Blog | Imagery Speaks
Author: Patrick Collins
In a previous blog post I spoke about the interoperability between ENVI and ArcGIS® for Desktop. This included things like data format support, drag and drop of data between interfaces, single-click push of ENVI derived data to ArcGIS, and the ability to run ENVI analytics from directly within the ArcGIS desktop software. Now I'd like to go over some of the ways that ENVI plays well with ArcGIS for Server, and how users can utilize this interoperability to streamline workflows.
Accessing ArcGIS for Server from ENVI
The first scenario I'd like to look at is using ENVI to directly access and analyze an Image Service being served from ArcGIS for Server. ENVI has full support for the image service specification, and can easily consume and analyze services from ArcGIS for Server.
Access is made using the ENVI Remote Connection Manager,which allows you to connect to your ArcGIS for Server and access the data contained there. Once you've connected to your dataset(s), you can use any of the analysis tools available in ENVI to analyze the data behind the service, bringing results local for further analysis. Once the data has been accessed and analyzed, you can then push your derived results over to ArcGIS desktop using the afore-mentioned desktop interoperability, and publish those results back to your ArcGIS for Server for dissemination. This enables any image service with the full suite of analysis tools available in ENVI, and provides a quick method for getting those results back into ArcGIS for Server and shared with the rest of your organization.
Running ENVI Geoprocessing Services through ArcGIS for Server
The second scenario I'd like to explore is running ENVI analytic functionality from the ArcGIS desktop via an ArcGIS for Server instance. In this case ENVI for ArcGIS - Services Edition has been installed alongside ArcGIS for Server, exposing the ENVI functionality as an ArcGIS Geoprocessing Service. ArcGIS can then access not only the data from the server, but can also run analysis using server-based ENVI analytic functions on server-based or local data.
This makes it much easier for developers to centralize the location of their custom analysis while easily exposing that utility across the organization. Analysts can now run ENVI and ArcGIS functionality within the familiar desktop interface and even chain together the different processes into repeatable custom functions using the ArcGIS Model Builder.
Running ENVI Functionality on an ArcGIS Image Service from a Thin Client
This allows users with low bandwidth to consume advanced analytics on large datasets without having to move data around or have massive computing power on their device. Web pages can be made to consume only the data and analysis that a specific user, or community, is interested in, creating very friendly user-interface experiences for the non-traditional consumer of GIS. It also showcases how standards-based cloud architecture can be designed to consume remote data and analytics from any web enabled device.
These are just three ways that ENVI and ArcGIS for Server can be used to streamline data and analysis workflows within an organization.With new development going into the interoperability between IDL and Python, soon building custom code that leverages the best functionality from both the ArcGIS and ENVI environments will be easier than ever. What do you think? How do you see cloud-based analysis evolving in the future?
*All images in this blog post are used courtesy of DigitalGlobe™ Inc.
Tags: ENVI, Esri, Cloud, ArcGIS, ENVI Services Engine, Interoperability, ArcGIS for Server
Author: James Slater
In Greece, the forest fire season starts the first of May each year and ends October 31. During this period which coincides with the tourist season, the civil protection mechanism and fire services are in a high alert status and operate under a well-planned scheme. However, local administration authorities and municipalities do not yet provide access to mapped assessments for burned areas immediately after or during the occurrence of a wildfire event and instead rely on central government agencies and institutions through what can be a time-consuming bureaucratic and fiscal process.
For the last three years as part of Inforest’s CSR initiative and bundled with an environmental remote sensing outreach initiative for local government, Inforest Research has made efforts to extend their knowledge and experience in the application of remote sensing technologies to deliver rapid forest fire assessment damage maps. By relying on free remote sensing data sources (MODIS and more recently Landsat 8 OLI datasets) in combination with semi-automatic processing in ENVI, the resulting map products for burned areas and major fires occurrences are then made freely available for institutions to use.
Chios & Rhodes Island Analysis
From these assessment maps, Inforest Research additionally produces newsletters distributed directly to local authorities and makes available the methodology to enable institutions and users to repeat and implement the process. Newsletters, written in Greek are available to download, while a video explaining the concept and methodology can be viewed on Inforest’s YouTube Channel. Beneficiaries of this work include not only local government but also conservation NGO’s such as WWF Greece.
Examples of the Inforest Research Forest Fire Damage Assessment newsletters for events in 2013 and 2014.
Inforest Research hope these efforts will help to improve the understanding of and response to the impact of forest fire events whilst improving local agency capabilities to conduct this analysis in house through sharing of methodologies, implementation and outcomes.
Author: Matt Hallas
Before we begin we must reaffirm the end goal of this process; to find the most spectrally pure, or spectrally unique, pixels (endmembers) within the data set and to map their locations and sub-pixel abundances. Following the step of atmospheric correction we move on to Spectral Data Reduction. Determining the inherent dimensionality of a dataset will allow the analyst to ignore “noisy” data, and identify the number of spectrally distinguishable endmembers through the separation of information from noise.
A hyperspectral image contains a truly overwhelming amount of information, and the efficient and accurate identification of spectral bands containing the most information is a necessary first step. It will be later on that we can use our expertise and other ENVI tools to identify the endmembers.What we accomplish in this step can be thought of as removing some of theclutter to see our data more clearly.
To accomplish this task of focusing solely on the pure endmembers found within our dataset, we perform a Minimum Noise Fraction (MNF) Transform. The MNF Transform alters the data to allow the analyst to subset a large number of the spectral bands due to the fact that the vast majority of unique spectral information will be contained within the first few MNF Transform bands (Boardman, J.W., 1993, “Automating Spectral Unmixing of AVIRIS DATA Using Geometry Concepts,” In Summaries of the Fourth Annual JPL Airborne Geoscience Workshop, JPL Publ. 93-26, Vol. 1, Jet Propulsion Laboratory,Pasadena, CA, pp. 11-14). In favor of describing how this is employed in ENVI I will leave it up to our fine documentation center to deliver a much more detailed explanation of MNF Transform and its implementation within ENVI.
One can accomplish this task of spectral data reduction through the use of the ENVI interface. The first portion of this transform consists of estimating the noise statistics from the data using a shift difference technique. You can accomplish this in ENVI by selecting a spatial subset of spectrally uniform data to then calculate a noise statistic baseline. A homogenous group of pixels should demonstrate a similar spectral signature; variance within the spectra is assumed to be the result of noise and thus a noise statistic baseline can be calculated.
After some moments of processing ENVI will display the MNF transformed image as well as display a profile containing the MNF Eigenvalues. Bands with large eigenvalues (greater than 1) contain data, and bands with eigenvalues near or less than one contain noise. Recall the file we are using, cup95eff.int, which is a hyperspectral image that only contains fifty spectral bands. This is of importance because the number of spectral bands in your data will always be equal to the highest eigenvalue number.
The MNF Eigenvalues plot displayed shows that we will be able to subset a large number of spectral bands. After eigenvalue number 30 the eigenvalue falls below one, and we know that we can essentially discard nearly half of the data, even more if you what you seek in your imagery is relatively abundant within the scene. A quick note; many people will find a high eigenvalue number to be of importance so ENVI does not discard any of the information, it is up to the user to decide what is and is not important to their pursuit.
MNF Band 1 contains the largest eigenvalue and as a result the largest amount of variance. It is necessary to visually inspect the transform bands and the Band Animation Tool makes this task extremely easy. Within the Band Animation tool you will be able to cycle through all of the transform bands to determine which bands contain strictly noise and will not aide in the identification of endmembers. The first few bands will appear similar to grayscale images and as you get to MNF Band 15 and 20 you can seethe image more closely resembles a TV with bad signal.
MNF Band 1
Through viewing the bands created by the MNF Transform tool we can start to weed out the superfluous data and focus only on the most interesting information.
MNF Band 20
Although the vast majority of the pixels in MNF Band 20 seem to be very noisy, the dark region could highlight a homogenous grouping of pixels with unique spectra. Further investigation of the spectral profile of those pixels and comparison against a spectral library will allow us to identify the presence of Alunite, that is further along in the spectral hourglass workflow.
We are well on our way to extracting the most pure endmembers from our dataset with the help of some high-powered ENVI tools. The next few steps will begin to rely more heavily on the expertise of the user as well as the n-D Visualizer tool found within ENVI. Be sure to check back in about six weeks for Part 4 and as always please feel free to comment below or send me an email directly at email@example.com.
You can also go the programmatic route and write scripts to perform an MNF transform using the ENVIForwardMNFTransformTask and ENVIInverseMNFTransformTask routines.
Tags: ENVI, VIS, Exelis, Remote Sensing, Image Analysis, Image Processing, Spectral Analysis, geospatial, data processing, Visualization, hyperspectral data reduction
Author: Peter DeCurtins
Expectations that the United States would once again become the world's largest oil producer appear to have proved true this year. We are in the midst of the shale boom, something which I could have only dreamed of back in the oil bust days of the 1990's when I worked in the geophysical industry processing seismic data. The business was pretty lean back then, but that economic situation was driving a wonderful revolution in the technology that is used in the exploration for and production of oil and natural gas. The focus of that time seemed to be centered on interactivity and integration. It was all about bringing the various data together in one place and providing powerful, interactive tools running on scientific workstations to the geoscientists and engineers working to find and exploit hydrocarbon resources.
Geologists and geophysicists rely on data from existing wells for their ground truth. It is only where the surface has been drilled into that we have direct knowledge of the constituent rocks and structure of the subsurface. But the information concerning the well is accurate only for a very small, well-defined area. The Earth is big, and wells are costly to drill. It's necessary to use interpretation techniques to extrapolate the data between wells, introducing considerable subjectivity into the analysis. The desire is to visualize the subsurface with greater degrees of clarity and resolution. The most cost-effective technology for doing that has long involved the imaging of seismic data.
An interpreted seismic section. From Earth's physical resources: Petroleum. An OpenLearn chunk used by permission of The Open University copyright © 2010. CC-by-NC-sa
Seismic data is produced by introducing acoustic energy into the subsurface and recording the resulting sound waves as they reflect off of different layers and travel back to receivers on the surface. Digital processing of the data can be used to produce an image representing the subsurface. 3D seismic surveys are the most cost-effective way to date of providing insight to the interpretation process.
A typical marine seismic survey acquisition layout. From Earth's physical resources: Petroleum. An OpenLearn chunk used by permission of The Open University copyright © 2010. CC-by-NC-sa
The production coming out of the rich shales of Texas and North Dakota are an indication of how successfully technology in the industry has been brought to bear. Not only are we able to effectively locate such hard to find resources, but extraction and production techniques such as horizontal drilling and hydraulic fracturing enable efficient production from these hard to exploit reservoirs. But the industry sees the need to push on to the next level, and they believe that the ticket to get them there is prescriptive analytics.
This will require the ability to analyze vast quantities of structured and unstructured data in an integrated fashion. The industry is hoping to analyze simultaneously images from well logs, seismic reports, video and image feeds collected down hole and during drilling and fracking operations, audio recording collected by various field sensors, and text notes collected from operations as well as a wealth of numeric information collected in place. This is a multi-interdisciplinary domain, combining different approaches to interpreting disparate data sets as an integrated whole. Image interpretation will rely on machine learning and computer vision leading to pattern recognition of processed data.
Ultimately, prescriptive analytics means being able to derive from analysis what is going to happen when, and how to best prepare for and optimize the future. In terms of petroleum exploration and production, one would hope to be able to better predict the future performance of various wells in an oil field. Companies will better know where to drill as well as, maybe most importantly, where not to. That's really what the whole endeavor has always been about, making prescriptive analytics a seemingly perfect fit.
3D visualization of integrated seismic and well data. From Earth's physical resources: Petroleum. An OpenLearn chunk used by permission of The Open University copyright © 2010. CC-by-NC-sa
Prescriptive analytics holds promise beyond that. By creating a holistic, integrated view of data collected from in-field production equipment, failures of pumps and other machinery can be better anticipated and mitigating actions prescribed to minimize production losses. Prescriptive analytics may be used to predict corrosion and fatigue failures in pipelines and prescribe preventive maintenance by analyzing video and other data collected by robotic devices that travel inside the pipeline network.
Better decisions mean better exploration and production results with fewer resources expended, and less impact on the environment. The winners of the race to the future will be those that are best able to utilize advances in technology to most efficiently and safely extract, product and deliver oil and gas to market. Just as it has always been.