Author: Stuart Blundell
The small satellite (Smallsat) revolution is fostering big ideas on how we should view the future of geospatial intelligence from a persistence point of view. First off, let’s define what we mean by small so that we can fully appreciate the scale of this new reality in earth imaging. The 2014 SpaceWorks© Nano/Microsatellite market assessment defines five different Smallsat classes on the basis of weight: at the small-end of the spectrum are the Femtosatellites (10-100 grams) and at the heavy-end of the ranges are the Small Satellites (100-500 kilograms). Think of professional boxers weight classes with the “Femtos” being the Pinweights and the “Smalls” being the Featherweights of the group. For earth imaging purposes the majority of the satellites will be Nanosatellites (1-10 kg) – think of Planet Labs as the Light Bantamweights of the global geospatial Smallsat arena (Figure 1).
Figure 1. In this corner in the striped trunks are the challengers: Planet lab co-founders with a Flock 1 Nanosatellite (Planet Labsimage)
Categories: ENVI Blog | Imagery Speaks
Tags: Smallsat, small satellites, microsatellites
Author: Matt Hallas
Often when I'm teaching an ENVI class or in communications with a customer I am asked the question, “Where can I get data?!” My typical response in the past was to list sites such as GLOVIS and OpenTopo where a wide-array of free data, usually at low spectral resolution, is available to for download. Next, I would rattle off the various commercial satellite organizations. While free data is truly a wonderful thing, often you need your data faster and at a higher spatial resolution than what is available for free.
Companies like DigitalGlobe and Airbus Defense and Space are at the forefront of the commercial imagery sector, with the well-known WorldView and Pleiades constellations, respectively. These constellations are just the tip of the iceberg when it comes to high resolution imagery with extremely short revisit times. WorldView-3 boasts a revisit time of less than one day with roughly 1.24 meter pixel resolution and Pleiades-1 has a similar revisit time with 2-meter pixel resolution for the multispectral bands. Wouldn’t it be nice to have a one-stop-shop for all of your data needs, where you can search what's available based on the area of interest, data type or sensor, and then acquire your data quickly so you can start your work ASAP? Enter the IntelliEarth Marketplace.
Harris boasts a truly massive library of data, as well as on-demand access to LiDAR, SAR, raster and vector data through the IntelliEarth Marketplace. Through this online marketplace, you can purchase data in its raw form or have it processed as much or as little as you would like. Many people want a product that has been just orthorectified. We've got that covered. Other folks want even more sophisticated visual/simulation dataset models. No problem, just look to our Services page! There is a large amount of data available immediately for download while other products need to be ordered. After placing your order, you'll receive it in anywhere from a couple of hours to a few days. The IntelliEarth Marketplace gives you access to:
The variety and most notably the quality of the data products provided through this online marketplace really help to differentiate the store from other options. What's more is that only the IntelliEarth Marketplace provides immediate download of worldwide 30cm+ resolution imagery from DigitalGlobe and Airbus Defense and Space.
The best place to start when you are looking for data is to check out the Data Store webpage, where you will see the variety of data types and products available. Once you know whether or not you want a DEM, raster data, perhaps some LiDAR data, or whatever, then view the available products found here. After selecting ‘More Information’ about your desired product you will see the sensors available for that data type, as well as various options for downloading/ordering what you need. This screenshot shows the various sensors available for ‘Satellite Imagery’.
Once you have perused the large library of datasets available through the marketplace and acquired the data, you need a tool to work with your information. ENVI and IDL have the ability to ingest and analyze all of the data available through the store (and much more, of course). So not only do you have access to the most recent imagery possible, you have the analytical tools required to get the job done.
This harmony between data and analytics is at the forefront of the recent acquisition of Exelis Inc. by the Harris Corporation. With the new Harris your problem is fully supported from beginning to end. You'll get access to the best possible data products through the IntelliEarth Marketplace, as well as best-in-class analytic and feature extraction tools contained within ENVI and IDL. We’re excited to work more closely with customers on the problem of data acquisition, and continue our great track record of helping our customers get the most out of their data as possible. Take some time to explore this marketplace and see what problems can be solved with our wide assortment of products available. As always don't hesitate to reach out to me with questions - firstname.lastname@example.org.
Tags: IDL, ENVI, GIS, Academic, Feature Extraction, Image Analysis, Image Processing, Spectral Analysis, multispectral imagery, hyperspectral imagery, DEM, NDVI, geospatial, GEOINT, LiDAR, Visualization, data analysis, geospatial data, data, Data Store, IntelliEarth Marketplace
Author: Joey Griebel
As 2016 is off to the races, one of the many exciting happenings at Harris Geospatial is the recent partnership with Icaros Geospatial Solutions and the addition of an Icaros OneButton™ Extension in ENVI. With over 1 million hobby drones sold during the holiday season, those capable of flying drones and the presence of UAS/UAV acquired data will only become more prevalent in the upcoming years. What that means for those providing analytic tools to make actionable information from the data is we need to find a way to work with the data and get accurate results.This is where Icaros comes into play.
Icaros allows a user do a variety of processing to their data sets including photogrammetric geocorrection, aerial triangulation, digital terrain modeling, and the key piece -- orthomosaic production. They have options that range from "one button" where you simply click and run the processing to "one button pro" where you have more manual options to insure accuracy and where you can do additional QC like replacing a bad scene from the mosaic. These capabilities bridge the gap between acquiring UAS/UAV data andactually being able to run analytics and provide answers in a timely fashion.On the other end of the spectrum, ENVI+IDL is able to bring not only post processing analytics, but tackle one of the huge problems facing UAS/UAV acquired data, data integrity from sensors and the ability to pre process the data and insure bands are aligned before it is ingested in Icaros.
With the partnering of ENVI+IDL, you have an end-to-end solution for working with UAS/UAV acquired data. You have the ability to verify the data lines up and pre process if needed, easily ingest and stitch large swaths of data using Icaros One button, and then ingest the data into ENVI and work with it as you would satellite imagery.
Tags: ENVI, UAV, UAS, ENVI+IDL, Icaros OneButton, photogrammetric geocorrection, aerial triangulation, digital terrain modeling, orthomosaic production
Author: Guss Wright
What does a soldier do when he or she runs into difficulty while using ENVI software? To answer this question you have to consider and understand the culture of most military organizations. Army geospatial professionals are often times under the gun, so to speak, given a limited time to provide GEOINT support and advice to decision makers with the added pressure of knowing the answers we provide could be the key factor in affecting success or failure on the ground.
Considering this, we see time as being one of our most precious resources so we’re always looking for the most efficient way to solve geospatial problems. Unfortunately this often relegates soldiers to the usage of tools he or she is more familiar with or has experienced past success in applying, even if the tool isn’t optimal for the problem at hand. Then we move on to the next task without revisiting the shortfall we just experienced with the more optimal tool, without requesting help or providing user feedback. I call this “suffering in silence”.
As many of you may know from my last blog, I’m Chief Warrant Officer 3 AugustusWright and I’ve been training at Harris with the ENVI software since August as part of the US Army’s Training with Industry program. I want to expound on the topic “suffering in silence” because after having spent the last six months integrating with the staff at Harris, I realize the old way of solving GEOINT problems isn’t the only option when facing the aforementioned shortfalls. WE CAN ASK FOR HELP!!!!!!, J - (And I’m smiling when I say this.)
Case in point, one of the awesome Harris software engineers, Scott Paswaters and I have recently been testing and refining ENVI’s CADRG Save As capability. This is a very important functionality for defense users because it enables us to provide tailored graphics in Raster Product Format (RPF) to defense end-users whose systems can’t and don’t need to read vector data. Ultimately this facilitates the provision of a Common Operational Picture across military platforms. During the process we discovered a shortfall which prevented software such as Falconview from being able to read ENVI’s output. We immediately queried the field to see who else was experiencing this problem and discovered there were many.
Within two days Scott found the problem, produced a patch, and he and I are already exploring methods to develop a tool that will use the IDL-python bridge to automate the entire specialized CADRG creation workflow for defense users. Keep an eye out for this tool.
In addition I’ve written and compiled what we call the “ENVI Pocket Guide”. The ENVI Pocket Guide is a quick reference booklet NOT intended to be read from cover to cover although it can be. The intent is to provide users succinct steps on how to accomplish common tasks in ENVI. The RPF export workflow and other pertinent information such as how to contact ENVI technical support and online help can also be found in this guide. We anticipate its release will be in tandem with Esri's upcoming Federal User Conference (FED GIS) February 24-25. Check back here and I'll link to it as soon as it is available.
If you're a military user of ENVI who is having specific issues or workarounds, comment below and we'll see what we can do. Let’s not “suffer in silence”!
Tags: ENVI, Esri, GEOINT, military, Army, IDL Python Bridge, ENVI in theatre, ENVI Pocket Guide FED GIS, Falconview, Raster Product Format, CADRG Save As
Author: Zachary Norman
For a good portion of my life I have been a space nerd and I have always enjoyed looking at images of other planets in our solar system, nebulae, and other galaxies. Recently there was a new addition to the group of photographed celestial objects: Pluto. On July 14th, 2015 the New Horizons probe flew within 12,600 km of the surface of Pluto and took some awesome images. With these new images, I thought it would be a neat idea to try and mosaic a few of the images together using ENVI to see if it could be done.
In order to get some images that would be suitable for mosaicking, I needed to make sure that they were taken in quick succession so that they would have a good amount of overlap with each other. This allows ENVI to be able to register the images to one another without too much trouble. To find these images I got data from New Horizon's instrument LORRI at the following link:
It was really important when selecting images to mosaic that the images needed to have similar acquisition times. After looking through the available images from LORRI, I found 13 images from the acquisition time from 11:36:00 UTC to 11:36:36 UTC. It is great that the images of Pluto are available, but unfortunately there was no metadata associated with the data to accurately geolocate the images. To work around this missing metadata, I decided that I would just create a dummy spatial reference centered around a reference image. For my base image, I chose to use the picture taken at 11:36:00 UTC and gave this image a basic WGS-84 coordinate system with fake pixel sizes of 1 meter by 1 meter. At this point it is important to note that my fake coordinate system and pixel sizes likely do not correspond at all to the actual coordinate system or pixel sizes, I just needed to create a spatial reference that I could use to register and mosaic the images using ENVI.
With my reference image and fake spatial reference, the next step was to figure out how to get the images registered properly to one another. After doing some investigation with the Image Registration Workflow, I realized that in order to get the best results I needed to have the images placed as close to their actual relative positions as possible. To do this, I looked at my reference image and first image to be registered in the series (reference image had the acquisition time of 2015-07-14 11:36:00 UTC and the first image to be registered was acquired at 2015-07-14 11:36:03 UTC).
After looking closely at the two images and tweaking things by hand a bit, I found that the approximate pixel offset of the second image with respect to the first was about [-580, -18] pixels in the x and y directions. This mean that the first image in series after the base image had an offset of [-580, -18] pixels and the second image had an offset of [-1160, -36] pixels. At this point I decided to use the ENVI API to programmatically apply this offset to each image in the series. A big reason for this choice to use the ENVI API is that I wouldn't have to step through the Image Registration Workflow for every image in my series and, if I changed the offset at all, it would be quick to apply to a different image.
After I reprojected each image based on my pixel offset the images were relatively close at this point. All that was left was to apply color balancing (through a custom histogram matching function), register the images to one another, and mosaic them together. Here is a quick summary of the steps that I performed using the ENVI API:
Apply color matching to reference image -> Use task GenerateTiePointsByCrossCorrelation -> Use task FilterTiePointsByGlobalTransform -> Use task ImageToImageRegistration -> Use task BuildMosaicRaster.
Below is the output mosaic and the IDL code used to create the image.
; :Author: Zach Norman
; Small example program to register and mosaic some Pluto
images from the New Horizon's
; LORRI instrument. Save this code as pluto_mosaicking.pro and
to run you can press 'Run'
; in the IDL Workbench. See the output section below for where
to find the results of the
; program and details on the results.
; The code is written to automatically download the images of
Pluto to a directory called images
; that will be created in the same location that this .pro
file will reside. If there are any
; issues downloading the files then you may have to download
them from the web by hand. This
; could be necessary if the URLs change for the images from
NASA. You may need to
; make some additional changes as well if you download the
images by hand.
; Specifically, you may have to change some of the parameters
in the procedure pluto_mosaicking
; so that they are current for your machine. If you download
the Pluto images from
; http://pluto.jhuapl.edu/soc/Pluto-Encounter/index.php then
you will need to specify
; the value of "imagedir" in the pluto_mosaicking
; If you want to or have to download the images by hand, there
are 13 images which have
; acquisition times ranging from 2015-07-14 11:36:00 UTC to
2015-07-14 11:36:36 UTC. As
; is, the code is written so that you place this .pro file
next to the directory which
; contains the images.
; Three files will be generated in the same directory as this
.pro code and they will be
; named: pluto_mosaic, pluto_mosaic.hdr, pluto_mosaic.tif.
; The first file is an ENVI format image that can be read into
ENVI with the header file (.hdr)
; and the third file is a TIFF images so that you can view the
mosaic with a general photo viewer.
; Note that the ENVI header file contains fake projection
information. I created a dummy spatial
; reference for each image since there was no metadata
provided with the LORRI images.
;small function for the histogram matching between images
;this is useful because some of the images have bad automatic
scaling from NASA and
;they appear very dark
function histogram_matching, match, base
maxval = (max(match))>(max(base))
;calculate the histograms and cumulatide densities
match_histogram = Histogram(match, Min=0, Max=maxval,
match_cdf = TOTAL(match_histogram, /CUMULATIVE) / N_ELEMENTS(match_histogram)
base_histogram = Histogram(base, Min=0, Max=maxval,
base_cdf = TOTAL(base_histogram, /CUMULATIVE) / N_ELEMENTS(base_histogram)
;create the transform function z to switch our data space in the
z = bytscl(base_histogram)
FOR j=0,n_elements(z)-1 DO BEGIN
i = Where(match_cdf LT base_cdf[j], count)
IF count GT 0 THEN z[j] = i[-1] ELSE z[j]=0
;do the histogram matching
matched = z[base]
;method for ENVI to close all data sources and
e = envi(/current)
if (e eq !NULL) then return
if (e.WIDGET_ID ne 0) then begin
;reset our views
views = e.getview(/all)
foreach view,views do view.close
;close all open rasters
opendata = (e.data).Get(/raster,/vector,
foreach raster, opendata do raster.close
;start ENVI and make it a new state!
if (e eq !NULL) then e = envi(/headless) else e.reset
;approximate pixel offset for each image
px_off = [-580, -18]
;directories for running the code
thisdir = FILE_DIRNAME( (((scope_traceback(/STRUCTURE))[-1]).FILENAME))
;imagedir is the location of the original images
imagedir = thisdir + path_sep() + 'images'
;copydir is the location of copies of the images from imagedir
that have a dummy spatial reference
copydir = thisdir + path_sep() + 'image_copies'
;warpeddir is where the registered images go
warpeddir = thisdir + path_sep() + 'images_warped'
;if imagedir does not exist, then make it and download the files
to that directory
;if the URLs change then these will need to be modified by hand
or downloaded by hand from the following URL:
if ~file_test(imagedir) then begin
print, 'Did not find the directory ' + imagedir
print, string(9b) + 'Creating directory and downloading images...'
urls = ['http://pluto.jhuapl.edu/soc/Pluto-Encounter/data/pluto/level2/lor/jpeg/029917/lor_0299179679_0x636_sci_3.jpg',$
;use the IDLnetURL object to download each image
urlobj = IDLnetURL()
for i=0, n_elements(urls)-1 do begin
outfile = imagedir + path_sep() + strmid(urls[i],strpos(urls[i],'/',/REVERSE_SEARCH)+1)
void = urlobj.Get(FILENAME = outfile,$
URL = urls[i])
print, string(9b) + string(9b) + 'Downloaded image ' + strtrim(i+1,2) + ' of ' + strtrim(n_elements(urls),2)
print, string (9b) + 'Downloaded all images!'
;make our output directories if they don't exist
if ~file_test(copydir) then FILE_MKDIR, copydir
if ~file_test(warpeddir) then FILE_MKDIR, warpeddir
;go to imagedir and look for the Pluto images
cd, imagedir, CURRENT = first_dir
files = file_search('*.jpg')
;check to see if we already made copies fo the images
;create a fake spatial reference
psize = 1d*(180/!CONST.R_EARTH);pixel size of 1 meter
spatialRef = ENVIStandardRasterSpatialRef( $
COORD_SYS_CODE=4326, /GEOGCS, $
PIXEL_SIZE=[psize,psize], TIE_POINT_PIXEL=[0.0D,0.0D], $
;make sure to sort the files so that we are arranged by time
files = files[sort(files)]
;copy all of the files and create ENVI rasters witha base
reference = e.openraster(files,
reference_data = reference.GetData()
;export original raster to copy raster
outfile = copydir + path_sep() + file_basename(files, '.jpg')
if file_test(outfile) then FILE_DELETE, outfile
reference.export, outfile, 'ENVI'
;perform histogram matching on all images and export them to the
for i=1, n_elements(files)-1 do begin
spatialRef = ENVIStandardRasterSpatialRef( $
COORD_SYS_CODE=4326, /GEOGCS, $
PIXEL_SIZE=[psize,psize], TIE_POINT_PIXEL=[0.0D,0.0D], $
warp = e.openraster(files[i],
warp_data = warp.GetData()
warp_match = histogram_matching(reference_data,
outfile = copydir + path_sep() + file_basename(files[i], '.jpg')
if file_test(outfile) then FILE_DELETE, outfile
warp_copy = ENVIRaster(warp_data, INHERITS_FROM=warp, URI =
;go to the copy directory and register our images to one another
files = file_basename(files, '.jpg')
reference = e.openraster(files)
;copy the reference image to the copy directory and open it in
outfile = warpeddir + path_sep() + files
reference = e.openraster(outfile)
;perform image registration on all images
print, 'Found ' + strtrim(files.length,2) + ' images to register!'
warp_this = e.openraster(files[i])
; Get the auto tie point generation task from the catalog of
TieTask = ENVITask('GenerateTiePointsByCrossCorrelation')
TieTask.INPUT_RASTER1 = reference
TieTask.INPUT_RASTER2 = warp_this
TieTask_TiePoints = TieTask.OUTPUT_TIEPOINTS
; Get the tie point filter task from the catalog of ENVITasks
FilterTask = ENVITask('FilterTiePointsByGlobalTransform')
FilterTask.INPUT_TIEPOINTS = TieTask_TiePoints
FilterTask_TiePoints = FilterTask.OUTPUT_TIEPOINTS
; Get the image-to-image registration task from the catalog of
RegistrationTask = ENVITask('ImageToImageRegistration')
RegistrationTask.WARPING = 'RST'
outfile = warpeddir + path_sep() + files[i]
RegistrationTask.OUTPUT_RASTER_URI = warpeddir + path_sep() + files[i]
Reference = RegistrationTask.OUTPUT_RASTER
;close the raster that was warped
;delete tie points files
print, string(9b) + 'Completed ' + strtrim(i,2) + ' of ' + strtrim(files.length-1,2) + ' image registrations'
print, 'Registered all images!'
;close the output raster
;got to the warped directory and open each warped raster for the
scenes = objarr(n_elements(files))
for i=0, n_elements(files)-1 do scenes[i] = e.openraster(files[i])
print, 'Generating mosaic...'
; Get the task from the catalog of ENVITasks
Task = ENVITask('BuildMosaicRaster')
; Define inputs
Task.INPUT_RASTERS = scenes
Task.COLOR_MATCHING_METHOD = 'Histogram Matching'
Task.COLOR_MATCHING_STATISTICS = 'Entire Scene'
Task.FEATHERING_METHOD = 'Edge'
Task.FEATHERING_DISTANCE = make_array(n_elements(files), /INTEGER,
VALUE = 40)
; Define outputs
outfile = thisdir + path_sep() + 'pluto_mosaic'
if file_test(outfile) then file_delete, outfile
Task.OUTPUT_RASTER_URI = outfile
; Run the task
;export the raster as a TIF image
outfile = thisdir + path_sep() + 'pluto_mosaic.tif'
(Task.OUTPUT_RASTER).export, outfile, 'TIFF'
;return to first directory
print, 'Mosaic complete!'
To run the code, simply create a text file called "pluto_mosaicking.pro" and copy/paste the code into the file. When you run the program, IDL will automatically download the Pluto images, create directories, and produce the mosaicked output in the same directory as the pluto_mosaicking.pro file. If there is an issue while downloading the files, you may have to manually download them from NASA at http://pluto.jhuapl.edu/soc/Pluto-Encounter/. Additional instructions are included at the top of the code for how to run the program along with comments throughout the example to explain the steps taken.
Tags: IDL, ENVI, NASA, ENVI API, New Horizons, Pluto imagery, mosiacking