Sunday, March 11, 2012

Removing Extreme Values and Outliers with NEST

Values outside of the processed SAR image should be NAN, i.e. "no value" and usually are, but at some point in my processing chain -- I think when converting the calibration to dB -- the NAN value is converted to a value of "-1E-30" For further processing, I wish to get these values to NAN again. Especially when mosaicing, the values of overlapping areas of several SAR images should not be influences by this invalid outlier



To change band values I choose Utilities>Band Math and get the following windows:


The Band Math uses a C-code If-expression which in this case says: If the image value is between -1E-30 and -2E-30, then replace it by NaN, otherwise keep the original value.

The following is not very clear to me -- You have to deselect "virtual" and the result is written into a new channel (save your file afterwards).

It would seem that you then simply could delete the original channel and retain the new band, but that does not seem to work, somehow the new band is still referenced to the original one. You have to choose Utilities>Spatial Subset from View to do the job. Here I go straight to the "Band Subset" tab and deselect the original band:


If I know choose "ok", you get a new product with the one corrected channel, which you then can save to disc.

This is of course too tedious if you have dozens of images, so to do this in batch mode, I create the processing line in "Graph Builder"



Then I save this graph as an XML file. You edit this XML-file so that the input and output file names have a place holder "$file" and "$target". Look in 1-Read that it reads "<file>$file</file>" and in 2-Write that it reads "<file>$target</file>".


Now you create a RemoveOutliers.bat file containing


for /r C:\Users\max\location_of_files\ %%X in (*.dim) do (gpt   C:\Users\max\location_of_XMLfile\ RemoveOutliers.xml -Pfile="%%X"  -Tfile=" C:\Users\max\location_of_new_files\%%~nX.dim")

What happens here?
  1. The for-command goes through the directory containing your files to find files named "*.dim" and passes the file name to "%%X".
  2. For each of these input files "-Pfile="%%X", the NEST command "gpt" applies the Graph Builder production chain saved in "RemoveOutliers.xml" 
  3. The output is saved in the parameter -Tfile, which here is written "%%~nX.dim", the same filename but in a new directory (but you also can give it a new name like   "%%~nX_NaN.dim" if you wish)
You navigate the DOS window (type "cmd" at Windows Start> "search programs and files" to open it) to the directory containing RemoveOutliers.bat, then type "RemoveOutliers.bat" and all scenes in the specified folder will be processed.

Curiously, and conveniently, if you run Band Math using "gpt" from commandline, the new file only contains the new band, so no reason for manually deleting the original band as in the NEST GUI version above is needed.

The Result (what is dark outside the scene is now NaN)





Saturday, March 10, 2012

Terrain Correction of SAR Images -- Part 4

A short comment only on the "Range Doppler Terrain Correction." As described in Part 1, the SARSIM algorithm takes the DEM and using orbit parameters of the satellite creates a simulated SAR image from this DEM. The simulated and the real SAR image, which will look very similiar, are coregistered. Through this simulation, the displacement for each location in the original landscape, the DEM, is known, so if the simulated SAR image is transformed back into the original DEM -- and the coregistered SAR image along with it -- the pixels of the SAR image will receive their real, geographical location.

The Range Doppler Algorithm does not simulate a SAR image to coregister this and the original SAR image, but calculates displacement based on orbit parameters and a DEM. The Range Doppler algorithm is much faster in processing scenes. When comparing scenes with both SARSIM and Range Doppler methods, I find no difference in the final product. However, the Range Doppler method does not work for quite a few of my scenes. If I understood ESA correctly, this is due to not accurate enough data in the SAR metadata, such that calculations of the displacement is incorrect. This appears to be special with data from the Arctic regions

I haven't therefore used this one that much, but in other areas of the world it may be worth using the Range Doppler Terrain Correction.

Choose Geometry>Terrain Correction>Range Doppler Terrain Correction  (in Graph Builder choose "Terrain Correction"). The settings are as follows


Terrain Correction of SAR Images -- Part 3

The most convenient way to process large quantities of SAR data is using the methods through command line. With the "gpt" command as described in the Nest help pages or in the SNAP help pages you can process single scenes from command line, but here is a way to process large quantities of scenes from command line. With the DOS "for" command you recursively search through your directories for scenes to be processed and hand these scenes each to the gpt command.

Here is how to do it:

First you create the processing chain with the graph builder as described in part 2 and save it as an XML file. Especially in the beginning, you may want to keep it to simpler processing chains not containing all tasks at once. In our case, let's take only the SARSIM Terrain Correction:


You set the values in Graph Builder and save it, lets say as "SARSIM_TC.xml". You still should check and edit the XML file for the parameters you need (map projection, resolution, etc) and you will have to modify the saved XML file at twoplaces for batch command line use as follows:

Make sure that the filenames in the XML file have $file  as placeholders as in the following examples:


 <node id="1-Read">
    <operator>Read</operator>
    <sources/>
    <parameters class="com.bc.ceres.binding.dom.Xpp3DomElement">
      <file>$file</file>
    </parameters>
  </node>
Now you create a SARSIM_TC.bat file containing


for /r C:\Users\max\location_of_files\ %%X in (*.dim) do (gpt   C:\Users\max\location_of_XMLfile\ SARSIM_TC.xml -Pfile="%%X"  -t " C:\Users\max\location_of_files\%%~nX_SarSimTC.dim")

What happens here?


  1. The for-command goes through the directory containing your files to find files named "*.dim" and passes the file name to "%%X".
  2. For each of these input files "-Pfile="%%X", the NEST command "gpt" applies the Graph Builder production chain saved in "SARSIMTC_dB.xml" 
  3. The output is saved in the parameter -Tfile, which here is written "%%~nX_SarSimTC.dim", taking the filename and between original name and filetype adding "_SarSimTC" to indicate this is having been processed with SarSim. You may choose different naming, but I find this convenient.
You navigate the DOS window (type "cmd" at Windows Start> "search programs and files" to open it) to the directory containing SARSIM_TC.bat, then type "SARSIM_TC.bat" and all scenes in the specified folder will be processed.

The results will be the same as shown in Part 1



Terrain Correction of SAR Images -- Part 2

Rather than clicking through each applied method in the menue, a production chain can be implemented with the "Graph Builder". Choose "Graphs>Graph Builder" and by right clicking in the graph space you can add methods and connect them with arrows in the order of running through these.

You can save the whole graph as an xml-file for later use, and this is also needed for batch command line processing. The individual tabs in the Graph Builder are just the same as described in part 1, only be aware that for the SARSIM Terrain Correction three individual parts have to be chosen (below the numbers 6 through 8). The "SARSIM-Terrain-Correction" you can choose is only part of the similarly named SARSIM Terrain Correction in Part 1!


Terrain Correction of SAR Images -- Part 1

A characteristic of side-looking SAR image is the so-called foreshortening and layover, a reflected signal from a mountaintop reaches the sensor earlier or at the same time as the signal at the foot of the mountain. This results in the typical look of mountains that seem to have "fallen over" towards the sensor:


In the original image to the left, a pixel is basically displaced depending on its elevation above sea-level, so it is important to remove this layover as seen in the image above to the right. The freely available SNAP SAR Toolbox (previous version known as the  NEST SAR Toolbox ) is in many ways a great tool for satellite image processing and makes it very easy terrain-correct SAR images in a fully automatic process.

The algorithm takes the DEM and using orbit parameters of the satellite creates a simulated SAR image from this DEM. The simulated and the real SAR image, which will look very similiar, are coregistered. Through this simulation, the displacement for each location in the original landscape, the DEM, is known, so if the simulated SAR image is transformed back into the original DEM -- and the coregistered SAR image along with it -- the pixels of the SAR image will receive their real, geographical location. (It's actually quite easy in principle, but not sure this description is clear...)

Below is the original ESA SAR image as loaded into SNAP (or NEST) displaying the typical layover: 


Before the terrain correction, I apply the newest orbit file (Utilities>Apply Orbit), calibrate the values (SAR Tools>Radiometric Correction>Calibrate; but not in dB since the terrain correction needs linear values!) and the run a speckle filter, median 3x3 (SAR Tools>Speckle Filtering>Single Image)

Now to the actual terrain correction. Choose Geometry>Terrain Correction>SAR Simulation Terrain Correction. In the first tab 1-Read you choose the product to be corrected: 


The second tab defines the output. Unfortunately the default output filename in this case is only "SarSimTC.dim", I follow the SNAP (NEST) naming convention where all methods applied are contained in the filename, such as "ORIGINALNAME_AppOrb_Calib_Spk_SarSimTC.dim" but this has to be typed manually:


In the "3-SAR simulation" tab, one can choose various DEM such as GETASSE and SRTM, but in my case I choose an "External DEM" and specify the file path. I set the "no data value" to 9999.0, otherwise all ocean surface will be NAN. There is something unusual in this tab -- if you do not highlight any of the source bands, the first one only will be processed, in this case "Amplitude_VV". If other bands also should be processed, both source bands (in this case Amplitude_VV and Amplitude_VH) must be highlighted by choosing and clicking them!


The "4-GCP Selection" tab I leave the given values: 


Finally, the "5-SARSIM-Terrain-Correction" tab. For my purpose, I choose 20m resolution for the output image, the map projection of Svalbard/Spitsbergen WGS1984 / UTM 33N and prefer nearest neighbour interpolation:

Now I can choose "process" and this particular run takes 6.5 minutes on a Windows 7 64-bit computer for one source band. For two source bands it is much, much longer (80 minutes in this run!) and may not work if the computer has too little RAM, so the best and fast way is to process individual source bands separately. (The Range Doppler algorithm for removing layover which I will discuss in a later  is faster but does not work for some scenes at high latitudes(?)).

 I choose "Utilities>Dataset Conversion>Linear to dB" to get desibel values and get this final result. The data fits perfectly to cartographic shapefiles of the coastlines and to the other geolocations.


compared again to the original SAR image the difference is easily visible!


A problem in the current version: If you -- after having processed a particular scene -- choose a different scene under "1-Read" as input having differently named source bands, the source band list under "3-SAR Simulation" does not update -- you have to close the whole window and start all over -- part 2 and following describe how to process a large number of scenes.

The next postings will discuss how to run all this from command line and do batch processing.

Welcome

This blog is meant as a place for my own notes on Remote Sensing and GIS tasks. Maybe some others find these helpful in their own work or can come with advice or point out errors in case they find any.

You may find here a link to my current projects.