ESA Earth Home Missions Data Products Resources Applications
    18-Apr-2014
EO Data Access
How to Apply
How to Access
Index
Credits
ASAR Data Formats Products
Records
Geolocation Grid ADSRs
Doppler Centroid parameters
Chirp parameters
Antenna Elevation pattern
ASAR external characterization data
ASAR external calibration data
Level 0 SPH
Level 0 MDSR
SPH for auxiliary data with N=1 DSDs
ASAR WVI Product SPH
SQ ADSRs
Wave Mode Geolocation ADS
ASAR Wave Mode Products Base SPH
Slant Range to Ground Range conversion parameters
SQ ADSRs
Measurement Data Set containing spectra. 1 MDSR per spectra.
Ocean Wave Spectra
Map Projection parameters
ASAR Image Products SPH
Measurement Data Set 1
Auxilliary Products
ASA_XCH_AX: ASAR External characterization data
ASA_XCA_AX: ASAR External calibration data
ASA_INS_AX: ASAR Instrument characterization
ASA_CON_AX: ASAR Processor Configuration
Browse Products
ASA_WS__BP: ASAR Wide Swath Browse Image
ASA_IM__BP: ASAR Image Mode Browse Image
ASA_GM__BP: ASAR Global Monitoring Mode Browse Image
ASA_AP__BP: ASAR Alternating Polarization Browse Image
Level 0 Products
ASA_WV__0P: ASAR Wave Mode Level 0
ASA_WS__0P: ASAR Wide Swath Mode Level 0
ASA_MS__0P: ASAR Level 0 Module Stepping Mode
ASA_IM__0P: ASAR Image Mode Level 0
ASA_GM__0P: ASAR Global Monitoring Mode Level 0
ASA_EC__0P: ASAR Level 0 External Characterization
ASA_APV_0P: ASAR Alternating Polarization Level 0 (Cross polar V)
ASA_APH_0P: ASAR Alternating Polarization Level 0 (Cross polar H)
ASA_APC_0P: ASAR Alternating Polarization Level 0 (Copolar)
Level 1 Products
ASA_IMS_1P: ASAR Image Mode Single Look Complex
ASA_IMP_1P: ASAR Image Mode Precision Image
ASA_IMM_1P: ASAR Image Mode Medium Resolution Image
ASA_IMG_1P: ASAR Image Mode Ellipsoid Geocoded Image
ASA_GM1_1P: ASAR Global Monitoring Mode Image
ASA_APS_1P: ASAR Alternating Polarization Mode Single Look Complex
ASA_APP_1P: ASAR Alternating Polarization Mode Precision Image
ASA_APM_1P: ASAR Alternating Polarization Medium Resolution Image product
ASA_WSS_1P: Wide Swath Mode SLC Image
ASA_WVS_1P: ASAR Wave Mode Imagette Cross Spectra
ASA_WSM_1P: ASAR Wide Swath Medium Resolution Image
ASA_APG_1P: ASAR Alternating Polarization Ellipsoid Geocoded Image
Level 2 Products
ASA_WVW_2P: ASAR Wave Mode Wave Spectra
ASAR Glossary Terms
Sea Ice Glossary
Land Glossary
Oceans Glossary
Geometry Glossary
ASAR Instrument Glossary
Acronyms and Abbreviations
ASAR Frequently Asked Questions
The ASAR Instrument
Instrument Characteristics and Performance
Inflight Performance Verification
Preflight Characteristics and Expected Performance
Instrument Description
Internal Data Flow
ASAR Instrument Functionality
Payload Description and Position on the Platform
ASAR Products and Algorithms
Auxiliary Products
Common Auxiliary Data Sets
Auxiliary Data Sets for Level 1B Processing
Summary of Auxiliary Data Sets
Instrument-specific Topics
Level 2 Product and Algorithms
Level 2 Product
ASAR Level 2 Algorithms
Level 1B Products
Descalloping
Range-Doppler
ASAR Level 0 Products
Level 0 Instrument Source Packet Description
Product Evolution History
Definitions and Conventions
Conventions
Organisation of Products
ASAR Data Handling Cookbook
Hints and Algorithms for Higher Level Processing
Hints and Algorithms for Data Use
ASAR Characterisation and Calibration
References
Notes
The Derivation of Backscattering Coefficients and RCSs in ASAR Products
External Characterisation
Internal Calibration
Pre-flight Characterisation Measurements
ASAR Latency Throughput and Data Volume
Data Volume
Throughput
Latency
Products and Algorithms Introduction
Child Products
The ASAR User Guide
Image Gallery
Further Reading
How to Use ASAR Data
Software Tools
How to Choose ASAR Data
Special Features of ASAR
Geophysical Coverage
Principles of Measurement
Scientific Background
Geophysical Measurements
ASAR Product Handbook
ASAR instrument characterization data
Wave Mode processing parameters
ASAR processor configuration data
Main Processing parameters
ASA_WVI_1P: ASAR Wave Mode SLC Imagette and Imagette Cross Spectra
Product Terms
RADAR and SAR Glossary
Level 1B Products
Summary of Applications vs Products
Services
Site Map
Frequently asked questions
Glossary
Credits
Terms of use
Contact us
Search


 
 
 


Chapter 4
ASAR Frequently Asked Questions

This chapter questions


Question 4.1 : What does synthetic aperture mean?

In general the larger the antenna, the more unique information you can obtain about a particular viewed object. With more information, you can create a better image of that object (improved resolution). It's prohibitively expensive to place very large radar antennas in space, however, so researchers found another way to obtain fine resolution: they use the spacecraft's motion and advanced signal processing techniques to simulate a larger antenna.

A SAR antenna transmits radar pulses very rapidly. In fact, the SAR is generally able to transmit several hundred pulses while its parent spacecraft passes over a particular object. Many backscattered radar responses are therefore obtained for that object. After intensive signal processing, all of those responses can be manipulated such that the resulting image looks like the data were obtained from a big, stationary antenna. The synthetic aperture in this case, therefore, is the distance traveled by the spacecraft while the radar antenna collected information about the object. Please see the associated graphic [below]

The ERS-1 satellite's SAR sends out around 1700 pulses a second, collects about a thousand backscattered responses from a single object while passing overhead, and the resulting processed image has a resolution near 30 metres. The spacecraft travels around 4 kilometres while an object is "within sight" of the radar, implying that ERS-1's 10 metre x 1 metre radar antenna synthesizes a 4 kilometre-long stationary antenna!

Synthetic Aperture
Figure 4.1 Synthetic Aperture
Question 4.2 : Why is radar often used in remote sensing?

(Why are radio waves, visible light, and infrared radiation the most common forms of electromagnetic radiation sensed by Earth observing satellites?)

The wavelengths of electromagnetic radiation most commonly used for remotely sensing Earth are: the spectrum of visible light, a wide spectrum of radio wavelengths, and several infrared wavelengths. A partial explanation of why these wavelengths are preferable is outlined below. When radar (which employs radio waves) is selected from these possible choices, the decision is usually based upon radar's independence of solar illumination and weather conditions. See the questions: How does radar "see" at night? and How does radar "see" through clouds? for more details.

There is a good reason why our eyes sense the electromagnetic radiation (light) they do: visible light represents a significant portion of the electromagnetic radiation which can pass through Earth's atmosphere and ionosphere. A wide spectrum of radio waves, which radars employ, and some infrared radiation can also pass through to the Earth's surface.

There are many reasons why other wavelengths/frequencies of electromagnetic radiation don't make it through Earth's atmosphere. For example many people know that ozone in Earth's upper atmosphere helps protect us from ultraviolet radiation. This occurs because the structure of the ozone molecule is particularly sensitive to ultraviolet frequencies; it has a natural resonance near those frequencies. Think of a person swinging; he swings back and forth with a particular frequency. If you push him with the same frequency (every time he comes back) or at a related periodic frequency (say, every other time he comes back), he will keep going higher and higher. Incoming ultraviolet radiation likewise keeps "pushing" the ozone moleule at the structure's resonant frequency - in a sense like pushing the swinging person higher. Soon the ozone molecule breaks into an oxygen atom and an O2 molecule which go on to other adventures. The incoming ultraviolet radiation's energy was used to break apart the ozone molecule, a process by which we say the radiation was "absorbed." Many molecules in Earth's atmosphere have various kinds of resonances which absorb other frequencies of electromagnetic radiation. The visible spectrum, a wide spectrum of radio frequencies, and some infrared frequencies don't match well with those resonances, however, and thus are not much affected by absorption. These frequencies of electromagnetic radiation are therefore most commonly used for remote sensing purposes.

As a side note - some radar wavelengths reflect off the ionosphere and therefore cannot be used for remote sensing purposes. You may know a HAM radio operator who utilizes this phenomenon to talk to a companion halfway around the world. In this case radio waves are transmitted up to the ionosphere and reflected back down to Earth elsewhere, rather than passing through the ionosphere. This happens because the radio waves cause electrons in the ionosphere to oscillate, and the oscillating electrons in turn radiate electromagnetic waves.
Question 4.3 : How does radar "see" at night?

SAR instruments transmit radar signals and then measure how strongly those signals are scattered back. An analogy with photography can be made: when it's dark, a camera's flash sends out light and then the film records objects that the flash illuminates. In both cases the SAR and the camera are not dependent upon the sun because they provide their own illumination.

Question 4.4 : How does radar "see" through clouds?

Light does often make it through clouds, but that light has just been scattered all over the place, making it nearly impossible to tell how the light was oriented before it entered the cloud. This is why we can't see objects through clouds. The difference with radar is how much less it's distorted while passing through a cloud.

The reason why clouds scatter visible light while leaving radar undistorted is a matter of relative scale. Radar's longer wavelengths in effect average the properties of air with the properties and shapes of many individual water droplets, making the cloud look homogeneous - i.e. like moist air. Visible light has short enough wavelengths to respond to all the individual boundaries between air and water droplets. At each boundary the light is reflected to a new direction, and by the time it escapes the cloud, information on the light's original direction is hopelessly lost. The radar signals, on the other hand, are only affected while entering and exiting the cloud. Because they don't suffer multiple bounces, the radar waves are relatively undistorted by clouds.

Question 4.5 : How is radar data different from what I would see? Why isn't there any colour?

(See the previous question for an explanation of why you don't see clouds in a typical radar image.)

Question 4.6 : Our eyes perceive what is called visible electromagnetic radiation, or electromagnetic radiation with wavelengths between 0.4 and 0.7 microns. Even though we can't see other wavelengths of electromagnetic radiation, they certainly affect us. Ultraviolet radiation, for example, can burn our skin or hurt our eyes, and X-Rays can inform us if we have broken a bone or have developed cavities in our teeth!

When we think about all the information light provides us about our world, and about how other electromagnetic waves impact our lives, it seems only natural that people would want to detect and "visualize" many other kinds of electromagnetic radiation. The Earth Observing System does just that; many satellites' instruments "see" certain electromagnetic waves and relay that data to a "brain" (computer), where the information is then converted into an image for humans to interpret. Each wavelength indicates something different about the imaged object, just as you might associate the wavelength corresponding to bright green light with young plants.

Visible light contains a range of wavelengths, but with radar we often measure one very specific wavelength. Just think of how differently things would look if you could only see yellow. Your eyes would only detect how brightly an object scattered yellow, so the reflection's intensity, not the colour, is what would give you new and useful information. Similarly, radar antennas are often made to detect how brightly objects reflect one particular wavelength. Since there are no other "colours" (wavelengths) to mix in, we really only care about the backscatter's intensity and therefore often use greyscale in our visualizations of this data.

Question 4.7 : What's the smallest object you can see in a SAR image?

In ASF's full-resolution SAR images, you can distinguish objects as small as about 30 metres wide. Some of the smaller items that we've spotted have been ships and their wakes. When the SAR happens to be aligned at a certain angle, long thin objects such as roads or even the Alaskan oil pipeline can also be seen.

Question 4.8 : What's the difference between resolution and pixel spacing?

Pixel spacing represents how much area each pixel covers, while resolution indicates the smallest object you could pick out in an image. Each pixel represents one solid colour, so of course you can't see anything within it. When you place other pixels around it, though, you might notice a few pixels are rather different in colour than surrounding pixels and conclude that you have identified a distinct object. ASF's full-resolution ERS-1 SAR images have 12.5 m pixel spacing and about 30 m resolution. This means that each pixel represents a 12.5 x 12.5 m area on the ground, and you can discern individual objects which are around 30 m wide or larger.

Question 4.9 : How is this SAR data used?

SAR's ability to pass relatively unaffected through clouds, illuminate the Earth's surface with its own signals, and precisely measure distances makes it especially useful for the following applications:

  • Sea ice monitoring
  • Cartography
  • Surface deformation detection
  • Glacier monitoring
  • Crop production forecasting
  • Forest cover mapping
  • Ocean wave spectra
  • Urban planning
  • Coastal surveillance (erosion)
  • Monitoring disasters such as forest fires, floods, volcanic eruptions, and oil spills
Some of the larger current research projects include: mapping the Antarctic continent; mapping the Amazon rainforest; using interferometric analysis for predicting or analyzing earthquakes and volcanic activity; and generating "Arctic Snapshots" of the Arctic ice extent.
Question 4.10 : What's the difference between slant range and ground range?

The time it takes for a transmitted signal to travel to an object and back tells you how far away the object is. If you transmit a signal and receive two separate "echoes," you can use the time difference between when you record the first and second responses to determine the distance between the two sensed objects (dependent on where you stand). In this way the spaceborne SAR measures how far objects are from the spacecraft and the distance between the two objects, along the direction the spacecraft is looking. These distances are said to be recorded in slant range, since they are measured in a direction which is at an angle/slant to the ground.

Often researchers don't really care about distances from the spacecraft; they want to know about distances on the ground. Perhaps they need "real" (ground) distances to determine how much land was used for farming or what percentage of the sea was covered with ice, but the spacecraft samples the returning radar signals at specific time intervals which correspond to discrete distances from the spacecraft. That means that the data are originally in slant range. Given various parameters, the data can be processed such that each data value covers the same amount of area (distance) on the ground. We then say that the data are in ground range.

Slant Range vs. Ground Range
Figure 4.2 Slant Range vs Ground Range

Question 4.11 : What does geocoded mean?

A standard ASF SAR image has gone through a lot of processing to look "normal." One step in this process involves manipulating the data such that each pixel represents a specific distance on the ground. The latitude and longitude coordinates of each image's centre and corner pixels are also known. Sometimes, however, it's convenient to map the data onto a standard grid - such as a mercator projection. Then the pixels in each row would be evenly spaced in terms of longitude, and the entire row would be located at a specific latitude. The data would then be termed geocoded. It is often much easier to compare/overlay geocoded SAR data with non-SAR data sets.

Question 4.12 : What does terrain correction mean?

If you have information about a region's topography, like a digital elevation model (DEM), you can make the slant to ground range conversion more sophisticated. In effect this terrain correction can compensate for foreshortening by spreading data representing the mountain's facing side into more pixels and compacting returns from the back face into fewer pixels. It's nearly impossible, though, to reliably extract the separate returns from data values representing the facing slope. Sometimes people try to compensate for shadowing as well. Knowing the mountain's slopes, they can approximate how the strength of backscattered signals were affected by the changed incidence angle and adjust results accordingly. These procedures, though inexact, can greatly improve SAR image analysis.

Question 4.13 : What is a look (e.g. 4-look data)? What is speckle?

As the spacecraft moves along in its orbit, the radar antenna transmits pulses very rapidly. It can therefore obtain many backscattered radar responses from a particular object while passing overhead. In fact the ERS-1 SAR records about 1,000 responses for a single object. The SAR processor could use all of these responses to obtain the object's radar cross-section (i.e. how brightly the object backscattered the incoming radar), but the result often contains quite a bit of speckle.

Speckle, generally considered to be noise, is due in part to the SAR's fine resolution and its signals' coherency. Speckle can be caused by an object that behaves as a very strong reflector at a particular alignment between itself and the spacecraft, or by a coherent sum of all the various responses within a grid cell which happen to randomly sum (as vectors with magnitude and phase) to a large resultant magnitude at a given phase.

To reduce speckle, the data are sometimes processed in sections which are later combined. With ERS-1's 1,000 samples per object, we might wish to use an object's first 250 responses to determine its radar cross-section. If we then processed the next 250 responses to get another estimate, and so on, we would end up with four estimates of the object's radar cross-section. Combining these four estimates, or looks, together would reduce the amount of speckle.

When an image has been processed as "4-looks": the first 250 (or so) samples of each viewed object were processed to make one image; the next 250 samples for each object were processed to make a second image; the third and fourth images were created with the next chunks of data; and the four images (looks) were combined to create the final result.

The more looks that are used to process an image, the less speckle there is. (The Complex-Format SAR Data Example [given below] demonstrates this.) It must be taken into account that information deemed important is also lost in this process, however, and that resolution is reduced. Several research groups are developing/improving algorithms to reduce speckle while saving as much accurate information as possible.

Spekle, Coherent Summing of Radar Backscatter
Figure 4.3 Coherent Summing of Radar Backscatter

Question 4.14 : What do you mean by "Complex SAR Data"?

Used here the term "complex" refers to complex numbers, or complex-format data. You might be used to hearing of complex numbers with their "real" and "imaginary" components, also known as cosine and sine components. For example a wave might be described in complex format by: A*(cos(wt) + i*sin(wt)), where 'w' represents the wave's frequency and 'A' its amplitude. [see below] The cosine value would describe the wave's real component, sine the imaginary component, and the two would combine as vectors to provide the wave's overall phase (inverse tangent of sin/cos) and amplitude.

Representing Waves in Complex Format
Figure 4.4 Representing Waves in Complex Format

Both the cosine and sine components of backscattered SAR signals are measured and digitized on-board the satellite.

representing Quadrature, the 90 degrees shifted, sine or imaginary component
Figure 4.5 Obtaining the Input Signal's In-Phase and Quadrature Components

The two resulting data streams are then transmitted to a ground station for further processing. People sometimes call these the 'I' (representing In-Phase, or the cosine or real component) and 'Q' (representing Quadrature, the 90 degrees shifted, sine or imaginary component) data streams. For standard processing these two data values are combined to obtain the composite signal intensity (sqrt[I^2 + Q^2]). Sometimes, though, it's desired to process the two data streams separately - usually to maintain the signals' phase information. Then ASF distributes the individually-processed I and Q data values for each pixel location, calling this product "Complex SAR Data." A more detailed example/tutorial of complex SAR data is also available.