You must have a javascript-enabled browser and javacript and stylesheets must be enabled to use some of the functions on this site.
 
   

 

Characterizing the quality of river water level time series derived from satellite radar altimetry: efforts towards a standardized methodology

Nicolas Bercher(1) , Pascal KOSUTH(1) , and Jérôme BRUNIQUEL(2)

(1) UMR TETIS (Cemagref-CIRAD-ENGREF), Maison de la Télédétection, 500 rue JF Breton, 34093 Montpellier Cedex 5, France
(2) Alcatel Alenia Space , 26, avenue Jean-François Champollion, BP 1187, 31037 Toulouse Cedex, France

Abstract

Numerous works during the last fifteen years have shown the potential contribution of satellite radar altimetry for the monitoring of water levels of inland water bodies (inner seas, lakes, floodplains and large rivers). Currently a significant number of satellites provide radar altimetry information (Topex Poseidon, ERS, Envisat, Jason) and could ensure the continuity of operational monitoring of continental water levels. Recently, various groups have dedicated large efforts to build data bases of rivers and lakes water levels derived from satellite radar altimeters ("Global reservoir and lake monitor" Project, "River and Lake" Project, "CASH" Project).

However, as a general picture, hydrologists still do not use these data for operational applications such as water resource quantification, flood forecast or water resource management. Among the reasons accounting for the reluctance of hydrologists to use water level data derived from satellite radar altimetry is the lack of a standardized method to characterize the quality of these data (Birkett 1998). The presentation will focus on that subject and develop two complementary topics (1) the quantification of the accuracy of individual satellite measurements, (2) the characterization of the quality of water level time series reconstructed from sampled satellite measurements.

Quantification of the accuracy of satellite measurements

To quantify the accuracy of satellite measurements, for a given satellite and waveform processing algorithm, a few distinct parameters must be taken into account : * the density and dispersion (uncertainty) of satellite measurements at a given location and time (during one satellite over flight), depending on one hand on the morphology and environment of the water body and on the other hand on the size of the targeted measurement window * the error distribution between satellite measurements and in situ measurement at a given location of a water body. The quantification of this error is generally biased by two factors : (1) the location of satellite ground tracks rarely coincides with hydrometric gauging stations, which implies the estimation of in situ measurement values based on information from distant gauging stations, (2) satellite measurements are given in a geodetic referential (ellipsoid) while in situ measurements are generally given in an orthometric referential (geoid) which implies the use of a geoid model to compare the two values, introducing an additional source of error.

Measurement density, dispersion and error distribution have been quantified for 200 observations areas on a large variety of continental water bodies (Amazon, Danube, Niger, Senegal, Rhône, …). Results will be presented and discussed with a particular focus on their correlation with parameters and characteristics such as water body size and morphology, neighbouring topography, size of the targeted observation window. Possibility for a priori estimation of the accuracy (without confrontation to in situ measurement) will be discussed.

Characterization of the quality of water level time series reconstructed from sampled satellite measurements

While hydrologists generally require daily sampled time series of water levels for main rivers, satellite sampling periods range from 10 days (Topex Poseidon, Jason) to 30 days (ERS, Envisat). Reconstructing a daily time series from sampled satellite measurements, for instance through temporal interpolation, induces specific errors that must be estimated. The satellite sampling rate has an important coupling effect with the natural temporal signal. The Shannon theorem states that the sampling frequency must be greater than twice the maximum frequency of the signal to be sampled in order to avoid any loss of information. The spectral signature of a river water level signal –and its maximum frequency- can be determined using the Fourier spectral analysis. It depends on such parameters as the precipitation regime and hydrology of the upstream catchment, and local hydraulic conditions. Comparison between the spectral signature of a signal and the sampling frequency enables the determination of the information loss. Analysis of spectral signature of in situ water level time series will be presented for 200 hydrometric stations over a large variety of continental water bodies. The spectral signature will be parameterized and parameters will be related to hydrological variables (precipitation regime, catchment size,…). Impact of satellite sampling on information loss will be presented, over-sampling techniques for daily signal reconstruction will be discussed and correlation between information loss and accuracy of the reconstructed water level time series will be developed.

Finally a synthetic method, quantifying both the accuracy of individual satellite measurements and the impact of satellite sampling rate on the accuracy of reconstructed time series, will be presented for complete characterization of the quality of water level time series derived from satellite radar altimetry. This method could both (1) allow hydrologists to use qualified water level time series, (2) allow the quantification of improvements generated by new waveform processing algorithms.

 

Workshop presentation

Full paper

 

                 Last modified: 07.10.03