SMOS Local Oscillators impact on Sea Surface Salinity quality

Print This Post Print This Post

The local oscillators (LOs) of the Soil Moisture and Ocean Salinity mission payload are used to shift the operating frequency of the 72 receivers to an optimal intermediate frequency needed for the signal processing. The LO temperature variations produce phase errors in the visibility, which result in a blurring of the reconstructed brightness temperature (Tb) image.

At the end of the commissioning phase, it was decided to calibrate the LO every 10 min while waiting for a more in-depth analysis. During short periods of time, the LO calibration has been performed every 2 min to assess the impact of a higher calibration rate on the quality of the data.

By means of a decimation experiment, the relative errors of 6- and 10-min calibration interval data sets are estimated using the 2 min as a reference. A noticeable systematic across- and along-track pattern of amplitude ±0.3 K is observed for Tb differences between 10 and 2 min, whereas this is reduced between 6 and 2 min.


Figure 1. Differences between Tb calibrated at (left) 10 and 2 min and between Tb calibrated at (right) 6 and 2 min, in (top) H-pol and (bottom) V-pol for incidence angles between 38◦ and 40◦ and EAF-FOV

Scatter plots of the SMOS Tb differences between 10 and 2 min and between 6 and 2 min (see Fig. 1) at H-pol and V-pol are shown as a function of latitude (which is proportional to time) for the portion of the orbit shown in Fig. 1.


Figure 2. Scatter plots of Tb differences between 10 and 2 min (left) for H-pol (top) V-pol (bottom), and 6 and 2 min (right). Only incidence angles between 38◦ and 40◦ are considered. Points of lower dispersion correspond to LO calibration events.

A simulation experiment confirms that the nature of such systematic pattern is due to the visibility phase errors induced by the LO calibration rate. The MIRAS Testing Software (MTS), which is an independent software that fully processes the SMOS raw data up to brightness temperature images, has been used to perform this analysis. Using a decimation scheme, the thermal noise (or radiometric sensitivity) in the measurements is correlated for the three cases (2, 6, and 10 min), since the measurements are exactly the same, and only the different LO phase calibration processing is performed.


Figure 3 Differences of Tb between (left) 10 and 2 min and (right) 6 and 2 min for an incidence angle range of 0◦ −40◦ and AF-FOV. Both thermal noise and phase errors have been taken into account in the simulation. Spatial structures across and along tracks can be appreciated, as in the real data.

Moreover, the impact of the LO phase calibration rate on the brightness temperatures has been analyzed at the antenna frame using real data (half-orbit over the Pacific), as shown in Figure 4. From this figure, it can be clearly observed that LO phase errors, at visibility level, translate into brightness temperature systematic errors, appearing as spatial structures at the antenna frame.


Figure 4. Differences of Tb between (left) 10 and 2 min and (right) 6 and 2 min using real data. An average of 125 snapshots over the Pacific has been performed. Spatial structures in the brightness temperatures at the antenna frame can be clearly observed due to the imperfect cancellation of LO phase errors.

In summary, the visibility phase errors induce noticeable systematic patterns in the spatial Tb distribution, which are translated into SSS retrievals. Overall, the SSS error increase (relative to the 2 min SSS data) is about 0.39 and 0.14 psu for the 10- and 6-min data sets, respectively.

Work published in: Gabarró, C.; González-Gambau, V.; Corbella, I.; Torres, F.; Martínez, J.; Portabella, M.; Font, J., “Impact of the Local Oscillator Calibration Rate on the SMOS Measurements and Retrieved Salinities”, IEEE Transactions on Geoscience and Remote Sensing doi: 10.1109/TGRS.2012.2233744, in Press.