Calibrating an imaging riometer: determination of the true beam azimuth angle position

An imaging riometer uses the absorption of cosmic radio noise as a means of measuring electron density enhancements in the D region of the ionosphere. It employs a ground-based phased antenna array to produce narrow beams which typically observe a region 200 by 200 km at 90 km altitude. Imaging riometers have traditionally been used for studying energetic electron precipitation into the D region and have also recently been used to characterize mesospheric gravity waves. Determining the location and shape of the riometer beam pattern projected onto the ionosphere is therefore important because it allows the spatial distribution of space weather effects to be accurately visualized and also because it would allow mesospheric gravity wave characteristics to be accurately quantified. Currently, the beam patterns of imaging riometers are determined theoretically, and attempts have been made to validate them by comparison with cosmic radio maps of low spatial resolution; alternatively, they are determined by calibration overflights using aircraft in a complex and expensive procedure. Here we demonstrate a novel wavelet-based approach to accurately determine the azimuth angle of each imaging riometer beam from the riometer data itself using quiet day observations alone and by way of example show how the actual beam azimuth angles (pointing directions) of the Halley (76°S, 27°W) 49-beam imaging riometer vary from those expected using theoretical calculations.


Publication status:
Authors: Moffat-Griffin, Tracy ORCIDORCID record for Tracy Moffat-Griffin, Hibbins, Robert E. ORCIDORCID record for Robert E. Hibbins, Jarvis, Martin

On this site: Tracy Moffat-Griffin
1 January, 2010
Radio Science / 45
Link to published article: