[mmaimcal] pointing error effect on interferometric phase

Bryan Butler bbutler at aoc.nrao.edu
Fri Jan 29 18:52:19 EST 1999




folks,

i recently ran across the following article:

Gorham, P.W., and D.J. Rochblatt, Effect of Antenna-Pointing Errors on 
   Phase Stability and Interferometric Delay, JPL TDA Progress Report 
   42-132, Feb. 15, 1998.

and thought it might be interesting to scale their numbers to our
12-m apertures and mm wavelengths.  basically, this article does a
full treatment of the aperture phase across the DSN antennas for 
given pointing errors, then simulates interferometric phase error.
of course, it is only exact for the DSN antennas, but the rough
scaling should be right, i suspect, for any well-constructed antenna
and feed assembly.

figure 17 of that article shows the delay error as a function of 
pointing error at X-band, for 34-m and 70-m antennas.  at very 
small pointing errors, it looks like the delay error scales roughly 
as diameter, and theoretically, it scales like frequency as well
(see their eqn. 12, where it scales like nu^[alpha-1] and note that 
alpha is slightly > 2 from their measurements shown in figure 15).
so, fig. 17 gives the following rough delay errors for a pointing
error of 0.7 arcsec (0.2 millidegrees - the odd unit they choose
to use):

  70-m => 0.07 picosec
  34-m => 0.035 picosec

so, the delay error at 1mm on a 12-m aperture should be roughly:

  dtau = 0.07 * (12m / 70m) * (35mm / 1mm) ~ 0.4 picosec

this converts to a path error of about 1/8 mm, which seems pretty
large to me (at 1mm this is 45 deg of phase...)!

can somebody tell me if this is totally whacked, or what?

	-bryan





More information about the mmaimcal mailing list