[evlatests] A strange problem at 12.5 MHz BW -- in continuum

Rick Perley rperley at nrao.edu
Sun Jul 27 19:03:11 EDT 2008


    In an effort to better understand sensitivity issues, I've taken 
data at X-band, in continuum, of a known calibrator and an adjoining 
blank field, at each BW from 50 MHz through 0.78 MHz.  All of the 
expected effects are seen -- I'll show these at the Monday test meeting.

    But one thing quite unexpected was also seen -- a notable 
degradation in sensitivity, affecting BW = 12.5 MHz only. 

    The loss of sensitivity -- which is seen *both* in noise histograms, 
*and* in the correlation coefficients, affects all correlations equally 
--  hence, both IFs (which were tuned to different frequencies).  And it 
affects equally both EVLA and VLA antennas. 
    More curiously -- the loss in sensitivity is greatest for antennas 
at the end of the east arm (about 20%), and least for antennas at the 
ends of the other arms (less than 5%).  The gains determined for the 
antennas at this bandwidth are also variable.  (All other gains, are 
rock-solid). 

    This smells like external RFI, originating near the end of the east 
arm -- except this doesn't explain why both IFs are equally affected, or 
why no other bandwidth is affected. 
    So is it possible that this is some internally-generated signal, 
which manifests itself in the spatial sense noted above due to 
differential fringe rates?  The 'u' baseline coordinate was near zero 
for the baselines comprising east-arm antennas. 





More information about the evlatests mailing list