[evlatests] Sensitivity Losses at edge of 1 GHz bandpass

Rick Perley rperley at nrao.edu
Mon Sep 20 17:07:20 EDT 2010


    The 1 GHz-wide path ('8-bit') has long been known to display a 
mysterious roll-off in the bandpass on one side.  The roll-off is 
gradual -- starting about 150 MHz from band edge, the atttenuation (in 
power units) is about 10 to 15 dB down (relative to the average of the 
middle portion of the bandpass) by the point where the anti-aliasing 
bandlimiting filter sharply cuts off the spectrum. 

    The recent flurry of 3 and 8 bit sensitivity testing permits a test 
of the relative noise as a function frequency through this rolloff 
band.  Noise was measured in each channel, using the histogram plotting 
program, after standard calibration (which include a bandpass 
calibration to remove the bandpass attenuation).  The key question is 
whether the noise is increased in the region of the anomalous attenuation. 

    The short answer is 'yes'.  A tougher question is whether the region 
affected, and the noise increase, is unacceptably wide. 

    The observations have 8 subbands, spanning 1.024 GHz, from 4.5 to 
5.5 GHz -- a nice clean band.  I have determined the post-calibration 
noise, using a blank field, for channels as a function of offset from 
the band edge.  Shown are statistics from three subbands -- #1 (which 
has the rolloff), #2 (which should be unaffected to any significant 
degree), and #8, which provides a calibration of the noise we should see 
due to the anti-aliasing filter, and without the anomalous rolloff.   
Each entry has been normalized by the best sensitivity seen in each 
subband (this is a slowly varying number across the various subbands) -- 
it thus represents a noise multiplier.  To the right of each 'mult' 
column is the approximate attenuation noted for that channel, using the 
mid-band levels as normal. 

                 subband 1    subband 2   subband 8
Channel
Offset     Mult    Att      Mult  Att     Mult   Att
-----------------------------------------------------------
0 (edge)   805     <-30    1.7   -6        3.8  -14
1              178     <-30    1.3   -4        3.1  -12
2                48      -30     1.2   -2        2.4   -11
3                15      -26     1.1   -1         1.9  -10
4                  6.7   -22     1.0               1.7   -8
5                  3.8   -19     1.0              1.4    -6
6                  2.6   -16     1.0              1.2    -4
7                  2.1             1.0              1.1    -2
8                  1.7              1.0             1.0    -1
9                  1.5   -12
10                1.4
11                1.3
12                1.2
13                1.1    -8

 From here on, the sensitivity in subband 1 slowly improves, reaching a 
multiplies of 1.0 at channel #20. 

Some conclusions to note:

    1) For subbands 2 and 8 (defining 'normal behavior), the noise 
increases by a factor of ~50% when the power is down by about 6 dB.  For 
the central subbands (defined by the stationboard digital filters), only 
the edge channel on each side reaches this level.  For subband 8, where 
the right-hand edge is defined by the sharp analog anti-aliasing filter, 
the 5 channels on the edge are affected to this level. 
    2) For subband 1, the behavior is quite different -- the noise is 
increased by ~50% for the nine channels on the edge, where the 
attenuation seems to be down by more than 10 dB over mid-band levels.  
It thus seems that most of the gradual rolloff is not accompanied by the 
increase in noise seen at the edges of the other subbands. 
    3) The bottom few channels of subband 1 are quite anomalous -- huge 
attenuations and increases in noise -- which I believe are due to some 
problem in the definition of the digital filter for the subband.  
(Somebody please correct me if I'm wrong). 

    So, discounting the bottom half dozen channels of subband 1, the 
gradual rolloff seen in the 1 GHz-wide path in the bottom ~100 MHz is 
accompanied by a modest, but significant increase in noise -- factors of 
up to 2. 

    Is this acceptable?  For some -- perhaps most--  experiments, if the 
full sensitivity in a particular part of the band is required, one can 
always adjust the LO to shift that part of the spectrum to the 2nd 
subband.  HI observations, for example, could do this to optimize 
sensitivity below 1.1 GHz -- the cost would be loss of the upper 128 
MHz, (1.9 to 2.1 GHz), which may be of rather little interest to HI 
observers ... 
    Within L-band, the bottom 128 MHz of continuum is likely to be of 
little interest from a sensitivity point of view (full of interference, 
and system performance is not very good there). 
    Within S-band, the situation is not as simple, as the 2.0 -- 2.1, 
and 3.0 -- 3.1 GHz regions (which are affected by this roll-off) are 
important for sensitivity, and potentially for other spectral 
applications. 

    I think it worth it to spend some more effort to find the origin of 
this rolloff in the 1 GHz path.





More information about the evlatests mailing list