[daip] passband cal

Eric Greisen egreisen at nrao.edu
Fri Jan 11 15:45:09 EST 2008


Harvey Liszt writes:
 > 
 > I've been reducing another of my absorption line runs, this one at 21.3 
 > GHz ( but I don't think that's terribly important).  This is the first 
 > time I've not found fairly strong lines and so have looked more 
 > carefully at the noise.
 > 
 > After I had looked at the 4 or 5 resultant spectra (output from UVLSD 
 > and vector average in POSSM) a few times, I realized that they all 
 > looked pretty much the same, had many of the same spectral features 
 > about the same rms in line/continuum ratio, despite the varying 
 > continuum strengths and identical integration time.  So this is what you 
 > expect when you use the same bandpass cal on all, just one scan on the 
 > bandpass cal at the start of the run, and the noise is dominated by that 
 > in the passband cal.  To test this idea, I turned on a boxcar smooth 
 > during BPASS.  As I expected, the noise level in the final source 
 > spectra all went down by the right amount (sqrt 3) but they continued to 
 > resemble each other, so the noise was still (presumably) dominated by 
 > that in the bandpass.
 > 
 > At this point, I created a set of line/continuum spectra without 
 > applying any bandpass cal . I find that if I divide these spectra of the 
 > sources by that of the bandpass calibrator, I get a nice flat spectrum 
 > but with  ~3 x lower noise level than originally (as described in the 
 > prior paragraph) and the spectra look much more independent of each 
 > other, as if the noise introduced by passband cal has receded not quite, 
 > but much more nearly, into insignificance.
 > 
 > Do you have any insight into why this should be the case?  Was I just 
 > doing something wrong in the original reduction that caused the noise of 
 > the bandpass?
 > 

Not really - if I had a better idea it might already be in AIPS.  It
is clear that to avoid a contribution from the BP calibration one has
to integrate a long time on the BP calibration - most people do way
too little, making the BP noise be a serious contribution.  It will be
coherent between sources.  Firthermore, the BP function for the VLA
almost certainly varies with time.  So a single cal scan misses that
entirely.

I wrote UVLSD where I thought of the D as "desperate" (looking for HI
absorption from Magellanic Stream).  Looking at the average over all
baselines of a UVLSD output, only then scaled by a similar average
over all baselines of the BP cal should have less noise in that the BP
is over all baselines rather than a baseline at a time.  BPASS depends
on there being a closure relationship in amplitude and phase between
the antennas - this seems to break down in the outer parts of the
spectrum (not just a couple of channels but rather more).  This may
cause specific problems in the BP solutions which would appear later.

I don't think any of the raving above has answered your question.  If
you have any thought about how the current algorithms might be
enhanced - please let me know - or tell me if you come up with a
clearer reason for your finding.

Thanks,

Eric




More information about the Daip mailing list