[daip] One more AIPS question...
Charles Figura
charles.figura at wartburg.edu
Thu Feb 11 16:10:31 EST 2010
Hey, Pat -
Thanks for the reply. I figured you were probably snowed under out there -
the weather looks almost as bad as we get out here in Iowa, only I suspect
we're a little more used to it!
I tried fitab with the same results - the 'error 66'. I saw that you copied
your response to daip at nrao.edu, I'll follow up with that.
Thanks for your help!
-Charlie
On Thursday 11 February 2010 01:24:59 pm you wrote:
> Charlie, sorry for not responding sooner, but we here in the
> mid-Atlantic have been getting snowfall amounts normally found in
places
> like Buffalo, NY!
>
> On Tue, 9 Feb 2010 17:25:02 -0600, Charles Figura
>
> <charles.figura at wartburg.edu> said:
> > On a similar but separate question, I ran into a problem writing out a
> > data cube with fittp today. I had used uvlod, sdgrd, then fittp.
> >
> > I have successfully done this (including earlier today), but went to
> > try a different file, as it were, and fittp failed. I've looked at
> > the data and don't see anything that jumps out at me as a problem.
> >
> > But clearly fittp isn't happy. Here's the error I got:
> >> go fittp
> >
> > FITTP1: Task FITTP (release of 31DEC10) begins
> > FITTP1: Writing to disk file:
PWD:NH3_s01b01_p3-30_206_main_cube.fits
> > FITTP1: Writing image: User 2369 Disk 1 Name SFO_14_C.SDGRD.41
> > forrtl: severe (66): output statement overflows record, unit -5, file
> > Internal Formatted Write
>
> I know what that error is, but only in the sense of the mechanics of
> Fortran, but that's not very helpful.
>
> > I don't suppose you have any ideas what might be going on, do you?
I
> > don't even a clue what it's complaining about, so I'm not sure what to
> > look for in my data!
>
> It would probably take an imhead on the image, a list of all the INPUTs
> to FITTP and some debugging to figure out.
>
> Did you try FITAB? If it has a similar problem, the issue might be with
> the data (or not...).
>
> On Wed, 10 Feb 2010 11:04:33 -0600, Charles Figura
>
> <charles.figura at wartburg.edu> said:
> > I was trying to investigate the error message I got (that I sent a
> > message about to you yesterday), and I *did* find a listing on a dec
> > fortran page - a description of 'sever error #66'. The explanation is
> > "an output statement attempted to transfer more data than would fit
in
> > the maximum record size".
>
> Indeed. But this is an internal write of data into a variable though,
> probably related to the metadata of your data cube, not the data itself.
>
> > Since this is (likely) the largest cube I've tried to build (662
> > points, and 200 channels) , I wonder if this means that my cube is
too
> > big? I've tried it with a partially-smoothed dataset this morning (50
> > channels), and get the same result.
> >
> > So to summarize...
> > 27 channels * 662 positions = 17874 points in the cube - worked
> > 50 channels * 662 positions = 33100 points - ERROR
>
> AHH. Exceeding 32768 or 2^15. Is this a UV file or a map (image)? If
> the latter, FITAB may actually work for you.
>
> > 100 channels * 662 positions = 66200 points - ERROR
> > 200 channels * 662 positions = 662000 points - ERROR
> >
> > So... if my interpretation of this error *is* correct, is there a way
> > to (temporarily) change the maximum record size?
>
> This is getting beyond my area of expertise; as I said earlier, I'm a
> vestigal lurker on the daip at nrao.edu list for largely historic reasons,
> and I don't routinely use the package anymore.
>
> > Jay Lockman tells me he's put together cubes that are
512x512*1024,
> > which is WAY bigger than mine, so I'm hoping that there's a record
> > length default setting.
>
> I'm going to punt this to the daip at nrao.edu list as Eric or one of the
> others can probably help far more than I can here.
>
> - Pat
--
Charles Figura charles.figura at wartburg.edu
Associate Professor of Physics http://mcsp.wartburg.edu/figura
Director, Wartburg Platte Observatory (319) 352-8373
(319) 352-8606 (fax)
More information about the Daip
mailing list