[daip] One more AIPS question...

Eric Greisen egreisen at nrao.edu
Thu Feb 11 18:41:27 EST 2010


Patrick P Murphy wrote:
> Charlie, sorry for not responding sooner, but we here in the
> mid-Atlantic have been getting snowfall amounts normally found in places
> like Buffalo, NY!
> 
> On Tue, 9 Feb 2010 17:25:02 -0600, Charles Figura
> <charles.figura at wartburg.edu> said:
> 
>> On a similar but separate question, I ran into a problem writing out a
>> data cube with fittp today.  I had used uvlod, sdgrd, then fittp.
> 
>> I have successfully done this (including earlier today), but went to
>> try a different file, as it were, and fittp failed.  I've looked at
>> the data and don't see anything that jumps out at me as a problem.
>> But clearly fittp isn't happy.  Here's the error I got:
> 
>>> go fittp                                                               
>> FITTP1: Task FITTP  (release of 31DEC10) begins                         
>> FITTP1: Writing to disk file: PWD:NH3_s01b01_p3-30_206_main_cube.fits   
>> FITTP1: Writing image: User 2369  Disk 1  Name SFO_14_C.SDGRD.41        
>> forrtl: severe (66): output statement overflows record, unit -5, file Internal 
>> Formatted Write
> 
> I know what that error is, but only in the sense of the mechanics of
> Fortran, but that's not very helpful.  
> 
>> I don't suppose you have any ideas what might be going on, do you?  I
>> don't even a clue what it's complaining about, so I'm not sure what to
>> look for in my data!
> 
> It would probably take an imhead on the image, a list of all the INPUTs
> to FITTP and some debugging to figure out.
> 
> Did you try FITAB?  If it has a similar problem, the issue might be with
> the data (or not...).
> 
> On Wed, 10 Feb 2010 11:04:33 -0600, Charles Figura
> <charles.figura at wartburg.edu> said:
> 
>> I was trying to investigate the error message I got (that I sent a
>> message about to you yesterday), and I *did* find a listing on a dec
>> fortran page - a description of 'sever error #66'.  The explanation is
>> "an output statement attempted to transfer more data than would fit in
>> the maximum record size".
> 
> Indeed.  But this is an internal write of data into a variable though,
> probably related to the metadata of your data cube, not the data itself.
> 
>> Since this is (likely) the largest cube I've tried to build (662
>> points, and 200 channels) , I wonder if this means that my cube is too
>> big?  I've tried it with a partially-smoothed dataset this morning (50
>> channels), and get the same result.
> 
>> So to summarize...
>> 27 channels * 662 positions = 17874 points in the cube - worked
>> 50 channels * 662 positions = 33100 points - ERROR
> 
> AHH.  Exceeding 32768 or 2^15.  Is this a UV file or a map (image)?  If
> the latter, FITAB may actually work for you.
> 
>> 100 channels * 662 positions = 66200 points - ERROR
>> 200 channels * 662 positions = 662000 points - ERROR
> 
>> So... if my interpretation of this error *is* correct, is there a way
>> to (temporarily) change the maximum record size?
> 
> This is getting beyond my area of expertise; as I said earlier, I'm a
> vestigal lurker on the daip at nrao.edu list for largely historic reasons,
> and I don't routinely use the package anymore.
> 
>> Jay Lockman tells me he's put together cubes that are 512x512*1024,
>> which is WAY bigger than mine, so I'm hoping that there's a record
>> length default setting.
> 
> I'm going to punt this to the daip at nrao.edu list as Eric or one of the
> others can probably help far more than I can here.
> 
>  - Pat
> 

Thanks Pat for giving this a try.

The error that is aborting the task is a formatted WRITE to an internal 
character string.  Either that write is longer than the string in fact 
or, more likely, the format has fewer format codes than the item list 
requests.  Your image is small by current standards - I am working with 
an image that has more pixels than a 4-byte integer can count 
(512x512x24200).  That is not the issue.

Unfortunately, I have looked through all the formatted writes that I can 
find and most are used a lot and so cannot have either of the above 
errors.  I checked them anyway and they seem blameless.  It is more 
likely that some unlikely error condition has arisen down in a low level 
routine and the accompanying error message has the problem.

Did the file you requested get created?

One thing that looks odd to me is the very long name you are using for 
the output file.  Have you tried a short name?  Is there enough disk 
space on $PWD to hold the file?

Eric Greisen




More information about the Daip mailing list