[daip] One more AIPS question...

Patrick P Murphy pmurphy at nrao.edu
Thu Feb 11 14:24:59 EST 2010


Charlie, sorry for not responding sooner, but we here in the
mid-Atlantic have been getting snowfall amounts normally found in places
like Buffalo, NY!

On Tue, 9 Feb 2010 17:25:02 -0600, Charles Figura
<charles.figura at wartburg.edu> said:

> On a similar but separate question, I ran into a problem writing out a
> data cube with fittp today.  I had used uvlod, sdgrd, then fittp.

> I have successfully done this (including earlier today), but went to
> try a different file, as it were, and fittp failed.  I've looked at
> the data and don't see anything that jumps out at me as a problem.
> But clearly fittp isn't happy.  Here's the error I got:

>> go fittp                                                               
> FITTP1: Task FITTP  (release of 31DEC10) begins                         
> FITTP1: Writing to disk file: PWD:NH3_s01b01_p3-30_206_main_cube.fits   
> FITTP1: Writing image: User 2369  Disk 1  Name SFO_14_C.SDGRD.41        
> forrtl: severe (66): output statement overflows record, unit -5, file Internal 
> Formatted Write

I know what that error is, but only in the sense of the mechanics of
Fortran, but that's not very helpful.  

> I don't suppose you have any ideas what might be going on, do you?  I
> don't even a clue what it's complaining about, so I'm not sure what to
> look for in my data!

It would probably take an imhead on the image, a list of all the INPUTs
to FITTP and some debugging to figure out.

Did you try FITAB?  If it has a similar problem, the issue might be with
the data (or not...).

On Wed, 10 Feb 2010 11:04:33 -0600, Charles Figura
<charles.figura at wartburg.edu> said:

> I was trying to investigate the error message I got (that I sent a
> message about to you yesterday), and I *did* find a listing on a dec
> fortran page - a description of 'sever error #66'.  The explanation is
> "an output statement attempted to transfer more data than would fit in
> the maximum record size".

Indeed.  But this is an internal write of data into a variable though,
probably related to the metadata of your data cube, not the data itself.

> Since this is (likely) the largest cube I've tried to build (662
> points, and 200 channels) , I wonder if this means that my cube is too
> big?  I've tried it with a partially-smoothed dataset this morning (50
> channels), and get the same result.

> So to summarize...
> 27 channels * 662 positions = 17874 points in the cube - worked
> 50 channels * 662 positions = 33100 points - ERROR

AHH.  Exceeding 32768 or 2^15.  Is this a UV file or a map (image)?  If
the latter, FITAB may actually work for you.

> 100 channels * 662 positions = 66200 points - ERROR
> 200 channels * 662 positions = 662000 points - ERROR

> So... if my interpretation of this error *is* correct, is there a way
> to (temporarily) change the maximum record size?

This is getting beyond my area of expertise; as I said earlier, I'm a
vestigal lurker on the daip at nrao.edu list for largely historic reasons,
and I don't routinely use the package anymore.

> Jay Lockman tells me he's put together cubes that are 512x512*1024,
> which is WAY bigger than mine, so I'm hoping that there's a record
> length default setting.

I'm going to punt this to the daip at nrao.edu list as Eric or one of the
others can probably help far more than I can here.

 - Pat

-- 
 Patrick P. Murphy, Ph.D.   Webmaster (East), Computing Security Manager
 http://www.nrao.edu/~pmurphy/          http://chien-noir.com/maze.shtml
 "Inventions then cannot, in nature, be a subject of property."
                                    -- Thomas Jefferson, August 13, 1813




More information about the Daip mailing list