Hi,

There are a few big EIS fits files which are not easy to be processed by "eis_prep", for example:

20/eis_l0_20070820_170353.fits.gz 122M
21/eis_l0_20070821_000022.fits.gz 118M
21/eis_l0_20070821_133226.fits.gz 117M

  • Click to show more fits files
    1. 22/eis_l0_20070822_023009.fits.gz 116M
    2. 23/eis_l0_20070823_191420.fits.gz 94M
    3. 24/eis_l0_20070824_001531.fits.gz 92M
    4. 24/eis_l0_20070824_105128.fits.gz 120M
    5. 25/eis_l0_20070825_015152.fits.gz 99M
    6. 25/eis_l0_20070825_050529.fits.gz 95M
    7. 25/eis_l0_20070825_093750.fits.gz 89M
    8. 26/eis_l0_20070826_005803.fits.gz 97M
    9. 26/eis_l0_20070826_055213.fits.gz 92M
    10. 27/eis_l0_20070827_045520.fits.gz 97M
    11. 27/eis_l0_20070827_170835.fits.gz 81M
    12. 27/eis_l0_20070827_210723.fits.gz 79M
    13. 28/eis_l0_20070828_005937.fits.gz 72M

These fits files are mostly generated by study eg. "HPW001_FULLCCD_v2": doing "full-ccd" scan (with often over 80 raster positions).

The typical error message is: "unable to allocate memory to make array". However, the machine that running "eis_prep" has fairly big enough memory (4GB). One way to let "eis_prep" keep running is probably to do only DC-removal and Abs calibration, not to do CR and HP removal, :-(


Questions:
  1. Is there any way to do 'eis_prep' with these big fits files? Or,
  2. Do we need to suggest that normally try to avoid using full-ccd scan with too many raster postions?

Regards,

JianSun 29-Mar-2024 02:21


Hi, Jian

This is starting to become a big problem, and I think you're asking the right questions.

However, I don't think we should stop taking large data because the current software doesn't allow us to analyse it. This just means we need to think about . We want to take the best data we can and worry about analysing it later. You never know when an instrument/satellite can fail.

This sounds like a major issue for the team meetings in October.

In the meantime, if anyone has comments, please start making them! This is something that really needs to be sorted out.

--David R Williams, 19-Sep-2007


Hi David,

I agree with you and Harry that we should get science data first and then work out the solution, if there is problem to analyse it. So probably we need some modifications on "eis_prep" to improve the memory performance for processing these big fits files.

JianSun 29-Mar-2024 02:21


The Hierarchical Data Format (HDF) was made for large files. There are a lot of good tools in IDL for working with HDF files.

--KenDere, 28-Sep-2007


Hi, Jian,

This problem arises from the limited memory size that IDL can allocate. IDL running on a 32-bit computer can theoretically allocate memory of about 2.3GB (from IDL help files). However, according to my experience, it can only allocate memory of a little more than 1GB, which is not enough to process such a large EIS file with EIS_PREP.

Nevertheless, this problem can be easily solved on a 64-bit computer which running the corresponding version of IDL (also 64-bit). Of course, the computer should have large enough physical memory (maybe 4 GB or more). In this case, IDL can allocate large enough memory to run EIS_PREP to process the EIS data of large size. I have tested it using the file eis_l0_20070824_001531.fits.gz. I processed it with EIS_PREP successfully.

Another way, I guess, to solve this problem is to optimize the IDL code EIS_PREP and related ones, in order to reduce the memory requirement. This may be done by freeing the allocated memory in the code immediately after using it and reducing temporary variables if any. I am thinking about this because I met the same problem when I processed several EIS files with smaller size one by one successively. Therefore, I guess that when we call the code, the allocated memory in the run of the code is not freed after the call.

--Hui Li 02-Oct-2007