The effects of the spectrum rotation on line ratios#

The EIS spectral images are not exactly aligned relative to the CCD and, for the SW band, images at shorter wavelengths are slightly higher on the CCD compared to longer wavelengths. The effect is small and, for example, for the Fe XII 186.88 and 195.12 lines it's less than 1 pixel. The effect is caused by the dispersion axis of the grating not being exactly aligned with the CCD axis (i.e., the grating is tilted) and we refer to it here as the "spectrum rotation". This is different to the "slit tilt" which arises from a misalignment of the EIS slits relative to the CCD.

Both the spectrum rotation and slit tilt were features of SOHO/CDS and the attached image (CCD.gif) produced for CDS illustrates what both effects do to the spectral images. Note that the spectrum rotation makes no contribution to the slit tilt.

When deriving densities from different Fe XII and Fe XIII emission line ratios, the peaks in the density profiles often occurred at different pixel positions for the different ratios so I decided to investigate the effects of the spatial offset.

This was done by performing sub-pixel shifts as follows using Fe XII 186.9/195.12 as an example. I assume 195.12 is my reference line, and I shift the 186.9 image. I estimate the offset is 0.65 pixels (see later for details) and I re-compute the 186.9 intensity by doing

I_new(ypixel=i) = 0.35 * I(ypixel=i) + 0.65 * I(ypixel=i+1)

Similarly for the Fe XII 196.64/195.12 ratio I make a correction for a 0.1 pixel offset. (Note that the errors on the intensities can be re-computed in a straightforward manner when doing things this way.)

The attached plot (fe12_offset_check.pdf) compares the densities derived from the two Fe XII ratios before applying the offset (top panel) and after applying the offset (bottom panel). The vertical lines show the errors on the density based on the 1-sigma intensity errors output by eis_prep. The change is quite striking. Note the peaks around pixels 96-100. Before they were about 3-4 pixels apart, but with the correction they miraculously coincide! Initially this doesn't make sense as the image offsets are sub-pixel, but my theory is that they get significantly amplified when taking ratios. Note that the densities derived from the two ratios don't agree with each other, but this is a separate issue (!).

I estimate the slope of the spectrum on the detector to be -0.0792 pixels/angstrom. To do this I compared images in the following line pairs: Fe VIII 185.21/194.66, Fe XII 186.88/196.64, Fe XII 192.39/195.12 and Fe XII 195.12/203.72. It's not easy to find good data to measure this, particularly for Fe XII, as you need compact features to do it well. Note that the tilt of the LW band will need to measured separately.

I think it will be important to correct for this effect if you're measuring the density or filling factor of a compact feature such as a loop.

Peter Young