Bin x 2 - problem with ISIS
Posted: Fri May 15, 2020 6:17 am
Hello
I'm back to spectroscopy after a break of 7 years and working my way up the learning curve with ISIS. I have my rig at Coonabarabran NSW, Australia adjacent to the Siding Springs Observatory and operate it remotely. Its been a challenging exercise but the first results are proving very positive.
My main issue at the moment is to process spectra taken with the imaging camera binned x2. I have a LISA attached to a Planewave CDK 12.5 which is stepped down to F5 with a reducer. My imaging camera is an Atik 460EX. Its a great combination and unbinned images are easily reduced and processed. However, to access fainter targets, I'd like to bin my images. And this is where I have problems.
I have taken images and Ar Ne calibration frames x2 and have prepared master darks and bias frames from images that are also binned x2. I have adjusted the pixel size - from 4.54 to 9.08. However, when I try and process the spectra, the spectral calibration fails or the RMS is surprisingly high and the results dubious.
The routine I am following is the same. I prepared a .lst file based on the file I use for unbinned images with the dispersion adjusted to reflect the change in pixel size to 3.652. Processing this, I generate a reasonable spectrum but with a RMS of about 19. I have managed an RMS as low as 0.2 with unbinned images. My reference star is a bright A0V star quite near the target. I then use the spectral calibration assistant to improve the calibration using the Ha line in the reference star. It generates an RMS of 5.18 and then reports that the calibration error is too high.
The .lst file method generates a reasonable spectrum but its clearly not correct - comparing it to the Miles A0V spectrum, my spectrum is shifted to the red at the blue end of the spectrum and vice versa.
As an alternative, I have used the Predefined mode for the LISA using argon IR, argon visible and neon and selected the appropriate lines in the calibration frame. When processing, the program seems to flicker between 2 spectral processes - checking the log I can see multiple attempts to determine the dispersion - primary and inverse. The primary RMS is a healthy 0.98 but the inverse RMS is an unhealthy 26.1! The resulting spectrum is worse than that generated by the .lst based method.
I am sure this is pilot error and not a bug in the software. But I can't see where I am falling down.
I'd be delighted if anyone is able to tell me where my method is wrong.
Thanks
Pete
PS I am sure I have some lovely data on 2020hvf - if only I could process it correctly!
I'm back to spectroscopy after a break of 7 years and working my way up the learning curve with ISIS. I have my rig at Coonabarabran NSW, Australia adjacent to the Siding Springs Observatory and operate it remotely. Its been a challenging exercise but the first results are proving very positive.
My main issue at the moment is to process spectra taken with the imaging camera binned x2. I have a LISA attached to a Planewave CDK 12.5 which is stepped down to F5 with a reducer. My imaging camera is an Atik 460EX. Its a great combination and unbinned images are easily reduced and processed. However, to access fainter targets, I'd like to bin my images. And this is where I have problems.
I have taken images and Ar Ne calibration frames x2 and have prepared master darks and bias frames from images that are also binned x2. I have adjusted the pixel size - from 4.54 to 9.08. However, when I try and process the spectra, the spectral calibration fails or the RMS is surprisingly high and the results dubious.
The routine I am following is the same. I prepared a .lst file based on the file I use for unbinned images with the dispersion adjusted to reflect the change in pixel size to 3.652. Processing this, I generate a reasonable spectrum but with a RMS of about 19. I have managed an RMS as low as 0.2 with unbinned images. My reference star is a bright A0V star quite near the target. I then use the spectral calibration assistant to improve the calibration using the Ha line in the reference star. It generates an RMS of 5.18 and then reports that the calibration error is too high.
The .lst file method generates a reasonable spectrum but its clearly not correct - comparing it to the Miles A0V spectrum, my spectrum is shifted to the red at the blue end of the spectrum and vice versa.
As an alternative, I have used the Predefined mode for the LISA using argon IR, argon visible and neon and selected the appropriate lines in the calibration frame. When processing, the program seems to flicker between 2 spectral processes - checking the log I can see multiple attempts to determine the dispersion - primary and inverse. The primary RMS is a healthy 0.98 but the inverse RMS is an unhealthy 26.1! The resulting spectrum is worse than that generated by the .lst based method.
I am sure this is pilot error and not a bug in the software. But I can't see where I am falling down.
I'd be delighted if anyone is able to tell me where my method is wrong.
Thanks
Pete
PS I am sure I have some lovely data on 2020hvf - if only I could process it correctly!