Code Monkey home page Code Monkey logo

pydis's Introduction

NOTE: PyDIS is now deprecated

This project was an important learning tool for me back in 2015, and I believe helped move Python spectroscopic data reduction forward in general (e.g. several pieces of code and the workflow have made their way into astropy modules). It is no longer being maintained or used, and has been replaced by PyKOSMOS.

The repo is saved here for posterity

PyDIS DOI

An easy to use reduction package for one dimensional longslit spectroscopy using Python.

The goal of pyDIS is to provide a turn-key solution for reducing and understanding longslit spectroscopy, which could ideally be done in real time. Currently we are using many simple assumptions to get a quick-and-dirty solution, and modeling the workflow after the robust industry standards set by IRAF. Additionally, we have only used data from the low/medium resolution APO 3.5-m "Dual Imaging Spectrograph" (DIS). Therefore, many instrument specific assumptions are being made. So far PyDIS has also been successfully used (with hacking/modification) on data from MMT and DCT. If you use PyDIS, please send me feedback!

Some background motivation on why I made this package is given here.

Examples

See the examples page on the Wiki for a few worked examples of reducing DIS data, or the step-by-step manual reduction guide for a detailed tutorial on reducing 1-d spectroscopy data with pyDIS.

Motivation

Really slick tools exist for on-the-fly photometry analysis. However, no turn-key, easy to use spectra toolkit for Python (without IRAF or PyRAF) was available (that we were aware of). Here are some mission statements:

  • Being able to extract and see data in real time at the telescope would be extremely helpful!
  • This pipeline doesn't have to give perfect results to be very useful
  • Don't try to build a One Size Fits All solution for every possible instrument or science case. We cannot beat IRAF at it's own game. IRAF is the industry standard
  • The pipeline does need to handle:
    • Flats
    • Biases
    • Spectrum Tracing
    • Wavelength Calibration using HeNeAr arc lamp spectra
    • Sky Subtraction
    • Extraction
    • basic Flux Calibration
  • The more hands-free the better, a full reduction script needs to be available
  • A fully interactive mode (a la IRAF) should be available for each task

So far pyDIS can do a rough job of all the reduction tasks for single point sources objects! We are seeking more data to test it against, to help refine the solution and find bugs. Here is one example of a totally hands-free reduced M dwarf spectrum versus the manual IRAF reduction:

Imgur

This spectrum took a few seconds to reduce, and is good enough for a quick-look! There are definitely errors in the wavelength, and small offsets in the flux calibration. A (terrible) brute-force wavelength solution, and sometimes fickle flux calibration are being used here. With some minimal parameter tweaking and manual lamp-line identifications the results are even better!

How to Help

  • Check out the Issues page if you think you can help code, or want to requst a feature!
  • If you have some data already reduced in IRAF that you trust and would be willing to share, let us know!

pydis's People

Contributors

bmorris3 avatar jradavenport avatar tdwilkinson avatar tzdwi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pydis's Issues

Autoreduce: Dis-high data in blue

First, thanks to your last changes, this a sample of great DIS-high data reduction (in red):
kic5597604 16r

It was so quick ! So, I decided to try it with the blue images. The results were... interesting. I ran the pydis.autoreduce the first time with the red henear by accident:
kic2285598 11b_r

The results were off, and I found the error in my code. This is what I got with the blue henear:
hmmm 15b
It seems we have discovered plasma dragons.

I'm not sure why the blue images would reduce differently than the red. I looked in pydis.py and could not find anything. On my side, I double checked my lists, and all seem to be the same. The next DIS-high night I ran with pydis.autoreduce also looked good in the red, and the same distortions appeared in the blue.

Add support for 2D spectra in `ApplyFluxCal`

Sometimes it is helpful to apply the sensitivity function from a standard target to a 2D spectrum with one axis being wavelength and the spatial direction to study extended sources.

Deal with Overscan region

Charli Sakari's comment via email:

If I want to use an overscan region for the bias instead of having a bias frame, is there a way to deal with that?

Test: UT140701 ; using stdstar=''

>>> spectra.autoreduce('objlist.txt','flatlist.txt','biaslist.txt','henear.0018r.fits',stdstar='')
Running HeNeAr_fit function.
Doing automatic wavelength calibration on HeNeAr.
Note, this is not very robust. Suggest you re-run with interac=True
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/astro/users/twilki/research/mdwarf/spectra/spectra.py", line 1190, in autoreduce
    display=displayHeNeAr, tol=HeNeAr_tol, fit_order=HeNeAr_order)
  File "/astro/users/twilki/research/mdwarf/spectra/spectra.py", line 833, in HeNeAr_fit
    popt,pcov = curve_fit(_gaus, xline, yline, p0=pguess)
  File "/astro/apps6/anaconda/lib/python2.7/site-packages/scipy/optimize/minpack.py", line 586, in curve_fit
    raise RuntimeError(msg)
RuntimeError: Optimal parameters not found: Number of calls to function has reached maxfev = 1000.

Not sure what the issue is here. Looks like it doesn't like the curve fit function?

HeNeAr fit needs 2nd pass

after user-defined HeNeAr lines are identified, use a higher resolution line list and do a 2nd pass. pick up more lines to use for the final solution

Test: UT140701 ; HeNeAr_fit documentation

Here's what the documentation states to use:

HeNeAr_file = 'HeNeAr.0030r.fits'
wfit = spectra.HeNeAr_fit(HeNeAr_file, trim=True, fmask=fmask_out,
interac=True, mode='poly', fit_order=5)

My Error: NameError: name 'fmask_out' is not defined

I did not see this as a variable in the code, but it is referenced as a fmask=fmask_out throughout the documentation.

Spectra code:
def HeNeAr_fit(calimage, linelist='apohenear.dat', interac=True,
trim=True, fmask=(1,), display=True,
tol=10, fit_order=2, previous='',mode='poly',
second_pass=True):

So, should this be an array or are we labeling this as a name?

Stellar parameters, including v sin i

It would be nice to have a secondary package to measure basic stellar parameters, including v sin i

Parameters could include:

  • T_eff
  • Spectral type (like the Hammer)
  • RV
  • v sin i
  • equivalent widths of a few key lines

Once PyDIS is ready for it's next release (is tested, ships w/ sample data, etc), this would make a nice student project to build useful features.

These functions could (and should?) be more general than for PyDIS, and could be included in astropy

henear lines in interactive mode

The plot that pops up in interactive mode show spectral lines increasing in wavelengh to the right. However, the arcs that are linked for comparison show spectral lines increasing in wavelength to the left. Usually, I think it would be flipped after the identify step in IRAF.

handle a bad trace

what to do if trace does a bad job automatically?

for example, if the spline extrapolates off the chip

add manual aperture selection

  • add manual aperture selection, user can click on which they want?
    • simple click interface? show the summed 1d version (easier, higher S/N)?
  • what if aperture moves slightly between exposures, drifts? should it recenter?
    • a trace_recenter keyword?

Make flats, biases, lamps optional for autoreduce

Charli Sakari's comments via email:

If one goal is to be able to run this quickly while observing in order to check data, it might be useful to be able to run the reduction without biases, flats, or standards. That way if someone has the second half of the night and is saving the cals for the end of the night they could still briefly check their data.

Similarly, it would be great to have the option to skip the flux calibration stage.

Autoreduce output object.fits.spec

These files include commas, which just make it a pain to pull out the data for plotting as they have to be removed.
In your spectra code (line 100):

fout.write(str(wfinal[k]) + ', ' + str(ffinal[k]) + ', ' + str(efinal[k]) + '\n')

suggestion: (?)

fout.write(str(wfinal[k]) + ' ' + str(ffinal[k]) + ' ' + str(efinal[k]) + '\n')

(With spaces - possibly not needed)
Would this be okay to implement?

ReduceTwo not breaking gracefully

Please add more grace when trying to handle a bias/flat/image list with uneven image sizes. Numpy doesn't care to concatenate differently sized images and fails on line 189 in biascombine:

ERROR: ValueError: all the input array dimensions except for the concatenation axis must match exactly [numpy.lib.shape_base]
Traceback (most recent call last):
File "", line 1, in
File "//anaconda/lib/python2.7/site-packages/pydis/wrappers.py", line 402, in ReduceTwo
bias = pydis.biascombine(biaslist, trim=trim)
File "//anaconda/lib/python2.7/site-packages/pydis/pydis.py", line 189, in biascombine
all_data = np.dstack( (all_data, im_i) )
File "//anaconda/lib/python2.7/site-packages/numpy/lib/shape_base.py", line 367, in dstack
return _nx.concatenate([atleast_3d(_m) for _m in tup], 2)
ValueError: all the input array dimensions except for the concatenation axis must match exactly

flux calibration

Reproduce "sensefunc". Need to grab library of known flux standards from IRAF, compare to observed standard to get flux calibration.

Important to only get stellar continuum, mask out lines

Reduction script for multiple HeNeAr frames

Assuming the wavelength solution doesn't change much over the night, but a careful observer has taken multiple lamp images to ensure the best possible velocities can be measured:

  • Create whole new reduction script in wrappers.py
  • Use a full solution for first HeNeAr, as normally done
  • From the first solution, cheat and solve the n'th HeNeAr image, use line positions and small tolerances (?)
  • instead of list of images and 1 HeNeAr, hand either a 2-column list or 2 lists, which indicates which HeNeAr should be solved for each image.
  • write up short explanation on the repo wiki

Close objects on slit caused trace bug

This is an interesting case using DIS-low data. I picked the peak on the right (centered ~500 similar to the other images and corresponding to top star on slit). The trace switched to the other star probably because they are so close and have similar counts.

bug

creating this fun dip:
dlautorun_kic5531953 23r

write reduction wrapper

once all the simple functions are in place, write reduction wrapper that uses each function in proper order:

  1. combine flats and biases
  2. trace HeNeAr, define wavelength solution
  3. remove flats and biases from all science frames
  4. extract object spectrum and subtract sky spectrum
  5. map wavelength calibration to object spectrum, extract 1d wave. cal'd spec!

be sure it works on 1 image or a list of images

sky subtraction routine

simple mode:

  • assume no tilt of wavelength solution within the trace
  • using windows above/below trace
  • fit 2d polynomial and subtract
  • do for each pixel in trace

complex mode:

  • all the same as above, but using wavelength HeNeAr mapping
  • do subtraction (and later extraction) using steps in wavelength space...
  • probably going to want to copy/mimic original IRAF code for this to do it "right"

real-time reduction

write a version of autoreduce that does the cal's, and then waits and reduces data in real time as it comes down from the telescope

  • for first pass, copy autoreduce functionality. assume user takes calibration files first
  • decide if want to do both RED and BLUE? Wouldn't be too hard...
  • instead of looping over images in list, do while loop or similar
  • when new image arrives, read in, reduce it, output results
  • if new HeNeAr image comes in, update wavelength solution

HeNeAr fit

take wide slice of image, find peaks

use rough wavelength info in header to guess wavelengths of peaks. example HeNeAr data

fit peaks against HeNeAr models from IRAF

trace peaks in spatial direction

generate 2d wavelength solution

Improve Sky Subtraction

Right now sky subtraction works very locally, pixels right above/below the aperture trace.

Since wavelength space is bent/tilted, not perfectly vertical along y-axis, if you increase sky window sizes you get worse fits. This is very bad, as the sky S/N should get better as the windows get larger, not the other way around!

Fix this by doing a 2-D (surface) interpolation between the sky windows?

Clarification in docuemntation

I've gotten the autoreduce wrapper to work with my low resolution asteroid data, works really well. There needs to be a clarification on the wiki/instructions as written though, once you've added the pydis directory to the sys.path you should call autoreduce with

import wrappers

and then

wrappers.autoreduce(obj...

since autoreduce is defined in wrappers.py and not pydis.py

Test: UT140701

I have already reduced this data from Fall quarter, and decided to run spectra on it. However, I ran into this error:

IOError: [Errno 2] No such file or directory: 'apohenear.dat'

I checked the resources directory and did not see this file. Should I download it somewhere?

setup.py script

It is useful to have a setup.py script to describe metadata about the code, dependencies and
to more easily install the package on the system.

flux calibration unstable at the edges

I have tried to run flux calibration in pydis using the standard star G191b2b. These data cover the optical and near-UV region starting around 3000 ร…. The sensitivity function using the IRAF libraries blows up at the blue end as shown in the figure below.

figure_1
figure_1-1
figure_1-3

If the standard star spectrum is clipped to start around 3200 ร… the result is better but when the sensitivity function is applied to the standard star data the result still is as shown in the figure below.
Is this a problem with the interpolation in DefFluxCal?

figure_1-5

streamline interactive mode for HeNeAr fit

interactive mode should be re-done, showing

  1. panel of HeNeAr spectrum with lines to pick by clicking on them (already have)
  2. able to delete lines by clicking on them again (already have)
  3. sub-panel with wavelength residual as a function of wavelength (have later on)
  4. able to click in sub panel to adjust fit order and update everything (need!)

version checking of dependencies

a few functions pydis uses have version-specific stuff.

e.g., scipy requires v0.15 for UnivariateSpline

add a check for version of things

help with pydis autorun script issue

I do not think this is an error with your pydis code.

In my autorun script, from the data directory that holds each date directory:
I can't seem to call the each date directory, and have it recognize the fits files in each .lis.
The python script uses:

b= os.path.dirname(os.path.abspath('data'))+ '/UT130520/'
pydis.autoreduce( b + 'objlist.r.txt', flatlist= b + 'rflat.lis', biaslist= b + 'rbias.lis', HeNeAr_file=b + 'obj.0001r.fits', stdstar='hr7001', display=False, display_final=True)

and
print b shows /astro/store/scratch/tmp/twilki/mdwarf/data/UT130520/

Here is my Error:
File "../dislow_autorun.py", line 60, in
pydis.autoreduce( b + 'objlist.r.txt', flatlist= b + 'rflat.lis', biaslist= b + 'rbias.lis', HeNeAr_file=b + 'obj.0001r.fits', stdstar='hr7001', display=False, display_final=True)
File "/astro/store/scratch/tmp/twilki/mdwarf/pydis/wrappers.py", line 108, in autoreduce
bias = pydis.biascombine(biaslist, trim=trim)
File "/astro/store/scratch/tmp/twilki/mdwarf/pydis/pydis.py", line 150, in biascombine
hdu_i = fits.open(files[i])
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/hdu/hdulist.py", line 129, in fitsopen
return HDUList.fromfile(name, mode, memmap, save_backup, cache, *_kwargs)
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/hdu/hdulist.py", line 271, in fromfile
save_backup=save_backup, cache=cache, *_kwargs)
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/hdu/hdulist.py", line 792, in _readfrom
ffo = _File(fileobj, mode=mode, memmap=memmap, cache=cache)
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/file.py", line 137, in init
self._open_filename(fileobj, mode, clobber)
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/file.py", line 440, in _open_filename
self._file = fileobj_open(self.name, PYFITS_MODES[mode])
File "/astro/apps6/anaconda/lib/python2.7/site-packages/astropy/io/fits/util.py", line 419, in fileobj_open
return open(filename, mode)
IOError: [Errno 2] No such file or directory: 'obj.0027r.fits'

wrappers.py at line 108 is just calling the pydis.biascombine
pydis.py at line 150 is:

files = np.loadtxt(biaslist,dtype='string')
for i in range(0,len(files)):
    hdu_i = fits.open(files[i])

So, this script reads rbias.lis and sees the first index as obj.0027r.fits, but doesn't recognize it as a fits file. I'm not sure how to fix this error, as os.path seems to be the best method to get files in python, and os.chdir didn't work when I tried. Any thoughts?

STFU

Add comments throughout reduction to tell me things like image/bias/flat sizes, what step pydis is at, etc. then add a /silent switch (similar to beloved IDL) to STFU when needed.

Test against benchmark data

need to find some benchmark data to test this against! So far it runs for both DIS R1200 and B1200 gratings with good S/N.

How does it do with low-res? Medium S/N?

Can I find some example DIS data I trust to test it on? Maybe binary star stuff from SLH's class?

Ipython notebook

Ipython notebook may throw errors when requiring interaction with graphics.

Flatcombine and Biascombine

Write flat and bias combine scripts, work over lists of images, write results to files

decide if just using median combine, or maybe allow median/mean choice

Wavelength solution too blue

early test against an IRAF reduction show our resulting wavelength too blue... wtf?

maybe first slice spline not doing a good job between identified peaks?

it's not an air-to-vac thing (though we should include that too!

gl669

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.