Code Monkey home page Code Monkey logo

pahfit's Introduction

PAHFIT

PAHFIT is a decomposition model and tool for astronomical infrared spectra, focusing on dust and gas emission features from the interstellar medium (see Smith, J.D.T., Draine B.T., et al., 2007, ApJ, 656, 770).

This package provides an updated python implementation of PAHFIT. While the original versions of PAHFIT (v1.x) were written in IDL and focused mainly on Spitzer/IRS spectroscopic observations, the newer python-based versions (>=v2.0) will expand instrument coverage to other existing (e.g., AKARI) and planned (e.g., JWST) facilities, and will offer a more flexible modeling framework suitable for modeling a wider range of astrophysical sources.

Based on and inspired by the IDL code PAHFIT by JD Smith and Bruce Draine.

Build checks/status

Documentation Status

Test Status

Test Coverage Status

LGTM Status

Codacy Status

Packaging

Powered by Astropy Badge

Documentation

Note that as of March, 2023, a significant change to the PAHFIT API and model specification was merged; the documentation do not reflect these changes and will be improved soon.

Hosted by readthedocs: <http://pahfit.readthedocs.io/en/latest/>

In Development!

This code is currently in active development. Contributions welcome (see below).

Contributors

  • Dries van De Putte
  • J.D. Smith
  • Karl Gordon
  • Thomas Lai
  • Alexandros Maragkoudakis
  • Els Peeters
  • Jan Cami
  • Ameek Sidhu
  • Dries Van De Putte

License

This project is Copyright (c) PAHFit Developers and licensed under the terms of the GNU GPL v3+ license. This package is based upon the Astropy package template which is licensed under the BSD 3-clause licence. See the licenses folder for more information.

Contributing

Please open a new issue or new pull request for bugs, feedback, or new features you would like to see. If there is an issue you would like to work on, please leave a comment and we will be happy to assist. New contributions and contributors are very welcome!

New to github or open source projects? If you are unsure about where to start or haven't used github before, please feel free to contact @karllark. Want more information about how to make a contribution? Take a look at the astropy contributing and developer documentation.

Feedback and feature requests? Is there something missing you would like to see? Please open an issue or send an email to @karllark. PAHFIT follows the Astropy Code of Conduct and strives to provide a welcoming community to all of our users and contributors.

We love contributions! pahfit is open source, built on open source, and we'd love to have you hang out in our community.

Imposter syndrome disclaimer: We want your help. No, really.

There may be a little voice inside your head that is telling you that you're not ready to be an open source contributor; that your skills aren't nearly good enough to contribute. What could you possibly offer a project like this one?

We assure you - the little voice in your head is wrong. If you can write code at all, you can contribute code to open source. Contributing to open source projects is a fantastic way to advance one's coding skills. Writing perfect code isn't the measure of a good developer (that would disqualify all of us!); it's trying to create something, making mistakes, and learning from those mistakes. That's how we all improve, and we are happy to help others learn.

Being an open source contributor doesn't just mean writing code, either. You can help out by writing documentation, tests, or even giving feedback about the project (and yes - that includes giving feedback about the contribution process). Some of these contributions may be the most valuable to the project as a whole, because you're coming to the project with fresh eyes, so you can see the errors and assumptions that seasoned contributors have glossed over.

This disclaimer was originally written by `Adrienne Lowe <https://github.com/adriennefriend>`_ for a `PyCon talk <https://www.youtube.com/watch?v=6Uj746j9Heo>`_, and was adapted by pahfit based on its use in the README file for the `MetPy project <https://github.com/Unidata/MetPy>`_.

pahfit's People

Contributors

aarish-a avatar alexmaragko avatar ameek-sidhu avatar bethanyrs avatar bsipocz avatar danialkhan6312 avatar dhruvil2911 avatar drvdputt avatar els1 avatar jancami avatar jdtsmith avatar karllark avatar sarduval avatar thomassylai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

pahfit's Issues

Attenuation curve flexibility

The assumed attenuation curve is now fixed. A more flexible model for handling attenuation should be added. For example, a 3.1 um ice feature should be added on top of the silicate extinction curve. Any ideas?

error: [Errno 13] Permission denied on installation

I'm trying to re-install pahfit from within a new branch containing updated files regarding issue #65, but when running python setup.py install from my astroconda environment I get the following error:

running install
running bdist_egg
running egg_info
error: [Errno 13] Permission denied

PAHFIT run time test

The latest version of PAHFIT in python runs slower than the original PAHFIT in IDL.
Need to investigate where the bottleneck is in the code. Likely to be the compound model in astropy.modeling.
It is also worth investigating whether the refactor of astropy v4.0+ makes the code slow.

Instrument Packs: What Information (and Format)

What features of the model framework should live in separate instrument packs? I can imagine:

  • Wavelength cutoffs
  • Resolution vs. wavelength (which may be a model, e.g. a polynomial fit), and may be disjoint.
  • Weighting parameters to de-weight cruddy ends of segments.

But what if you want to blend spectra together from multiple intstruments?

We could obviously go much further to correct response function issues, etc., but PAHFIT is not a calibration or reduction tool.

Extra model components usage

As I was working on the Table output function I realized that 2 identical model components are used when building the final model.

In both cases it's the first element in the H2 features and ionic features lists (H2 S(7) and [ArII] respectively).

Currently, the final model has 55 components. However, there are 9 blackbodies, 25 dust features, 8 H2 features, 10 ionic features, and 1 component for the tau of the attenuation curve, adding up to 53 components in total.

Units for model pack and output format

Should add the units to the model pack and output files. This would require the ingest of the model pack to make sure to convert the units to the internally used ones. And probably have the output convert back to the input units for consistency.

Should be fairly straightforward to implement, but including good check for robustness would be good and add complexity.

YAML format reader

The proposed YAML format needs a reader and appropriate integration into the class init function. Probably would be good to have the reader picked based on the file extension (e.g., switching between the YAML and straight table formats).

Script to get model packs

With the scripts being installed system wide (once PR #51), now possible to run pahfit from any location. But need to know the full path to the model pack files - not ideal. One solution would be to have a script that would download the model pack files into the current directory.

Expressing Constraints in Instrument and Science Model Packs

We need a simple and flexible means of expressing constraints on and between parameters, for the input specification discussed in #9.

Examples that can be attached to one parameter:

  1. Parameter is fixed: fixed (or omit). Example: temperature of starlight continuum.
  2. Parameter is bounded (below, above, or both): bounded: [p1,] or bounded: [,p2], or bounded: [p1,p2]. Examples: central intensities, wavelengths, etc.
  3. Parameter is tied to another parameter (by name?): tied: 1.15 pah_77_feature1_central_lam. Example: fixing ratios between (probably new) sub-features of PAH complexes.

Examples that might need to be expressed separately, since they govern multiple parameters:

  1. Parameter is tied to a simple function of another parameter: tied: integrated_strength(pah_11*) = 0.4 integrated_strength(pah_77*) or some such.

Instrument packs can use this notation as well.

Use astropy.units?

We could use astropy.units to specify the units of the spectra. Then we could ensure they are in the units we expect. It may even be transparent through the use of the astropy.models as long as we setup things correctly. Worth thinking about.

Update for astropy 4.0

The astropy modeling got a major update in v4.0. Updates needed. As part of this update, the Drude1D and BlackBody1D functions were added to astropy (by me actually). So, they can be removed from pahfit.

Threading/parallelism in fitting spectra?

It's very common now to PAHFIT hundreds or more spectra over and over, from large samples to spectral cubes. This can get slow. Optimizers usually run on one processor. I wonder if there is a package or convenient way to farm fits out on multiple threads and then collect the results back together?

So the user could say "here are 1000 different spectra I need to (re)-fit", and it would thread them out appropriately and fit them simultaneously. I suspect JWST will have this issue in spades.

DOC: how to add a new component model

Would be good to document how to add a new component model. E.g., add it to component_models.py and add the needed options to creating a combined model and read/write/plot functions.

Redshift as free parameter

Later versions of IDL PAHFIT (never widely released) toyed with allowing redshift z to be a free parameter. It wasn't widely tested and worked poorly. Free redshift obviously introduces some real challenges for an optimizer. With a fitting pack using mostly fixed PAH band ratios, this is conceivable, however.

Create an easy access to individual components data

Currently the access to the model of the individual components requires a bit of work so I think it would be very useful to create a class that has the components as attributes. From what I've seen of the plot_pahfit.py, it would be only a matter of adapting some things but #19 makes me question if it is that simple.
This would make it easier to work directly with the fitted data in operations such as plotting specific components or just saving the individual components for another use -- e.g.: the dust features can be used as input for the NASA Ames PAH IR Spectroscopic Database. In addition to the components as attributes there could be a method that allows saving everything in a neat table.

Fixed-flavor PAH spectra

We should define fixed-ratio PAH model spectra based on noise-free templates (Smith+07) to be used in the PAHFIT model. Especially useful for absorption-dominated spectra.

pahfit class setup

We should use a class (object) for the pypahfit model and fitting. This should be an astropy.modeling class to take advantage of various buildin functions. See http://docs.astropy.org/en/stable/modeling/.
We can use the code in pyISMFit as starter code (this starter code should be removed once it has been used).

Export Spectra to other (non-native) instrument packs or different resolutions

A less common but very powerful use of PAHFIT is to take a fitted spectrum or spectral average, then up-res the lines to target a different instrument or hypothetical instrument. Right now this is a hard-codeding affair, but if we had the ability to take a given PAHFIT fit result, and recreate the spectrum according to arbitrary instrument packs, or at an arbitrary resolution, this would be valuable for mission planning, preparing science cases, etc.

Data coupled to instrument pack?

Would be good to couple the data to an instrument pack. Not quite sure how to do this, but this would make it clear the coupling of the instrument pack to the right data.

Parameter Initialization

One things we haven't thought of yet is the prescription for selecting the starting values for parameters. We need to figure out where that belongs. Obviously starting values will depend on the instrument pack(s) selected (think line widths, etc.). But it may also depend on the science model.

Right now in PAHFIT original parameter initialization is just hard-coded in, doing some simple averages to set tau's and line center intensities, etc., as appropriate for Spitzer/IRS. And now there is no direct mapping between the "PARINFO" structure which initializes, ties, and constrains parameters for the LM optimization, and the "PAHFIT results" structure, which is the end/science-user consumable. So there is no way to chain the output of one fit to the input of another (as suggested in #13).

License

I'm a strong proponent of GPL licenses. IDL PAHFIT is license under GPLv2, and I think that or GPL3 are ideal for what we're trying to achieve. Are there license requirements for AstroPy affiliation?

Extinction

Determine extinction based on HI lines, H2 lines to help with the degeneracy of the silicate absorption band and the surrounding PAH bands.

setup automated training/check set for v2.0

Need to have a robust training set. Maybe for each model pack. Useful to be able to quickly check if changes to a pack or main code "screws" up the fitting. Need data and expected fit parameter values.

Bounds as optional

Update the base class to allow for bounds to be optional.
If not in input feature dictionary, then set to (None, None).
Probably should add helper functions in this code area to make it easier to do this (less copying).

'Drude1D' object has no attribute 'stddev'

Hi, sorry I got a problem when I tried to run the pahfit, my astropy version is 4.0 and pahfit is 2.0.dev96

here is my command:
run_pahfit M101_Nucleus_irs.ipac scipack_ExGal_SpitzerIRSSLLL.ipac

And I got these messeges:
WARNING: AstropyDeprecationWarning: Class BlackBody1D defines class attributes inputs.
This has been deprecated in v4.0 and support will be removed in v4.1.
Starting with v4.0 classes must define a class attribute n_inputs.
Please consult the documentation for details.
[astropy.modeling.core]
WARNING: AstropyDeprecationWarning: Class BlackBody1D defines class attributes inputs.
This has been deprecated in v4.0 and support will be removed in v4.1.
Starting with v4.0 classes must define a class attribute n_inputs.
Please consult the documentation for details.
[astropy.modeling.core]
Traceback (most recent call last):
File "/Users/k/anaconda3/bin/run_pahfit", line 90, in
pmodel = PAHFITBase(filename=packfile)
│ └ '/Users/k/anaconda3/lib/python3.7/site-packages/pahfit/packs//scipack_ExGal_SpitzerIRSSLLL.ipac'
└ <class 'pahfit.base.PAHFITBase'>
File "/Users/k/anaconda3/lib/python3.7/site-packages/pahfit/base.py", line 166, in init
'stddev': dust_features['fwhms_fixed'][k]})
File "/Users/k/anaconda3/lib/python3.7/site-packages/astropy/modeling/core.py", line 441, in init
init, args, kwargs, varkwargs='kwargs')
File "/Users/k/anaconda3/lib/python3.7/site-packages/astropy/modeling/core.py", line 438, in init
return super(cls, self).init(*params, **kwargs)
│ │ │ └ {'amplitude': 100.0, 'x_0': 5.27, 'fwhm': 0.17918, 'name': 'DF0', 'bounds': {'amplitude': (0.0, None), 'x_0': (5.17, 5.369999999...
│ │ └ ()
│ └ <Drude1D(amplitude=1., x_0=10., fwhm=1., name='DF0')>
└ <class 'pahfit.component_models.Drude1D'>
Name: Drude1D
N_inputs: 1
N_outputs: 1
Fittable parameters: ('amplitude', 'x_0', 'fwhm...
File "/Users/k/anaconda3/lib/python3.7/site-packages/astropy/modeling/core.py", line 715, in init
self._initialize_constraints(kwargs)
│ └ {'amplitude': 100.0, 'x_0': 5.27, 'fwhm': 0.17918, 'bounds': {'amplitude': (0.0, None), 'x_0': (5.17, 5.369999999999999), 'fwhm'...
└ <Drude1D(amplitude=1., x_0=10., fwhm=1., name='DF0')>
File "/Users/k/anaconda3/lib/python3.7/site-packages/astropy/modeling/core.py", line 1801, in _initialize_constraints
param = getattr(self, ckey)
│ │ └ 'stddev'
│ └ <Drude1D(amplitude=1., x_0=10., fwhm=1., name='DF0')>
└ Parameter('x_0', value=10.0)
AttributeError: 'Drude1D' object has no attribute 'stddev'

Is there a solution about this problem?

Remove old directory

Keeps the code base clean. Can get it back via the magic of git version control.
Especially as old, not maintained code muddies the lgtm results.

Docs: Fitting problems with IDL version of PAHFIT

PAHFIT was developed to work on SINGs galaxies. Which observations were not well fitted by PAHFIT-IDL? What was the issue?

Continuum issues:

  1. ultra-compact HII region: mismatch in continuum beyond 13um?
    Screen Shot 2021-01-11 at 2 43 48 PM

  2. LMC HII Region: PAHFIT uses the 33um dust component to correct for the mismatch in continuum emission at longer wavelengths
    Screen Shot 2021-01-11 at 2 45 57 PM

DOC: code setup philosophy

Add in the code philosophy. What is the core of the python version (astropy.modeling)? What capabilities does it enable (e.g., new components, more of the same component, constraining parameters, etc.)?

Spitzer based test code

Should start with getting the code running on Spitzer IRS data.
Check the results with the IDL code results

Interface to MCMC codes

Would be great if we could use the same setup with an MCMC code. One obvious choice is the 'emcee' package.
There is some work in astropy to enable this through adding the ability to add a prior to the parameter class used by all models. Currently suspended until the refactoring of the astropy.model class is done (soon).
When done by astropy, add to PAHFIT.

PAHFIT Model Component Extraction

One of the most useful (and under-used) features of the original PAHFIT is the pahfit_components function, which looks like this:

function pahfit_components,lambda,decoded_params, $
                           DUST_CONTINUUM=dust_continuum, $
                           TOTAL_DUST_CONTINUUM=dc_tot,STARLIGHT=stars, $
                           TOTAL_CONTINUUM=tot_cont, $
                           DUST_FEATURES=dust_features, $
                           TOTAL_DUST_FEATURES=df_tot, $
                           LINES=lines,TOTAL_LINES=lines_tot, $
                           EXTINCTION_FAC=ext, $
                           EXTINCTION_CURVE=ext_curve, $
                           DISABLE_EXTINCTION=de,LAMBDA_REST=lam_rest

This code allows you to take a decoded_params structure, which is output from a PAHFIT run, and selectively pull pieces from it: the total continuum, all due features, each features separately, etc. This is used to produce the canonical multi-colored plots we all known and love, but it's also essential for using model results more deeply than just examining the parameters output. We need something similar.

Error when reading input file and unit conversion

In the current version of the code the input file format is decided from the input file extension.
If I run pahfit for the example spectra in the data directory it raises an error of "No reader defined for format ...", followed by a table with the available formats.

This is strange because .ipac is included in the available formats, but "M101_Nucleus_irs.ipac" gives the above error.

The same happens with "orion_bar_SWS_with7023shape.txt", although if I rename the file to .ascii it will read the file.

But, it will raise an error in the following line when the unit conversion is happening, because no unit assignment is happening.

So currently we can't run pahfit as: run_pahfit spectrumfile scipack_ExGal_SpitzerIRSSLLL.ipac without modifying the code, and it's harder to test.

Infrastructure updates notice from Astropy Project

Hello from Astropy Project!

The following updates to your package might be necessary to ensure compatibility with the latest stable version of Astropy:

  • MPLBACKEND is now set to Agg in ci-helpers, packages expecting interactive plotting should override it in .travis.yml
  • Astropy 3.1 is not compatible with Numpy <1.13 versions. If you want to keep testing these older Numpy versions, please use ASTROPY_VERSION=3.0 or ASTROPY_VERSION=LTS in your Travis CI matrix.
  • Add sphinx-astropy as a package dependency if you are using astropy-helpers 3.1 or later. Otherwise, your documentation build (e.g., on ReadTheDocs) might fail.
  • If you are using six that is bundled with Astropy, please consider using the standalone six package instead.

If these are no longer applicable to you, please close the issue.

This is an automated issue for packages that opted in for automated astropy-helpers update. If this is opened in error, please let @pllim know.

xref astropy/astropy-tools#108

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.