Code Monkey home page Code Monkey logo

piff's Introduction

PIFF: PSFs In the Full FOV

Piff is a Python software package for modeling the point-spread function (PSF) across multiple detectors in the full field of view (FOV).

Features: (Some are aspirational. We're still working on the code!)

  • Has multiple basis sets for the underlying PSF model, including pixel-based, shapelets, Gaussian mixture, maybe also Moffat and/or Kolmogorov.
  • Can build the models in either chip or sky coordinates, properly accounting for the WCS of the image.
  • Can interpolate across the full field-of-view, or across each chip separately, or a combination of both.
  • Can do the fitting in either real or Fourier space.
  • Has multiple interpolation functions including polynomials, gaussian processes, and others.
  • Can take knowledge of the optical aberrations as input to convolve the model of the atmospheric PSF.
  • Performs outlier rejection to detect and remove stars that are not good exemplars of the PSF. Outputs the list of stars that were actually used to build the final model.
  • Allows the centroid to be fixed or floating.
  • In general, allow any value to be fixed rather than fit for.
  • Uses highly readable YAML configuration files to set the various options.
  • Includes Python code to read in the PSF files and use it to draw an image of the PSF at an arbitrary location.
  • Currently, the lead developers are:
    • Mike Jarvis (mikejarvis17 at gmail)
    • Josh Meyers (jmeyers314 at gmail)
    • Pierre-Francois Leget (pierrefrancois.leget at gmail)
    - Chris Davis (chris.pa.davis at gmail) If you'd like to join the development effort, or if you have any other questions or comments about the code, feel free to contact us at the above email addresses.

Installation

The easiest way to install Piff is with pip:

pip install piff

If you have previously installed Piff and want to uprade to a new released version, you should do:

pip install piff --upgrade

Depending on the write permissions of the python distribution for your specific system, you might need to use one of the following variants:

sudo pip install piff
pip install piff --user

The latter installs the Python module into ~/.local/lib/python3.7/site-packages, which is normally already in your PYTHONPATH, but it puts the executables piffify and meanify into ~/.local/bin which is probably not in your PATH. To use these scripts, you should add this directory to your PATH. If you would rather install into a different prefix rather than ~/.local, you can use:

pip install piff --install-option="--prefix=PREFIX"

This would install the executables into PREFIX/bin and the Python module into PREFIX/lib/python3.7/site-packages.

If you need the bleeding edge version on the main branch, you can download or clone the repo and install with:

pip install -r requirements.txt
pip install .

Depending on your system, you might prefer/need one of these variants:

sudo pip install .
pip install . --user
pip install . --install-option="--prefix=PREFIX"

Running Tests

After installing Piff, you can run the unit tests by doing:

cd tests
nosetests

Using Piff

A tutorial notebook giving an overview of how to use Piff is available in the examples directory, called Tutorial.ipynb

This is not a comprehensive tour of Piff's capabilities of course, but it should provide a rough guide to the basic structure.

Full documentation is available at:

http://rmjarvis.github.io/Piff/

Reporting bugs

If you have any trouble installing or using the code, or if you find a bug, an error in the documentation, or have any other problem, please report it at:

https://github.com/rmjarvis/Piff/issues

Click "New Issue", which will open up a form for you to fill in with the details of the problem you are having.

Requesting features

If you would like to request a new feature, do the same thing. Open a new issue and fill in the details of the feature you would like added to Piff. Or if there is already an issue for your desired feature, please add to the discussion, describing your use case. The more people who say they want a feature, the more likely we are to get around to it sooner than later.

piff's People

Contributors

aaronroodman avatar beckermr avatar cpadavis avatar eacharles avatar esheldon avatar gbernstein avatar gogrean avatar jmeyers314 avatar joezuntz avatar pfleget avatar rmjarvis avatar theoschutt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

piff's Issues

draw does not use input image

In the docs for draw it says image: An existing image on which to draw, if desired. [default: None]

However when I send it an existing image, it still seems to create and return a new image instead. It does use the shape of the input image.

`nosetests test_poly` isn't thread safe

A few times every ten attempts or so, I'm getting errors when running nosetests test_poly.py from the test directory. Looks like test_poly_load_save and test_poly_load_err are both calling poly_load_save_sub on different threads, leading to problems.

piff_des/tests/run_data_fits.py fails to read psf from config.yaml file

run_data_fits.py runs into some trouble when attempting to read the config.yaml file. Specifically, I copied this file from "/nfs/slac/g/ki/ki18/cpd/Projects/piff_des/configs/des_optatmo.yaml" and only changed its name from "des_optatmo.yaml" to "config.yaml." It seems to run into trouble when trying to read the use_gradient boolean. Is "use_gradient" a new feature in config.yaml? If so, is there something in psf.py that needs to be changed to work properly with it?

Traceback (most recent call last):
  File "tests/run_data_fits.py", line 243, in <module>
    run(**kwargs)
  File "tests/run_data_fits.py", line 108, in run
    psf = piff.PSF.process(config['psf'], logger=logger)
  File "/u/ec/aresh/Piff-wavefront/piff/psf.py", line 82, in process
    psf = cls(**kwargs)
TypeError: __init__() got an unexpected keyword argument 'use_gradient'

SequencePSF

I believe the OptAtmoPSF can be divided into two separate pieces:

  1. Fitting optics parameters and simultaneously fitting tied atmospheric parameters for every star.
  2. Independently fitting atmospheric parameters for every star while holding the optics parameters fixed.

I can imagine adding additional steps too: For example, iterating between fitting optics and fitting atmospheric parameters.

The HSC PSF model presented by Claire Saunders on the PSF TF telecon also has a sequence structure:

  1. Fit parameters of Moffat profile.
  2. Fit pixel grid model to residuals.

If we're being really ambitious, then I could also imagine following the OptAtmoPSF model with a PixelGrid over the residuals.

In all cases, I think the intermediate steps form valid PSF objects. This makes me wonder if we ought to develop a SequencePSF abstraction that would take an ordered list of other PSF objects as input, plus some kind of mapping to tell what parameters of PSF2 come from PSF1, and apply those PSFs in order. This may also help break OptAtmoPSF down into smaller pieces.

Update example des.yaml

Two problems I've ran into trying to run the example des.yaml.

  1. in the des.yaml, it was complaining not getting "image_file_name" and "cat_file_name". I commented out the "images" and "cats" lines and fed it a specific image and catalog... it also didn't like the 'chipnums' line.

  2. in psf.py it seems like the argument "copy_image" is obsolete?
    File "/Users/chihwaychang/anaconda3/envs/galsim/lib/python3.6/site-packages/Piff-0.3-py3.6.egg/piff/psf.py", line 196, in
    TypeError: drawStar() got an unexpected keyword argument 'copy_image'

Running Tests

I installed piff on my MacBook Pro (OS X Yosemite Version 10.10.3) with the "Piff-master" directory placed inside my home directory. However, I was not successful when tried to run some tests. I'm not sure if this is because there is something that went wrong with my installation or if it is because I did not run the tests in the right way.

First tried out what was suggested: I moved from the "Piff-master" directory into the "tests" directory and then used the command "nosetests."

However, I got the error "-bash: nosetests: command not found."

Second, I tried to see if I could run some of the test programs directly. For example, I moved from the "Piff-master" directory into the "tests" directory and then used the command "python test_gp_interp.py." I did the same for the other test programs.

For all of the test programs I got the same error:

" import piff
File "/Library/Python/2.7/site-packages/Piff-0.1-py2.7.egg/piff/init.py", line 86, in
from .gp_interp import GPInterp, ExplicitKernel, AnisotropicRBF
File "/Library/Python/2.7/site-packages/Piff-0.1-py2.7.egg/piff/gp_interp.py", line 75
exec(execstr, globals(), locals())
SyntaxError: unqualified exec is not allowed in function '_eval_kernel' it contains a nested function with free variables"

I'm not sure why I am unable to run tests. Is it possible to run tests only by using the "nosetests" command? If so, is there a "nosetests" file I am likely missing? If not, is it instead likely that something went wrong with my installation of piff?

Gaussian process interpolation

Add an interpolation scheme that implements some kind of Gaussian process interpolation.

If we can successfully model the optical portion of the PSF separately, and we are left with something that is close to pure atmosphere, then I think this part should be well modeled by a Gaussian process, which means that GP interpolation (e.g. kriging) should do a better job than polynomials.

This issue is to write an interpolator that implements this.

Use open symbols for plotting negative rho statistic values

Using solid and dashed lines to connect positive and negative correlation points helps somewhat, but I think it would be much easier to quickly interpret the rho plots if the symbols used for the negative points were open (and filled for positive points).

Unit Tests

I wanted to clarify if I have a couple of points correct with regard to my understanding of the "test_optatmo.py" unit test.

  1. It seems the main incomplete testing function is "def test_optical_fit()." It looks like the reason it is incomplete is because the misalignment errors from the minuit fitting are not trustworthy and a way should be devised to check the minuit errors in a more direct way to see if they match up with the errors we currently get when running this unit test.

2.The only other incomplete testing function is "def create_synthetic_optatmo()." It seems this is incomplete because stars are drawn with it only with the optical PSF and not the atmospheric PSF. At this point, I'm assuming nothing needs to be added to this function until after some new features are added to the atmospheric PSF to make it ready for testing.

Write pixel-based model class

@suchyta1 and @gbernstein have started work on the design of this class already.

This will be the first of our LinearModel classes, so we should put some thought into what aspects of this are general enough to be extracted to a middle-tier base class for other linear models.

During the hack day, we though the following would be useful features for arbitrary linear models:

  1. Be able to compute the matrix that transforms from model parameters into pixel values
  2. Also compute the derivatives of this wrt a centroid shift

Then the LinearModel class could use these matrices to handle the fitting and centroiding (if we are doing that -- we might want to hold nominal centers fixed) in a completely general way that will also apply to e.g. shapelet models.

Chip-based interpolation

We currently have a polynomial interpolator that can take either field positions or chip positions as the interpolation variables. So to do chip-based interpolation now, you can just define the field of view to be a single chip, and you're fine.

However, if we do multi-layer interpolation, then we'll need to have an interpolator that works separately for each chip and properly keeps track of the chip numbers and such. This issue is to make that interpolator.

piff failing tests

PIFF master
latest anaconda on ubuntu 16.04
galsim master

[esheldon@forest tests] nosetests
/home/esheldon/miniconda3/lib/python3.5/site-packages/matplotlib-2.0.0rc1-py3.5-linux-x86_64.egg/matplotlib/font_manager.py:279: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
/home/esheldon/miniconda3/lib/python3.5/site-packages/matplotlib-2.0.0rc1-py3.5-linux-x86_64.egg/matplotlib/font_manager.py:279: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
................./home/esheldon/miniconda3/lib/python3.5/site-packages/galsim/phase_psf.py:404: UserWarning: Input pupil plane image may not be sampled well enough!
Consider increasing sampling by a factor 28.444900, and/or check PhaseScreenPSF outputs for signs of folding in real space.
  "PhaseScreenPSF outputs for signs of folding in real space."%ratio)
..........................E.....
======================================================================
ERROR: Test the simple case of one image and one catalog.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/esheldon/miniconda3/lib/python3.5/site-packages/nose/case.py", line 198, in runTest
    self.test(*self.arg)
  File "/home/esheldon/git/Piff/tests/test_simple.py", line 271, in test_single_image
    psf = piff.read(psf_file)
  File "/home/esheldon/miniconda3/lib/python3.5/site-packages/Piff-0.1-py3.5.egg/piff/psf.py", line 381, in read
    return PSF.read(file_name, logger=logger)
  File "/home/esheldon/miniconda3/lib/python3.5/site-packages/Piff-0.1-py3.5.egg/piff/psf.py", line 223, in read
    with fitsio.FITS(file_name,'r') as f:
  File "/home/esheldon/miniconda3/lib/python3.5/site-packages/fitsio/fitslib.py", line 351, in __init__
    raise IOError("File not found: '%s'" % filename)
OSError: File not found: 'output/simple_psf.fits'
-------------------- >> begin captured stdout << ---------------------
corrupting star at  567.25 1094.32
corrupting star at  1532.74 1743.83
mean =  [ 1.29997811  0.22998814 -0.17001087]

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
Ran 49 tests in 82.006s

FAILED (errors=1)

Model image plots

It would be nice to see the images for our model PSFs along side star (data) images and residuals.

Variable stellar positions

In some use cases we will want to enforce a centering criterion on the PSF model (e.g. that its centroid is zero) and leave the positions of the stars as free parameters. We can begin with the other case (stellar positions are fixed a priori and PSF model is allowed to have non-zero centroid), but this issue is to collect work that would need to be done to implement the former case. Some aspects of this:

  • stardata class will need to support redefinition of the (x,y) of the star.
  • Models will need to provide some derivatives of the PSF with respect to (x,y), or perhaps just a routine to re-optimize (x,y) with the current values of PSF parameters
  • Models will need to define and implement the centering constraints on the PSF model and enforce it while fitting for PSF parameters.

Kolmogorov model

Add a Kolmogorov model, which I think will look a lot like a Gaussian model in implementation. Combined with a good interpolator, this could be useful for modeling the atmospheric part of a PSF.

Before I start, any recommendations for the .fit method? import lmfit? Rescale HSM result? ...?

Rejected stars in individual PSFs are not propagated to SingleChipPSF

In SingleChipPSF, we provide a PSF which is deep-copied for each chip. Then we feed it stars based on each chip, which it then fits. Here's the problem:

If your psf has outlier rejection, and stars are actually rejected, this doesn't get propagated back up to SingleChipPSF.stars. That is, len(SingleChipPSF.psf_by_chip[blah].stars) <= len([star for star in SingleChipPSF.stars if star['chipnum'] == blah]). This is bad because our Statistics objects will assess the PSF based on SingleChipPSF, so you're going to get a bunch of outliers that look terrible -- because they were rejected during the fit!

I'm not sure of a clean way to fix this. Maybe some sort of function that just takes all the psf_by_chip's stars and replaces the SingleChipPSF stars attribute? so:

    # in SingleChipPSF.fit(), having run the psf fit
    # update the stars
    self.stars = [star for key in self.psf_by_chip for star in self.psf_by_chip[key].stars]

if that's cool with folks, I'll just go ahead and do it, but I wanted to check here in case the current behavior was the expected one

Jobs Being Deleted Without Email Message

I wrote a program to run notest_optatmo.py several times with different numbers of stars as jobs. However, something went wrong with one set of jobs that I sent, which caused them to continue to run without stopping for awhile. To try to find out what was going on, I rewrote my program and ran it again with changes. However, it looks like any new jobs I send to the farm are being deleted soon after they are sent. I'm noticing that the "EXEC_HOST" is now always "fell_some_number" instead of "hequ_some_number."

The jobs that mysteriously dissappear don't appear in my email. I also manually ran the notest_optatmo.py and sent it as a job but the job also disappeared without a message in my email. Is there a large backlog of jobs currently preventing new ones from being finished?

python setup.py build fails

Trying to run python setup.py build fails with:

Traceback (most recent call last):
  File "setup.py", line 339, in <module>
    if (dist.script_install_dir not in os.environ['PATH'].split(':') and
AttributeError: Distribution instance has no attribute 'script_install_dir'

Multi-layer interpolation

I think it will be useful to have the option of having multiple interpolator apply to a single data set. e.g. 5th order polynomial across full field plus separate 1st order corrections on each chip. Or GP in the full field with a polynomial tweak on each chip. That kind of thing.

This issue is for getting the overall structure to allow multiple interpolators. I think since we are generally going to have an iteration on the interpolation anyway (for outliers, handling missing data, etc.) it would be easiest to have a primary interpolator go first, then use a second one on the residuals, and iterate from there.

Make it easier to specify how the PSF profile should be centered on an image

Currently psf.draw(x,y) centers the PSF profile at the sub-pixel values of the provided x,y positions.

This is sometimes a desirable feature, but not always. We should make it easier for the user to specify how they want the PSF image centered. I'm proposing a new center keyword argument to the draw command with the following options:

center=None   # (default) the current behavior using (x,y) for the sub-pixel position.
center=True     # center the profile at the actual center of the image.
center=pos      # center the profile at some other arbitrary position on the image.

Optics PSF Subclass

In discussions with @cpadavis, we have decided that the Optics PSF should have its own PSF class, which has the same structure as the existing PSF class, but with rewritten functions as appropriate for the way the optics model works.

It should also be able to return a GSObject in order to facilitate convolutions.

How to pick a stamp size

The default stamp size is 48. However in the examples I've seen from DES most of this area is identically zero. Is there a way to determine what the appropriate stamp size should be for a given PSF?

No "config.yaml" example file with latest new features

To test the new run_data_fits.py script, I sought to try it out on a config.yaml example file. I tried looking for one in the directory /nfs/slac/g/ki/ki19/des/cpd/piff_data/test but there was none. Is there a config.yaml example file saved somewhere where I can test the new features .yaml files now have (such as modifying the nstars parameter)?

Shapelet model

We should have a shapelet model.

This will be another LinearModel class, so I'll hold off doing much work on this until @suchyta1 gets farther along on #5 and that design starts taking shape.

One additional design wrinkle to this one is that we'll want some way to decide on the sigma parameter to use for the shapelets. This sets the overall scale of the shapelet model, and it cannot be varies along with the other coefficients. At least not while keeping it a linear model.

Also, I think we will want to use a single sigma value for the whole field, not let each star determine a sigma on its own. Mostly because sigma is degenerate with other coefficients (at first order), so the interpolation will be messed up if these are allowed to change from star to star.

So I think we will want some kind of MetaFit operation that will take all the stars and do something simple first to get sigma, then take a mean value of that across all the stars in the field, then go back and do the full fits on each star using a fixed value of sigma. So there is a bit of interesting API design here that I'm happy to take input on if anyone has ideas.

Unit tests failing

test_chipnum() is failing for me with

E           galsim.errors.GalSimConfigError: No input image_file_name available for type = Current
E           Error generating Current value with key = image_file_name

This is master piff and master GalSim.

Make Star class the fundamental class for interacting with both data and fits

In #5, we added a StarFit class to keep track of the fitting information connected to each star. We also added a Star class that doesn't do much more than hold a StarData and StarFit instance as star.data and star.fit. This is convenient to keep track of some of the bookkeeping involved so the fit information stays connected to the relevant data even when we do things like remove outliers.

All the relevant functions are in either StarData or StarFit, and most operations (interpolation, drawing, refitting, etc.) return a new Star object with one or the other component replaced with an updated value.

The problem with this is that some functions still expect a StarData object as an input parameter. So I'd like to clean those up to take a Star instead to be more uniform in the calling parameters.

Similarly, it would be convenient if the input process would create Stars rather than StarData instances, probably with star.fit = None. This would save some boiler plate in the user procedures of calling makeStar to turn a StarData into a Star.

Finally, I think we can move the methods from StarData and StarFit into Star, so the Star would become the fundamental class through which users would interact with the data. Internally, we would still keep the data and fit attributes, since typically only one of these need to be changed by the various functions we have. But that seems like an implementation detail that users shouldn't need to care about.

Model including convolution by another model

One of our Models should be a convolution of an atmospheric PSF model with a (presumably known much better) optical model from wavefront/out-of-focus data.

The main design work here for now is to write a Model class that is a convolution of two other models, once of which is (or at least can be) held fixed and the other has the parameters that are actually being fit.

I'm happy to work on the overall API of this class and make sure data can be passed around the right ways. Then once that's working, I'll ask @cpadavis, @mclaughlin6464, and @aaronroodman to fold in their optical PSF model as the thing we are convolving. But I'll start with something simpler that is easy to test.

Plotify

It would be nice to be able to take an existing piff yaml + output file, modify the plots section of the yaml file, and then remake the plots but without rerunning the actual fit. I suggest "plotify" to be a script that reads a normal looking piff yaml file, but reads the .piff file in instead of writing it out, and then also remakes the plots.

LSSTDESC.Coord easy install sandbox violation

Entering the setup command ran into issues with installing LSSTDESC.Coord automatically with easy_install, complaining about a sandbox violation error, first few and last dozen or so lines copied below

$ sudo python setup.py install

Using setuptools version 28.8.0
Python version = 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609]
packages = ['piff', 'piff.des']
Checking that GalSim can be imported
GalSim is version 1.4.2
Piff version is 0.3

and last dozen lines:

Installed /usr/local/lib/python2.7/dist-packages/Piff-0.3-py2.7.egg
Processing dependencies for Piff==0.3
Searching for LSSTDESC.Coord
Reading https://pypi.python.org/simple/LSSTDESC.Coord/
Downloading https://pypi.python.org/packages/98/24/51b5ec9c4480508f786fe3fe09c7b441d79160ce4a879d1ed9977a585946/LSSTDESC.Coord-1.0.5.tar.gz#md5=317f0a789100aaa461d1ea1167ce2a8f
Best match: LSSTDESC.Coord 1.0.5
Processing LSSTDESC.Coord-1.0.5.tar.gz
Writing /tmp/easy_install-sEKLtA/LSSTDESC.Coord-1.0.5/setup.cfg
Running LSSTDESC.Coord-1.0.5/setup.py -q bdist_egg --dist-dir /tmp/easy_install-sEKLtA/LSSTDESC.Coord-1.0.5/egg-dist-tmp-chpl_K
Python version = 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609]
sources = ['src/Angle.cpp']
headers = ['include/Angle_C.h']
error: SandboxViolation: symlink('../include/', 'coord/include') {}

The package setup script has attempted to modify files on your system
that are not within the EasyInstall build area, and has been aborted.

This package cannot be safely installed by EasyInstall, and may not
support alternate installation locations even if you run its setup
script by hand. Please inform the package's author and the EasyInstall
maintainers to find out if a fix or workaround is available.

However, I got around this and successfully installed Piff after:
$ sudo pip install LSSTDESC.Coord

WCS too complicated for FITS

We will need to figure out how to incorporate information for WCS that are beyond the ability of the FITS convention to represent (for example tree rings, piecewise models for edge distortions, compound maps). My C++ astrometry code saves these maps as YAML files but we will have to think whether there is a way to do this short of writing Python code to interpret these YAML files.

Implement outlier rejection

We need some machinery to handle finding outliers in the interpolation and removing them (and redoing the interpolation).

In the config structure, I think this can be a parameter within the interp field, but in the code, the functionality will be separate from any particular Interpolation class, working with the residuals between the measurements and the interpolated values.

I volunteer to do this, and I plan to implement a few different options. This article has a good discussion of the typical kinds of outlier rejection schemes that are usually employed. I think we should implement their first three options (based on sigma, median absolute deviation, and quartiles), and probably follow their lead in making the MAD scheme the default.

Singular matrix in PixelGrid.reflux

Sometimes the PixelGrid matrix solution will involve a singular matrix according to numpy.linalg:

Building SimplePSF
Iteration 1: Fitting 367 stars
             Total chisq = 341418.36 / 89248 dof
Iteration 2: Fitting 367 stars
             Removed 3 outliers
             Total chisq = 42972.41 / 88515 dof
Iteration 3: Fitting 364 stars
             Removed 3 outliers
             Total chisq = 27166.78 / 87793 dof
Iteration 4: Fitting 361 stars
             Removed 3 outliers
             Total chisq = 23088.40 / 87059 dof
Iteration 5: Fitting 358 stars
             Removed 3 outliers
             Total chisq = 21235.73 / 86318 dof
Iteration 6: Fitting 355 stars
             Removed 2 outliers
             Total chisq = 20214.33 / 85828 dof
Iteration 7: Fitting 353 stars
ERROR: LinAlgError: Singular matrix [numpy.linalg.linalg]
Traceback (most recent call last):
  File "/astro/u/mjarvis/bin/piffify", line 5, in <module>
    pkg_resources.run_script('Piff==0.1', 'piffify')
  File "/opt/astro/SL64/anaconda/lib/python2.7/site-packages/pkg_resources.py", line 505, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/opt/astro/SL64/anaconda/lib/python2.7/site-packages/pkg_resources.py", line 1245, in run_script
    execfile(script_filename, namespace, namespace)
  File "/direct/astro+u/mjarvis/lib/python2.7/site-packages/Piff-0.1-py2.7.egg/EGG-INFO/scripts/piffify", line 93, in <module>
    main()
  File "/direct/astro+u/mjarvis/lib/python2.7/site-packages/Piff-0.1-py2.7.egg/EGG-INFO/scripts/piffify", line 90, in main
    piff.piffify(config, logger)
  File "/astro/u/mjarvis/lib/python2.7/site-packages/Piff-0.1-py2.7.egg/piff/config.py", line 130, in piffify
    psf.fit(stars, wcs, pointing, logger=logger)
  File "/astro/u/mjarvis/lib/python2.7/site-packages/Piff-0.1-py2.7.egg/piff/simplepsf.py", line 151, in fit
    for s in self.stars]
  File "/astro/u/mjarvis/lib/python2.7/site-packages/Piff-0.1-py2.7.egg/piff/pixelgrid.py", line 633, in reflux
    df = np.linalg.solve(alpha, beta)
  File "/astro/u/mjarvis/.local/lib/python2.7/site-packages/numpy/linalg/linalg.py", line 384, in solve
    r = gufunc(a, b, signature=signature, extobj=extobj)
  File "/astro/u/mjarvis/.local/lib/python2.7/site-packages/numpy/linalg/linalg.py", line 90, in _raise_linalgerror_singular
    raise LinAlgError("Singular matrix")
numpy.linalg.linalg.LinAlgError: Singular matrix

We should at least guard against this kind of thing and exclude the star that had the problem. (I'll go ahead and do that directly on master.) But probably the better solution would be to figure out why this happens and find a real workaround. Possibly using svd for the solution rather than the normal lud.

Develop generic code to do statistics on the reconstructions

For example rho statistics. But we also probably need a more generic framework. We probably want to develop new statistics for the simulations where we know the truth.

Mike has a rho statistics code. Perhaps we can start a new repository with that code as the basis, and generalize it.

DECam pupil plane not sampled well enough.

The unit tests currently emit the warning:

UserWarning: Input pupil plane image may not be sampled well enough!
Consider increasing sampling by a factor 28.444900, and/or check PhaseScreenPSF outputs for signs of folding in real space.

@jmeyers314 @cpadavis I thought we (read Josh) solved this problem through updates to GalSim, but I guess the changes haven't been propagated into Piff yet. Unfortunately, I don't remember exactly what change needs to be made in the Piff code. Could one of you take a look and see if we can fix this behavior?

Write polynomial interpolation class

@joezuntz has already started working on this on branch joe_polynomial. (See also branch polynomial_interp that @danielgruen started work on, but he said he thinks all of that work is superseded by what Joe has done.)

Not much to say here. Should take an arbitrary order for each coordinate. Maybe options for the kind of polynomial? Cheby, Lagrange, etc. That might not be important though.

Simulations to test PSF models and shear recovery

Generate galsim simulations of star fields and associated galaxies.

The motivation is

  • Create star images that can fed into a PSF fitting and interpolation algorithm. This is useful in itself, since rho statistics can be computed
  • Create associated galaxy images to be used in shear recovery tests.

These are both important: it may be that two algorithms perform equivalently in rho stats but not in shear recovery.

The sims will ideally have the following features

  • Realistic optical and atmospheric models
  • Pure star fields, with stars placed with some specified density in a focal plane.
  • Separate galaxy images, sampling the same underlying PSF, but at different locations.

If we produce MEDS files for the galaxies a number of people can run on the simulation immediately without any learning curve.

The star fields should represent a single CCD image, with proper WCS etc., which can be fed into standard packages such as PSFEx, and eventually PIFF

I (Erin) am happy to do the leg work running these sims but I need help generating the galsim configuration files

Average of PSF's parameters !=0 across different exposures & GP interp

Hi,

I am working right now on atmospheric fit from DES Y1 images. By looking PSF’s parameters averages across 23 exposures, I notice (like @cpadavis ), that you can observed a systematic average, different from zeros and that vary across the FoV. Some of those average fluctuation are correlated to some CCD position. Other looks like rings pattern, which I guess is something known from SV and Y1 data and solved for Y3 with better astrometric solution.

Even if those problem disappear in Y3, I expect to have this same kind of problem (average across exposures !=0), and this is an issue for both Gaussian Process algorithms implemented within Piff. Indeed, the assumption of the GP is that the gaussian fluctuation are around an average function that can be a function of the coordinates (which is what I observed on Y1 data). So, not take into account this average will affect, the shape of the spatial correlation, the hyperparameters found and the quality of the interpolation.

This is not a big deal to treat that correctly, it just needs to subtract this average function determine across different exposure on a single exposure before to start the interpolation and to add it back when you get your GP interpolation.

I would like to implement this new features within both GP algorithm that are implemented within Piff. Basically to do that, I want to feed the GP class with a new argument that are a function of the coordinate and which return the average of the PSF’s parameters. However, I know that Piff should work also from a config files so my idea is to do something like that:

Using the dill package (which is pip installable and equivalent to pickle) you can wrap a function/class within a pickle files and after re-used the function by opening the pickle file. An example:

import dill 

class average:
	def __init__(self, coeff_x, coeff_y):
		self.coeff_x = coeff_x
		self.coeff_y = coeff_y

	def __call__(self, x, y):
		return self.coeff_x * x + self.coeff_y * y

mean = average(2, 3)
print mean(5, 2)
dill.dump(mean, open("test.pickle", mode="wb"))

and now from everywhere this is possible to re-used this function with coefficient 2 and 3:

import dill 
mean  = dill.load(open("test.pickle", mode="rb"))
print mean(5, 2)

I guess with this implementation, you just need to feed the config file with this pickle file and within the GP class, used directly the function implemented within the pickle.
Do you think this is a good API to do it like that?

I am writing a new branch to implement this spatial average define in a pickle file. If you think, it should be done in an other way, comments are welcome. But I am convince at least, GP implementation should care about average across exposures because this is something needed to make GP interpolation work on real data.

Cheers,
PF

run on new PSF simulations

We have created some PSF simulations to test PIFF and PSFEx. I can give more details as needed, but in brief these are using galsim with

  • realistic star flux and S/N distributions based on DES, with a cutoff at low flux and high flux
  • DES optical paramaters, with spatial variation
  • wcs based on DES
  • Kolmogorov atmosphere
  • randomly located stars, and thus blending

Still TODO

  • add tree rings and edge distortions
  • galaxies (for testing s/g separation and blending further, and ultimately shear recovery)
  • full field of view simulation, with common optical and atmospheric models for all CCD images

Logical steps in processing

  • run SExtractor or other reduction - volunteer needed
  • run PSFEx - volunteer needed
  • run PIFF - volunteer needed

The first step in analysis is running SExtractor or another reduction such as LSST DM stack. This is important since the stars are intentionally placed randomly and thus blended.

I think the next logical step is running PSFEx. This is an interesting test in itself since PSFEx is widely used, including in DES. We will learn something about PSFEx from this, and probably iron out some bugs in the sims.

Last is running PIFF. I don't know the current status, whether this is feasible at a large scale.

Gaussian mixtures model

@esheldon suggested adding a Gaussian mixtures model. Of course, ngmix already does this, so it should be pretty straightforward to write a Model that mostly just calls out to ngmix functionality to do the work.

Test failure from numpy bitwise_and

You may know about this already, but with my version of numpy 1.11.0 I get this test failure.
(I got many other test failures with numpy 1.9 as well; upgrading solved them).

======================================================================
ERROR: Test the whole process with a DES CCD.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/Users/jaz/src/piff/tests/test_pixel.py", line 799, in test_des_image
    logger=logger)
  File "/Users/jaz/src/piff/tests/fit_des.py", line 160, in fit_des
    original = stardata_from_fits(ff, xysky, logger=logger)
  File "/Users/jaz/src/piff/tests/fit_des.py", line 77, in stardata_from_fits
    good = np.bitwise_and(msk.array, badmask)==0
TypeError: ufunc 'bitwise_and' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''

corrupted psf reconstruction

Using this file at BNL /astro/u/mjarvis/work/y3_piff/y3a1-v23/362921/D00362921_r_c60_r2331p02_piff.fits

And the following code

p=piff.read("/astro/u/mjarvis/work/y3_piff/y3a1-v23/362921/D00362921_r_c60_r2331p02_piff.fits")
gsim=p.draw(y=3383.0, x=260.0)

I get the following reconstruction
test

Start work on analysis framework

@cpadavis has started work on branch psf_stats building an analysis framework for testing how good a model is on a given set of data.

This will simplify a lot of the analysis that has previously required several different pieces of software. It would be nice to be able to

  • build the PSF model
  • measure some aspects of the model and stars (e.g. size, e1, e2, maybe others)
  • compute some statistics of these (e.g. rho statistics)
  • plot the results

all from within Piff. For now, I would focus on the path of information flow. Having a single choice for each step (e.g. hsm for measurements, rho1 and rho2 for stats) is fine. Then once we settle on the design of this, we can work on adding more statistics and measurements that we might want.

Will Piff select stars for you?

I notice that even in the DES example it uses a catalog already processed by findstars. I bet we'll want Piff to one day select stars, too.

Where would this go? This seems like it should be an Input class, but I can also see it being some kind of Outlier class too, if your assessment of whether a star is really a star is changing as you fit your PSF.

fit_des.py is broken and extraneous and probably should be deleted

Here's a fun thing I learned:

just running nosetests in the tests directory currently doesn't run the test in fit_des. It looks like you need to have the file name have a test out front, and also have the function name be test_blah(); either change by itself won't work.

If you specifically run fit_des, you quickly find that the code is outdated and broken.

But then I looked more into it, and I think test_pixel does everything in fit_des anyways; fit_des.py has just been sitting in the tests directory silently broken for probably the past couple months.

So, is there any reason I shouldn't just delete fit_des.py?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.