Code Monkey home page Code Monkey logo

Comments (21)

cshanahan1 avatar cshanahan1 commented on June 26, 2024 1

Just getting caught up on this discussion. @tddesjardins If you construct a WCS with a hdu that contains a path to an NPOL file, the transformations that do more than just the core WCS (like all_pix2world etc...) will use those terms along with the SIP coefficients. You have to remove the link to the NPOL file in the header and create a WCS object from that if you want to turn that off (correct me if there is a better way). It has been a little tricky, in my experience, to verify after the fact if a WCS object does reference an NPOL file to use for transformations, or if it is only SIP. There doesn't see to be an attribute I can access. Again, correct me if I am missing something in the documentation. Also, I don't know what the case for ACS is but in the documentation for WFC3 PAMs it is clear that only the polynomial coefficients were used to create the maps, so I would think that users are operating under that assumption already.

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

Certainly more discussion could be added @mackjenn. I think a short discussion would be useful in the notebook. Anything more than a short discussion might be better served on the ACS/WFC3 websites just for keeping this particular notebook simple.

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

Also tagging @mcara as he will be interested in this.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@mackjenn @tddesjardins I have tried to use data available at http://www.stsci.edu/hst/acs/analysis/PAMS I do not know how to put distortion coefficients "back" into the IDCTAB genie bottle. So, I just got the IDCTAB filenar11046j_idc.fits and then I removed all other distortions (tabular, TDD, etc.) I got close to 0.1%. Still not 0. However, when I looked at the IDCTAB file I did not see the coefficients published here: http://www.stsci.edu/hst/acs/analysis/PAMS/WFC_coeffs1 So, it is quite possible that I did not manage to completely reproduce 2004 conditions. It would be very simple if I had the "original" 2004 data file to create a WCS.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

Given that we may never see the original 2004 file, I would suggest a different test: Take a modern file and:

  1. strip all other distortions except for standard geometric distortion (that gets converted to SIP);
  2. use 2004 script to create PAM, if possible. If not, use modern tools (drizzle, blot, etc. - they should be able to work directly with SIP).
  3. run my tool on the same "modern" file (with SIP only)
  4. compare the results.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@tddesjardins @mackjenn I did a test that, in my opinion, verifies that stsci.skypac.pam_utils functions work correctly. I was not able to re-create the file used by R. Hook in 2004. Instead I used a more "modern" ACS/WFC file and removed all non-polynomial distortions - only SIP and velocity aberration were left (if applied in updatewcs).

In addition, I have modified pam_utils code to allow users to "ignore" VA correction, that is, to generate PAM as if WCS was not VA-corrected. In the original version this was done by default due to better agreement with Hook's 2004 PAM. However, I believe this is not the correct thing to do if one wants more accurate PAM. Therefore the new version will have this option turned off by default. (However, the effect of this VA correction is small)

In this comparison I got maximum relative error (%) between drizzle- and skypac-generated PAM images is 2e-4 (%) and the mean relative error (%) is 2e-5 (%)! This is close to FLOAT32 precision and it is also limited by the default accuracy of inverse (world->detector) coordinate transformations used in AstroDrizzle (1e-4). Improperly ignoring VA correction leads to errors larger by a single factor (~1e-3 (%)).

The new method should be more accurate than AstroDrizzle because this method does not use inverse coordinate transformations. In addition it is about 30x faster than drizzle/blot approach. However, stsci.skypac.pam_utils does not take into account non-polynomial (tabular) distortions.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@mackjenn A note of caution about WFC3 PAM: those are normalized to 1 at CRPIX (unlike the ACS's PAM). I am adding an option to my script to normalize PAM at CRPIX if desired. Old code was not normalizing PAM.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

Another test. This time I kept all distortions when drizzling. Since stsci.skypac.pam_utils uses polynomial distortions only and ignores all non-polynomial ones, the error (compared to drizzle/blot approach) is larger:

WFC2: max error=0.07%; mean error=0.009%
WFC1: max error=0.08%; mean error=0.015%

These measurements exclude 5 pixels around the border of FLT images in order to avoid edge effects when drizzling.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

In my analysis, it seems that the biggest contribution to errors (differences between PAM generated using SIP-only approach of stsci.skypac.pam_utils and drizzle/blot-generated PAM) comes from the DET2IM distortion. If I turn this distortion off, then the error (discrepancy) is mostly due to NPOL distortions with max error about 0.02 (%) and mean error of 0.002 (%). Therefore, for all non ACS/WFC instruments we should expect similar, or even, likely, smaller errors than the ones obtained with NPOL- and SIP-only distortions for ACS/WFC (no DET2IM).

from acs-notebook.

mackjenn avatar mackjenn commented on June 26, 2024

These tests are very interesting, Mihai!
Thanks for reporting the results. It's helpful to know the impact of the various distortion corrections on the PAM.
Regarding the velocity aberration, this report suggests that the maximum shift at the edges of the detector due to scale changes as a result of HST's orbit is ~0.1 pixels, and ~0.5 pixels due to Earth's orbit. Just thought it would be useful to add a note to this thread for completeness.
http://www.stsci.edu/hst/HST_overview/documents/calworkshop/workshop2002/CW2002_Papers/cox.pdf

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@mackjenn Just to clarify, my code does take into account velocity aberration (get the latest update to the software) AND SIP but it NOT NPOL or DET2IM.

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

I'm surprised it doesn't do NPOL and DET2IM. Can astropy.wcs not handle those, @eteq?

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

It has nothing to do with astropy's WCS. It has to do with the method used here: (essentially) symbolic computation of area given polynomial coefficients from SIP (and CD matrix, of course).

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

Okay, I'll have to think about that a little more to see if we can use the astropy.wcs module. From what I understand based on this sentence:

"WCS objects perform standard WCS transformations, and correct for SIP and distortion paper table-lookup transformations, based on the WCS keywords and supplementary data read from a FITS file."

The astropy.wcs.WCS() object should be able to understand NPOL and DET2IM. If that's true, then we could use it to compute the distortion corrected CD matrix. @shannnnnyyy might also be able to give some input on that.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@tddesjardins No way you can compute a "distortion corrected" CD matrix from NPOL or DETIM. Yes, you could use astropy.wcs to perform full distortion correction when computing coordinates but then this is what drizzlepac already does. So, if this accuracy is not good enough - use either drizzlepac or design a method that uses full WCS transformations.

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

Sorry @mcara, I didn't mean to imply you only needed those two tables. I just meant we could use astropy.wcs to get the full WCS transformation including those two tables. That was what me and @shannnnnyyy were working on originally, but got stuck on how to make it speedy because of how we were trying to use it. I do agree most people should probably just use drizzlepac, though. We just need to be clear about the lack of NPOL and DET2IM when using these PAMs if that's what users want to do, so this is good to know about.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@shannnnnyyy

You have to remove the link to the NPOL file in the header and create a WCS object from that if you want to turn that off (correct me if there is a better way).

Yes, there is a better way. First create a WCS object from HDUList and Header:

w = WCS(header, hdulist)

Then set non-polinomial corrections to None like so:

w.cpdis1 = None
w.cpdis2 = None
w.det2im1 = None
w.det2im2 = None

You cannot turn off velocity aberration correction this way but it is easy to undo it by re-scaling the CD (w.wcs.cd) matrix by w.vafactor.

... in the documentation for WFC3 PAMs it is clear that only the polynomial coefficients were used to create the maps, so I would think that users are operating under that assumption already.

That is true and that was my assumption when I wrote my script: I wanted to be able to do what calibration team did when they generated PAM for the instruments - just to do it faster without all this astrodrizzle and blot dancing.

from acs-notebook.

tddesjardins avatar tddesjardins commented on June 26, 2024

I'll have to check with Norman when he gets back then, because I think he wanted to include the full distortion model in the new PAMs for ACS at least. The goal of the PAM was to make it match the photometry of the distortion-corrected drizzle products, and I don't know about WFC3's PAMs, but ACS's were so out of date because we didn't have NPOLFILE or velocity aberration corrections etc. when we made our static files and put them up on the website.

from acs-notebook.

mackjenn avatar mackjenn commented on June 26, 2024

@mcara Yes, I understand your code accounts for velocity aberration but the static PAM did not.
So i just wanted to quantify the size of that effect.... though I give it in pixels...
To quantify the impact on the PAM, one would have to run it using two different extreme values and then take the ratio.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@mackjenn I think the effect of velocity aberration in relation to PAM must be negligible. That 0.1pix from the cited paper 2002 refers to change in pixel position. Area change is a differential effect and therefore it will be much, much smaller than 0.1**2. Even 0.1**2 would be a 1% effect, but taking differential nature of area change, I could bet that for all purposes we can neglect VA in relation to PAM.

from acs-notebook.

mcara avatar mcara commented on June 26, 2024

@tddesjardins Two points:

... ACS's were so out of date because we didn't have NPOLFILE ...

Well, most likely the biggest factor in being "out of date" are improved linear and polynomial corrections (if any) that calibration team may have found in recent years. NPOLFILE is just ~0.02% correction. Not only that, but also tweakreg adjusts the scale and skew of CD matrix. Other packages may be used for higher-order corrections. Therefore, it is useful to be able to re-compute PAM for individual images. But these are polynomial kind of corrections - not tabular.

... he wanted to include the full distortion model in the new PAMs for ACS at least. The goal of the PAM was to make it match the photometry of the distortion-corrected drizzle products ...

I am not sure this is would provide the correct (as opposite to a matching drizzle) PAM. That is, I am not sure that matching drizzle is the correct thing to do in relation to PAM but it may depend on the usage and I do not know how these PAMs are going to be used.

from acs-notebook.

Related Issues (14)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.