Code Monkey home page Code Monkey logo

caustics's People

Contributors

fbartolic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

kmzzhang

caustics's Issues

Integration with starry

Currently caustics supports only the linear limb-darkening law as a model for the intensity profile of the source star disc. This is implemented in _brightness_profile. It would be useful to allow for an arbitrary intensity profile by instead evaluating the intensity of a starry map. This would enable computation of microlensing light curves with spots etc. . The relevant reference is the paper The microlensing signatures of photospheric starspots.

@rodluger would you be interested in this? A basic implementation would involve switching _brightness_profile with a JAX function which takes a vector of spherical harmonic coefficients as an input and return the flux at arbitrary (r, theta) as an output. This seems like a pretty trivial task no?

Rethink tests which trigger the extended full computation

The tests from Bozza et. al. 2018 are quite wasteful in the sense that there are tons of false positives. Here's an example:

image

The red region is where the error is > $10^{-4}$ and the grey region is where the tests says that the full calculation is necessary. I feel like there has to be a way to improve the accuracy of these tests.

Perhaps we should treat this as a machine learning prediction problem and construct physically meaningful features such as
the magnitude of the higher order terms in the multipole expansion and the value of Jacobian.

Related to #19.

Compute the magnification to a specified tolerance

This will require major changes to how the code is organised and the use of bounded while loops. The adaptive Gauss-Kronrod algorithm for estimating the LD integrals (see #13) to a specific precision has to be implemented first.

Being able to specify a target relative precision will substantially improve performance with large sources.

Fix finite source test for triple lenses

Neither the cusp test nor the "ghost images" test from Bozza et.al. 2018 seem to be working for triple lenses. I must be doing something wrong when computing the derivatives of $f(z)$.

A related issue is that as it is currently implemented, the ghost images tests checks that
$$c_g(|I(z_1)| + |I(z_2)|)<\delta$$
where $I$ is an indicator function that evaluates to true if the condition is satisfied, rather than
$$c_g(|I(z_1)|)<\delta\quad \text{and} \quad c_g(|I(z_2)|)<\delta$$
The prohibited regions in the source plane don't look right with the latter condition.

Only way to fix these issues is to re-derive all the math from scratch and check that it all makes sense.

Modify calculation of the intersection point in `_connection_condition` such that it is more numerically stable

In very rare cases the two lines formed by the endpoints of two segments are almost exactly parallel and due to precision issues the intersection points between the two lines can end up being far away from the endpoints of either segment. I need to think about how to differentiate this situation from a situation when the two lines are almost exactly parallel but they do not belong to the same part of the contour.

GPU backened not working on collab

Installing caustics on Collab with a GPU runtime and running the root solver results in an immediate crash of the kernel. Not sure what's going on here, could be a memory issue perhaps.

Implement "ghost image" test from Bozza et.al. 2017

The quadrupole and hexadecapole corrections to the point source magnifications are insignificant when approach a fold from the the outside of a caustic. Bozza et.al. 2018 devised a test (Eq. 47 in the paper) to catch when the source is close to a fold and trigger the full calculation at those points. I've implemented this test but the output looks like this:

image

No idea what these weird curves emerging out of the fold near the top are. Could be a mistake in my implementation.

Improve compilation times

Compilation times are OK when using caustics on its own but they're completely unacceptable when caustics is incorporated within a numpyro model.

Migration to Julia

Using JAX for this problem was a mistake. I'm building a Julia version of the caustics code focused exclusively on binary and triple-lens events. This repository will mostly consists of Python bindings to the Julia code.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.