Code Monkey home page Code Monkey logo

Comments (9)

torfjelde avatar torfjelde commented on August 15, 2024

Softmax isn't bijective. The one we have now is (maps from d to d-1 dimensional)

from bijectors.jl.

devmotion avatar devmotion commented on August 15, 2024

See also #51.

from bijectors.jl.

Red-Portal avatar Red-Portal commented on August 15, 2024

Betanalpha does discuss a bijective softmax by arbitrarily setting the endpoint logits. Any experience with this?

from bijectors.jl.

devmotion avatar devmotion commented on August 15, 2024

That's supported e.g. in GPLikelihoods (see maybe also the discussion in JuliaGaussianProcesses/GPLikelihoods.jl#55).

from bijectors.jl.

Red-Portal avatar Red-Portal commented on August 15, 2024

Good to know thanks. Though, back to my original intention, I really wish that our simplex bijector could play nicely with GPUs out of the box. Among non-NF bijectors, it seems the simplex bijector is really going to be the big challenge going in that direction. Do we have any plans on how to pursue this? It does seem to me that the softmax approach would be much easier to get this done.

from bijectors.jl.

Red-Portal avatar Red-Portal commented on August 15, 2024

Actually, nevermind. I just wrote a stick-breaking bijector using array operations based on the implementations of numpyro and tensorflow. If this were to be added to Bijectors.jl we'll probably have to add a CUDA array specialization. Let me know how to proceed on this.

from bijectors.jl.

devmotion avatar devmotion commented on August 15, 2024

On Julia >= 1.9, a CUDA specialization could be put in an extension (possibly could even just be an extension with GPUArrays).

from bijectors.jl.

Red-Portal avatar Red-Portal commented on August 15, 2024

I do have the feeling that this will have to wait until the batch operation interface is finalized. @torfjelde Do we have an expectation on when that would be?

from bijectors.jl.

sethaxen avatar sethaxen commented on August 15, 2024

There are three main ways to use softmax for simplex transforms. One uses parameter expansion to retain bijectivity: f(y) = [softmax(y); logsumexp(y)]. The other two come from compositional data analysis literature are called additive log-ratio f(y) = softmax(vcat(y, 0)) and isometric log-ratio f(y) = softmaxx(V * y) for a particular choice of semi-orthogonal matrix V. I'm currently testing performance of each of these versus stick-breaking.

from bijectors.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.