Code Monkey home page Code Monkey logo

Comments (17)

twiecki avatar twiecki commented on August 21, 2024 4

My biased opinion is that this is a great idea :). CC @jsalvatier @fonnesbeck.

@dustinvtran Are you in touch with Kevin Murphy by any chance?

We have discussed either allowing for an additional backend like TensorFlow (probably results in a pymc3 fork), or using something like https://github.com/dementrock/tensorfuse to support switching out backends.

For the first option, it shouldn't be too difficult to change out the backend. Theano is used to compute the model logp and its gradient. The step methods and variational inference are pure python (only some parts we sped up by theano). The distributions should be easiest to port (see e.g. https://github.com/pymc-devs/pymc3/blob/master/pymc3/distributions/continuous.py). model.py then combines those to the full model model graph (see e.g. https://github.com/pymc-devs/pymc3/blob/master/pymc3/model.py#L164).

Or were you imagining a different way of combining them?

from edward.

fonnesbeck avatar fonnesbeck commented on August 21, 2024 3

Keep in mind that PyMC isn't really another language -- one is essentially coding models in pure Python, by instantiating high-level PyMC objects. We have studiously avoided exposing most of Theano to users. The DSL (if you want to call it that) just uses standard Python idioms, for the most part (e.g. context managers for specifying models).

from edward.

datnamer avatar datnamer commented on August 21, 2024

@twiecki

Pymc3 has the most appealing DSL in my opinion.

from edward.

dustinvtran avatar dustinvtran commented on August 21, 2024

Hi @twiecki @fonnesbeck. Thanks for the description! Yes, it would be great to have PyMC as a modeling language if it's not much effort to get it in. Generally, the library just needs a way to calculate the model's log joint density and its gradient, log p(x, z).

The easiest way to get PyMC into the library is just to treat the model's log density function as one node in TensorFlow's graph. There's some black box in how it's computed, and it need only take the right inputs and spit out the right output, as well as a way to differentiate this density. If there's a way to get log density evaluations and their gradient from PyMC programs, then this can be easily achieved.

Further integration would be as you said, where we treat the PyMC program not as one node but its own corresponding graph from Theano ported over to TensorFlow.

from edward.

twiecki avatar twiecki commented on August 21, 2024

OK, that should be pretty simple actually, as we expose model.logp() and model.dlogp() directly.

from edward.

dustinvtran avatar dustinvtran commented on August 21, 2024

Perfect. Then the first step would be to write a wrapper in the same way one for Stan is written. It will take a vector of parameters from TensorFlow and converts it to the right dictionary format to plug into model.logp().

from edward.

akucukelbir avatar akucukelbir commented on August 21, 2024

nice!

from edward.

twiecki avatar twiecki commented on August 21, 2024

This seems to work:

import edward as ed
import numpy as np
import tensorflow as tf
from edward import PythonModel, Variational, Beta
from scipy.stats import beta, bernoulli
import pymc3 as pm
import theano
import numpy as np

data_shared = theano.shared(np.zeros(1))

with pm.Model() as model:
    beta = pm.Beta('beta', 1, 1, transform=None)
    out = pm.Bernoulli('data', 
                       beta,
                       observed=data_shared)

class PyMC3Model:
    """
    Model wrapper for models written in PyMC3.

    Arguments
    ----------
    model : pymc3.Model object
    observed : The shared theano tensor passed to the model likelihood
    """
    def __init__(self, model, observed):
        self.model = model
        self.observed = observed

        vars = pm.inputvars(model.cont_vars)

        bij = pm.DictToArrayBijection(pm.ArrayOrdering(vars), model.test_point)
        self.logp = bij.mapf(model.fastlogp)
        self.dlogp = bij.mapf(model.fastdlogp(vars))

        self.num_vars = len(vars)

    def log_prob(self, xs, zs):
        return tf.py_func(self._py_log_prob, [xs, zs], [tf.float32])[0]

    def _py_log_prob(self, xs, zs):
        n_minibatch = zs.shape[0]
        lp = np.zeros(n_minibatch, dtype=np.float32)
        self.observed.set_value(xs)
        for s in range(n_minibatch):
            lp[s] = self.logp(zs[s, :])

        return lp

data = ed.Data(np.array([0, 1, 0, 0, 0, 0, 0, 0, 0, 1]))
m = PyMC3Model(model, data_shared)
variational = Variational()
variational.add(Beta(m.num_vars))
inference = ed.MFVI(m, variational, data)
inference.run(n_iter=10000)

from edward.

twiecki avatar twiecki commented on August 21, 2024

Does it not require the gradient somewhere?

from edward.

akucukelbir avatar akucukelbir commented on August 21, 2024

this is cool. i'll try to play around with this sometime soon.

@twiecki we implicitly end up calling tensorflow's autodiff in ed.MFVI

from edward.

twiecki avatar twiecki commented on August 21, 2024

But how can it compute the gradient over the pymc3 blackbox node?
On May 9, 2016 8:12 PM, "Alp Kucukelbir" [email protected] wrote:

this is cool. i'll try to play around with this sometime soon.

@twiecki https://github.com/twiecki we implicitly end up calling
tensorflow's autodiff in ed.MFVI


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#20 (comment)

from edward.

akucukelbir avatar akucukelbir commented on August 21, 2024

in this particular model, we use the score function gradient estimator. thus, we only need to evaluate _py_log_prob and the autodiff'ing happens on the Beta in variational.

from edward.

dustinvtran avatar dustinvtran commented on August 21, 2024

Awesome! This is excellent. Yes, for this Beta-Bernoulli model we just use the score function gradient which does not require model gradients.

When I run the code, it seems to converge to roughly -9.6. The model codes converge to around -6.2. Is this a bug or something different in how this model is written compared to the other Beta-Bernoulli examples?

from edward.

twiecki avatar twiecki commented on August 21, 2024

Ah, this is due to the autotransformation we apply to beta. Setting it to None fixes it and converges to -6.2. I updated the code above.

Regarding the autodiff -- is there an API for the model gradient?

Also, is mini-batching supported right now? Seems like xs would also need to be a list of sub-samples?

from edward.

dustinvtran avatar dustinvtran commented on August 21, 2024

That's really cool. So I'm not too familiar with PyMC3 syntax. Is there a reason that data has to appear both when defining the model and when calculating the joint log density? Also, do you mind submitting a pull request so we can officially merge this in? :)

Regarding the autodiff -- is there an API for the model gradient?

It's mentioned here in TensorFlow's API but I haven't spent much time figuring out how to register gradients. It would be excellent to do this though.

Also, is mini-batching supported right now? Seems like xs would also need to be a list of sub-samples?

Yes, the default is that we subsample from the first slice of the data structure, so random elements from a vector in this case, or more generally rows in a matrix or matrices from a tensor. You can see this in edward/data.py.

from edward.

twiecki avatar twiecki commented on August 21, 2024

That's really cool. So I'm not too familiar with PyMC3 syntax. Is there a reason that data has to appear both when defining the model and when calculating the joint log density?

It's not required, you only have to set something for the shape. I changed the above example to do so.

Also, do you mind submitting a pull request so we can officially merge this in? :)

Yeah, I'll do that.

from edward.

twiecki avatar twiecki commented on August 21, 2024

Here we go: #75

from edward.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.