Comments (4)
Hi @brockf,
I just realised I had never merged the self-tuning HMC sampler into the current version of greta, it was languishing in a fork somewhere.
I've just merged it into the dev branch, so you can install it with:
devtools::install_github(‘goldingn/greta@dev')
Running your example with this version, it has no problem sampling, and converges quite nicely.
One thing to bear in mind it that when there's lots of data, it becomes quite important to provide reasonable initial values. That's because the parameters are very strongly identified (little uncertainty) so the initial conditions can have very low density, leading to rejected samples.
If you still have convergence problems with the new sample, that might be worth a try.
Happy to help if that doesn't work though!
from greta.
Hi Brock, thanks for trying out greta and for the feedback!
So far I've only implemented 'vanilla' Hamiltonian Monte Carlo, which requires manual* tuning of hyperparameters. The leapfrog step length 'epsilon' is the key thing to tune, and can be set via the control
argument to mcmc()
, like this:
mcmc(..., control = list(epsilon = 0.005)
In general the more data, the more extreme the density surface to explore, so a smaller epsilon will lead to higher acceptance rates. For the model above, this works well for me (no rejections):
draws <- mcmc(greta_model, n_samples = 1000, warmup=100,
control = list(epsilon = 0.0001))
In the longer term, I plan to hook STAN's samplers into greta, and their implementation of NUTS seems to do a great job of avoiding this issue of manually tuning the sampler. I'll keep vanilla HMC as an option, since there are circumstances when it's more efficient.
In the short term, I now realise there's no documentation of these control parameters in the mcmc()
help file, so I'll fix that right away. I'll also mention the need to think about tuning epsilon.
I hope this helps for your model, I'd be interested hear what you're using greta for!
*actually the sampler tries to optimise epsilon starting from the value specified in control, but it relies on a good starting value
from greta.
I fixed the docs on master in #61, so closing this now. Please reopen if the issue isn't fixed for you
from greta.
Thanks, @goldingn ! Appreciate the follow-up :)
from greta.
Related Issues (20)
- M1 TF2 dev error: issues sampling variables with bounds HOT 1
- explore using expect_no_error(), expect_no_warning(), expect_no_message(), and expect_no_condition() HOT 1
- M1 TF2 dev errors for new optimisers HOT 3
- greta has crash situation: address 0x50, cause 'memory not mapped' HOT 29
- extra_samples fails when when thin, pb_update and/or n_samples are not multiples HOT 1
- Implement new TFP optimisers?
- Extract convergence and iteration information from TF2 optimisers
- Deprecate optimisers accessed through compat?
- Improve documentation on non-independent sampling using greta_array()
- install_greta_deps() claims success when it has actually failed (windows) HOT 3
- Creating 'greta-env' conda environment using python v3.7 failed HOT 2
- TF2 Errors in `test_inference.R` - bad proposals are not rejected HOT 1
- TF2 error - Failure (`test_inference.R:293)`: mcmc supports rwmh sampler with normal/uniform proposals HOT 3
- TF2 error - Failure (`test_inference.R:317`): mcmc supports slice sampler with single precision models
- TF2 error - Failure (`test_inference.R:351`): numerical issues are handled in mcmc
- TF2 - Failure (`test_inference.R:375`): mcmc works in parallel - cannot find TF installation when using `future`
- TF2 error - "adam" optimiser doesn't converge
- TF2 error - "iterations" object not found for `adagrad_da` `proximal_adagrad` `proximal_gradient_descent` HOT 2
- Implement `outer` function in greta
- `dim<-` doesn't alway create an unknowns object
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from greta.