Code Monkey home page Code Monkey logo

turingmodels.jl's People

Contributors

bobinmathew avatar github-actions[bot] avatar goedman avatar karajan9 avatar rikhuijzer avatar torkar avatar tpapp avatar trappmartin avatar yiyuezhuo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

turingmodels.jl's Issues

Errors when running julia --project -i scripts/basic-example.jl

Error when running
wew-mac:TuringModels.jl witoldwolski$ julia --project -i scripts/basic-example.jl

I am getting errors (see below).

However, starting the Julia REPL with julia and no --project options
and copy-pasting the code from basic-example.jl all works nicely.

When starting julia with the --project option
wew-mac:TuringModels.jl witoldwolski$ julia --project
and copy pasting the same code from basic-example.jl I am also getting the error.

wew-mac:TuringModels.jl witoldwolski$ julia --project -i scripts/basic-example.jl
Sampling 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| Time: 0:00:01
ERROR: LoadError: MethodError: no method matching ADgradient(::Val{:ForwardDiff}, ::WARNING: both Bijectors and Base export "stack"; uses of it in module Turing must be qualified
Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:s, :m), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:s, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:s, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:m, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:m, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gdemo), (:x, :y), (), (), Tuple{Float64, Int64}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext}; gradientconfig::ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 2}}})

Closest candidates are:
  ADgradient(::Val{:ForwardDiff}, ::Any; chunk, tag, x) got unsupported keyword argument "gradientconfig"
   @ LogDensityProblemsADForwardDiffExt ~/.julia/packages/LogDensityProblemsAD/pwc6T/ext/LogDensityProblemsADForwardDiffExt.jl:98
  ADgradient(::Val{kind}, ::Any; kwargs...) where kind
   @ LogDensityProblemsAD ~/.julia/packages/LogDensityProblemsAD/pwc6T/src/LogDensityProblemsAD.jl:68
  ADgradient(::Turing.Essential.ReverseDiffAD{false}, ::Turing.LogDensityFunction) got unsupported keyword argument "gradientconfig"
   @ Turing ~/.julia/packages/Turing/UsWJl/src/essential/ad.jl:115
  ...

Stacktrace:
  [1] kwerr(::NamedTuple{(:gradientconfig,), Tuple{ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 2}}}}}, ::Function, ::Val{:ForwardDiff}, ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:s, :m), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:s, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:s, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:m, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:m, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gdemo), (:x, :y), (), (), Tuple{Float64, Int64}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Base ./error.jl:165
  [2] ADgradient(ad::Turing.Essential.ForwardDiffAD{0, true}, ℓ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:s, :m), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:s, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:s, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:m, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:m, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gdemo), (:x, :y), (), (), Tuple{Float64, Int64}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Turing.Essential ~/.julia/packages/Turing/UsWJl/src/essential/ad.jl:102
  [3] ADgradient(ℓ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:s, :m), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:s, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:s, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:m, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:m, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gdemo), (:x, :y), (), (), Tuple{Float64, Int64}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{NUTS{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.DiagEuclideanMetric}}, DynamicPPL.DefaultContext})
    @ Turing.Essential ~/.julia/packages/Turing/UsWJl/src/essential/ad.jl:82

Link Edition 1 and 2 models

The file index.md contains links from a model number in Edition 1 or 2, to a model web page like models/africa. However, some of the models of Edition 2 are (nearly) identical to Edition 1, so index.md can link to the same file.

To resolve this issue, someone has to read through Statistical Rethinking Edition 2 and check whether some models are the same as the models defined in this repository. If that is the case, a link has to be added to index.md.

Varying Slopes

Hi,

Thanks for putting this work out in the open. I just recently finished McElreath's excellent book and was attempting to translate some of the models over to Julia. I am having a ton of trouble figuring out how to specify a multilevel model in Turing with the correct Multivariate Normal prior for the varying intercepts and slopes like in the cafes model m13.1 in the book.

Do you happen to have any examples of a varying intercept/slope model that uses the MvNormal prior?

Thanks again!

Add Cholesky optimization to LKJ priors when available.

Hi Rob,

It turns out that there is an ongoing effort to add Cholesky factorization to the LKJ prior distribution. This will improve the efficiency and numerical stability of the multilevel models. I am willing to add those improvements when they become available.

NUTS(0.65) versus NUTS(0.95)

In dc67846, Martin Trapp has replaced many occurrences of NUTS(0.95) with NUTS(0.65). Still, I see both in this repository. Is there a reason why sometimes 0.65 should be used and at other times 0.95?

Unclarity about the corresponding book version

It seems that model 13.2 does not correspond to the model in the statistical rethinking book version 2020. In the book, model 13.2 is about tadpoles and tanks, whereas the model in this repository is about gender and department.

Which version of the book is this repository corresponding to?

Rename chains to chn

In the scripts, the variable chains is used all over the place. Unfortunately, chains is also exported by MCMCChains, which can cause issues like

julia> chains = sample(model(a, b), NUTS(0.65), 1000)
ERROR: invalid redefinition of constant chains

Thanks to David Widmann for figuring this out (TuringLang/Turing.jl#1529).

To solve this issue, and avoid problems in the future for us and for users, we would be better of renaming the variables to chn or chns or something.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

Replace `reverse_diff` with `tracker` or remove this line

CompatHelper have suggested support for Turing 0.14 #29 .

But as noted by this comment, ad name reverse_diff is deprecated and will be removed, and exactly have been removed from bijectors#master, so now it will raise such error:

julia> Turing.setadbackend(:reverse_diff)
ERROR: MethodError: no method matching setadbackend(::Val{:reverse_diff})
Closest candidates are:
  setadbackend(::Val{:zygote}) at /home/yiyuezhuo/.julia/dev/Bijectors/src/interface.jl:21
  setadbackend(::Val{:tracker}) at /home/yiyuezhuo/.julia/dev/Bijectors/src/interface.jl:20
  setadbackend(::Val{:reversediff}) at /home/yiyuezhuo/.julia/dev/Bijectors/src/interface.jl:19
  ...
Stacktrace:
 [1] setadbackend(::Symbol) at /home/yiyuezhuo/.julia/packages/Turing/tf1R6/src/core/ad.jl:8
 [2] top-level scope at REPL[39]:1

So to properly support Turing 0.14 or incoming release, this line should be replaced with equivalence :tracker. I think the goal of commit of 6b0d4b4 which replaced reversediff with reverse_diff is that ReverseDiff is using ReverseDiff is missing, so this line just not works, while the goal of old commit dc67846 which replaced reverse_diff with reversediff are considerations of efficiency and the fact that reverse_diff is outdated. As some experiments show ReverseDiff (:reversediff backend) is fastest backend in many cases.

But in my opinion, most models specified by TuringModels are small enough, so default forwarddiff is fastest in most cases. It may be better to just remove these lines (same reason ForwardDiff is selected as default AD instead of Tracker). Write tests to compare efficiency between reversediff and forwarddiff are also useful.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.