Code Monkey home page Code Monkey logo

turingtutorials's Introduction

Turing Tutorials

Build status Code Style: Blue

This repository contains tutorials and docs on the probabilistic programming language Turing.

The tutorials are defined in the tutorials folder. All the outputs are generated automatically from that.

Additional educational materials can be found at StatisticalRethinkingJulia/SR2TuringPluto.jl, which contains Turing adaptations of models from Richard McElreath's Statistical Rethinking. It is a highly recommended resource if you are looking for a greater breadth of examples.

Interactive Notebooks

To run the tutorials interactively via Jupyter notebooks, install the package and open the tutorials like:

# Install TuringTutorials
using Pkg
pkg"add https://github.com/TuringLang/TuringTutorials"

# Generate notebooks in subdirectory "notebook"
using TuringTutorials
TuringTutorials.weave(; build=(:notebook,))

# Start Jupyter in "notebook" subdirectory
using IJulia
IJulia.notebook(; dir="notebook")

You can weave the notebooks to a different folder with

TuringTutorials.weave(; build=(:notebook,), out_path_root="my/custom/directory")

Then the notebooks will be generated in the folder my/custom/directory/notebook and you can start Jupyter with

IJulia.notebook(; dir="my/custom/directory/notebook")

Contributing

First of all, make sure that your current directory is TuringTutorials. All of the files are generated from the jmd files in the tutorials folder. So to change a tutorial, change one of the .jmd file in the tutorials folder.

To run the generation process, do for example:

using TuringTutorials
TuringTutorials.weave("00-introduction", "00_introduction.jmd")

To generate all files do:

TuringTutorials.weave()

If you add new tutorials which require new packages, simply updating your local environment will change the project and manifest files. When this occurs, the updated environment files should be included in the PR.

Credits

The structure of this repository is mainly based on SciMLTutorials.jl.

turingtutorials's People

Contributors

arnaudmgh avatar awellis avatar azev77 avatar chrisrackauckas avatar cpfiffer avatar devmotion avatar dilumaluthge avatar enweg avatar github-actions[bot] avatar jaimerzp avatar jeremiahpslewis avatar leachim avatar lovelybuggies avatar mkborregaard avatar mrchaos avatar pavanchaggar avatar rikhuijzer avatar ritesh99rakesh avatar saumyagshah avatar sebastiancallh avatar sshkhr avatar staticfloat avatar sunxd3 avatar torfjelde avatar trappmartin avatar vaibhavdixit02 avatar willtebbutt avatar xukai92 avatar yebai avatar yongchaohuang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

turingtutorials's Issues

BNN Variational Inference Error

@torfjelde the BNN tutorial is failing because

update(q, (μ, exp.(ω)))

doesn't seem to work anymore, because update doesn't seem to be exported anymore. I tried calling

Variational.update(q, (μ, exp.(ω)))

but there's a new error:

MethodError: no method matching update(::DistributionsAD.TuringDiagMvNormal{Array{Float64,1},Array{Float64,1}}, ::Tuple{Array{Float64,1},Array{Float64,1}})
Closest candidates are:
  update(::DistributionsAD.TuringDiagMvNormal, ::Any, !Matched::Any) at /home/cameron/.julia/packages/Turing/cReBm/src/variational/advi.jl:8
  update(!Matched::TransformedDistribution, ::Any...) at /home/cameron/.julia/packages/Turing/cReBm/src/variational/advi.jl:9

Stacktrace:
 [1] update(::TransformedDistribution{DistributionsAD.TuringDiagMvNormal{Array{Float64,1},Array{Float64,1}},Stacked{Tuple{Identity{1}},1},Multivariate}, ::Tuple{Array{Float64,1},Array{Float64,1}}) at /home/cameron/.julia/packages/Turing/cReBm/src/variational/advi.jl:9
 [2] top-level scope at In[28]:10

What's the fix here?

Ornstein-Uhlenbeck / GP

Do a comparison of a vanilla Ornstein-Uhlenbeck implemented in Turing, and Stheno GP-based implementation.

Gaussian Mixture Models

Here's the original example. This one is fairly well commented and just needs a bit more background info thrown in. I may or may not be able to handle this one.

On-line thinning

Is there some way to store thinned MCMC chains without doing post sampling thinning? I have a huge number of parameters that I run with AdvancedMH and only want to save lets say every 10th sample in order not to run into storage limitations.

Stochastic Tunneling

@mohamed82008 wrote a nifty thing to do stochastic tunneling and it might be nice to do a little write up on it.

using Distributions, LinearAlgebra

mutable struct ObjDist{F, Tobj, Tsol, Tstep, Tbound} <: Distribution{Multivariate, Continuous}
    f::F
    best_obj::Tobj
    best_sol::Tsol
    step::Tstep
    lb::Tbound
    ub::Tbound
end

function ObjDist(f, N=1; step = 1.0, lb=-Inf, ub=Inf)
    x0 = rand.(TruncatedNormal.(zeros(N), step, lb, ub))
    obj = f(x0)
    return ObjDist(f, obj, x0, step, lb, ub)
end

function Base.rand(dist::ObjDist)
    N = length(dist.best_sol)
    r = rand.(TruncatedNormal.(dist.best_sol, dist.step, dist.lb, dist.ub))
    return r
end

function Distributions.logpdf(dist::ObjDist, x::AbstractVector)
    obj = dist.f(x)
    if obj > dist.best_obj || isnan(dist.best_obj)
        dist.best_obj = obj
        dist.best_sol .= x
    end
    return obj
end

using Turing

function STUN(f, N, alg = MH(10000))
    dist = ObjDist(f, N)
    @model obj_model() = begin
        obj ~ dist
    end
    sample(obj_model(), alg)
    return dist.best_sol, dist.best_obj
end

STUN(x->-norm(x .- 20), 3)

Issues in tutorials that needs to be resolved

Issues to resolve after jmd-staging reaches master:

  1. 10-bayesian-differential-equations
    • Very last cell which runs inference in parallel fails.
    • MonteCarloSummary apparently not there anymore?
  2. 09-variational-inference
    • Printing of docs for certain functions will be put int the document as markdown which is not what we want.

Also in general, we should go through all the tutorials and ensure that the packages are up to date.

New tutorial request: Bayesian Structural Time Series model

Hello. I was just putting in a request for a tutorial on Bayesian Structural Time Series models using Turing. Structural time series models seem to be getting a lot more attention, especially in forecasting applications. There is a tutorial on structural time series using Tensorflow Probability, that could probably be easily adapted to use Turing. The link is below.

https://blog.tensorflow.org/2019/03/structural-time-series-modeling-in.html

I will work on creating my own implementation of this model in Turing. If I can get it to work, I will contribute it to the tutorials repo.

Remove use of setprogress

Soon we'll be using Weave.jl, in which case the progress won't show in the outputted file anyways. Therefore we might as well remove the setprogress!(false) from all the tutorials so that the user actually gets the progress bar if they just follow the tutorial.

Project.toml contains MCMCChain, tutorials are using MCMCChains

Running, for instance, the logistic regression tutorial doesn't work out of the box with the current environment since MCMCChain (singular) is in Project.toml and MCMCChains (plural) in the notebooks. Running pkg.add("MCMCChains") makes the using statements work and shows MCMCChains appearing next to MCMCChain in Project.toml. I assume this is the correct dependency, but would love some clarification whether it should be MCMCChains or MCMCChains.

Add Manifest.toml and CompatHelper

Currently, only the Project.toml files are checked in, it seems. This helps to avoid conflicts between dependencies of different tutorials but it does not ensure reproducibility. It would be easier to rerun and debug the examples if the Manifest.toml would be checked in as well.

Additionally, one could then use CompatHelper to be informed about updated packages (it would be cool if the tutorials would be rebuilt automatically but it is even helpful if this has to be done manually, I think).

Providing easily executable examples

Hi together,

thanks for Turing and these great tutorials.
One issue I've encountered while playing around with it was to get it actually run.

While this might be an "it works on my machine" thing it might help if there was something where people could run Turing models with one click and not experience any headaches.

I've cloned your repository and added REQUIREMENTS in the ./binder dir
https://github.com/arnim/TuringTutorials
https://notebooks.gesis.org/binder/v2/gh/arnim/TuringTutorials/master

However, I still get many errors. Am I using wrong versions of the dependencies? Or is there anything else I'm missing?

The requirements are in https://github.com/arnim/TuringTutorials/blob/master/binder/REQUIRE
Maybe you have an idea.

Thanks in advance.

Tutorial mistakes

Hi. Thank you for the great tutorials. I'm just in the process of reading them, and they are a indispensable for learning to use Turing.

I try to thank you by listing any issue I found here. As we say: Leave a place as you want to encounter it. (I'm talking about the tutorial version 0.6.23)

https://turing.ml/v0.6.23/tutorials/0-introduction/

Coin Flip without Turing

  1. Please mention that GR is needed for the grapics. (The animated grapic is extremely cool, and it is a joy to recreate and understand it.)

  2. in the animation, the following line has a $ to much

    title = "Updated belief after $$N observations",

Coinflip with Turing
The line

 chain = sample(coinflip(data), HMC(iterations, ϵ, τ));

should be

chain = sample(coinflip(data), HMC(ϵ, τ), iterations);

https://turing.ml/v0.6.23/tutorials/1-gaussianmixturemodel/
The following line doesnt parse

 gmm_sampler = Gibbs(100, PG(100, 1, :k), HMC(1, 0.05, 10, :μ1, :μ2))

this one does, but as I'm off the limits of my current understanding, I'm not sure it does the right thing:

gmm_sampler = Gibbs(PG(1, :k), HMC(0.05, 10, :μ1, :μ2))

now this one is wrong (ignore if above was wrong, and this is implied)

tchain = mapreduce(c -> sample(gmm_model, gmm_sampler,100), chainscat, 1:3);

but this works

tchain = mapreduce(c -> sample(gmm_model, gmm_sampler,100), chainscat, 1:3);

(same disclaimer)

The picture I get is different than the one in the tutorial. (I used the same random seed.)
This makes me doubt my corrections for the gaussian mixtures. Help welcome.

That's it for today. Kind greetings!.
z.

I

Binder link doesn't work

The binder link in the README (https://mybinder.org/v2/gh/TuringLang/TuringTutorials/master) currently doesn't work. The error is:

ERROR: Unsatisfiable requirements detected for package OpenBLAS_jll [4536629a]:
 OpenBLAS_jll [4536629a] log:
 ├─possible versions are: [0.3.5, 0.3.7, 0.3.9-0.3.10, 0.3.12-0.3.13] or uninstalled
 ├─restricted to versions 0.3.10 by an explicit requirement, leaving only versions 0.3.10
 └─restricted by julia compatibility requirements to versions: [0.3.5, 0.3.7, 0.3.9] or uninstalled — no versions left
Complete Binder build logs
Waiting for build to start...
Picked Git content provider.
Cloning into '/tmp/repo2dockersf8lc91d'...
HEAD is now at 23f06ba Restore 10_diffeq temporarily
Building conda environment for python=3.7Using JuliaProjectTomlBuildPack builder
Building conda environment for python=3.7Building conda environment for python=3.7Step 1/55 : FROM buildpack-deps:bionic
 ---> 17c4791bcf21
Step 2/55 : ENV DEBIAN_FRONTEND=noninteractive
 ---> Using cache
 ---> bbfddbe339b9
Step 3/55 : RUN apt-get -qq update &&     apt-get -qq install --yes --no-install-recommends locales > /dev/null &&     apt-get -qq purge &&     apt-get -qq clean &&     rm -rf /var/lib/apt/lists/*
 ---> Using cache
 ---> ecd90e648d87
Step 4/55 : RUN echo "en_US.UTF-8 UTF-8" > /etc/locale.gen &&     locale-gen
 ---> Using cache
 ---> 84e4cf037057
Step 5/55 : ENV LC_ALL en_US.UTF-8
 ---> Using cache
 ---> b31fcfb3f821
Step 6/55 : ENV LANG en_US.UTF-8
 ---> Using cache
 ---> cc4070f93d4e
Step 7/55 : ENV LANGUAGE en_US.UTF-8
 ---> Using cache
 ---> eb63af9d24c4
Step 8/55 : ENV SHELL /bin/bash
 ---> Using cache
 ---> ce528fbddc2f
Step 9/55 : ARG NB_USER
 ---> Using cache
 ---> e48d6003ea26
Step 10/55 : ARG NB_UID
 ---> Using cache
 ---> b2a28e2768fc
Step 11/55 : ENV USER ${NB_USER}
 ---> Using cache
 ---> 3f55b05ab45f
Step 12/55 : ENV HOME /home/${NB_USER}
 ---> Using cache
 ---> 283c3d74d5c1
Step 13/55 : RUN groupadd         --gid ${NB_UID}         ${NB_USER} &&     useradd         --comment "Default user"         --create-home         --gid ${NB_UID}         --no-log-init         --shell /bin/bash         --uid ${NB_UID}         ${NB_USER}
 ---> Using cache
 ---> ed06d9197f2d
Step 14/55 : RUN wget --quiet -O - https://deb.nodesource.com/gpgkey/nodesource.gpg.key |  apt-key add - &&     DISTRO="bionic" &&echo "deb https://deb.nodesource.com/node_14.x $DISTRO main" >> /etc/apt/sources.list.d/nodesource.list &&     echo "deb-src https://deb.nodesource.com/node_14.x $DISTRO main" >> /etc/apt/sources.list.d/nodesource.list
 ---> Using cache
 ---> 50c0f576e05d
Step 15/55 : RUN apt-get -qq update &&     apt-get -qq install --yes --no-install-recommends        less        nodejs        unzip    > /dev/null &&     apt-get -qq purge &&     apt-get -qq clean &&     rm -rf /var/lib/apt/lists/*
 ---> Using cache
 ---> f7acc769b7ca
Step 16/55 : EXPOSE 8888
 ---> Using cache
 ---> 690c9ccb8649
Step 17/55 : ENV APP_BASE /srv
 ---> Using cache
 ---> 735c2ea65e2f
Step 18/55 : ENV NPM_DIR ${APP_BASE}/npm
 ---> Using cache
 ---> 222fb1d7ab20
Step 19/55 : ENV NPM_CONFIG_GLOBALCONFIG ${NPM_DIR}/npmrc
 ---> Using cache
 ---> 47416cedadcd
Step 20/55 : ENV CONDA_DIR ${APP_BASE}/conda
 ---> Using cache
 ---> 47875cc55273
Step 21/55 : ENV NB_PYTHON_PREFIX ${CONDA_DIR}/envs/notebook
 ---> Using cache
 ---> 596a4fbf7883
Step 22/55 : ENV KERNEL_PYTHON_PREFIX ${NB_PYTHON_PREFIX}
 ---> Using cache
 ---> 015beca108fa
Step 23/55 : ENV JULIA_PATH ${APP_BASE}/julia
 ---> Using cache
 ---> 65728701f35b
Step 24/55 : ENV JULIA_DEPOT_PATH ${JULIA_PATH}/pkg
 ---> Using cache
 ---> 31074edb9bd5
Step 25/55 : ENV JULIA_VERSION 1.5.3
 ---> Using cache
 ---> d54078bc65a0
Step 26/55 : ENV JUPYTER ${NB_PYTHON_PREFIX}/bin/jupyter
 ---> Using cache
 ---> feb1d9890655
Step 27/55 : ENV JUPYTER_DATA_DIR ${NB_PYTHON_PREFIX}/share/jupyter
 ---> Using cache
 ---> 839f14dfaa84
Step 28/55 : ENV PATH ${NB_PYTHON_PREFIX}/bin:${CONDA_DIR}/bin:${NPM_DIR}/bin:${JULIA_PATH}/bin:${PATH}
 ---> Using cache
 ---> eca41cb2fb1e
Step 29/55 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e8-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2factivate-2dconda-2esh-391af5 /etc/profile.d/activate-conda.sh
 ---> Using cache
 ---> eedbb12978dd
Step 30/55 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e8-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2fenvironment-2epy-2d3-2e7-2efrozen-2eyml-037262 /tmp/environment.yml
 ---> Using cache
 ---> f6f2b68bef1e
Step 31/55 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e8-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2finstall-2dminiforge-2ebash-514214 /tmp/install-miniforge.bash
 ---> Using cache
 ---> 46091d8b28c0
Step 32/55 : RUN mkdir -p ${NPM_DIR} && chown -R ${NB_USER}:${NB_USER} ${NPM_DIR}
 ---> Using cache
 ---> 02a63f31b527
Step 33/55 : USER ${NB_USER}
 ---> Using cache
 ---> bd1b9e7626a7
Step 34/55 : RUN npm config --global set prefix ${NPM_DIR}
 ---> Using cache
 ---> 3b3dc9adaa7f
Step 35/55 : USER root
 ---> Using cache
 ---> d4de065d7368
Step 36/55 : RUN TIMEFORMAT='time: %3R' bash -c 'time /tmp/install-miniforge.bash' && rm /tmp/install-miniforge.bash /tmp/environment.yml
 ---> Using cache
 ---> 0b907318f44a
Step 37/55 : RUN mkdir -p ${JULIA_PATH} && curl -sSL "https://julialang-s3.julialang.org/bin/linux/x64/${JULIA_VERSION%[.-]*}/julia-${JULIA_VERSION}-linux-x86_64.tar.gz" | tar -xz -C ${JULIA_PATH} --strip-components 1
 ---> Using cache
 ---> 9b1be9161892
Step 38/55 : RUN mkdir -p ${JULIA_DEPOT_PATH} && chown ${NB_USER}:${NB_USER} ${JULIA_DEPOT_PATH}
 ---> Using cache
 ---> 89f99214c548
Step 39/55 : ARG REPO_DIR=${HOME}
 ---> Using cache
 ---> 1ffc812e3a4f
Step 40/55 : ENV REPO_DIR ${REPO_DIR}
 ---> Using cache
 ---> 78ca8bf607ec
Step 41/55 : WORKDIR ${REPO_DIR}
 ---> Using cache
 ---> b3f6d8effdad
Step 42/55 : RUN chown ${NB_USER}:${NB_USER} ${REPO_DIR}
 ---> Using cache
 ---> ba1a3a948d4c
Step 43/55 : ENV PATH ${HOME}/.local/bin:${REPO_DIR}/.local/bin:${PATH}
 ---> Using cache
 ---> a920acd6a059
Step 44/55 : ENV CONDA_DEFAULT_ENV ${KERNEL_PYTHON_PREFIX}
 ---> Using cache
 ---> 267f803ebd0b
Step 45/55 : ENV JULIA_PROJECT ${REPO_DIR}
 ---> Using cache
 ---> 72788f062f20
Step 46/55 : COPY --chown=1000:1000 src/ ${REPO_DIR}
 ---> 3145de5ac0cf
Step 47/55 : USER ${NB_USER}
 ---> Running in aaa2cddb96d4
Removing intermediate container aaa2cddb96d4
 ---> 982c6ec1b64a
Step 48/55 : RUN JULIA_PROJECT="" julia -e "using Pkg; Pkg.add(\"IJulia\"); using IJulia; installkernel(\"Julia\", \"--project=${REPO_DIR}\");" && julia --project=${REPO_DIR} -e 'using Pkg; Pkg.instantiate(); Pkg.resolve(); pkg"precompile"'
 ---> Running in 4ca3e9a45b4e
 Installing known registries into `/srv/julia/pkg`
######################################################################## 100.0%
      Added registry `General` to `/srv/julia/pkg/registries/General`
  Resolving package versions...
  Installed Parsers ───────── v1.0.15
  Installed MbedTLS ───────── v1.0.3
  Installed VersionParsing ── v1.2.0
  Installed ZeroMQ_jll ────── v4.3.2+5
  Installed Conda ─────────── v1.5.0
  Installed SoftGlobalScope ─ v1.1.0
  Installed JSON ──────────── v0.21.1
  Installed IJulia ────────── v1.23.1
  Installed MbedTLS_jll ───── v2.16.8+1
  Installed ZMQ ───────────── v1.2.1
  Installed Artifacts ─────── v1.3.0
  Installed JLLWrappers ───── v1.2.0
Updating `/srv/julia/pkg/environments/v1.5/Project.toml`
  [7073ff75] + IJulia v1.23.1
Updating `/srv/julia/pkg/environments/v1.5/Manifest.toml`
  [56f22d72] + Artifacts v1.3.0
  [8f4d0f93] + Conda v1.5.0
  [7073ff75] + IJulia v1.23.1
  [692b3bcd] + JLLWrappers v1.2.0
  [682c06a0] + JSON v0.21.1
  [739be429] + MbedTLS v1.0.3
  [c8ffd9c3] + MbedTLS_jll v2.16.8+1
  [69de0a69] + Parsers v1.0.15
  [b85f4697] + SoftGlobalScope v1.1.0
  [81def892] + VersionParsing v1.2.0
  [c2297ded] + ZMQ v1.2.1
  [8f1865be] + ZeroMQ_jll v4.3.2+5
  [2a0f44e3] + Base64
  [ade2ca70] + Dates
  [8ba89e20] + Distributed
  [7b1f6079] + FileWatching
  [b77e0a4c] + InteractiveUtils
  [76f85450] + LibGit2
  [8f399da3] + Libdl
  [56ddb016] + Logging
  [d6f4376e] + Markdown
  [a63ad114] + Mmap
  [44cfe95a] + Pkg
  [de0858da] + Printf
  [3fa0cd96] + REPL
  [9a3f8284] + Random
  [ea8e919c] + SHA
  [9e88b42a] + Serialization
  [6462fe0b] + Sockets
  [8dfed614] + Test
  [cf7118a7] + UUIDs
  [4ec0a83e] + Unicode
   Building Conda ─→ `/srv/julia/pkg/packages/Conda/x5ml4/deps/build.log`
   Building IJulia → `/srv/julia/pkg/packages/IJulia/IDNmS/deps/build.log`
[ Info: Installing Julia kernelspec in /srv/conda/envs/notebook/share/jupyter/kernels/julia-1.5
  Installed Libiconv_jll ───────────────── v1.16.0+6
  Installed IteratorInterfaceExtensions ── v1.0.0
  Installed CommonSubexpressions ───────── v0.3.0
  Installed JuliaVariables ─────────────── v0.2.3
  Installed ArrayLayouts ───────────────── v0.4.8
  Installed GeometryBasics ─────────────── v0.3.1
  Installed Requires ───────────────────── v1.0.3
  Installed GPUArrays ──────────────────── v5.2.0
  Installed NLSolversBase ──────────────── v7.7.0
  Installed PDMats ─────────────────────── v0.10.0
  Installed DistributionsAD ────────────── v0.6.9
  Installed GLM ────────────────────────── v1.3.10
  Installed PyPlot ─────────────────────── v2.9.0
  Installed FileIO ─────────────────────── v1.4.3
  Installed PyCall ─────────────────────── v1.91.4
  Installed VectorizationBase ──────────── v0.12.33
  Installed IfElse ─────────────────────── v0.1.0
  Installed OrderedCollections ─────────── v1.3.1
  Installed Colors ─────────────────────── v0.12.4
  Installed LibVPX_jll ─────────────────── v1.9.0+0
  Installed TimerOutputs ───────────────── v0.5.6
  Installed OpenSSL_jll ────────────────── v1.1.1+5
  Installed Tables ─────────────────────── v1.0.5
  Installed Ogg_jll ────────────────────── v1.3.4+1
  Installed LeftChildRightSiblingTrees ─── v0.1.2
  Installed Juno ───────────────────────── v0.8.3
  Installed Tracker ────────────────────── v0.2.11
  Installed RecursiveFactorization ─────── v0.1.4
  Installed Combinatorics ──────────────── v1.0.2
  Installed GPUCompiler ────────────────── v0.6.1
  Installed libvorbis_jll ──────────────── v1.3.6+5
  Installed VertexSafeGraphs ───────────── v0.1.2
  Installed Zlib_jll ───────────────────── v1.2.11+16
  Installed DiffEqSensitivity ──────────── v6.31.5
  Installed Zygote ─────────────────────── v0.5.7
  Installed NNlib ──────────────────────── v0.7.4
  Installed Opus_jll ───────────────────── v1.3.1+2
  Installed Adapt ──────────────────────── v2.1.0
  Installed Formatting ─────────────────── v0.4.1
  Installed ProgressMeter ──────────────── v1.3.3
  Installed BinaryProvider ─────────────── v0.5.10
  Installed Plots ──────────────────────── v1.6.4
  Installed LearnBase ──────────────────── v0.3.0
  Installed Missings ───────────────────── v0.4.4
  Installed StructArrays ───────────────── v0.4.4
  Installed SpecialFunctions ───────────── v0.10.3
  Installed Observables ────────────────── v0.3.1
  Installed RangeArrays ────────────────── v0.3.2
  Installed ModelingToolkit ────────────── v3.20.0
  Installed FunctionWrappers ───────────── v1.1.1
  Installed AdvancedMH ─────────────────── v0.5.1
  Installed QuasiMonteCarlo ────────────── v0.2.0
  Installed GeneralizedGenerated ───────── v0.2.7
  Installed Flux ───────────────────────── v0.11.1
  Installed Turing ─────────────────────── v0.14.3
  Installed MacroTools ─────────────────── v0.5.5
  Installed EzXML ──────────────────────── v1.1.0
  Installed NameResolution ─────────────── v0.1.5
  Installed CodecZlib ──────────────────── v0.7.0
  Installed DiffEqCallbacks ────────────── v2.14.1
  Installed CpuId ──────────────────────── v0.2.2
  Installed WoodburyMatrices ───────────── v0.5.2
  Installed DynamicHMC ─────────────────── v2.2.0
  Installed DataFrames ─────────────────── v0.21.7
  Installed FFMPEG_jll ─────────────────── v4.3.1+2
  Installed FixedPointNumbers ──────────── v0.8.4
  Installed ReverseDiff ────────────────── v1.4.3
  Installed NearestNeighbors ───────────── v0.4.6
  Installed ExprTools ──────────────────── v0.1.2
  Installed ConjugatePriors ────────────── v0.4.0
  Installed OrdinaryDiffEq ─────────────── v5.42.8
  Installed RecursiveArrayTools ────────── v2.7.0
  Installed ZygoteRules ────────────────── v0.2.0
  Installed Parsers ────────────────────── v1.0.10
  Installed AxisArrays ─────────────────── v0.4.3
  Installed SentinelArrays ─────────────── v1.2.15
  Installed EarCut_jll ─────────────────── v2.1.5+0
  Installed PositiveFactorizations ─────── v0.2.3
  Installed CSV ────────────────────────── v0.7.7
  Installed NamedArrays ────────────────── v0.9.4
  Installed ForwardDiff ────────────────── v0.10.12
  Installed MLStyle ────────────────────── v0.4.6
  Installed LoopVectorization ──────────── v0.8.26
  Installed FillArrays ─────────────────── v0.9.6
  Installed FFMPEG ─────────────────────── v0.4.0
  Installed ParameterizedFunctions ─────── v5.6.0
  Installed QuadGK ─────────────────────── v2.4.1
  Installed ShiftedArrays ──────────────── v1.0.0
  Installed AbstractFFTs ───────────────── v0.5.0
  Installed ExponentialUtilities ───────── v1.8.0
  Installed MbedTLS ────────────────────── v1.0.2
  Installed SparseDiffTools ────────────── v1.10.0
  Installed CEnum ──────────────────────── v0.4.1
  Installed YAML ───────────────────────── v0.4.2
  Installed DiffRules ──────────────────── v1.0.1
  Installed DataAPI ────────────────────── v1.3.0
  Installed CompilerSupportLibraries_jll ─ v0.3.3+0
  Installed NLsolve ────────────────────── v4.4.1
  Installed METIS_jll ──────────────────── v5.1.0+4
  Installed DataValues ─────────────────── v0.4.13
  Installed AxisAlgorithms ─────────────── v1.0.0
  Installed DiffEqFinancial ────────────── v2.4.0
  Installed MLJModelInterface ──────────── v0.3.5
  Installed CategoricalArrays ──────────── v0.8.2
  Installed AbstractTrees ──────────────── v0.3.3
  Installed GenericSVD ─────────────────── v0.3.0
  Installed Sundials_jll ───────────────── v5.2.0+1
  Installed PrettyPrint ────────────────── v0.2.0
  Installed ApproxBayes ────────────────── v0.3.2
  Installed ChainRules ─────────────────── v0.7.20
  Installed LoggingExtras ──────────────── v0.4.2
  Installed Contour ────────────────────── v0.5.5
  Installed MultivariateStats ──────────── v0.7.0
  Installed MultiScaleArrays ───────────── v1.8.1
  Installed ArnoldiMethod ──────────────── v0.0.4
  Installed InplaceOps ─────────────────── v0.3.0
  Installed ZipFile ────────────────────── v0.9.2
  Installed SteadyStateDiffEq ──────────── v1.5.1
  Installed CanonicalTraits ────────────── v0.2.2
  Installed libfdk_aac_jll ─────────────── v0.1.6+3
  Installed ChainRulesCore ─────────────── v0.9.10
  Installed FFTW_jll ───────────────────── v3.3.9+5
  Installed LAME_jll ───────────────────── v3.100.0+2
  Installed Showoff ────────────────────── v0.3.1
  Installed SLEEFPirates ───────────────── v0.5.5
  Installed HTTP ───────────────────────── v0.8.19
  Installed Optim ──────────────────────── v1.2.0
  Installed Rmath_jll ──────────────────── v0.2.2+1
  Installed DataStructures ─────────────── v0.18.6
  Installed LightGraphs ────────────────── v1.3.0
  Installed StatsFuns ──────────────────── v0.9.5
  Installed FiniteDiff ─────────────────── v2.7.0
  Installed PrettyTables ───────────────── v0.9.1
  Installed Compat ─────────────────────── v3.16.0
  Installed Conda ──────────────────────── v1.4.1
  Installed Functors ───────────────────── v0.1.0
  Installed MKL_jll ────────────────────── v2020.2.254+0
  Installed Documenter ─────────────────── v0.25.2
  Installed Interpolations ─────────────── v0.12.10
  Installed TranscodingStreams ─────────── v0.9.5
  Installed DimensionalPlotRecipes ─────── v1.2.0
  Installed StatsPlots ─────────────────── v0.14.13
  Installed DocStringExtensions ────────── v0.8.3
  Installed OpenSpecFun_jll ────────────── v0.5.3+3
  Installed NaturalSort ────────────────── v1.0.0
  Installed MCMCChains ─────────────────── v4.2.1
  Installed OpenBLAS_jll ───────────────── v0.3.10+0
  Installed BandedMatrices ─────────────── v0.15.20
  Installed PooledArrays ───────────────── v0.5.3
  Installed InvertedIndices ────────────── v1.0.0
  Installed TableOperations ────────────── v0.2.1
  Installed ResettableStacks ───────────── v1.0.0
  Installed Ratios ─────────────────────── v0.4.0
  Installed DiffEqPhysics ──────────────── v3.6.0
  Installed Distributions ──────────────── v0.23.11
  Installed UnPack ─────────────────────── v1.0.2
  Installed Roots ──────────────────────── v1.0.5
  Installed x264_jll ───────────────────── v2020.7.14+1
  Installed LaTeXStrings ───────────────── v1.1.0
  Installed Measures ───────────────────── v0.3.1
  Installed ScientificTypes ────────────── v1.0.0
  Installed LogDensityProblems ─────────── v0.10.3
  Installed SimpleTraits ───────────────── v0.9.3
  Installed DifferentialEquations ──────── v6.15.0
  Installed DynamicPPL ─────────────────── v0.9.1
  Installed PlotUtils ──────────────────── v1.0.6
  Installed ArgCheck ───────────────────── v2.1.0
  Installed Weave ──────────────────────── v0.10.3
  Installed SIMDPirates ────────────────── v0.8.25
  Installed AbstractMCMC ───────────────── v1.0.1
  Installed RecipesBase ────────────────── v1.1.0
  Installed DiffEqBayes ────────────────── v2.17.0
  Installed DelayDiffEq ────────────────── v5.24.2
  Installed LabelledArrays ─────────────── v1.3.0
  Installed SafeTestsets ───────────────── v0.0.1
  Installed Mustache ───────────────────── v1.0.5
  Installed XML2_jll ───────────────────── v2.9.10+2
  Installed ConsoleProgressMonitor ─────── v0.1.2
  Installed DataValueInterfaces ────────── v1.0.0
  Installed Arpack ─────────────────────── v0.4.0
  Installed Crayons ────────────────────── v4.0.4
  Installed EllipticalSliceSampling ────── v0.2.2
  Installed IniFile ────────────────────── v0.5.0
  Installed x265_jll ───────────────────── v3.0.0+2
  Installed DiffResults ────────────────── v1.0.2
  Installed TableTraits ────────────────── v1.0.0
  Installed Unitful ────────────────────── v1.4.1
  Installed AbstractAlgebra ────────────── v0.10.0
  Installed SymbolicUtils ──────────────── v0.5.1
  Installed Highlights ─────────────────── v0.4.5
  Installed libass_jll ─────────────────── v0.14.0+3
  Installed FFTW ───────────────────────── v1.2.4
  Installed MLDataPattern ──────────────── v0.5.3
  Installed IterativeSolvers ───────────── v0.8.4
  Installed Sundials ───────────────────── v4.3.0
  Installed TimeZones ──────────────────── v1.3.2
  Installed StatsModels ────────────────── v0.6.14
  Installed RecipesPipeline ────────────── v0.1.13
  Installed TransformVariables ─────────── v0.3.10
  Installed KernelDensity ──────────────── v0.6.0
  Installed SortingAlgorithms ──────────── v0.3.1
  Installed ColorSchemes ───────────────── v3.9.0
  Installed MLDataUtils ────────────────── v0.5.2
  Installed DiffEqNoiseProcess ─────────── v5.3.0
  Installed OffsetArrays ───────────────── v1.2.0
  Installed TerminalLoggers ────────────── v0.1.2
  Installed LatinHypercubeSampling ─────── v1.6.4
  Installed GR ─────────────────────────── v0.52.0
  Installed Reexport ───────────────────── v0.2.0
  Installed ColorTypes ─────────────────── v0.10.9
  Installed Media ──────────────────────── v0.5.0
  Installed EllipsisNotation ───────────── v0.4.0
  Installed Inflate ────────────────────── v0.1.2
  Installed StaticArrays ───────────────── v0.12.4
  Installed FastClosures ───────────────── v0.3.2
  Installed ConstructionBase ───────────── v1.0.0
  Installed StatsBase ──────────────────── v0.33.1
  Installed ArrayInterface ─────────────── v2.12.0
  Installed LineSearches ───────────────── v7.1.0
  Installed MuladdMacro ────────────────── v0.2.2
  Installed BoundaryValueDiffEq ────────── v2.5.0
  Installed Rmath ──────────────────────── v0.6.1
  Installed BenchmarkTools ─────────────── v0.5.0
  Installed StructTypes ────────────────── v1.1.0
  Installed IRTools ────────────────────── v0.4.1
  Installed Clustering ─────────────────── v0.14.1
  Installed RDatasets ──────────────────── v0.6.10
  Installed AdvancedHMC ────────────────── v0.2.25
  Installed FreeType2_jll ──────────────── v2.10.1+4
  Installed MappedArrays ───────────────── v0.2.2
  Installed Distances ──────────────────── v0.9.0
  Installed Bijectors ──────────────────── v0.8.5
  Installed StochasticDiffEq ───────────── v6.26.0
  Installed CUDA ───────────────────────── v1.3.3
  Installed MbedTLS_jll ────────────────── v2.16.8+0
  Installed FriBidi_jll ────────────────── v1.0.5+5
  Installed SuiteSparse_jll ────────────── v5.4.0+9
  Installed MLLabelUtils ───────────────── v0.5.2
  Installed AdvancedVI ─────────────────── v0.1.0
  Installed Libtask ────────────────────── v0.4.1
  Installed Sobol ──────────────────────── v1.4.0
  Installed LLVM ───────────────────────── v2.0.0
  Installed TreeViews ──────────────────── v0.3.0
  Installed IntervalSets ───────────────── v0.5.1
  Installed Mocking ────────────────────── v0.7.1
  Installed Widgets ────────────────────── v0.6.2
  Installed RData ──────────────────────── v0.7.2
  Installed DiffEqBase ─────────────────── v6.47.1
  Installed IntelOpenMP_jll ────────────── v2018.0.3+0
  Installed PoissonRandom ──────────────── v0.4.0
  Installed NaNMath ────────────────────── v0.3.4
  Installed GeometryTypes ──────────────── v0.8.3
  Installed Parameters ─────────────────── v0.12.1
  Installed Arpack_jll ─────────────────── v3.5.0+3
  Installed IterTools ──────────────────── v1.3.0
  Installed Latexify ───────────────────── v0.14.0
  Installed ProgressLogging ────────────── v0.1.3
  Installed PlotThemes ─────────────────── v2.0.0
  Installed DiffEqJump ─────────────────── v6.10.1
  Installed RandomNumbers ──────────────── v1.4.0
  Installed Bzip2_jll ──────────────────── v1.0.6+4
   Building Conda ───────→ `/srv/julia/pkg/packages/Conda/3rPhK/deps/build.log`
   Building PyCall ──────→ `/srv/julia/pkg/packages/PyCall/zqDXB/deps/build.log`
   Building GR ──────────→ `/srv/julia/pkg/packages/GR/BwGt2/deps/build.log`
   Building Plots ───────→ `/srv/julia/pkg/packages/Plots/4EfKl/deps/build.log`
   Building SLEEFPirates → `/srv/julia/pkg/packages/SLEEFPirates/jGsib/deps/build.log`
   Building FFTW ────────→ `/srv/julia/pkg/packages/FFTW/DMUbN/deps/build.log`
   Building TimeZones ───→ `/srv/julia/pkg/packages/TimeZones/v0mfN/deps/build.log`
   Building Libtask ─────→ `/srv/julia/pkg/packages/Libtask/Zo6uM/deps/build.log`
ERROR: Unsatisfiable requirements detected for package OpenBLAS_jll [4536629a]:
 OpenBLAS_jll [4536629a] log:
 ├─possible versions are: [0.3.5, 0.3.7, 0.3.9-0.3.10, 0.3.12-0.3.13] or uninstalled
 ├─restricted to versions 0.3.10 by an explicit requirement, leaving only versions 0.3.10
 └─restricted by julia compatibility requirements to versions: [0.3.5, 0.3.7, 0.3.9] or uninstalled — no versions left
Stacktrace:
 [1] propagate_constraints!(::Pkg.Resolve.Graph, ::Set{Int64}; log_events::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/graphtype.jl:1005
 [2] propagate_constraints! at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/graphtype.jl:946 [inlined] (repeats 2 times)
 [3] simplify_graph!(::Pkg.Resolve.Graph, ::Set{Int64}; clean_graph::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/graphtype.jl:1460
 [4] simplify_graph! at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/graphtype.jl:1460 [inlined] (repeats 2 times)
 [5] resolve_versions!(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Operations.jl:375
 [6] up(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}, ::Pkg.Types.UpgradeLevel) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Operations.jl:1222
 [7] up(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}; level::Pkg.Types.UpgradeLevel, mode::Pkg.Types.PackageMode, update_registry::Bool, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:245
 [8] #up#38 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:68 [inlined]
 [9] #resolve#110 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:251 [inlined]
 [10] resolve at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:251 [inlined]
 [11] #resolve#109 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:249 [inlined]
 [12] resolve() at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:249
 [13] top-level scope at none:1
Removing intermediate container 4ca3e9a45b4e
The command '/bin/sh -c JULIA_PROJECT="" julia -e "using Pkg; Pkg.add(\"IJulia\"); using IJulia; installkernel(\"Julia\", \"--project=${REPO_DIR}\");" && julia --project=${REPO_DIR} -e 'using Pkg; Pkg.instantiate(); Pkg.resolve(); pkg"precompile"'' returned a non-zero code: 1

Make tutorials available on `colab`

Colab provides a cloud-based environment for running Jupyter notebooks. It also supports GPU and TPU for free. Maybe we can make Turing tutorials available on colab to encourage users to play with it. See e.g.

Swift example in colab

How to install Julia (and packages) on colab

Resetting Turing globals

Currently, the TuringTutorials.parallel_build starts a new Julia instance for every tutorial. Given that it takes about 1 minute for the Turing and plotting compilation to be done, we could save 10 minutes of build time (10-16%) by running all the tutorials in the same instance.

A problem that this would introduce is that some tutorials change global variables.

Would it be possible to work around this by being a bit more careful in the tutorials? For example, always setting

Turing.setadbackend(:forwarddiff)

in a hidden block just to be sure?

This might seem a bit verbose, but could actually help developers and users who switch between tutorials without restarting Julia.

Integration with CUDA.jl

Is there an example demonstrating integration with CUDA.jl? I'd be particularly interested in an example using reverse mode AD and the Bayesian HMM, since it's implementation involves a loop. It doesn't seem Tracker or Zygote support loops yet.

Tutorials Roadmap

My understanding of this repo is that it is to be the home for all of Turing's educational materials. I've reviewed a lot of the existing stuff we have in TuringDemos and TuringExamples, but so far there's not a lot outside of brief code blurbs and a couple of Jupyter notebooks without a lot of commentary.

@trappmartin noted that Edward's tutorials are something of a target for us. I can see why, they are exceptionally thorough and very pleasant to read with a conversational approach. They also tend assume limited familiarity with probabilistic methods, which I think of as a plus.

So, here are my guidelines for what I think would make a good set of tutorials.

  1. Highlight the ease of use of Turing. Easy to read, easy to write, easy to understand.
  2. Usable by reasonably informed layman. Someone with a basic to intermediate understanding of statistics should be able to understand most of the tutorial and expand it to their problem. This doesn't necessarily mean that we want to dumb everything down -- I still think there's room to showcase some really complex and nifty stuff, but we should try and make everything as accessible as is reasonable.
  3. Solve an example problem with real data. This might be any of the data sets in RDatasets or MLDatasets. In cases where a real-world data set might not be available or isn't applicable (are there any of those in this field? I have no idea) we should make sure that synthetic data is described (table headers, a chart or two, etc) so people can easily see what it looks like.
  4. Consistent in tone, information content, and pedagogical approach.

I'll be adding issues for each tutorial type we should eventually have, along with what we have in place elsewhere to see if we can adapt some of the models already written. More to come on that, but figured I should get out what kinds of things I think are valuable in a tutorial.

The "Bayesian Differential Equation" tutorial needs small improvements

From: https://github.com/TuringLang/Turing.jl/edit/master/docs/_tutorials/10_BayesianDiffEq.md

The tutorial results seem to have changed since its initial writing. For example, the plot in the Data retrodiction section does not reproduce the true ODE solution "quite accurately", since there is a big mismatch between the parameters in the three chains. I suspect that to resolve this issue and fix the example, some minor parameter tweaking could already be sufficient.

FHMM

Here's the existing tutorial. This one is outside of my current understanding for the moment, and I'll have to read up on it to write something satisfactory, unless someone would like to jot down a couple of thoughts I can expand upon.

Bayes Neural Net

We have a really good example for this already around. It just needs a bit of tidying up. I can probably take this one for the moment, but comments on it are always welcome.

Original example

Fix unstandardize function.

The mean and std functions need to be switched in the unstandardize function.

This function is found in both the linear regression and the VI tutorials.

I'd be glad to submit a PR but I'll need a bit more guidance as to best practices for this repo.

Linear Regression

Should we make a tutorial on linear regression? I am slightly on the fence on this one. I think it would be useful to have a brief example on how to do Bayesian linear regression, because pretty much anyone with any stats training knows how to do a regression of this sort from a frequentist perspective -- it might help build the groundwork for people on how to think about specifying their models.

Thoughts?

Usability of Bayesian HMM Tutorial

Hi! First of all, thank you very much for this package and the detailed tutorial! I just started working with Turing.jl and noticed a few issues when going through the Bayesian HMM Tutorial. Some of them are addressed in #86 and will likely be solved anyway, but there is one thing not mentioned there.

When I copy & paste the code given from the tutorial and run it on my machine, the simulated trajectory doesn't converge to the "experimental" one as it does in the animation in the tutorial:

HMM-Tutorial

I'm aware of the stochasticity of the process and ran the code a couple of times, but never got close to the result given in the tutorial. Is there any additional code that I need to run to get a similarly performing fit?
As a side note: I also noticed that in the tutorial, the model is run with only 100 samples, while 500 are shown in the animation...not sure if this might be related...

(Edit: just fixed a typo...)

Specifying target density up to proportionality constant

In STAN, one can specify a model up to a proportionality constant. For example, the following (uninteresting) STAN code (taken from here) will generate code that samples from a standard Normal. Note that the target is specified incrementally, in terms of the log density that will be sampled from.

parameters {
  real y;
}
model {
  target += -0.5 * y * y;
  // Same as this, which is the full Normal density:
  // target += -0.5 * y * y - log(2*pi) * 0.5 
}

Is there something similar in Turing?

UndefVarError: bijector not defined

Tutorial are key for users to learn.
The Variational Inference notebook has the following error for a long time.

_, sym2range = Variational.bijector(m; sym_to_ranges = Val(true)); #ERROR: UndefVarError: bijector not defined

Also, the following line needs to exclude ".value":
density!(collect(skipmissing(samples_nuts[:s].value)), label = "s (NUTS)", color = :green, linewidth = 2)

Regards,

Package conflict on LogisticRegression tutorial

Question on: https://turing.ml/dev/tutorials/2-logisticregression/

When I try to follow along on this guide in my own notebook, there are some package management issues. When loading the RDataset, there is an identifier not found error, which was resolved by using the RData#master package as recommended here.

I can download the dataset now, but the MLDataUtils package is now incompatible. When I try to add it, I get this error due to some conflict with required versions of DataFrames packages:

 MLDataUtils [cc2ba9b6] log:

 ├─possible versions are: [0.4.0, 0.5.0-0.5.1] or uninstalled

 ├─restricted to versions * by an explicit requirement, leaving only versions [0.4.0, 0.5.0-0.5.1]

 └─restricted by compatibility requirements with DataFrames [a93c6f00] to versions: uninstalled — no versions left

   └─DataFrames [a93c6f00] log:

     ├─possible versions are: [0.11.7, 0.12.0, 0.13.0-0.13.1, 0.14.0-0.14.1, 0.15.0-0.15.2, 0.16.0, 0.17.0-0.17.1, 0.18.0-0.18.4, 0.19.0-0.19.4, 0.20.0-0.20.2, 0.21.0-0.21.1] or uninstalled

     └─restricted to versions 0.21 by RData [df47a6cb], leaving only versions 0.21.0-0.21.1

       └─RData [df47a6cb] log:

         ├─possible versions are: 0.7.2 or uninstalled

         └─RData [df47a6cb] is fixed to version 0.7.2

Bayes HMM

We already have an example for this one elsewhere, but it needs a lot more commentary and definition. This one is at the moment slightly outside of my comfort zone to expand on, so it might need to fall to someone else for now or wait until I can read up on the subject a bit more.

Original: BayesHMM.ipynb

Tutorials for Economists

One thing I have always wanted to do is write more tutorials that empirical and theoretical economists can use to inform how they do their research. @trappmartin recently reminded me how much I have wanted to do this.

This issue is meant to be a super-issue that brainstorms possible tutorials that people might like to see on how to use Turing in the social sciences (economics in particular, but I am open to other fields if someone with experience can help).

I naturally tilt towards finance and thus most of my suggestions are likely to be finance-related. I'm hoping someone with more general econ experience can jump in here. In particular, I would like to see some macroeconomics tutorials, because that has historically been where the most sophisticated applications of Bayesian tools has been.

I also want to hear from some of the QuantEcon people to see what they think. @jlperla do you know of some people who might have some good ideas for shortish tutorials they would like to see on Bayesian methods in economics? I'm not trying to get anyone to write any of these, just to solicit some ideas.

Anyways, here are some of my ideas:

  1. Trend/cycle decomposition. This is a fairly simple time series analysis that let's you talk about latent variables, analytical tricks to simplify models, and ties into a moderately large literature.
  2. Structural estimation of a simple macro model, like endogenous growth or something. Probably best to use a very simple model that the literature has moved on from just for computation and expository purposes.
  3. An industrial organization model, also structural estimation. I don't know this literature very well but I would like to see something on market growth and consolidation.
  4. A dynamic corporate finance model -- this chapter by Strebulaev and Whited is excellent and has any number of things that could be viewed from a Bayesian perspective.
  5. Models that attempt to recover latent variables, perhaps like a well-structured labor econ paper that attempts to infer skill from obervables. This might also be good because a tutorial could cover DAGs and how they can be expressed in Turing.
  6. A conditional beta model from finance. Lots of evidence suggests that market betas are time varying, and it would be cool to try to estimate the posteriors for conditional betas.
  7. A factor model comparison paper. Barillas and Shaken (2018) do something like this where they compare a bunch of different factor models, and I think this could be done with Bayesian model combination (where expected returns are weighted by a Dirichlet distribution) to see if the results hold.
  8. Structural break tests in time series (via @rlouf).
  9. Conjoint MNL/mixed logit models for marketing (via this tweet).
  10. Item response theory, via this tweet. There's a good Stata blog post and an ArXiV paper on this.

Any other ideas from economist types are welcome, please add them down here.

Typo in the VI tutorial

image

In the second to last paragraph at the bottom, "overestimating" should be "underestimating".

Redirects

Right now there is no handling of redirects. It seems like the way to go is to use https://github.com/jekyll/jekyll-redirect-from, which requires us to change

---
title: "Introduction to Turing"
permalink: "/:collection/:name/"
---

to

---
title: "Introduction to Turing"
permalink: "/:collection/:name/"
redirect_from: "tutorials/0-introduction/"
---

The questions is: should this be done in the tutorial itself, i.e. in TuringTutorials, or is this something that we should add programmatically in turing.ml?

Personally I'm leaning towards doing this in TuringTutorials as this is where the renaming happens, making it easier to keep track of changes. At the same time this means that there is a discrepancy between the permalink field which is somewhat agnostic to the naming of the folder of the file while redirect_from is hard-coded. Even so, IMO it's best to keep track of this in TuringTutorials.

Thoughts?

Error in running the Bayesian HMM tutorial

I was going through the Turing tutorial on Bayesian HMMs .

I defined the Turing model as mentioned but when I ran the sampler:

g = Gibbs(1000, HMC(2, 0.001, 7, :m, :T), PG(20, 1, :s))
c = sample(BayesHmm(y, 3), g);

I got the following error:

ArgumentError: Sampler for this object is not defined

Stacktrace:
 [1] Random.Sampler(::Type{MersenneTwister}, ::Random.SamplerType{Real}, ::Val{1}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Random/src/Random.jl:132
 [2] Random.Sampler(::MersenneTwister, ::Random.SamplerType{Real}, ::Val{1}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Random/src/Random.jl:129
 [3] rand(::MersenneTwister, ::Random.SamplerType{Real}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Random/src/Random.jl:219
 [4] rand(::MersenneTwister, ::Type{Real}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Random/src/Random.jl:222
 [5] rand(::MersenneTwister, ::DiscreteNonParametric{Int64,Real,Base.OneTo{Int64},Array{Real,1}}) at /home/sheshank/.julia/packages/Distributions/fMt8c/src/univariate/discrete/discretenonparametric.jl:82
 [6] rand(::DiscreteNonParametric{Int64,Real,Base.OneTo{Int64},Array{Real,1}}) at /home/sheshank/.julia/packages/Distributions/fMt8c/src/univariate/discrete/discretenonparametric.jl:91
 [7] init(::DiscreteNonParametric{Int64,Real,Base.OneTo{Int64},Array{Real,1}}) at /home/sheshank/.julia/packages/Turing/izlov/src/utilities/robustinit.jl:13
 [8] assume(::Turing.SampleFromUniform, ::DiscreteNonParametric{Int64,Real,Base.OneTo{Int64},Array{Real,1}}, ::Turing.Core.VarReplay.VarName, ::Turing.Core.VarReplay.VarInfo) at /home/sheshank/.julia/packages/Turing/izlov/src/inference/Inference.jl:117
 [9] macro expansion at /home/sheshank/.julia/packages/Turing/izlov/src/core/compiler.jl:102 [inlined]
 [10] macro expansion at ./In[3]:28 [inlined]
 [11] (::getfield(Main, Symbol("###inner_function#370#12")){Int64})(::Turing.Core.VarReplay.VarInfo, ::Turing.SampleFromUniform, ::Turing.Model{Tuple{:T,:m,:s},Tuple{:y},getfield(Main, Symbol("###inner_function#370#12")){Int64},NamedTuple{(:y,),Tuple{Array{Float64,1}}},NamedTuple{(:y,),Tuple{Symbol}}}) at /home/sheshank/.julia/packages/Turing/izlov/src/core/compiler.jl:388
 [12] #call#3 at /home/sheshank/.julia/packages/Turing/izlov/src/Turing.jl:62 [inlined]
 [13] Model at /home/sheshank/.julia/packages/Turing/izlov/src/Turing.jl:62 [inlined]
 [14] #sample#46(::Bool, ::Nothing, ::Int64, ::Function, ::Turing.Model{Tuple{:T,:m,:s},Tuple{:y},getfield(Main, Symbol("###inner_function#370#12")){Int64},NamedTuple{(:y,),Tuple{Array{Float64,1}}},NamedTuple{(:y,),Tuple{Symbol}}}, ::Gibbs{Tuple{HMC{Turing.Core.ForwardDiffAD{40},Symbol},PG{Symbol,typeof(Turing.Utilities.resample_systematic)}}}) at /home/sheshank/.julia/packages/Turing/izlov/src/inference/gibbs.jl:105
 [15] sample(::Turing.Model{Tuple{:T,:m,:s},Tuple{:y},getfield(Main, Symbol("###inner_function#370#12")){Int64},NamedTuple{(:y,),Tuple{Array{Float64,1}}},NamedTuple{(:y,),Tuple{Symbol}}}, ::Gibbs{Tuple{HMC{Turing.Core.ForwardDiffAD{40},Symbol},PG{Symbol,typeof(Turing.Utilities.resample_systematic)}}}) at /home/sheshank/.julia/packages/Turing/izlov/src/inference/gibbs.jl:76
 [16] top-level scope at In[4]:2

Could someone please help me out with what's happening here. Thanks

Introduction

There is already an introductory piece which I'm currently editing and polishing up. This is an issue for me, no need for anyone else to do anything on this bit.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.