arviz-devs / preliz Goto Github PK
View Code? Open in Web Editor NEWA tool-box for prior elicitation.
Home Page: https://preliz.readthedocs.io
License: Apache License 2.0
A tool-box for prior elicitation.
Home Page: https://preliz.readthedocs.io
License: Apache License 2.0
Currently the only available option is to plot or not plot.
We should explore two approaches. Create sliders from
Currently, we need to
Here there are other alternatives that we could implement https://ipywidgets.readthedocs.io/en/8.0.2/embedding.html
Using nbsphinx is not a solution as stated in #177 by @OriolAbril
Maybe we can make it work by doing
from xvfbwrapper import Xvfb
with Xvfb() as xvfb:
pz.roulette()
or
import pyvirtualdisplay
_display = pyvirtualdisplay.Display(visible=False, # use False with Xvfb
size=(1400, 900))
_ = _display.start()
Both approaches require to have xvfb
and maybe also x11-utils
, which are not installed by default. I try to install them by adding a apt.txt file to the root of the repo. But it seems binder is not installing it.
It is often the case that the quartile function returns a distribution with a mass slightly different from 0.5 (inside the interquartile range). This could be partially caused by a fundamental limitation, but also by a suboptimal optimization routine.
Maybe some of our functions could benefit from a Probabilistic Numerics perspective. https://www.probabilistic-numerics.org/
Functions like maxent
or roulette
could either return a single distribution (current behavior) or a few distributions representing the optimization uncertainty and input uncertainty. For example, for the roulette method, the discreteness of the bins/chips.
The main goal is to avoid users to be overconfident of their decisions.
Other options could be empirical cdf, quantile dot plots, histograms. Maybe point-interval only?
It would be nice to be able to explore distributions interactively. Ipywidgets already provides functions like interactive to do this. Thus, it may be not that difficult to wrap that into an interactive
method for the PreliZ distributions.
The logic for back_fitting the prior draws to the prior was done quickly and needs improvements. Additionally, we need more test.
This is the error when trying to run pz.roulette()
from https://preliz.readthedocs.io/en/latest/examples/param_space_1d_examples.html
[Open Browser Console for more detailed log - Double click to close this message]
Failed to load model class 'MPLCanvasModel' from module 'jupyter-matplotlib'
makeError@https://hub.gke2.mybinder.org/user/arviz-devs-arviz_sandbox-pyp8w7j3/static/nbclassic/components/requirejs/require.js?v=d37b48bb2137faa0ab98157e240c084dd5b1b5e74911723aa1d1f04c928c2a03dedf922d049e4815f7e5a369faa2e6b6a1000aae958b7953b5cc60411154f593:168:17
checkLoaded@https://hub.gke2.mybinder.org/user/arviz-devs-arviz_sandbox-pyp8w7j3/static/nbclassic/components/requirejs/require.js?v=d37b48bb2137faa0ab98157e240c084dd5b1b5e74911723aa1d1f04c928c2a03dedf922d049e4815f7e5a369faa2e6b6a1000aae958b7953b5cc60411154f593:696:23
newContext/checkLoaded/checkLoadedTimeoutId<@https://hub.gke2.mybinder.org/user/arviz-devs-arviz_sandbox-pyp8w7j3/static/nbclassic/components/requirejs/require.js?v=d37b48bb2137faa0ab98157e240c084dd5b1b5e74911723aa1d1f04c928c2a03dedf922d049e4815f7e5a369faa2e6b6a1000aae958b7953b5cc60411154f593:717:25
scipy does not provide a .fit
method for discrete distributions. We should add a function for internal use.
We will implement method on a distribution basis
https://preliz.readthedocs.io/en/latest/api_reference.html#module-preliz
@OriolAbril do you know what am I missing?
For example the plot()
method
This paper https://osf.io/paby6/ and this library https://github.com/dmi3kno/qpd are very interesting.
While the function tries to provide some useful starting values, It will be useful to allow users to define the values themself
This could be done by passing a dictionary of variable name tuple min/max and/or by linking text boxes to the sliders
We may want to keep Tkinter interface and offer more than one option.
Waiting for build to start...
Picked Git content provider.
Cloning into '/tmp/repo2dockerrm4_86m6'...
HEAD is now at 6e39d29 Merge pull request #1 from arviz-devs/display
Building conda environment for python=3.7Using PythonBuildPack builder
Building conda environment for python=3.7Building conda environment for python=3.7Step 1/54 : FROM buildpack-deps:bionic
---> 38c3a7e0952a
Step 2/54 : ENV DEBIAN_FRONTEND=noninteractive
---> Using cache
---> 107534e99c42
Step 3/54 : RUN apt-get -qq update && apt-get -qq install --yes --no-install-recommends locales > /dev/null && apt-get -qq purge && apt-get -qq clean && rm -rf /var/lib/apt/lists/*
---> Using cache
---> aa90368103ab
Step 4/54 : RUN echo "en_US.UTF-8 UTF-8" > /etc/locale.gen && locale-gen
---> Using cache
---> 891957450365
Step 5/54 : ENV LC_ALL en_US.UTF-8
---> Using cache
---> fc51f9229e37
Step 6/54 : ENV LANG en_US.UTF-8
---> Using cache
---> aab53d369b48
Step 7/54 : ENV LANGUAGE en_US.UTF-8
---> Using cache
---> 8ae6cbeac1b0
Step 8/54 : ENV SHELL /bin/bash
---> Using cache
---> 30e8f62ce4d0
Step 9/54 : ARG NB_USER
---> Using cache
---> a09cc57ebe4e
Step 10/54 : ARG NB_UID
---> Using cache
---> 81921c230f5a
Step 11/54 : ENV USER ${NB_USER}
---> Using cache
---> bbddf314ed72
Step 12/54 : ENV HOME /home/${NB_USER}
---> Using cache
---> 835d094c6f39
Step 13/54 : RUN groupadd --gid ${NB_UID} ${NB_USER} && useradd --comment "Default user" --create-home --gid ${NB_UID} --no-log-init --shell /bin/bash --uid ${NB_UID} ${NB_USER}
---> Using cache
---> 0653cf63f3e3
Step 14/54 : RUN apt-get -qq update && apt-get -qq install --yes --no-install-recommends less unzip > /dev/null && apt-get -qq purge && apt-get -qq clean && rm -rf /var/lib/apt/lists/*
---> Using cache
---> 7808c66cd89b
Step 15/54 : EXPOSE 8888
---> Using cache
---> 1b4a643f0375
Step 16/54 : ENV APP_BASE /srv
---> Using cache
---> db10df214b44
Step 17/54 : ENV CONDA_DIR ${APP_BASE}/conda
---> Using cache
---> 95ed60f307ac
Step 18/54 : ENV NB_PYTHON_PREFIX ${CONDA_DIR}/envs/notebook
---> Using cache
---> 554833aa351d
Step 19/54 : ENV NPM_DIR ${APP_BASE}/npm
---> Using cache
---> 2caa15484789
Step 20/54 : ENV NPM_CONFIG_GLOBALCONFIG ${NPM_DIR}/npmrc
---> Using cache
---> 3c266a026bb4
Step 21/54 : ENV NB_ENVIRONMENT_FILE /tmp/env/environment.lock
---> Using cache
---> 711cc3b384ce
Step 22/54 : ENV MAMBA_ROOT_PREFIX ${CONDA_DIR}
---> Using cache
---> e2fd2586d4d5
Step 23/54 : ENV MAMBA_EXE ${CONDA_DIR}/bin/mamba
---> Using cache
---> a8b1f732adb1
Step 24/54 : ENV KERNEL_PYTHON_PREFIX ${NB_PYTHON_PREFIX}
---> Using cache
---> 41b3c469b791
Step 25/54 : ENV PATH ${NB_PYTHON_PREFIX}/bin:${CONDA_DIR}/bin:${NPM_DIR}/bin:${PATH}
---> Using cache
---> d60104446ce4
Step 26/54 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e9-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2factivate-2dconda-2esh-e9bee0 /etc/profile.d/activate-conda.sh
---> Using cache
---> fda4fa6e91e9
Step 27/54 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e9-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2fenvironment-2epy-2d3-2e7-2elock-d12193 /tmp/env/environment.lock
---> Using cache
---> a9654070d37a
Step 28/54 : COPY --chown=1000:1000 build_script_files/-2fusr-2flib-2fpython3-2e9-2fsite-2dpackages-2frepo2docker-2fbuildpacks-2fconda-2finstall-2dbase-2denv-2ebash-41d468 /tmp/install-base-env.bash
---> Using cache
---> 65bd65962e4b
Step 29/54 : RUN TIMEFORMAT='time: %3R' bash -c 'time /tmp/install-base-env.bash' && rm -rf /tmp/install-base-env.bash /tmp/env
---> Using cache
---> 8304472795aa
Step 30/54 : RUN mkdir -p ${NPM_DIR} && chown -R ${NB_USER}:${NB_USER} ${NPM_DIR}
---> Using cache
---> 7a1b4348b810
Step 31/54 : USER root
---> Using cache
---> 5d60aea9f45a
Step 32/54 : ARG REPO_DIR=${HOME}
---> Using cache
---> cc31b856c17d
Step 33/54 : ENV REPO_DIR ${REPO_DIR}
---> Using cache
---> 6b26acd6d870
Step 34/54 : WORKDIR ${REPO_DIR}
---> Using cache
---> eae88cc05961
Step 35/54 : RUN chown ${NB_USER}:${NB_USER} ${REPO_DIR}
---> Using cache
---> b44497fad57a
Step 36/54 : ENV PATH ${HOME}/.local/bin:${REPO_DIR}/.local/bin:${PATH}
---> Using cache
---> 7da3e318c390
Step 37/54 : ENV CONDA_DEFAULT_ENV ${KERNEL_PYTHON_PREFIX}
---> Using cache
---> 0b893f8c0680
Step 38/54 : COPY --chown=1000:1000 src/requirements.txt ${REPO_DIR}/requirements.txt
---> Using cache
---> 3c4b992dd281
Step 39/54 : RUN apt-get -qq update && apt-get install --yes --no-install-recommends x11-utils xvfb && apt-get -qq purge && apt-get -qq clean && rm -rf /var/lib/apt/lists/*
---> Using cache
---> 20654a22bef8
Step 40/54 : USER ${NB_USER}
---> Using cache
---> 6fa8d1d05ac8
Step 41/54 : RUN ${KERNEL_PYTHON_PREFIX}/bin/pip install --no-cache-dir -r "requirements.txt"
---> Running in b6fa2354922f
Collecting preliz@ git+https://github.com/arviz-devs/preliz.git
Cloning https://github.com/arviz-devs/preliz.git to /tmp/pip-install-k3vraino/preliz_42788d36f102427896796c25f163cb13
Running command git clone --filter=blob:none --quiet https://github.com/arviz-devs/preliz.git /tmp/pip-install-k3vraino/preliz_42788d36f102427896796c25f163cb13
Resolved https://github.com/arviz-devs/preliz.git to commit bbde3f8f6bca53492a881264e72398eadc991691
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Collecting nbgitpuller
Downloading nbgitpuller-1.1.0-py2.py3-none-any.whl (456 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 456.9/456.9 KB 21.7 MB/s eta 0:00:00
Collecting arviz
Downloading arviz-0.12.1-py3-none-any.whl (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 11.3 MB/s eta 0:00:00
Collecting xarray-einstats[einops]
Downloading xarray_einstats-0.2.2-py3-none-any.whl (33 kB)
Collecting pyvirtualdisplay
Downloading PyVirtualDisplay-3.0-py3-none-any.whl (15 kB)
Collecting xvfbwrapper
Downloading xvfbwrapper-0.2.9.tar.gz (5.6 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: jupyter-server>=1.10.1 in /srv/conda/envs/notebook/lib/python3.7/site-packages (from nbgitpuller->-r requirements.txt (line 1)) (1.16.0)
Requirement already satisfied: notebook>=5.5.0 in /srv/conda/envs/notebook/lib/python3.7/site-packages (from nbgitpuller->-r requirements.txt (line 1)) (6.4.10)
Requirement already satisfied: tornado in /srv/conda/envs/notebook/lib/python3.7/site-packages (from nbgitpuller->-r requirements.txt (line 1)) (6.1)
Collecting pandas>=0.23
Downloading pandas-1.3.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.3/11.3 MB 30.4 MB/s eta 0:00:00
Collecting scipy>=0.19
Downloading scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 38.1/38.1 MB 30.5 MB/s eta 0:00:00
Collecting matplotlib>=3.0
Downloading matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (11.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.2/11.2 MB 30.4 MB/s eta 0:00:00
Requirement already satisfied: typing-extensions>=3.7.4.3 in /srv/conda/envs/notebook/lib/python3.7/site-packages (from arviz->-r requirements.txt (line 4)) (4.1.1)
Requirement already satisfied: setuptools>=38.4 in /srv/conda/envs/notebook/lib/python3.7/site-packages (from arviz->-r requirements.txt (line 4)) (62.0.0)
Collecting xarray>=0.16.1
Downloading xarray-0.20.2-py3-none-any.whl (845 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 845.2/845.2 KB 48.4 MB/s eta 0:00:00
Collecting numpy>=1.12
Downloading numpy-1.21.6-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 15.7/15.7 MB 30.2 MB/s eta 0:00:00
Collecting netcdf4
Downloading netCDF4-1.6.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.1/5.1 MB 33.6 MB/s eta 0:00:00
Requirement already satisfied: packaging in /srv/conda/envs/notebook/lib/python3.7/site-packages (from arviz->-r requirements.txt (line 4)) (21.3)
Requirement already satisfied: nbclient<0.6,>=0.2 in /srv/conda/envs/notebook/lib/python3.7/site-packages (from preliz@ git+https://github.com/arviz-devs/preliz.git->-r requirements.txt (line 5)) (0.5.13)
INFO: pip is looking at multiple versions of <Python from Requires-Python> to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of arviz to determine which version is compatible with other requirements. This could take a while.
Collecting arviz
Downloading arviz-0.12.0-py3-none-any.whl (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 42.1 MB/s eta 0:00:00
Downloading arviz-0.11.4-py3-none-any.whl (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 39.1 MB/s eta 0:00:00
Collecting typing-extensions<4,>=3.7.4.3
Downloading typing_extensions-3.10.0.2-py3-none-any.whl (26 kB)
Collecting arviz
Downloading arviz-0.11.3-py3-none-any.whl (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 41.9 MB/s eta 0:00:00
Downloading arviz-0.11.2-py3-none-any.whl (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 40.1 MB/s eta 0:00:00
Downloading arviz-0.11.1-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 40.5 MB/s eta 0:00:00
Downloading arviz-0.11.0-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 38.0 MB/s eta 0:00:00
Downloading arviz-0.10.0-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 40.3 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of arviz to determine which version is compatible with other requirements. This could take a while.
Downloading arviz-0.9.0-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 37.9 MB/s eta 0:00:00
Downloading arviz-0.8.3-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 39.5 MB/s eta 0:00:00
Downloading arviz-0.8.2-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 39.8 MB/s eta 0:00:00
Downloading arviz-0.8.1-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 38.3 MB/s eta 0:00:00
Downloading arviz-0.8.0-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 39.4 MB/s eta 0:00:00
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Downloading arviz-0.7.0-py3-none-any.whl (1.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 40.2 MB/s eta 0:00:00
Downloading arviz-0.6.1-py3-none-any.whl (1.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 39.7 MB/s eta 0:00:00
Downloading arviz-0.6.0-py3-none-any.whl (1.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 41.5 MB/s eta 0:00:00
Downloading arviz-0.5.1-py3-none-any.whl (1.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 42.2 MB/s eta 0:00:00
Downloading arviz-0.4.1-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 41.1 MB/s eta 0:00:00
Downloading arviz-0.4.0-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 43.2 MB/s eta 0:00:00
Downloading arviz-0.3.3-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 40.4 MB/s eta 0:00:00
Downloading arviz-0.3.2-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 41.6 MB/s eta 0:00:00
Collecting numpy==1.15
Downloading numpy-1.15.0-cp37-cp37m-manylinux1_x86_64.whl (13.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.8/13.8 MB 30.5 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of numpy to determine which version is compatible with other requirements. This could take a while.
Collecting arviz
Downloading arviz-0.3.1-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 41.7 MB/s eta 0:00:00
Downloading arviz-0.3.0-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 41.3 MB/s eta 0:00:00
Downloading arviz-0.2.1-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 41.3 MB/s eta 0:00:00
Downloading arviz-0.2.0-py3-none-any.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 42.1 MB/s eta 0:00:00
Downloading arviz-0.1.0-py3-none-any.whl (1.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 48.0 MB/s eta 0:00:00
Downloading arviz-0.0.1-py3-none-any.whl (1.2 kB)
INFO: pip is looking at multiple versions of nbgitpuller to determine which version is compatible with other requirements. This could take a while.
Collecting nbgitpuller
Downloading nbgitpuller-1.0.2-py2.py3-none-any.whl (430 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 430.7/430.7 KB 17.6 MB/s eta 0:00:00
ERROR: Could not find a version that satisfies the requirement scipy>=1.9.0rc1 (from preliz) (from versions: 0.8.0, 0.9.0, 0.10.0, 0.10.1, 0.11.0, 0.12.0, 0.12.1, 0.13.0, 0.13.1, 0.13.2, 0.13.3, 0.14.0, 0.14.1, 0.15.0, 0.15.1, 0.16.0, 0.16.1, 0.17.0, 0.17.1, 0.18.0, 0.18.1, 0.19.0, 0.19.1, 1.0.0, 1.0.1, 1.1.0, 1.2.0, 1.2.1, 1.2.2, 1.2.3, 1.3.0rc1, 1.3.0rc2, 1.3.0, 1.3.1, 1.3.2, 1.3.3, 1.4.0rc1, 1.4.0rc2, 1.4.0, 1.4.1, 1.5.0rc1, 1.5.0rc2, 1.5.0, 1.5.1, 1.5.2, 1.5.3, 1.5.4, 1.6.0rc1, 1.6.0rc2, 1.6.0, 1.6.1, 1.6.2, 1.6.3, 1.7.0rc1, 1.7.0rc2, 1.7.0, 1.7.1, 1.7.2, 1.7.3)
ERROR: No matching distribution found for scipy>=1.9.0rc1
Removing intermediate container b6fa2354922f
The command '/bin/sh -c ${KERNEL_PYTHON_PREFIX}/bin/pip install --no-cache-dir -r "requirements.txt"' returned a non-zero code: 1Built image, launching...
Failed to connect to event stream
I there isn't much to win from static docs or notebooks in preliz, but we can write some "ready to run" notebooks that can be executed with binder or thebe (which would be even better I think).
With thebe for example, the docs themseves have a button to run the code without even leaving the website (and it runs binder behind the scenes and interfaces with the docs website). We used that for example in the pymc sprint, however, binder doesn't offer a lot of computing power, so it's use for pymc it's limited due to slowness and lack of available RAM, but hopefully with preliz things would work fast enough to show interactiveness.
Most of the distributions need docstrings
Distributions returned by the constraint
function are not uniquely defined, one solution could be to return the maximum entropy distribution with the requested mass inside the requested interval.
refactor ploting code in constraints
and plot
and make some general reusable functions
Similar to maxent
but the user provides the median, lower and upper quartiles. of X, we can then use least squares to fit the distribution.
[F(Xq1, theta) - 0.25]^2 + [F(Xq2, theta) - 0.5]^2 + [F(Xq3, theta) - 0.75]^2
where F(X, theta) is the CDF of X
Currently, the default reference value is 0. And it is fixed.
quartile
example.roulette()
It seems the result can be (ZeroInflatedBinomial(), 2, 3, 4, (0.993, 9.934, 0.394) passing or [ (ZeroInflatedBinomial(), 2, 3, 4, (0.993, 9.033, 0.394)) failing
Related to #92. We should be able to use the same syntax used for predictive_sliders
. This has at least two advantages. Making PreliZ agnostic of any PPLs, simplifies iterative fitting during a PPA session. One disadvantage on not using any PPL is that users will need to write their model in their PPL and in PreliZ. One possible solution will be to provide an option to return or export the solution to a given PPL (and maybe also importing).
All new distributions should include an alternative parametrization if useful/common. We should at least cover the parametrization in PyMC, but we can consider having others too (for example the beta has kappa parameter that is useful but not present in PyMC).
In #206 we introduced the logitnormal, the entropy is only approximated. We need to use the correct expression
Given the differences between discrete and continuous distributions it makes sense to have plot function for each of the these instead of a single method. This should lead to a simpler docstring and list of arguments
In #206 we introduced the logitnormal, we approximate the first two moments by sampling. We may want to approximate them in a different way. We may also want to add skewness and kurtosis.
ppa uses ipywidgets and matplotlib interactively. So ideally test should include simulated interactions
A few of the alternative parametrization involve converting from tau to sigma and vice-versa. We could define these functions only once in the library.
All functions from the unidimensional and predictive modules are not show in the API section. And each one of the distributions is instead of just the two main groups continuous and discrete. The changes in the last PR #164 are at least partially responsible for this error.
Related to #60
Diagnostic and recomendantion by @OriolAbril
Pip is messing up big time. RTD first installs requirements-docs, then installs PreliZ.
On the requirements docs step:
Collecting nbclient<0.6,>=0.2
Downloading nbclient-0.5.13-py3-none-any.whl (70 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.6/70.6 kB 214.0 MB/s eta 0:00:00
But then, when installing PreliZ, now nbclient is also an indirect dependency via ipywidgets, and somehow this happens:
Requirement already satisfied: nbclient>=0.5.0 in /home/docs/checkouts/readthedocs.org/user_builds/preliz/envs/60/lib/python3.9/site-packages (from nbconvert>=5->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets->preliz==0.0.1.dev0) (0.5.13)
Collecting nbclient>=0.5.0
Downloading nbclient-0.6.6-py3-none-any.whl (71 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 71.8/71.8 kB 211.0 MB/s eta 0:00:00
And at the end of the PreliZ install it even prints this:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
jupyter-cache 0.5.0 requires nbclient<0.6,>=0.2, but you have nbclient 0.6.6 which is incompatible.
I think adding nbclient as an explicit dependency for PreliZ will fix this. But I think the preferred solution should be making ipywidgets and scikit-learn optional dependencies, then install doing the following:
Install requirements-optional
Install requirements-docs
Install PreliZ
Extra requirements can be added in the readthedocs.yml. It would look like
install:
requirements: requirements-optional.txt
requirements: requirements-docs.txt
A few options, 2 to 4 first moments, a fitted preliz distribution (from maxent, roulette or whatever), or a hand-drawn distribution. Limits, (upper or lower, or both) or x% mass between limits, etc...
It would be nice to run pylint and black
Roulette uses ipywidgets and matplotlib interactively. So ideally test should include simulated interactions
To let the user specify the collection of distributions to use and set the current collection as the default value.
In #173 sliders (now textboxes) become aware of a few functions that can be applied to the parameters. Providing a truly general solution will probably be too hard, and it may require a different approach to parse the function. Something easier may be to detect 3 categories, no function has been applied, a function/operation is applied and we can solve it, and a function/operation is applied but we don't have a clue what is going on, for this later case we could return something like "(unk, unk)", signaling that the function is not able to guess the boundaries of the parameter.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.