aloctavodia / bap3 Goto Github PK
View Code? Open in Web Editor NEWFigures and code examples from Bayesian Analysis with Python (third edition)
Home Page: http://bap.com.ar/
Figures and code examples from Bayesian Analysis with Python (third edition)
Home Page: http://bap.com.ar/
Page 254 states, "We call
I believe the first
On pg 133 is states, "We have been using the linear motif to model the mean of a distribution and, in the previous section, we used it to model interactions."
Interactions have not been introduced to this point. Not sure if this is a typo and it's supposed to say something else like probably, or if it's a holdover from a previous version where interactions were previously discussed.
Thanks,
Carsten
Figure 3.6 indicates a half-normal prior for
Why is the distribution for the likelihood of the coin flip example in Chapter 2 a Bernoulli when it was a Binomial distribution in the same coinflip example in Chapter 1? To me a Binomial seems more intuitive since the coin flip is preformed multiple times (4 trials in the example). Clarity on this becomes even more relevant when performing Exercise nbr 1 of Chapter 2 which requires comparing the coin flip models between chapters 1 and 2.
I’m in need of some clarification on Chapter 2, Exercise #1: are the beta parameters in question from page 34 (Code 1.6), or are they referenced elsewhere?
It wasn’t entirely clear to me.
Thanks!
The code in 6.14 uses the temperature
variable but it should use hour
instead. The caption for figure 6.5 says it's for the "bikes model with temperature and humidity" but it's for the polynomial bikes models with hour.
Executing Code 2.7 I get the following error:
AttributeError: module 'nutpie' has no attribute 'random'
When I run the cell content as
x, y = np.mgrid[-4:4:.01, -4:4:.01]
pos = np.empty(x.shape + (2,))
pos[:, :, 0] = x; pos[:, :, 1] = y
rv = pz.MvNormal([0, 0], [[1, 0.8],
[0.8, 1]])
x_value = pos[:, :, 0][:,0]
x_density = rv.pdf(pos)
left, width = 0.1, 0.65
bottom, height = 0.1, 0.65
bottom_h = left_h = left + width + 0.02
rect_scatter = [left, bottom, width, height]
rect_histx = [left, bottom_h, width, 0.2]
rect_histy = [left_h, bottom, 0.2, height]
plt.figure(1, figsize=(8, 8))
ax_joint = plt.axes(rect_scatter)
ax_x = plt.axes(rect_histx)
ax_y = plt.axes(rect_histy)
ax_joint.imshow(x_density, cmap='cet_gray_r', origin='lower', extent=[-3, 3, -3, 3])
ax_joint.plot(x_value, x_density[400]*2, 'k:', lw=2)
ax_joint.plot(x_value, x_density[500]*2+1, 'k:', lw=2)
ax_joint.plot(x_value, x_density[300]*2-1, 'k:', lw=2)
ax_x.fill_between(x_value, x_density.sum(1), color='C2')
ax_y.fill_betweenx(x_value, x_density.sum(1), color='C2')
for ax in [ax_joint, ax_x, ax_y]:
ax.grid(False)
ax.set_facecolor('w')
ax.set_xticks([])
ax.set_yticks([])
ax_joint.set_xlim(-3, 3)
ax_joint.set_ylim(-3, 3)
ax_x.set_xlim(-3, 3)
ax_y.set_ylim(-3, 3)
ax_x.set_xlim(-3, 3)
ax_joint.set_ylabel('$B$', rotation=0, labelpad=20, fontsize=18)
ax_joint.set_xlabel('$A$', fontsize=18)
ax_joint.text(-2.5, 2.5, '$p(A, B)$', fontsize=18, color='k', weight='medium')
ax_y.text(10, 0, '$p(B)$', fontsize=18, color='k', weight='medium')
ax_x.text(-0.2, 15, '$p(A)$', fontsize=18, color='k', weight='medium')
ax_joint.text(1, -2, ' ... $p(A \mid B)$', fontsize=18, color='k', weight='medium')
plt.savefig('../fig/joint_marginal_cond.png')
jupyter issue 2 warnings
/tmp/ipykernel_5463/4085251966.py:52: UserWarning: There are no gridspecs with layoutgrids. Possibly did not call parent GridSpec with the "figure" keyword
plt.savefig('../fig/joint_marginal_cond.png')
/home/osvaldo/anaconda3/envs/bap3/lib/python3.11/site-packages/IPython/core/pylabtools.py:152: UserWarning: There are no gridspecs with layoutgrids. Possibly did not call parent GridSpec with the "figure" keyword
fig.canvas.print_figure(bytes_io, **kw)
I looked up the matplot API and found if I changed
plt.figure(1, figsize=(8, 8))
ax_joint = plt.axes(rect_scatter)
ax_x = plt.axes(rect_histx)
ax_y = plt.axes(rect_histy)`
to
_, axes_arr = plt.subplots(1, 3, figsize=(8, 8))
ax_joint, ax_x, ax_y = axes_arr
ax_joint.set_position(rect_scatter)
ax_x.set_position(rect_histx)
ax_y.set_position(rect_histy)
The warning problems can be fixed.
Sorry if this is the wrong place for this, as it's probably not an error as much as a misunderstanding. But I have followed all the installation instructions and have activated a bap3 environment in conda, and when I try to write commands that use "pz" such as "pz.BetaBinomal()" I am getting an error. I have scoured Google and ChatGPT in search of answers but just can't get this code to run correctly. Am I missing something?
Hi @aloctavodia,
I am writing because I got a problem in the Code 1.2. I installed it with pip install preliz==0.0.2
. However, when I code ´pz.BetaBinomial(alpha=10, beta=10, n=6).plot_interactive()´ I got: AttributeError: module 'preliz' has no attribute 'BetaBinomial'.
Do you suggest some way to solve it? I keep attentive.
Best regards from Chile.
Thanks in advance for sharing your knowledge.
Bryan
PS: Let me know if I can write you in Spanish.
Page 105 states
... will need to properly index the vectors
$\mu_p$ and$\nu_p$ to match the total number of players
but I believe "players" should be changed to "positions"
I understand the code behind figure 4.4. Looking around in Arviz I discovered the plotting funtion plot_lm that should be able to do something similar to the plots in figure 4.4. I would like to replace the Python code for fig 4.4 with a single method call :-)
In the attached notebook I have created a slimmed down version of the example and added a call to plot_lm. However I have problems getting it to work. It looks like some naming convention issue for coords in the inferencedata object.
In the notebook I describe both what problem I experience and how I fix it. Can you please advice if there is a better way of doing this. Thanks.
Thank you for this very interesting book that opened my mind to Bayesian statistics!
One small feedback I want to give is regarding the Bernoulli distribution.
This distribution is introduced in code 2.2 p. 52, but its relationship with the Binomial distribution is not clearly explained.
Of course, it is possible to understand what is Bernoulli with a small search on Wikipedia, but I thought for completeness, it would be good to say it in a sentence in the book.
I think that there is a typo in line 2 of Code block 2.26 of page 81.:
𝜇 = pm.Normal("𝜇", mu=0, sigma=10, shape=4)
I'm not sure if a normal distribution centered around zero is a logical prior for this example.
The text right above the code block says that the code should be "almost the same as model_g", who's mean was defined in Code 2.12 of page 66 as:
𝜇 = pm.Uniform('𝜇', lower=40, upper=70)
But Code 2.27 of page 82 is described as an alternative syntax to Code 2.26 and the mean was defined there as:
𝜇 = pm.HalfNormal("𝜇", sigma=5, dims="days")
Is a correction needed to avoid confusion?
Thanks again for this excellent book!
At bottom of p. 115, it is written "The noise term \epsilon".
Am I mistaken or should it be "The noise term \sigma"?
The short url listed there doesn't lead to pymc documentation, instead leads to the linked in page of "Melanya Kirakosyan".
The exercise says to see the "accompanying code model_2
" but this model doesn't appear in the code for chapter 4 (or elsewhere as far as I can determine).
Thanks again for the good reading!
I have not yet been to the end of the book, but something which surprised me is the absence of the log-normal distribution in the examples given.
I thought the log-normal was a very natural distribution for always-positive variables (some people call them Jeffreys variables, if I am not mistaken).
And indeed, the log-normal distribution appears around us: the distribution of temperatures (in K) in the universe, the distribution of resistance values in electronic goods, the distribution of earnings in the society, etc.
Is there a reason for omitting this important distribution?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.