Code Monkey home page Code Monkey logo

ex-no.-4--implementation-of-approximate-inference-in-bayesian-networks's Introduction

Ex-No.4 Implementation of Approximate Inference in Bayesian Networks

Aim :

To construct a python program to implement approximate inference using Gibbs Sampling.

Algorithm:

Step 1: Bayesian Network Definition and CPDs:
(i) Define the Bayesian network structure using the BayesianNetwork class from pgmpy.models.
(ii) Define Conditional Probability Distributions (CPDs) for each variable using the TabularCPD class.
(iii) Add the CPDs to the network.

Step 2: Printing Bayesian Network Structure:
(i) Print the structure of the Bayesian network using the print(network) statement.

Step 3: Graph Visualization:
(i)Import the necessary libraries (networkx and matplotlib).
(ii)Create a directed graph using networkx.DiGraph().
(iii)Define the nodes and edges of the graph.
(iv)Add nodes and edges to the graph.
(v) Optionally, define positions for the nodes.
(vi) Use nx.draw() to visualize the graph using matplotlib.

Step 4: Gibbs Sampling and MCMC:
(i) Initialize Gibbs Sampling for MCMC using the GibbsSampling class and provide the Bayesian network.
(ii)Set the number of samples to be generated using num_samples.
Step 5: Perform MCMC Sampling:
(i)Use the sample() method of the GibbsSampling instance to perform MCMC sampling.
(ii)Store the generated samples in the samples variable.
Step 6: Approximate Probability Calculation:
(i) Specify the variable for which you want to calculate the approximate probabilities (query_variable).
(ii)Use .value_counts(normalize=True) on the samples of the query_variable to calculate approximate probabilities.
Step 7:Print Approximate Probabilities: (i) Print the calculated approximate probabilities for the specified query_variable.

Program :

NAME:- Gunaseelan G
REG NO:- 212221230031

Import the necessary libraries

from pgmpy.models import BayesianNetwork
from pgmpy.factors.discrete import TabularCPD
from pgmpy.sampling import GibbsSampling
import networkx as nx
import matplotlib.pyplot as plt

Define the Bayesians network Structure

network=BayesianNetwork([('Burglary','Alarm'),
                         ('Earthquake','Alarm'),
                         ('Alarm','JohnCalls'),
                         ('Alarm','MaryCalls')])

Define the conditional Probability Distractions(CPDs)

cpd_burglay=TabularCPD(variable='Burglary',variable_card=2,values=[[0.999],[0.001]])
cpd_earthquake=TabularCPD(variable='Earthquake',variable_card=2,values=[[0.998],[0.002]])
cpd_alarm=TabularCPD(variable='Alarm',variable_card=2,values=[[0.999,0.71,0.06,0.05],[0.001,0.29,0.94,0.95]],evidence=['Burglary','Earthquake'],evidence_card=[2,2])
cpd_john_calls=TabularCPD(variable='JohnCalls',variable_card=2,values=[[0.95,0.1],[0.05,0.9]],evidence=['Alarm'],evidence_card=[2])
cpd_mary_calls=TabularCPD(variable='MaryCalls',variable_card=2,values=[[0.99,0.3],[0.01,0.7]],evidence=['Alarm'],evidence_card=[2])

Add CPDs to the network

network.add_cpds(cpd_burglay,cpd_earthquake,cpd_alarm,cpd_john_calls,cpd_mary_calls)

Print the Bayesian network structure

print("Bayesian Network Structure:")
print(network)

Create a Directed Graph

G=nx.DiGraph()

Define nodes and Edges

nodes=['Burglary', 'Earthquake', 'Alarm',' JohnCalls',' MaryCalls']
edges=[('Burglary','Alarm'),('Earthquake','Alarm'),('Alarm','JohnCalls'),('Alarm','MaryCalls')]

Add nodes and Edges to the Graph

G.add_nodes_from(nodes)
G.add_edges_from(edges)

Set the positions from nodes

pos={'Burglary':(0,0),'Earthquake':(2,0),'Alarm':(1,-2),'JohnCalls':(0,-4),'MaryCalls':(2,-4)}

Draw the network

nx.draw(G,pos,with_labels=True,node_size=1500,node_color='skyblue',font_size=10,font_weight='bold',arrowsize=20)
plt.title("Bayesian Network:alarm Problem")
plt.show()

Initialize Gibbs Sampling for MCMC

gibbs_sampler=GibbsSampling(network)

Set the number of samples

num_samples=10000

Perfrom MNMC sampling

samples=gibbs_sampler.sample(size=num_samples)

Calculate approximate probabilities based on the samples

query_variable='Burglary'
query_result=samples[query_variable].value_counts(normalize=True)

Print the approximate probabilities

print('\n Approximate probabilities of {}:'.format(query_variable))
print(query_result)

Output :

268047758-0ccea31c-59d2-4f39-ab7c-080b933b5f0a

268047748-6934c7b6-718b-452f-abca-c7c099530b17

268047733-4b778f59-091b-4f84-a46d-148af0583492

Result :

Thus, an Approximate method of interference computation is implemented using Gibbs Sampling in Python

ex-no.-4--implementation-of-approximate-inference-in-bayesian-networks's People

Contributors

guru-guna avatar lavanyajoyce avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.