Code Monkey home page Code Monkey logo

summary-likelihood's Introduction

Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach

Bayesian neural networks (BNNs) can account for both aleatoric and epistemic uncertainty. However, in BNNs the priors are often specified over the weights which rarely reflects true prior knowledge in large and complex neural network architectures. We present a simple approach to incorporate prior knowledge in BNNs based on external summary information about the predicted classification probabilities for a given dataset. The available summary information is incorporated as augmented data and modeled with a Dirichlet process, and we derive the corresponding \emph{Summary Evidence Lower BOund}. The approach is founded on Bayesian principles, and all hyperparameters have a proper probabilistic interpretation. We show how the method can inform the model about task difficulty and class imbalance. Extensive experiments show that, with negligible computational overhead, our method parallels and in many cases outperforms popular alternatives in accuracy, uncertainty calibration, and robustness against corruptions with both balanced and imbalanced data.

To start

git submodule update --init
ln -s ./bayesian-torch-repo/bayesian_torch .

See Requirements section for setting up dependencies.

Incorporating summary information

Figure 1

Preparing data

The data for running the experiments can be downloaded by running the download.sh in data/ directory. This will download the following datasets

  1. MNIST-C
  2. CIFAR10-C
  3. SST

From root directory, run

./data/download.sh

Additionaly, for NLP task, you need to run create_sst_emb.py. This will download Sentence-BERT pretrained model and extact the embeddings for SST dataset. To run this,

cd data/
python create_sst_emb.py
cd ..

Running experiments

Under construction. Refer the slurm-scripts for now.

Requirements

  1. Pytorch
  2. Pytorch Lightning
  3. Tensorboard
  4. Scipy
  5. tbparse

Also, you can use the provided environment.yaml file to exactly reproduce the experiment environment.

summary-likelihood's People

Contributors

v-i-s-h avatar

Stargazers

 avatar

Watchers

 avatar Tianyu Cui avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.