Code Monkey home page Code Monkey logo

pytorch_resnet_cifar10's Introduction

Proper ResNet Implementation for CIFAR10/CIFAR100 in Pytorch

Torchvision model zoo provides number of implementations of various state-of-the-art architectures, however, most of them are defined and implemented for ImageNet. Usually it is straightforward to use the provided models on other datasets, but some cases require manual setup.

For instance, very few pytorch repositories with ResNets on CIFAR10 provides the implementation as described in the original paper. If you just use the torchvision's models on CIFAR10 you'll get the model that differs in number of layers and parameters. This is unacceptable if you want to directly compare ResNet-s on CIFAR10 with the original paper. The purpose of this repo is to provide a valid pytorch implementation of ResNet-s for CIFAR10 as described in the original paper. The following models are provided:

Name # layers # params Test err(paper) Test err(this impl.)
ResNet20 20 0.27M 8.75% 8.27%
ResNet32 32 0.46M 7.51% 7.37%
ResNet44 44 0.66M 7.17% 6.90%
ResNet56 56 0.85M 6.97% 6.61%
ResNet110 110 1.7M 6.43% 6.32%
ResNet1202 1202 19.4M 7.93% 6.18%

This implementation matches description of the original paper, with comparable or better test error.

How to run?

git clone https://github.com/akamaster/pytorch_resnet_cifar10
cd pytorch_resnet_cifar10
chmod +x run.sh && ./run.sh

Details of training

Our implementation follows the paper in straightforward manner with some caveats: First, the training in the paper uses 45k/5k train/validation split on the train data, and selects the best performing model based on the performance on the validation set. We do not perform validation testing; if you need to compare your results on ResNet head-to-head to the orginal paper keep this in mind. Second, if you want to train ResNet1202 keep in mind that you need 16GB memory on GPU.

Pretrained models for download

  1. ResNet20, 8.27% err
  2. ResNet32, 7.37% err
  3. ResNet44, 6.90% err
  4. ResNet56, 6.61% err
  5. ResNet110, 6.32% err
  6. ResNet1202, 6.18% err

If you find this implementation useful and want to cite/mention this page, here is a bibtex citation:

@misc{Idelbayev18a,
  author       = "Yerlan Idelbayev",
  title        = "Proper {ResNet} Implementation for {CIFAR10/CIFAR100} in {PyTorch}",
  howpublished = "\url{https://github.com/akamaster/pytorch_resnet_cifar10}",
  note         = "Accessed: 20xx-xx-xx"
}

pytorch_resnet_cifar10's People

Contributors

akamaster avatar marisakirisame avatar karthikramesh55 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.