Code Monkey home page Code Monkey logo

bdl's Introduction

Bidirectional Learning for Domain Adaptation of Semantic Segmentation (CVPR 2019)

A pytorch implementation of BDL. If you use this code in your research please consider citing

@article{li2019bidirectional, title={Bidirectional Learning for Domain Adaptation of Semantic Segmentation}, author={Li, Yunsheng and Yuan, Lu and Vasconcelos, Nuno}, journal={arXiv preprint arXiv:1904.10620}, year={2019} }

Requirements

  • Hardware: PC with NVIDIA Titan GPU.
  • Software: Ubuntu 16.04, CUDA 9.2, Anaconda2, pytorch 0.4.0
  • Python package
    • conda install pytorch=0.4.0 torchvision cuda91 -y -c pytorch
    • pip install tensorboard tensorboardX

Datasets

Train adaptive segmenation network in BDL

  • Transferred images for CityScapes dataset can be found:
  • Initial model can be downloaded from DeepLab-V2
  • Training example (without self-supervised learning):
python BDL.py --snapshot-dir ./snapshots/gta2city \
              --init-weights /path/to/inital_weights \
              --num-steps-stop 80000 \
              --model DeepLab
  • Training example (with self-supervised learning):
    • Download the model SSL_step1 or SSL_step2 to generate pseudo labels for CityScapes dataset and then run:
python SSL.py --data-list-target /path/to/dataset/cityscapes_list/train.txt \
              --restore-from /path/to/SSL_step1_or_SSL_step2 \
              --model DeepLab \ 
              --save /path/to/cityscapes/cityscapes_ssl \
              --set train

With the pseudo labels, the adaptive segmenation model can be trained as:

python BDL.py --data-label-folder-target pseudo_label_folder_name \ 
              --snapshot-dir ./snapshots/gta2city_ssl \
              --init-weights /path/to/inital_weights \
              --num-steps-stop 120000 \
              --model DeepLab

Evaluation

The pre-trained model can be downloaded here GTA5_deeplab. You can use the pre-trained model or your own model to make a test as following:

python evaluation.py --restore-from ./snapshots/gta2city \
                     --save /path/to/cityscapes/results

Others

The different initial models can be downloaded here:

If you want to use BDL for SYNTHIA dataset or use VGG-FCN model, you can assign '--source synthia' or '--model VGG' The pre-trained model for SYNTHIA with DeepLab or VGG can be downloaded here:

The pre-trained model for GTA5 with VGG can be downloaded here:

Acknowledgment

This code is heavily borrowed from AdaptSegNet

bdl's People

Contributors

liyunsheng13 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.