Code Monkey home page Code Monkey logo

gan-slimming's Introduction

GAN-Slimming

License: MIT

GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework

Haotao Wang, Shupeng Gui, Haichuan Yang, Ji Liu, Zhangyang Wang

In ECCV 2020 (Spotlight)

Overview

An all-in-one GAN compression method integrating model distillation, channel pruning and quantization under GAN minimax optimization framework.

Visualization Results

Image-to-image translation by (compressed) CycleGAN:

Training

1. Download dataset:

./download_dataset <dataset_name>

This will download the dataset to folder datasets/<dataset_name> (e.g., datasets/summer2winter_yosemite).

2. Get the original dense CycleGAN:

summer2winter_yosemite dataset

Use the official CycleGAN codes to train original dense CycleGAN.

horse2zebra dataset

Using the pretrained dense generator and discriminator to initialize G and D for GAN-Slimming is necessary on horse2zebra dataset. Downloaded the dense models for GS32 and GS8 from here and here respectively, and put them under the project root path.

3. Generate style transfer results on training set

Use the pretrained dense generator to generate style transfer results on training set and put the style transfer results to folder train_set_result/<dataset_name>. For example, train_set_result/summer2winter_yosemite/B/2009-12-06 06:58:39_fake.png is the fake winter image transferred from the real summer image datasets/summer2winter_yosemite/A/2009-12-06 06:58:39.png using the original dense CycleGAN.

4. Compress

GS-32:

python gs.py --rho 0.01 --dataset <dataset_name> --task <task_name>

GS-8:

python gs.py --rho 0.01 --quant --dataset <dataset_name> --task <task_name>

The training results (checkpoints, loss curves, etc.) will be saved in results/<dataset_name>/<task_name>. Valid <dataset_name>s are: horse2zebra, summer2winter_yosemite. Valid <task_name>s are: A2B, B2A. (For example, horse2zebra/A2B means transferring horse to zebra and horse2zebra/B2A means transferring zebra to horse.)

5. Extract compact subnetwork obtained by GS

GAN slimming has pruned some channels in the network by setting the channel-wise mask to zero. Now we need to extract the actual compressed subnetowrk.

python extract_subnet.py --dataset <dataset_name> --task <task_name> --model_str <model_str> 

The extracted subnetworks will be saved in subnet_structures/<dataset_name>/<task_name>

6. Finetune subnetwork

python finetune.py --dataset <dataset_name> --task <task_name> --base_model_str <base_model_str>

Finetune results will be saved in finetune_results/<dataset_name>/<task_name>

Pretrianed Models

Pretrained models are available through Google Drive.

Citation

If you use this code for your research, please cite our paper.

@inproceedings{wang2020ganslimming,
  title={GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework},
  author={Wang, Haotao and Gui, Shupeng and Yang, Haichuan and Liu, Ji and Wang, Zhangyang},
  booktitle={European Conference on Computer Vision},
  year={2020}
}

Our Related Work

Please also check our concurrent work on combining neural architecture search (NAS) and model distillation for GAN compression:

Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin, and Zhangyang Wang. "AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks." ICML, 2020. [pdf] [code]

gan-slimming's People

Contributors

htwang14 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gan-slimming's Issues

RuntimeError: Error(s) in loading state_dict for Generator:

Thanks for your geart work! I have traind my cycle gan model with the official CycleGAN codes, but when I run gs.py the error showed :
RuntimeError: Error(s) in loading state_dict for Generator:
Missing key(s) in state_dict: "model.2.weight", "model.2.bias", "model.5.weight", "model.5.bias", "model.10.conv_block.2.weight", "model.10.conv_block.2.bias", "model.11.conv_block.2.weight", "model.11.conv_block.2.bias", "model.12.conv_block.2.weight", "model.12.conv_block.2.bias", "model.13.conv_block.2.weight", "model.13.conv_block.2.bias", "model.14.conv_block.2.weight", "model.14.conv_block.2.bias", "model.15.conv_block.2.weight", "model.15.conv_block.2.bias", "model.16.conv_block.2.weight", "model.16.conv_block.2.bias", "model.17.conv_block.2.weight", "model.17.conv_block.2.bias", "model.18.conv_block.2.weight", "model.18.conv_block.2.bias", "model.20.weight", "model.20.bias", "model.23.weight", "model.23.bias".

I didn't change the official cycle gan codes, and the options I trained my model is "python train.py --dataroot ./datasets/Task12_BBR2color --name BBR2color --model cycle_gan --pool_size 50 --no_dropout --gpu_ids 1 --preprocess scale_width_and_crop --load_size 1920 --crop_size 360 --batch_size 1 --display_port 2020"
The pytorch version is 1.7

Ask for some details

What an exciting work! I noticed two details when running the code.
(1) Whether it is GS or fine-tune, G and D use a smaller learning rate, while the norm factor gamma uses a larger learning rate. Why?
(2) When running gs.py, only dataset horse2zebra ( in 104 lines) needs to load the pre training model and retrain. Why?

Confusion about dense model?

Thank you best work,i meet some confusion about dense model.why need dense model?
The dataset is already one-to-one.
I think we can skip directly to the fourth step without a dense model,
but I don’t know if my idea is correct. Hope you can help me.

compressed horse2zebra model

Hello, thanks for sharing the code.
I trained the cycleGAN model on horse2zebra dataset for several times without the quant. But the best FID is about 97, I wonder if I should change some hyperparameters when training the model without quant?
And could you please upload the horse2zebra model as a reference, thank you very much!

Steps for point no. 3. Generate style transfer results on training set

Hi,
Could you please explain steps for how to Generate style transfer results on the training set? Which file do we need to run?
Do I need to again train your pre-trained models? I used the GS-32 pre-trained dense model. I am not getting good results. Could you please help me?
With Regards,
Jatin Kumar

int8 quantified model

Hi! @htwang14
Thanks for your nice work! I have seen the Conv2dQuant and ConvTrans2dQuant operators in the in models/models.py file, but the feature map before the quantized convolution and weights are still floating-point values.
How can I get an int8 quantified model, it can be saved with 8-bit.

为何mse会不如感知loss?

看了论文和gs.py, 给我的感觉是是基于cyclegan生成的图像, 外加剪枝loss重新训练一个pix2pix的G. 我这样的理解有问题吗?

既然是衡量student_output_img和teacher_output_img的差距, 为什么感知loss是强于mse的呢?
我的理解是训练目标就单纯是让G_p2p(student)的结果拟合G_cyclegan(teacher)的结果, mse这样直接的约束为何没有感知loss这样更加偏向语义的约束效果好呢?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.