The proposed generator learns both foreground and background attentions. It uses the foreground attention to select from the generated output for the foreground regions, while uses the background attention to maintain the background information from the input image. For refer to our papers for more details.
AttentionGAN: Unpaired Image-to-Image Translation using Attention-Guided Generative Adversarial Networks.
Hao Tang1, Hong Liu2, Dan Xu3, Philip H.S. Torr3 and Nicu Sebe1.
1University of Trento, Italy, 2Peking University, China, 3University of Oxford, UK.
The repository offers the official implementation of our paper in PyTorch.
Copyright (C) 2019 University of Trento, Italy.
All rights reserved. Licensed under the CC BY-NC-SA 4.0 (Attribution-NonCommercial-ShareAlike 4.0 International)
The code is released for academic research use only. For commercial use, please contact [email protected].
Clone this repo.
git clone https://github.com/Ha0Tang/AttentionGAN
cd AttentionGAN/
This code requires PyTorch 0.4.1+ and python 3.6.9+. Please install dependencies by
pip install -r requirements.txt (for pip users)
or
./scripts/conda_deps.sh (for Conda users)
To reproduce the results reported in the paper, you would need an NVIDIA TITAN Xp GPUs.
Download the datasets using the following script. Please cite their paper if you use the data.
bash ./datasets/download_cyclegan_dataset.sh dataset_name
- Download a dataset using the previous script (e.g., horse2zebra).
- To view training results and loss plots, run
python -m visdom.server
and click the URL http://localhost:8097. - Train a model:
bash ./scripts/train_attentiongan.sh
- To see more intermediate results, check out
./checkpoints/horse2zebra_attentiongan/web/index.html
. - Test the model:
bash ./scripts/test_attentiongan.sh
- The test results will be saved to a html file here:
./results/horse2zebra_attentiongan/latest_test/index.html
.
- You need download a pretrained model (e.g., horse2zebra) with the following script:
bash ./scripts/download_attentiongan_model.sh horse2zebra
- The pretrained model is saved at
./checkpoints/{name}_pretrained/latest_net_G.pth
. - Then generate the result using
python test.py --dataroot ./datasets/horse2zebra --name horse2zebra_pretrained --model attention_gan --dataset_mode unaligned --norm instance --phase test --no_dropout --load_size 256 --crop_size 256 --batch_size 1 --gpu_ids 0 --num_test 500 --epoch 60
The results will be saved at ./results/
. Use --results_dir {directory_path_to_save_result}
to specify the results directory.
- For your own experiments, you might want to specify --netG, --norm, --no_dropout to match the generator architecture of the trained model.
- FID: Official Implementation
- KID: Suggested by UGATIT.
Install Steps:
conda create -n python36 pyhton=3.6 anaconda
andpip install --ignore-installed --upgrade tensorflow==1.13.1
This source code is inspired by CycleGAN and SelectionGAN.
If you have any questions/comments/bug reports, feel free to open a github issue or pull a request or e-mail to the author Hao Tang ([email protected]).