Code Monkey home page Code Monkey logo

alfs's Introduction

Asymmetric Loss Functions for Learning with Noisy Labels

This repository is the official implementation of Asymmetric Loss Functions for Learning with Noisy Labels [ICML 2021] and Asymmetric Loss Functions for Noise-tolerant Learning: Theory and Applications [T-PAMI].

Requirements

Python >= 3.6, PyTorch >= 1.3.1, torchvision >= 0.4.1, numpy>=1.11.2, tqdm >= 4.50.2, seaborn >= 0.11.0, tensorboardX >= 2.5

Learning with Noisy Labels (LNL)

The main running file is main.py with arguments as follows:

  • noise_type: symmetric | asymmetric
  • noise_rate: noise rate
  • loss: AGCE | AUL | AEL | CE (Cross Entropy) | FL (Focal Loss) | MAE | GCE | SCE | NFL | NCE | ...

The detailed implementation about the proposed asymmetric losses for classification can be found in ./lnl/losses.py

Example for 0.4 Symmetric noise rate with AUL loss

# CIFAR-10
$  python3  main.py --noise_type      symmetric           \
                    --noise_rate      0.4                 \
                    --loss            AUL                 \

Self-supervised Image Denoising

The main running file is main.py with arguments as follows:

  • exp: n2c | n2n | n2s
  • style: gauss | bernoulli | saltpepper | impulse
  • loss: heat | poisson | lp | mse ...

The detailed implementation about the proposed asymmetric losses for regression can be found in ./denoising/losses.py

Example for using the negative heat kernel loss for Gaussian denoising with noise2self

$  python3  main.py --exp       n2s               \
                    --style     gauss15           \
                    --loss      heat0.1           \

Reference

For technical details and full experimental results, please check the paper. If you have used our work in your own, please consider citing:

@InProceedings{zhou2021asymmetric,
  title = 	 {Asymmetric Loss Functions for Learning with Noisy Labels},
  author =       {Zhou, Xiong and Liu, Xianming and Jiang, Junjun and Gao, Xin and Ji, Xiangyang},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {12846--12856},
  year = 	 {2021},
  editor = 	 {Meila, Marina and Zhang, Tong},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR}
}

@ARTICLE{10039708,
  author={Zhou, Xiong and Liu, Xianming and Zhai, Deming and Jiang, Junjun and Ji, Xiangyang},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={Asymmetric Loss Functions for Noise-Tolerant Learning: Theory and Applications}, 
  year={2023},
  volume={},
  number={},
  pages={1-16},
  doi={10.1109/TPAMI.2023.3236459}
}

Moreover, we thank the code implemented by Ma et al. (classification) and Zhang et al. (DnCNN).

alfs's People

Contributors

hitcszx avatar

Stargazers

 avatar  avatar Runze Yan avatar  avatar Xiaoyu Li avatar Zhihua Fang avatar Yulong Nan  avatar  avatar  avatar  avatar Qin Lin avatar AIIA Lab avatar Hanzhang Wang avatar  avatar julilien avatar Jacob A Rose avatar  avatar Peter Hanneman avatar Li Bo avatar Jin Ming Teo avatar Thanit Tativannarat avatar Zehui Liao avatar  avatar Hooman Vaseli avatar Daxin Li avatar legendchilli avatar  avatar Conrad Stack avatar  avatar  avatar XuZhang avatar hitzx avatar  avatar  avatar Mike avatar  avatar Junjun Jiang avatar

Watchers

Mike avatar Peter Hanneman avatar  avatar

alfs's Issues

Request kind citation of our two papers as they are highly relevant.

Dear authors,

How are you?

I read your great work about asymmetric loss functions: Asymmetric Loss Functions for Learning with Noisy Labels, ICML 2021.

However, we have two pieces of work which worked on this aspect too, which I believe to be highly relevant.

(1) In DERIVATIVE MANIPULATION FOR GENERAL EXAMPLE WEIGHTING (https://arxiv.org/pdf/1905.11233.pdf), we mentioned :
image

(2) In IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters, we mentioned:
image

Both papers have been included in my PhD thesis: Example weighting for deep representation learning, Xinshao Wang, 2020

Therefore, could I ask you kindly to cite our two papers? If you could update your arXiv version by citing our two papers, I would appreciate it a lot.

Thanks very much.
Kind regards,

Implementation Problem for NLNL loss

Hi,

Thanks for this great implementation. When I ran this code, I found that the implementation of the NLNL loss might exist problem that will cause loss to be Nan. It might be caused by this line:

ALFs/losses.py

Line 233 in f9c5bf3

labels = labels * 0 - 100
. This line will make all labels to be -100. Would you mind checking this issue?

Best Regards,
Hongxin

Adjustment of hyper-parameters

Hi,
Thanks for this great implementation.
There are some hyper-parameters in the proposed ALFs. And the test accuracy corresponding to each hyper-parameter with different values is shown in the paper. However, there is no information on how to adjust the hyper-parameters in the training phase. So I wonder how you choose the value of these hyper-parameters. Is there a validation set with noisy labels? If yes, how to evaluate the performance on the noisy validation set? Looking forward to your reply.

Best Regards,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.