Code Monkey home page Code Monkey logo

rebias's People

Contributors

chaht01 avatar coallaoh avatar sanghyukchun avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

rebias's Issues

Could HSIC be negative?

Dear authors,
When I conducted my experiments using your HISC implementation, I found the HSIC value could be negative. Is it reasonable? Do you find the phenomenon in your experiments?
Thanks in advance!

Why is the HSIC not minimized but maximized?

Thank you for such a great work! When reading the paper and code, I have the following questions.

According to the definition of HSIC, it measures the level of independence and HSIC(U, V) = 0 indicates that U and V are independent. A larger HSIC value indicates that U and V are dependent to some extent.

So, to debias the representation of network f by using a biased network g, shouldn't we minimize HSIC(f, g)?

Besides, the for loop in line 62 seems redundant because the g_dim will be over-written by the last loop, right?

Looking forward to your reply. Thanks!

Discrepancies in reproducing results

Description:
I encountered discrepancies while attempting to reproduce the results presented in your paper titled "Learning De-biased Representations with Biased Representations." Despite following the provided methodology and using the suggested code and parameters on GitHub, the obtained results differ from what was reported in the paper.

Expected Results:

  • Rubi 0.99:

    • Run 1: ~0.936
    • Run 2: ~0.936
    • Run 3: ~0.936
  • ReBias 0.999:

    • Run 1: ~0.227
    • Run 2: ~0.227
    • Run 3: ~0.227

Actual Results:

  • Rubi 0.99:

    • Run 1: 0.8578
    • Run 2: 0.8666
    • Run 3: 0.8578
  • ReBias 0.999:

    • Run 1: 0.2419
    • Run 2: 0.244
    • Run 3: 0.253

Steps to Reproduce:

  1. Clone the repository and install requirements:
git clone https://github.com/clovaai/rebias.git
cd rebias
pip install -r requirements.txt
  1. For Rubi 0.99:
python main_biased_mnist.py --root data/ --train_correlation 0.99 --outer_criterion RUBi --g_lambda_inner 0
  1. For ReBias 0.999:
python main_biased_mnist.py --root data/ --train_correlation 0.999

I would greatly appreciate your assistance in understanding the reasons behind these inconsistencies.

Thank you for your time, and I look forward to your response.

Sincerely,
Alexandre DEVILLERS

RuntimeError for LearnedMixin on MNIST

Hello, I'm trying to run your LearnedMixin implementation on MNIST but I'm getting the following error:

[2021-07-03 06:58:30] start training
Traceback (most recent call last):
  File "main_biased_mnist.py", line 138, in <module>
    fire.Fire(main)
  File "/home/barbano/.pyenv/versions/rebias/lib/python3.7/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/home/barbano/.pyenv/versions/rebias/lib/python3.7/site-packages/fire/core.py", line 471, in _Fire
    target=component.__name__)
  File "/home/barbano/.pyenv/versions/rebias/lib/python3.7/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "main_biased_mnist.py", line 134, in main
    save_dir=save_dir)
  File "/home/barbano/rebias/trainer.py", line 390, in train
    self._train_epoch(tr_loader, cur_epoch)
  File "/home/barbano/rebias/trainer.py", line 362, in _train_epoch
    self._update_f(x, labels, loss_dict=loss_dict, prefix='train__')
  File "/home/barbano/rebias/trainer.py", line 340, in _update_f
    _f_loss_indep = self.outer_criterion(f_feats, _g_feats, labels=labels, f_pred=preds)
  File "/home/barbano/.pyenv/versions/rebias/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/barbano/rebias/criterions/comparison_methods.py", line 92, in forward
    loss = F.cross_entropy(f_pred+g_pred, labels)
RuntimeError: The size of tensor a (10) must match the size of tensor b (128) at non-singleton dimension 1

Appeal for the Imagenet texture cluster code?

Dear authors,

Hi, thanks for your great work and this icml paper gives me many insights.
Now I am trying to follow your work and I want to obtain the cluster of texture feature for the ImageNet training set.

Could you please provide the texture feature cluster code for reference?
Thanks a lot for your kind help.

Best,
Tan

"CosineAnnealingLR" in color-mnist experiment

Thanks the author release the code.
I change the "StepLR" to "CosineAnnealingLR" in the color-mnist experiments (vanilla approach). The acc is much better than that reported in the paper. Can author explain how the hyperparameters is selected? (such the StepLR vs CosineAnnealingLR)

9 class imagenet cluster labels

Dear Authors,
Thanks for sharing your code. I would like to reproduce the 9 class Imagenet clustering: I saw your .pth checkpoints but there is no reference to images and I am not sure whether I am considering the very same split as yours.

Could you please share a .csv file with filenames and associated cluster labels?

Thanks in advance

Code doubts

Hello, I wonder that whether the code 'KH = K - K.mean(0, keepdim=True)' is right? I think it should be 'KH = K - K.mean(1, keepdim=True)'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.