Comments (2)
@lileicv
Good to hear from you that our method still has room for improvements!
We couldn't have put much effort into the optimization hyperparameter search.
For optimization parameters, we choose parameters that stably reduce the training loss values. There was no specific quantitative criterion, but we didn't fit the parameters on the test set (especially, we hide the unbiased test set during the hyperparameter search).
It would be better if we tested all possible combinations such as learning rate, weight decay, scheduler (e.g., warm-up, cosine, linear, exp decays, step, multi-step decays, ...), optimizers (now we strongly recommend our new optimizer AdamP, ICLR'21), but we chose very basic and popular parameters. Because too much fine optimization hyperparameter tuning often makes analysis difficult -- what if one chooses a very complex hyperparameter that only works for a specific method? Instead, we chose very basic optimization parameters in this paper.
We chose different hyperparameters for ImageNet and Kenetics experiments because they are large-scale experiments comparing to MNIST. We chose CosineAnnealingLR here because we empirically know that it is effective for ImageNet training (e.g., many state-of-the-art ImageNet classifiers uses cosine / exponential lr decay). We didn't have enough time to test the cosine learning rate decay to MNIST again when we wrote the paper, but I don't surprise that the ImageNet option performs better than the current MNIST option.
from rebias.
@SanghyukChun
Thanks for your reply and for sharing the code. It is a good work.
from rebias.
Related Issues (9)
- 9 class imagenet cluster labels HOT 2
- Code doubts HOT 3
- Discrepancies in reproducing results HOT 3
- PythonTypeError: Tuple object does not support item assignment. HOT 1
- Why is the HSIC not minimized but maximized? HOT 5
- Appeal for the Imagenet texture cluster code? HOT 2
- Could HSIC be negative? HOT 2
- RuntimeError for LearnedMixin on MNIST HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rebias.