tricks-used-in-deep-learning
Tricks used in deep learning. Including papers read recently.
Improving softmax
Gumbel-Softmax: Categorical Reparameterization with Gumbel-Softmax
Confidence penalty: Regularizing Neural Networks by Penalizing Confident Output Distributions
Normalization
weight normalization: Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks
Batch Renormalization: Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models
Weight compressing
Soft weight-sharing for Neural Network compression
GAN
Wasserstein GAN: Example on MNIST
Loss-Sensitive Generative Adversarial Networks on Lipschitz Densities