For commonly used activation functions include the following:
- intialization:
- regions of saturation:
- avoid saturation:
- activation progression with epochs:
- gradient progression with epochs:
- useful in:
Activations functions to include:
- sigmoid
- tanh
- relu
- leaky relu
- softsign
- max-out units