The purpose of activation functions is to introduce non-linearities into the network , since we have a lot of non-linear data today.
- tf.compat.v1.math.sigmoid(34.0)
f'(x)= f(x) (1-f(x))
- tf.compat.v1.math.tanh(34.0)
f'(x)=1 -f(x)^2
- tf.compat.v1.nn.relu(34.0)
Recified Linear Unit :- f(x)= max(0,x)
f'(x)= 1 if x>0 else 0