Find a good learning rate
Initialize a neural network is important for the training and performance. Some initializations simply don't work, some will degrade the performance of the model. We should choose wisely.
BiPolar sigmoid function and its properties
Conic Section Function and its properties
ELU and its properties
Tanh function and its properties
Leaky ReLu and its properties
Radial Basis Function function and its properties
Rectified Linear Unit, aka ReLu, and its properties
Swish and its properties
Uni-polar sigmoid function and its properties