Initialize a neural network is important for the training and performance. Some initializations simply don't work, some will degrade the performance of the model. We should choose wisely.
Artificial neuron that separates the state space
Connected perceptrons
BiPolar sigmoid function and its properties
Conic Section Function and its properties
ELU and its properties
Tanh function and its properties
Leaky ReLu and its properties
Radial Basis Function function and its properties
Rectified Linear Unit, aka ReLu, and its properties
Swish and its properties
Uni-polar sigmoid function and its properties