Neural Networks

Initialize a neural network is important for the training and performance. Some initializations simply don't work, some will degrade the performance of the model. We should choose wisely.

Rectified Linear Unit, aka ReLu, and its properties