wiki/machine-learning/neural-networks/artificial-neural-networks.md

Initialize a neural network is important for the training and performance. Some initializations simply don't work, some will degrade the performance of the model. We should choose wisely.

For numerical stability we can use the log-sum-exp trick to calculate some loss such as cross entropy

Artificial neuron that separates the state space

Rectified Linear Unit, aka ReLu, and its properties