Layer Norm

Layer norm is a normalization method to enable better training1.

Layer normalization (LayerNorm) is a technique to normalize the distributions of intermediate layers. It enables smoother gradients, faster training, and better generalization accuracy.

Quote from Xu et al. 20191

The key of layer norm is to normalize the input to the layer using the mean and standard deviation.

Layer norm plays two roles in neural networks:

  1. Projects the key vectors onto a hyperplane.
  2. Scales the key vectors to have the same length.
Xu et al. 2019

Xu et al. 2019

Planted: by ;

L Ma (2021). 'Layer Norm', Datumorphism, 02 April. Available at: https://datumorphism.leima.is/wiki/machine-learning/neural-networks/layer-norm/.