Fraser Information

#Information Theory

The Fraser information is

$$ I_F(\theta) = \int g(X) \ln f(X;\theta) , \mathrm d X. $$

When comparing two models, $\theta_0$ and $\theta_1$, the information gain is

$$ \propto (F(\theta_1) - F(\theta_0)). $$

The Fraser information is closed related to Fisher information Fisher Information Fisher information measures the second moment of the model sensitivity with respect to the parameters. , Shannon information, and Kullback information KL Divergence Kullback–Leibler divergence indicates the differences between two distributions 1.

  1. Fraser DAS. On Information in Statistics. aoms. 1965;36: 890–896. doi:10.1214/aoms/1177700061 ↩︎

Published: by ;

Lei Ma (2021). 'Fraser Information', Datumorphism, 05 April. Available at:

Current Ref:

  • cards/information/