Datumorphism Datumorphism
Notebooks
Awesome Blog Cards Hologram Reading Notes TIL Wiki
  Hologram   Deep Learning Book
Blog AmNeumarkt
  • Datumorphism
  • Cards
  • Information
  • Shannon Entropy

Shannon Entropy

GROWING This note is being developed. It is readible but changes are being applied.
cards/information/shannon-entropy.md Current Note ID: The unique ID of this note.
#Information Theory #Entropy

Shannon entropy $S$ is the expectation of information content $I(X)=-\log \left(p\right)$1,

\begin{equation} H(p) = \mathbb E_{p}\left[ -\log \left(p\right) \right]. \end{equation}


  1. shannon_entropy_wiki Contributors to Wikimedia projects. Entropy (information theory). In: Wikipedia [Internet]. 29 Aug 2021 [cited 4 Sep 2021]. Available: https://en.wikipedia.org/wiki/Entropy_(information_theory)  ↩︎

Planted: 2021-09-04 by L Ma;

References:
  1. shannon_entropy_wiki Contributors to Wikimedia projects. Entropy (information theory). In: Wikipedia [Internet]. 29 Aug 2021 [cited 4 Sep 2021]. Available: https://en.wikipedia.org/wiki/Entropy_(information_theory)
Dynamic Backlinks to cards/information/shannon-entropy:
Cross Entropy
Cross entropy is1 $$ H(p, q) = \mathbb E_{p} \left[ -\log q \right]. $$ Cross entropy $H(p, q)$ can …

Lei Ma (2021). 'Shannon Entropy', Datumorphism, 09 April. Available at: https://datumorphism.leima.is/cards/information/shannon-entropy/.

« Jensen-Shannon Divergence Mutual Information »

Created and maintained by L Ma. Acknowledgement: Hugo, Bulma, KausalFlow. love.
Use CMD/WIN+k to activate the command palette
About Feed JSON Data