Skip to main content

Entropy

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable XX, which takes values in the alphabet X\mathcal{X} and is distributed according to p:X→[0,1]p: \mathcal{X}\to[0, 1]:

E(X):=−∑x∈Xp(x)log⁡p(x)=E[−log⁡p(X)],E(X) := -\sum_{x \in \mathcal{X}} p(x) \log p(x) = \mathbb{E}[-\log p(X)] ,

Entropy (information theory)