A Few Things About Entropy

Entropy

The amount of uncertainty of an observable entity.

Formula

Conditional Entropy

The amount of uncertainty of an observable entity given some prior knowledge.

Formula

Mutual Information

Mutual information is a fundamental quantity for measuring the relationship between random variables.

Formula

$$ \begin{aligned} I(X; Y) &= \sum_{x, y \in \mathcal{X, Y}} p_{X, Y}(x, y) \left[ \log \frac{1}{p_{X}(x) \cdot p_{Y}(y)} - \log \frac{1}{p_{X, Y}(x, y)} \right] \end{aligned} $$

Essence

$$ \begin{aligned} I(X; Y) &= H(X) - H(X | Y) \\ &= H(Y) - H(Y | X) \end{aligned} $$

Set Theory:

$$ \begin{aligned} X \cap Y &= X - (X \backslash Y) \\ &= Y - (Y \backslash X) \\ \end{aligned} $$

(both "$-$" and "$\backslash$" are the set-difference operator)

Chain Rule

For Entropy

$$ \begin{aligned} H(X_1, \ldots X_n) &= \sum_{i=1}^{n} H(X_i | X_{i - 1}, \ldots, X_{1}) \end{aligned} $$

Set Theory:

$$ \begin{aligned} X_1 \cup \cdots \cup X_n &= X_1 \\ &\cup X_2 \backslash X_1 \\ &\cup \cdots \\ &\cup X_n \backslash (X_1 \cup \cdots \cup X_{n - 1}) \end{aligned} $$

For Mutual Information

$$ \begin{aligned} I(X; Y_1, \ldots, Y_n) &= \sum_{i=1}^{n} I(X; Y_i | Y_1, \ldots, Y_{i - 1}) \end{aligned} $$

Set Theory: $$ \begin{aligned} X \cup (Y_1 \cap \cdots \cap Y_n) &= (X \cap Y_1) \\ &\cup (X \cap Y_2) \backslash Y_1 \\ &\cup \ \cdots \\ &\cup (X \cap Y_n) \backslash (Y_1 \cup \cdots \cup Y_{n - 1}) \end{aligned} $$

Entropy Rate

How does the entropy of the sequence grow with $ n $ ?

Formula

For stochastic process $\{X_i\}$ :

$$ \begin{aligned} H(\mathcal{X}) &= \lim_{n \to \infty} \frac{H(X_1, \ldots, X_n)}{n} \end{aligned} $$

If $\{X_i\}$ are i.i.d.: $$ \begin{aligned} H(\mathcal{X}) = \lim_{n \to \infty} \frac{n \cdot H(X)}{n} &= H(X) \\ &= \lim_{n \to \infty} H(X_n | X_1, \ldots, X_{n - 1}) = H'(\mathcal{X}) \end{aligned} $$

by Jon