AM > GM
An interesting relationship between two averages.
When learning about random variables, their probability distributions, and estimators for statistics - one comes across the "arithmetic average", commonly termed as the "average" or the "mean". Though, little is said about its little brother - the "geometric average" - and there's a reason. It's relatively less informative than its arithmetic counterpart. That said - there's an interesting relationship between the two that forms the basis of this post. Further, it provides insight into how a probabilistic relationship surfaces as a useful tool for problems in combinatorics and geometry. With my interest captured. Let's begin.
Definitions
Let's recall the definitions. Let $\{X_{i}\}_{1}^{N}$ be $N$ i.i.d. random variables. Then - the arithmetic average ("AA") is given by \begin{align*} \widehat{\mathbb{E}}_{\text{AA}}[X_{i}] = \frac{1}{N}\sum_{i=1}^{N}X_{i}. \end{align*} The geometric average counterpart ("GA") is given by \begin{align*} \widehat{\mathbb{E}}_{\text{GA}}[X_{i}] = \left(\prod_{i=1}^{N}X_{i}\right)^{\frac{1}{N}} \end{align*}
Why is $\mathbb{E}[X]_{\text{AA}}$ is more useful?
These formulae are estimators for the expected value of the random variable. Again, they "estimate", and are expected to be the true expected value of the random variable. This can be tested: \begin{align*} \mathbb{E}\left[\frac{1}{N}\sum_{i=1}^{N}X_{i}\right]=\mathbb{E}[X_{i}] \end{align*} Starting with the LHS \begin{align*} \mathbb{E}\left[\frac{1}{N}\sum_{i=1}^{N}X_{i}\right]&=\frac{1}{N}\mathbb{E}\left[\sum_{i=1}^{N}X_{i}\right]\\\\ &=\frac{1}{N}\sum_{i=1}^{N}\mathbb{E}[X_{i}]\\\\ &=\frac{1}{N}N\mathbb{E}[X_{i}]\\\\ &=\mathbb{E}[X_{i}] \end{align*} Therefore - the arithmetic average is known as an unbiased estimator of the expected value of $X_{i}$. In layman terms - the AA is a great estimator of $\mathbb{E}[X_{i}]$! The geometric average on the other hand does not share the same result. We repeat the same calculations: \begin{align*} \mathbb{E}\left[\left(\prod_{i=1}^{N}X_{i}\right)^{\frac{1}{N}}\right] &= \mathbb{E}\left[\left(\prod_{i=1}^{N}\left(X_{i}\right)^{\frac{1}{N}}\right)\right]\\\\ &=\left(\prod_{i=1}^{N}\mathbb{E}\left[\left(X_{i}\right)^{\frac{1}{N}}\right]\right)\\\\ &=\left(\mathbb{E}\left[\left(X\right)^{\frac{1}{N}}\right]\right)^{N}\\\\ &\leq\mathbb{E}\left[\left(\left(X\right)^{\frac{1}{N}}\right)^{N}\right]\\\\ &=\mathbb{E}\left[X\right] \end{align*} The 4th line in the block above is due to Jensen's Inequality. The conclusion here is that the expected value of the geometric average is not equal to the expected value of $X$, in fact, it is less implying that it is underestimating said value in the presence of a negative bias. One interesting question is when the geometric average is a useful estimator of a random variable's behavior, though that is a conversation for another day.
Conclusion
From Jensen's Inequality - it has been shown that the arithmetic average is greater than the geometric average, or \begin{align*} \widehat{\mathbb{E}}_{\text{AA}}[X_{i}] &\geq \widehat{\mathbb{E}}_{\text{GA}}[X_{i}]\\\\ \frac{1}{N}\sum_{i=1}^{N}X_{i} &\geq \left(\prod_{i=1}^{N}X_{i}\right)^{\frac{1}{N}} \end{align*} That said - it becomes a ueseful tool when exploring inequalities in combinatorics and geometry! Thus why this relationship is so interesting to me. For the cases where $N=2$, and using $a$, $b$ and $c$ as variables for simplicity: \begin{align*} \frac{a+b}{2} &\geq \sqrt{ab} \\\\ a+b &\geq 2\sqrt{ab} \end{align*} Or when $N=3$ \begin{align*} \frac{a+b+c}{3} &\geq \sqrt{abc} \\\\ a+b+c &\geq 3\sqrt{abc} \end{align*} Both of which are incredibly useful and can be further generalized! Check out this interesting inequality post that was solved by using the AM-GM relationship a couple of times: https://math.stackexchange.com/q/5098717/66565 Thank you for your attention!