Wednesday, March 2, 2016

sequences and series - Does the Shannon Entropy always exist (even for infinite distributions)?

Let $p : \mathbb{N} \to [0, 1]$ be a probability distribution over the naturals.




The Shannon Entropy is:



$$H = -\sum_{n=0}^\infty p(n)\log_2 p(n)$$



Does this series always converge?






I tried a little to attack this problem but it's been some time since I evaluate series. My attempt:




We known that $\sum_{n=0}^\infty p(n) = 1$, therefore it must be the case that $\lim\limits_{n\to\infty} p(n) = 0$, and since $\lim\limits_{x\to 0} x\log_2 x = 0$, for any $\varepsilon > 0$ there exists only a finite amount of terms greater than $\varepsilon$. The problem is that I can't conclude anything from this, because there are several series in which the terms go to zero but it still diverges.

No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...