
Sums of independent random variables with unbounded variance

![$ \mathbb{E}[X_i] \leq \mu $](/files/tex/e0268221532981debea25e9446c8ee6f112e1881.png)
![$$\mathrm{Pr} \left( \sum X_i - \mathbb{E} \left[ \sum X_i \right ] < \delta \mu \right) \geq \min \left ( (1 + \delta)^{-1} \delta, e^{-1} \right).$$](/files/tex/03dc1130142ee6fefcc33888e2fb6137211bf327.png)
In comparison to most probabilistic inequalities (like Hoeffding's), Feige's inequality does not deteriorate as goes to infinity, something that is useful for computer scientists.
Let . Feige argued that to prove the conjecture, one only needs to prove it for the case when
and each variable
has the entire probability mass distributed on 0 and
for some
. He proved that
and conjectured that the constant 1/13 may be replaced with
. It was further conjectured that "the worst case" would be one of
- \item one variable has







One way to initiate an attack on this problem is to assume and argue that the case when each variable assumes
with probability
and otherwise 0 is indeed the worst.
Bibliography
*[F04] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, STOC '04: Proceedings of the thirty-sixth annual ACM symposium on Theory of computing (2004), pp. 594 - 603. ACM
*[F05] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, Manuscript, 2005, [pdf]
The problem was also referenced at population algorithms, the blog.
* indicates original appearance(s) of problem.