# Sums of independent random variables with unbounded variance

\begin{conjecture} If $X_1, \dotsc, X_n \geq 0$ are independent random variables with $\mathbb{E}[X_i] \leq \mu$, then $$\mathrm{Pr} \left( \sum X_i - \mathbb{E} \left[ \sum X_i \right ] < \delta \mu \right) \geq \min \left ( (1 + \delta)^{-1} \delta, e^{-1} \right).$$ \end{conjecture}

% You may use many features of TeX, such as % arbitrary math (between $...$ and $$...$$) % \begin{theorem}...\end{theorem} environment, also works for question, problem, conjecture, ... % % Our special features: % Links to wikipedia: \Def {mathematics} or \Def[coloring]{Graph_coloring} % General web links: \href [The On-Line Encyclopedia of Integer Sequences]{http://www.research.att.com/~njas/sequences/}

In comparison to most probabilistic inequalities (like Hoeffding's), Feige's inequality does not deteriorate as $n$ goes to infinity, something that is useful for computer scientists.

Let $T = \mathbb{E}\left [ \sum X_i \right ] + \delta$. Feige argued that to prove the conjecture, one only needs to prove it for the case when $\mu = 1$ and each variable $X_i$ has the entire probability mass distributed on 0 and $t_i$ for some $\mathbb{E}[X_i] \leq t_i \leq T$. He proved that $\mathrm{Pr} \left( \sum X_i - \mathbb{E} \left[ \sum X_i \right ] < \delta \right) \geq \min \left ( (1 + \delta)^{-1} \delta, 1/13 \right),$ and conjectured that the constant 1/13 may be replaced with $e^{-1}$. It was further conjectured that "the worst case" would be one of \begin{enumerate} \item one variable has $1 + \delta$ as maximum value and the remaining $n-1$ random variables are always 1 (hence the probability that the sum is less than $T$ is $(1 + \delta)^{-1} \delta$), \item each variable has $T = n + \delta$ as maximum (hence the probability that the sum is less than $T$ is $\left(1 - \frac{1}{T}\right)^n \stackrel{n \rightarrow \infty}{\longrightarrow} e^{-1}$). \end{enumerate}

One way to initiate an attack on this problem is to assume $\delta = \mathbb{E}[X_i] = 1$ and argue that the case when each variable assumes $n + 1$ with probability $(n+1)^{-1}$ and otherwise 0 is indeed the worst.

## Bibliography

% Example: %*[B] Claude Berge, Farbung von Graphen, deren samtliche bzw. deren ungerade Kreise starr sind, Wiss. Z. Martin-Luther-Univ. Halle-Wittenberg Math.-Natur. Reihe 10 (1961), 114. % %[CRS] Maria Chudnovsky, Neil Robertson, Paul Seymour, Robin Thomas: \arxiv[The strong perfect graph theorem]{math.CO/0212070}, % Ann. of Math. (2) 164 (2006), no. 1, 51--229. \MRhref{MR2233847} % % (Put an empty line between individual entries)

*[F04] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, STOC '04: Proceedings of the thirty-sixth annual ACM symposium on Theory of computing (2004), pp. 594 - 603. \href[ACM]{http://doi.acm.org/10.1145/1007352.1007443}

*[F05] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, Manuscript, 2005, \href[[pdf]]{http://www.wisdom.weizmann.ac.il/~feige/Others/newmarkov.pdf}

The problem was also referenced at \href[population algorithms, the blog]{http://petar.blog.lcs.mit.edu/?p=66}.

* indicates original appearance(s) of problem.