Conjectures, Etc.

This page lists a few fun conjectures/problems and is updated from time to time. If you solve one, let me know!

Which Boolean Functions are Most Informative?

Conjecture: Let X^n be i.i.d. Bernoulli(1/2), and let Y^n be the result of passing X^n through a memoryless binary symmetric channel with crossover probability alpha. For any Boolean function b: {0,1}^nrightarrow {0,1}, we have

 I(b(X^n);Y^n)leq 1-H(alpha).

I am offering $100 for a correct proof of (or counterexample to) this conjecture. For more details on the conjecture itself, see my paper with Gowtham R. Kumar.

Entropy Power Inequality for a Triple Sum?

For a random vector X with density on R^n, define its entropy power as

 N(X) = 2^{2 h(X)/n},


where h(X) is the differential entropy of X.

If X,Y,Z are independent random vectors, each having density on R^n, is it true that

 N(X+Z) N(Y+Z) geq N(X)N(Y) + N(Z)N(X+Y+Z)~?

It turns out that this inequality is true if Z is Gaussian, but the only proof I know of is via a more general result (see Theorem 3 of this paper). So, I would also be interested in a direct proof of the inequality in the special case where Z is Gaussian.