Processing math: 100%

Conjectures, Etc.

This page lists a few fun conjectures/problems and is updated from time to time. If you solve one, let me know!

Which Boolean Functions are Most Informative?

Conjecture: Let Xn be i.i.d. Bernoulli(1/2), and let Yn be the result of passing Xn through a memoryless binary symmetric channel with crossover probability α. For any Boolean function b:{0,1}n{0,1}, we have

I(b(Xn);Yn)1H(α).

I am offering $100 for a correct proof of (or counterexample to) this conjecture. For more details on the conjecture itself, see the original paper with Gowtham R. Kumar.

Entropy Power Inequality for a Triple Sum?

For a random vector X with density on Rn, define its entropy power as

N(X)=22h(X)/n,


where h(X) is the differential entropy of X.

If X,Y,Z are independent random vectors, each having density on Rn, is it true that

N(X+Z)N(Y+Z)N(X)N(Y)+N(Z)N(X+Y+Z)?

It turns out that this inequality is true if Z is Gaussian, but the only proof I know of is via a more general result (see this paper). So, I would also be interested in a direct proof of the inequality in the special case where Z is Gaussian.

The in-line helper

Let m terminals be connected in a line, where terminal i observes discrete memoryless source Xi, and communicates to terminal i+1 at rate Ri. The Slepian-Wolf theorem tells us that the constraints

RiH(X1,,Xi|Xi+1,,Xm),1im

are both necessary and sufficient for terminal m to recover X1,,Xm near-losslessly.

Suppose we only demand that terminal m recovers X1,Xj1,Xj+1,Xm near-losslessly. What conditions on rates are necessary and sufficient to do so?