# Conjectures, Etc.

This page lists a few fun conjectures/problems and is updated from time to time. If you solve one, let me know!

## Which Boolean Functions are Most Informative?

Conjecture: Let $$X^n$$ be i.i.d. Bernoulli(1/2), and let $$Y^n$$ be the result of passing $$X^n$$ through a memoryless binary symmetric channel with crossover probability $$\alpha$$. For any Boolean function $$b: \{0,1\}^n\rightarrow \{0,1\}$$, we have

$I(b(X^n);Y^n)\leq 1-H(\alpha).$

I am offering \$100 for a correct proof of (or counterexample to) this conjecture. For more details on the conjecture itself, see the original paper with Gowtham R. Kumar.

## Entropy Power Inequality for a Triple Sum?

For a random vector $$X$$ with density on $$R^n$$, define its entropy power as

$N(X) = 2^{2 h(X)/n},$

where $$h(X)$$ is the differential entropy of $$X$$.

If $$X,Y,Z$$ are independent random vectors, each having density on $$R^n$$, is it true that

$N(X+Z) N(Y+Z) \geq N(X)N(Y) + N(Z)N(X+Y+Z) ?$

It turns out that this inequality is true if $$Z$$ is Gaussian, but the only proof I know of is via a more general result (see this paper). So, I would also be interested in a direct proof of the inequality in the special case where $$Z$$ is Gaussian.

## The in-line helper

Let $$m$$ terminals be connected in a line, where terminal $$i$$ observes discrete memoryless source $$X_i$$, and communicates to terminal $$i+1$$ at rate $$R_i$$. The Slepian-Wolf theorem tells us that the constraints

$R_i \geq H(X_1, \dots, X_i |X_{i+1}, \dots, X_m), 1\leq i \leq m$

are both necessary and sufficient for terminal $$m$$ to recover $$X_1, \dots, X_m$$ near-losslessly.

Suppose we only demand that terminal $$m$$ recovers $$X_1, \dots X_{j-1}, X_{j+1}, \dots X_m$$ near-losslessly. What conditions on rates are necessary and sufficient to do so?