Edge-isoperimetric inequality
The proof is due to Alex Samorodnitsky.1
Let f be a boolean function with
This is significantly better than Poincaré’s inequality, which is only linear in
Method 1: entropy
Here, we will only show
Intuition
Suppose you know
Now, if some for some
Proof
Let
The result follows by rearrangement.
Method 2: entropy deficit
The intuition of this method is basically the same, except that instead of conditioning on
Let
Indeed,
- when
is sensitive in the direction, then implies , which means the entropy deficit “is locally ”; - when
is not sensitive in the direction, then conditioned on , is still uniform, which means the entropy deficit “is locally ”.
This means the edge-isoperimetric inequality is equivalent to
Which is true because conditioning can only increase entropy deficit:
-
It first appeared in “Lower bounds for linear locally decodable codes and private information retrieval” by Oded Goldreich, Howard Karloff, Leonard J. Schulman and Luca Trevisan. ↩
-
Note that this is basically the same argument as in the “janky proof” of Poincaré’s inequality. ↩