I summarize the second day of the conference.
Wednesday June 27 Morning Session:
- A Satisfiable Algorithm and Average Case Hardness for Formulas over Full binary Basis
by Seto and Tamaki.
- Approximating AC
by Beame, Impagliazzo, Srinivasan.
- DNF Sparsification and Faster Deterministic Counting by Goplan, Meka, Reingold.
- Invited talk Communication Complexity and Information Cost: Foundation and New Directions by Toniann Pitassi.
This was an excellent talk. I didn't realize it was an hour long invited survey talk instead of a 20 minute long normal talk.
I kept waiting for
(a) her results, and (b) the talk to end, even though I was enjoying it.
ADDED LATER: Toni emailed me her slides.
- Gaussian Noise Sensitivity and Fourier Tails by Kindler and O'Donnel. They first define a new kind of sensitivity Rotation Sensitivity They then show that this type of sensitivity is subadditive. This is used to obtain simple (or at least simpler) proofs of the the Gaussian Isoperimetric inequality in certain cases and a lower bound on approx max-cut of .8787 (not the best known, but very close and a simpler proof).
- Junto-Symmetric Functions, Hypergraph Isomorphism, and Crunching by Fischer, Garcia-Soriano, Matsliah. Two Boolean functions are Isomorphic if they are the same up to a relabelling of the variables. Given two Boolean functions, can we tell if they are isomorphic? One way is to obtain the truth table for both of them and then see if some permutation of the variables makes them the same. Even if all we care about is queries, this seems like a lot. Can we do this in less queries? That is, we are asking Property Testing questions. What if we restrict them. Then what? Read the paper to find out!
- Complexity lower bounds through Balanced Graph Properties by Moshkovitz.
This looks hard! They mention Tensors!
- Limitations of Lower-Bound Methods for the Wire Complexity of Boolean Operators by Andrew Drucker. First there were pathetic lower bounds on wire complexity. Then after what seemed like half a century, there were (just recently) some slightly less pathetic lower bounds. Even so, that was promising. This paper shows NO, don't get too excited. The new technique will not get better results. Gee Andy Drucker, couldn't you have let us dream just a bit longer. The most depressing talk I have every seen. (It was a GREAT talk and a GREAT result, but still depressing.)
- Limits on Alternation-Trading Proofs for Time-Space Lower Bounds Fortnow, Lipton, van Melkebeek, Viglas showed that any algorithm for SAT that uses log space must take at least time n1.61 (the golden ratio). Since the golden ratio is a natural number (not literally) it was plausible that current techniques could not improve that. But Ryan Williams got nc where c is 2cos(π/7) which is roughly 1.801. This can't possible be the best we can do since its just such a funny looking number. Ryan Williams wrote a program to try to fine tune the constants and do better but he couldn't do better. So he said NOBODY can do better!!!!. His advisor Manuel Blum believed him, but nobody else did. NOW he has shown, with help from Sam Buss, that NO, nobody can do better. I am impressed that he and Sam Buss were able to formalize what the technique was in a way that really does pin down ALL prior proofs. Had this paper been written first, there would only be one paper. This talk was NOT depressing since, being a new problem, I can believe that Ryan or Sam or someone can look at what the old techniques were and come up with some new ones and break the (I can't believe I am writing this) 2cos(π/7) barrier! And then what? A paper showing that that bound can't be improved, and then improve it. How far can this go? I've heard that really, really, we won't be able to show quadratic lower bounds. We'll see.
- The Hardness of Being Private Common problem in crypto: Alice has x, Bob has y, they want to computer f(x,y) but they can't know anything about the other ones input accept what they can learn from their input and f(x,y). Are there any functions that people really want to compute like this? YES- the functions involved in a Vickrey Auction. In a Vickrey auction Bob and Alice bid x and y privately. The winner pays the LOSING bid. So if Alice bids 1000 and Bob bids 100, Alice gets to buy it for 100. Formally f(x,y) contains both WHO won and what they paid. The winner does know what the loser bid. But the loser should not know what the winner bid. This paper is about computing this function approximately-private in both the worst case and the average case. The most practical paper at the conference which isn't saying much. (Can the winner use Quantum Money?)