Monday, July 10, 2006

When Karp and Lipton showed that if NP had polynomial-size circuits the polynomial-time hierarchy collapses, they also give a general definition of nonuniform complexity.
Let C be a class of languages and F be a set of functions. The class C/F is the set of all languages L such that there exists an A in C and an arbitrary f that maps n to strings with |f(n)| in F such that
x is in L ⇔ (x,f(|x|)) is in A
Seems natural but this definition has given complexity theorists headaches for many years. The definition works fine for the applications in the Karp-Lipton paper, but it loses the semantic meaning of complexity classes in general.

In particular consider (NP∩co-NP)/poly. We need an A in NP∩co-NP for the definition above, but note that means we need two NP machines that accept complementary languages even for all possible advice strings, not just the correct one. In our toolkit paper we give a relativized world where NP/1∩co-NP/1 is not contained in (NP∩co-NP)/poly. We don't even know if (NP/poly)∩co-NP is contained in (NP∩co-NP)/poly.

At least we can use the terminology NP/poly∩co-NP/poly to nicely capture the class we want. For other classes like BPP/log we have no such clean notation. Once could make some new notation (someone suggested C//F) but instead we usually just state early on that we are not using the official Karp-Lipton terminology and only require the BPP behavior for correct advice.

Karp and Lipton did nothing wrong. They use a very natural definition that works for their purposes. Unfortunately the natural definition does not match the natural interpretation for all classes and will continue to confuse inexperienced complexity theorists for years to come.

1. The Zoo defines BPP//log as BPP with randomness-dependent advice, which is a different variation on the theme. Or at least, I don't see any equivalence.

You could perhaps convey what you want with parenthesization, i.e., by writing B(PP/log). This would be correct if PP and PP/log were interpreted as classes that produce a probability distribution on the set {accept,reject}.

2. Actually, if you could provide a reference to a paper that defines BPP/log in this more natural but less standard way, that would be great. After a bit of brainstorming, I propose calling it BPP/mlog, for "Merlinized advice". The difference between BPP/mlog and BPP/log is the same as the difference between MA and ∃BPP. In both cases, Merlin is an existentialist who only guarantees the boundedness condition when his message is good.

3. The Karp-Lipton formulation actual gives the natural class (for NP intersect coNP, BPP, etc.) if you start with a class of *promise problems* (rather than a class of languages). Using classes of promise problems, quantifiers also behave nicely, e.g. it holds that MA=\exists BPP.

See Goldreich's survey "On Promise Problems" (Sec 7, June 05 version):
http://www.wisdom.weizmann.ac.il/~oded/prpr.html

4. The best reason for not using the "official" definition is that in general we'd like being solvable with poly-size circuits to be the same as being solvable with a polynomial amount of advice. Should a definition be standard just for historical reasons?

Of course, as Salil says, dealing exclusively with promise problems gets around all these issues in a very nice way.

Rahul.

5. The promise classes do not have the problem with advice but they are different classes, sometimes much stronger than the original. For example promise-NP\cap co-NP, as Oded notes in his survey, is hard for NP, where NP\cap co-NP probably isn't.