- Defining TM's, Time-Hierarchy Theorem. NL=coNL. Savitch's theorem.
- Cooks Theorem, some NPC reductions. Mahaney's theorem, PH, Karp-Lipton theorem.
- If GI is NPC then PH collapses. (This will use PH, Karp-Lipton. Will also need hash functions, AM protocol for GIbar.)
- If PARITYSAT is in P then SAT is in R. (This will use some of the machiney devoloped for GI.)
- Everything in PH is \le_T^p #SAT. (Toda's theorem.)
- If CLIQUE can be approximated then P=NP. (STATE PCP, give proof of some easier cases, but do not proof the full theorem or even try.)
Other topics that would be reasonable to cover: Baker-Gill-Solovay Oracle, Hard vs Random stuff, PARITY not in constant depth, and CLIQUE not in Monotone P. Not doing BGS-oracle since I'm not doing enough proofs that relativize in the first place. Not doing Hard vs Random since its a bit hard and a bit random for this level of course. Not doing Circuit Stuff since that does not fit in that well with these topics (concrete vs. abstract complexity). Any of these points are debatable.
I'll be giving out my own notes. My experience is that using other peoples notes does not work, and others using mine does not work. My notes work for students taking my course, who see my lectures, and who have access to me. But they would not be good for anybody else.
You'd rather prove Toda's theorem then IP = PSPACE?
ReplyDeleteThe key for me is your statement that most of the students will not be doing theory.
ReplyDeleteIn that case, the course outline seems skewed heavily towards theoreticians. I'd break it down something like
* Why P-time (TMs, invariance of machine models, basic hierarchy theorems)
* Trust but verify: non-d time, non-d space. Savitch's theorem, Cook's theorem, Reductions
* Let's play a game: PSPACE, QBF, ptime-hierarchy in terms of alternating TMs ?
* Can someone please prove a lower bound ? Circuits, why non-uniformity is needed, lower bounds for circuits, relation to parallel computing
* Should I toss a coin ? randomness, BPP, RP, BPP and circuit classes; hardness vs randomness
* Let's talk ! Interaction, IP, PSPACE, PCPs, communication complexity. Link to approximation hardness.
* Wave/particle ? rudimentary quantum complexity: BQP vs NP.
Things deliberately omitted because the audience is not specialist theory: oracles, natural proofs, sparseness, details of MA/AM/MIP=NEXP, Toda's theorem
For a course intended to convey some "definite things", 3 of the 6 listed topics have the "if pigs can whistle, then horses can fly" flavor :-)
ReplyDeleteMore quibbles/comments:
While your point about abstract vs. concrete is fair enough, I think PARITY not-in AC^0 is a must-do result (esp. with a pretty polynomial based proof). AC^0 is quite "abstract", esp. because of the connection to first-order logic. Speaking of logic, Fagin's theorem is another true gem in complexity theory that must be included.
I don't understand why you say "don't even try" for the PCP theorem. Dinur's proof makes it extremely teachable - if a bit ambitious to teach to non-theorists.
I'd leave out Mahaney, Karp-Lipton, Valiant-Vazirani... and include Parity not-in AC^0, Fagin's theorem, IP=PSPACE, PCP Theorem, and Fortnow+ theorem on time-space lower bounds for SAT... and perhaps include a "showcase" section on four problems that have been at the heart of much of the last two decades' work in complexity theory (randomness vs. determinism):
Primality (RP v P), Undirected Graph Connectivity (RL v L), Graph NonIsomorphism (AM v NP), Perfect Matchings (RNC v NC).
At some point, I'd project a screenshot of the table of contents of Complexity Zoo up on the wall, just to get the point across that there is a *lot* of stuff you're leaving out.
ReplyDeleteI think that as well as proving lots of easy results, it also gives more idea of the scope of things to at least state lots of harder results along the way. For example, I'm fond of the result that DSPACE(f(n)) is strictly contained in DSPACE(f(n)) for space-constructible f(n) ("On time versus space") and it fits in with discussion of the hierarchy theorems. It's good to do PARITY not in constant depth, CLIQUE not in monotone P, and inapproximability results because, well, you might as well cover all the lower-bounds we know. Must mention the nebulous status of factoring, GI, and (down inside P) GCD.
I would emphasise an analogy between log-space and "in-place algorithms" and mention the L=P and L=NL problems, and that Savitch's theorem implies NL in L^2. I also think L=SL is worth stating, but I would formulate it as "you can do undirected graph connectivity in log space, and look, here's a bunch of problems that log space reduce to it" (not even mentioning symmetric TMs or expanders). I would just state Toda's theorem rather than prove it. I also would talk about EXPTIME-complete a bit, since it's nice to have some problems that we *know* are outside P. This fits in with Mahaney's theorem because of NP - P containing sparse languages being equivalent to EXPTIME != NEXPTIME (and might as well mention that sparse P-complete languages imply L = P while you're at it). Interactive protocols have to be visited, but I would just state IP = PSPACE (and MIP = NEXPTIME too).
When talking about NP-complete problems, I would mention the bounded Post correspondence problem and an automata problem like regular expression inequivalence to provide a connection to former education on computability and automata theory.
The primality problem, besides being of interest itself in its inexorable descent through the classes, provides a good context to frame lots of other topics. It's a good place to introduce randomness and quantum computing (Shor's algorithm). The Miller-Rabin test provides an excellent nontrivial example for a P/poly algorithm - the advice string supplies the witnesses to test, which leads naturally into BPP contained in P/poly.
Oh, and alternating TMs and their characterization of P, PH, and PSPACE are handy. And just a bit of descriptive complexity, to tie it into logic - Fagin's theorem and a discussion of this picture:
http://www.cs.umass.edu/~immerman/descriptive.jpg
Things I wonder: should PP be introduced in the context of randomness and BPP, or in the context of nondeterminism and #P? To what extent should the beliefs that P != NP, L != P, P = BPP, and the noncollapse of the polynomial hierarchy be justified? How far to delve into approximation algorithms (talk about PTASs and APX-completeness?) There's no need to go into every oracle result under the sun, but it's really important to explain that there's no relativizing proof for P = NP. Hrm, I think I've used up about 3 semesters. :-P
Can anyone suggest a good reference (maybe online lecture notes?) for "PARITY not-in AC^0 ... with a pretty polynomial based proof"?
ReplyDeleteThe Miller-Rabin test provides an excellent nontrivial example for a P/poly algorithm
ReplyDeletePlease explain; I always thought Miller-Rabin was an excellent example of a coRP (or RP, depending on your language) algorithm.
For that matter, does anyone know a good writeup of Fagin's theorem? It doesn't seem to appear in the standard textbooks...
ReplyDeleteA good writeup of Fagin's theorem, including visual aids, appears in Immerman's book Descriptive Complexity.
ReplyDeleteThe people taking my course will largely not be doing theory. Hence I want them to come away knowing some very definite things.
ReplyDeleteMany of them will encounter PSPACE-complete problems in their research. They should understand what this is all about and Savitch's Theorem is a natural/necessary accompaniment. While you're at it, Immerman-Szelepcsenyi is very definite, actually pretty easy to teach, and non-theory students enjoy it.
Please explain; I always thought Miller-Rabin was an excellent example of a coRP (or RP, depending on your language) algorithm.
ReplyDeleteI did, briefly. This all depends where the witness values you test come from. If you test all numbers from 1 to log^2 n, you get a deterministic algorithm which is polynomial time conditional on the generalized Riemann hypothesis. If you test a random value, you get an RP algorithm. You can also get a P/poly algorithm by supplying the witnesses as the advice string, since there is a small set of witnesses that works for every input up to a given length. The proof of this follows essentially the same lines as the BPP in P/poly proof - if for every composite number 3/4s of all lesser numbers are witnesses, then necessarily all numbers up to a certain size have some finite base of polynomial size in common. Some of these advice strings have been explicitly determined for some small input sizes (up to about 341 x 10^12):
http://primes.utm.edu/prove/prove2_3.html
This has some practical use as a very efficient deterministic primality test for small numbers (without the unreasonable space requirements a trivial lookup table would have).
Thanks for all the comments.
ReplyDeleteHere are some responses.
Anon 1:
Toda's theorem VERSUS IP=PSPACE.
Toda's theorem tells us something interesting
about a natural problem: #SAT is harder than
we thought. IP=PSPACE is a great result but
does not easily lead to results on classifying
difficulty of problems.
Suresh: Some of your points I DO cover, some I do not.
I will do WHY P? TRUST BUT VERIFY, IP (while doing GI stuff).
Some I will not.
Anon 3: Which results are `If pigs can whistle?'
GI NPC --> PH collapses is not.
PARITY SAT in P --> NP=R is not--- especially given
Valiants recent results on Planar-Sat-Mod-7.
Clique-approx --> P=NP is not--- there ARE NPC problems
that can be approximated. Mahaney and Karp-Lipton certainly
are pigs-whistle; however, I need Karp-Lipton to do GI stuff.
Mahaney could be omitted, but its a nice intro to sparse sets
and a nice thing to see before Karp-Lipton.
Anon 3: Teaching the full proof of PCP to non-theorists does not
have a good time/benefit tradeoff.
Anon 3: ``AC_0 is a must-do''- alas there are too many ``must-do's''
so we can't to them all.
Derrick (commenter 4) got it right when he pointed out that
to do all he wants to do would take 3 semesters.
Paul Beame- YES, PSPACE-complete results may be a good thing to do.
Immerman-Sc.. is already in the course- note I have NL=co-NL.
Upshot of all of your comments- I will try to put in some of
Suresh's teaching tips and motivation, and as Beame suggests,
PSPACE-complete problems. Once I finish the course I'll see
what seemed to work and not work
(e.g., I may find that doing VV is good but doing Toda is not
good for time/benefit tradeoffs) and consider putting in
PARITY and CLIQUE as they are, as one of the Anon said,
the only two lower bounds we know.
Bill Gasarch
An "if pigs can whistle, then horses can fly" result can be viewed as "if horses cannot fly, then pigs cannot whistle" which may look more convenient.
ReplyDeleteWhat is the original quote by the way? Shouldn't it involve pigs and donkeys instead? :)
I think teaching randomness is very important. That may also be a good excuse the teach them expanders, which of course are everywhere. (Dinur's PCP proof may be another excuse)
ReplyDeleteAnons 5 and 7:
ReplyDeletefor Parity not-in Ac^0 via polynomials, look up "spielman complexity scribe parity polynomial" on google and see top two results.
Papadimitriou's book has a very accessible version of Fagin's theorem
This is Anon 3 == Anon 14:
ReplyDeleteBill says:
Anon 3: Which results are `If pigs can whistle?'
GI NPC --> PH collapses is not.
PARITY SAT in P --> NP=R is not--- especially given
Valiants recent results on Planar-Sat-Mod-7.
Clique-approx --> P=NP is not--- there ARE NPC problems
that can be approximated.
Really? I am not saying that any of this is "likely" or "unlikely" (though I do realize that in general, a whistling pig in a whistling pig theorem is considered to be an unlikely hypothesis). We have as much clue into the truth/falsehood of "GI is NPC" or "ParitySAT in P" or "Clique can be approximated" or "PH collapses" or "P = NP" as we do about whether pigs could whistle.
My point (if you were wondering whether I had one) is this: what's the point of teaching these implications in a first course in complexity theory, esp. one for non-theorists? After having worked in complexity for several years, I feel that these results add to the richness of our field and build a rather delicate fabric of interrelationships among complexity classes, but if you ask me what a result like "GI NPC ==> PH collapses" says (in isolation), I'd be hardpressed to say anything meaningful.
In other words, I suspect that these results give little or no takeaways for general CS students.
Anon 5,
ReplyDeletefor Parity not-in Ac^0 via polynomials
Vollmer's "Introduction to Circuit Complexity" has a detailed and elementary exposition of the polynomial method for AC^0. It's full of details, lengthy and highly clear.
Kranakis & Clote book from 2002, also has the polynomial method for AC^0 (with mod q gates). It's very concise but quite unclear.
Hi Bill,
ReplyDeleteI've been reading the comments on this post, and am actually helping design a complexity reading group based on your structure, and the comments on the post. So thanks a lot for bringing this topic up.