Wednesday, November 30, 2005

The Graduate Complexity Course

After my post on teaching PCPs, a reader questioned the wisdom of spending 6-8 lectures on PCP and asked what topics should be taught in an introductory graduate complexity course. This is not a new issue. In the spring of 1985 I took a graduate complexity course with Juris Hartmanis at Cornell and in the spring of 1986 took a graduate complexity course with Michael Sipser at Berkeley with only the polynomial-time hierarchy in the intersection of the two courses.

Here is my list of the topics and theorems that should be covered in a graduate complexity course. Depending on the school, some of this material is covered in undergraduate courses. More material should be added based on the interests of the instructor.

  • DTIME(f(n)) ⊆ NTIME(f(n)) ⊆ DSPACE(f(n)) ⊆ NSPACE(f(n)) ⊆ ∪cDTIME(cf(n)).
  • Basic time and space hierarchies.
  • NSPACE(s(n))⊆DSPACE(s2(n)) and NSPACE(s(n)) = co-NSPACE(s(n)).
  • The P versus NP problem and NP-completeness. SAT is NP-complete.
  • The polynomial-time hierarchy.
  • Alternation (Alternating polynomial-time = PSPACE).
  • Probabilistic Complexity Classes (BPP, RP, ZPP). BPP ⊆ PH.
  • Counting (PH ⊆ P#P).
  • The PCP theorem. Even if you don't do the proof, state the theorem and the implications for hardness of approximation.


  1. The lack of circuit complexity in your list misses out on an entire way of looking at complexity problems and any lower bound technique other than diagonalization. I think that any such complexity course should at least expose students to the circuit complexity approach especially since it is the main way to understand low level complexity classes. I would add to your list:

    * Circuit complexity classes: NC, AC^0, AC^0[p], TC^0, P/poly
    * Parity not in AC^0 and/or Mod_p not in AC^0[2]
    * to tie things in I would include NP not in P/poly unless PH collapses although this isn't essential
    *(the circuit view of PH in P^PP is also particularly convenient).

    When I last taught such a course I also taught:
    * #P-completeness
    * IP, AM, MA
    * Graph Isomorphism is not NP-complete unless PH collapses.
    * Time-space tradeoff lower bounds for SAT
    * Clique requires exponential size monotone circuits.

    Paul Beame

  2. You did not comment on which the best books are to follow for the material in this course (if doing a self-study). Could you comment on that as well?

  3. Do you remember the list of topics of the 1985 Hartmanis course and the 1986 Sipser course?

  4. Papadimitriou's '94 text is fun and readable (though with a slightly daunting coverage of logic) and covers all but the last two topics (it describes their importance). In fact, my worry is that for anyone who got hooked on complexity thru that book (like me), the proposed course is unexciting.

    Paul's suggestions sound good, though the case has to be made for existing circuit bound techniques having relevance to the project of finding really good ones (though the parity bound has e.g. the PH separation oracle applications to recommend itself).

    I'll just suggest relativization be added. Complexity theorists need to learn how to use it; other students need some explanation why this course doesn't answer most of the questions it raises.

  5. Lance, I'm curious why you suggest teaching alternation and Toda's theorem. I thought alternation was "passe", except maybe as a way to introduce the PSPACE-complete problem of quantified boolean formulae. As for Toda's theorem, I know it is considered central but I never quite understood why...

  6. At the risk of sounding pushy, I truly enjoyed the first podcast with Bill Gasarch and think that this topic presents the opportunity for another.

  7. In addition to circuits as Paul suggested, I would add some topics to emphasize the naturalness of the classes, ie some machine-free characterizations, Eg Bellantoni-Cook's recursion system for FP, or some Finite Model Theory characterizations, like Fagin's Theorem.

    @anonymous: Alternation is still the best way to understand the complexity of many problems. Eg it becomes trivial to see that context-free languages are in P, because the mere definition of a word being generated by a grammar in Chomsky normal form is an alternating logspace algorithm. Also, most problems complete for EXP that I can think of are "really" complete for APSPACE, which just happens to be EXP. Similar for higher classes like EXPSPACE, 2-EXP ...

  8. This brings up a game that Bill Gasarch likes to play. We can call it "I can't believe they don't know X" We often play it on undergrad CS majors, but it works for grad students, college graduates, complexity theorists, etc.

    The idea is that each person states a fact (or theorem, or proof technique, or whatever) that they are shocked, SHOCKED, that the appropriate group doesn't all know.

    The answers are interesting because there is ALWAYS more than 4 years worth of material in the undergraduate example...

  9. "4 years worth of material" is a pretty ambiguous metric.