Thursday, June 22, 2006

Favorite Theorems: Probabilistic Complexity

May Edition

After we saw several randomized algorithms for primality we needed a way to classify and analyze the power of probabilistic computation. The power of computational complexity comes from not just studying old models of computation but taking new ones and finding ways to analyze their computational power as well.

Computational Complexity of Probabilistic Turing Machines, John Gill, STOC 1974, SICOMP 1977. 
A Complexity Theoretic Approach to Randomness, Michael Sipser, STOC 1983.

Gill gave a formal definition of a probabilistic Turing machine and defined the basic classes, PP, BPP, RP (which Gill called VPP) and ZPP and showed the basic relationships between them.
Sipser's main result showed the BPP is contained in the fourth level of the polynomial-time hierarchy and the paper also includes Gács improvement putting BPP in Σ2∩Π2. More importantly Sipser introduced new techniques into the complexity theorists' toolbox including
  • A new resource-bounded Kolmogorov distinguishing complexity, and
  • Using Carter-Wegman hash functions to focus randomness. Perhaps the first extractor.
Sipser's tools go on to play an important role in the complexity of Arthur-Merlin games, graph isomorphism, statistical zero-knowledge and other areas of complexity. But perhaps most importantly Sipser showed you can apply the tools of complexity to really understand the power a new model of computation, the probabilistic machine. How about that newer model of a quantum computer? Bernstein and Vazirani's paper plays the Gill role, in formalizing efficient quantum computation and definining the basic classes like BQP. But while we have had some limited success in understanding the computational complexity of BQP, not only do we not know whether BQP is contained in the polynomial-hierarchy we have not yet developed great tools for understanding "quantumness" the way Sipser has shown we can do for randomness.



1 comment:

  1. Quantum computing seems to be rich in new algorithmic ideas, but it also seems somewhat 'orthogonal' to much of existing complexity theory, e.g. is difficult to relate to the polynomial hierarchy.

    So, could QC inspire other models that don't share its physical plausibility but that have more use to classical complexity theory? This by way of analogy with PP, which isn't what one 'had in mind' for randomized algs but is nevertheless an upstanding complexity class.

    As one example of what I'm thinking of, Scott Aaronson (quant-ph/0402095) shows you can solve NP-hard problems in BQP if you throw in some nonlinearity into QM. And there are others; I'm just wondering if this envelope could still be pushed.

    ReplyDelete