So what has Valiant done? Here is an incomplete high level list which may be exaggerated.

- Defined #P and showed Perm was NP-complete and also that most NPC problems have #P analogs.
- Was co-author on Valiant-Vazarani paper (duh). Main result: if given a formula that you KNOW has either 0 or 1 SAT assignments, telling which one is hard (unless NP=R). Was also a first step towards Toda's theorem.
- Defined PAC learning.
- Defined Superconcentrators- a kind of expander graph.
- Started Algebraic analogs of Boolean Complexity.
- Had some stuff on parallelism.
- Started the recent Matchgate algorithm paradigm.
- Put up with me being his TA for Combinatorics.

I would add the work reported in "Circuits of the Mind" to the list. IMHO, it is an outstanding example of an "extroverted" application of (theoretical) computer science to the key questions underlying another field of science---in this case, the neurosciences.

ReplyDeleteHis fairly recent paper on evolvability is another interesting example of the algorithmic lens at work.

I agree with you, I wonder why he has not won the Turing Award yet. He is certainly someone who has asked himself important questions, and who has had the ability to solve them to boot. But then again, awards and prizes are a scarce resource.

Keep in mind that there are now a number of bright theorists named Valiant. I believe this post merits including a first name. (Even if Les Valiant can be given presumably a small bit of the credit for the abundance of said bright theorists.)

ReplyDelete--- Not a Valiant (unvaliant?)

I think that the post is too and incomplete. Valiant is one of the giants of the Theory community, who has contributed great ideas and started several subfields of Theory.

ReplyDeleteSingle sentences are very inadequate to describe his contributions. Luca Aceto mentioned "Circuits of the Mind", and Valiant's other work on AI, especially PAC learning is well known. Thus, I would like to talk of his early research, which is might be less familiar to the readers of the blog.

His PhD dissertation was excellent research on Formal Languages. He attacked the equivalence problem for dpdas, a problem only solved several decades later, and his theorems were the strongest

results until Sezeguiere's proof.

He then attacked problems in machine-based computational complexity, and proved the first results showing that space is (slightly) more powerful than time on Turing machines (same results obtained independently by Hopcroft and Paul. The final 3-author version in JACM).

This paper yielded not only numerous followups, but considerable work on graph pebbling by computational complexity theorists. Valiant had then a truly inspired idea, namely that one should try to prove lower bounds by studying properties of the computation graph of algorithms computing a give function. This was a major new research direction, and a source of numerous papers, models, and some problems that remain open. In particular, he defined the superconcentrators mentioned in the post.

The #P results were of course extremely important, and were discussed at greater length by Gasarch (and commented by many, including myself) in a previous post http://weblog.fortnow.com/2005/12/surprising-results.html

A relatively unknown nice paper from the early years is his contribution to the decidability of certain Lindenmeyer Systems. The final proof is due to Culik, but Valiant proved an important special case.

His contributions to parallel algorithms include the beautiful probabilistic routing algorithm on the hypercube. Again, this was enormously influential, with followup research by Pippenger, Paul, Ranade, Rabin, Galil and Leighton among others, and considerable influence on the thinking of parallel computer architects.

Perhaps the most important complexity-theoretic research direction that originated with an idea of Valiant's is the use of algebra to try to obtain lower bounds. Once it became clear that the complexity of computing the permanent is an important problem, it is natural to ask whether it can be computed by algebraic circuits--namely by + and * operations. Valiant has asked the questions clearly, cleanly, and hammered home the point that a superpolynomial lower bound (say on the size of +,* circuits computing the permanent) would be a major advance, yet might be easier than proving #P \ne P. Moreover, one could try to use (or develop) powerful algebraic machinery to prove such a theorem.

I am convinced that the insistence on algebraic techniques had a profound influence on the field--not only in direct followups, including the current Mulmuley programme for proving lower bounds, but as a means of making complexity theorists think about algebraic techniques, leading to such diverse results as IP=PSPACE (through Lipton's permanent-checking algorithm), and PCP theory.

I think that the Mulmuley programme is an excellent idea for a post in this weblog. :)

ReplyDeleteDear me (anonymous #4):

ReplyDeletehttp://weblog.fortnow.com/2002/09/ketan-mulmuley-and-milind-sohoni-have.html

>I'm surprised he hasn't won the Turing

ReplyDelete>Award yet, but I'm sure he will.

It looks like these days, the Turing award doesn't go very often to "pure" theorists.

"His fairly recent paper on evolvability is another interesting example of the algorithmic lens at work."

ReplyDeleteInteresting, yes, but also underwhelming.

ReplyDeleteI'm surprised he hasn't won the Turing Award yet, but I'm sure he will.I echo that.

ReplyDeleteI'm surprised he hasn't won the Turing Award yet, but I'm sure he will.A Turing award seems more than appropriate. Turing Awards require that a community organize a nomination with many supporting letters. I suspect that the 'surprise' that he hasn't won may be more of an indictment of our community for not providing a well-supported nomination than it is a commentary on what the rest of CS thinks about his work. For a Turing award nomination one needs major contributions that would be broadly understood in CS as a whole. That seems easy:

1. The foundation of computational learning theory. (PAC learning.)

2. Showing the fundamental relationships between the difficulty of counting solutions to problems and determining the existence of solutions. (Not only for introducing the question but PERM is #P-hard, showing that easy problems have hard counting versions, and the random reductions from NP to UP.)

3. Fundamental approaches to data transfer in parallel machines. (His introduction of randomized routing among other aspects of his work.)

(Others have noted many other important items but it seems that the contributions listed here would be best understood outside of theory.)

Another good thing for this nomination now is that this work is timely in the sequence of contributions being honored. The Turing award just honored work done in the early 1980's. The key papers for these contributions are from the late 1970's and early to mid 1980's.

When I was coming up for tenure, I was talking with a senior person elsewhere, and I mentioned the issue of coming up for tenure in a department with two Turing Award winners. This senior person informed me that Les didn't yet have a Turing Award. I was about to bet $50 that he had to be wrong, but followed a long-standing principle of avoiding betting on factual information unless I have absolute knowledge.

ReplyDeleteI

stilldon't believe Les doesn't have a Turing award; in my mind he's just clearly the most deserving person who does not have one. In my mind, he already has it (but it would be nice if the real world made it official).I say a bit more at my blog.

To anon 7.

ReplyDelete>Interesting, yes, but also underwhelming.

The PAC model by itself would probably be even more "underwhelming". It's the realization of Valiant's deep insights that made the model so important. Fortunately, Valiant does not care much about "overwhelming" per se. In my opinion, his paper on evolvability is certainly a very impressive and valiant (can't resist this pun) attack on some of the mysteries of evolution. It is both thorough and deep. While only the future will be able to tell how good his theory is, I think that this work is at least as important as any single of his complexity-theoretic contributions.

>Interesting, yes, but also underwhelming.

ReplyDeleteAs usual in TCS insight is undervalued, and technical prowess overvalued.

ReplyDelete>Interesting, yes, but also underwhelming.

As usual in TCS insight is undervalued, and technical prowess overvalued.

Not really. Look at Turing Awards, or Godel Prizes.

Rabin and Scott got the prize not for their technically difficult works, but for the "simple" paper on finite automata.

It

istrue that FOCS/STOC committees often look for difficulty, but at least in part it is due to the fact that recognizing insight takes longer than judging technical merit--and even that is a daunting task under the time constraints committees operate.However, in the long run, people remember that great ideas, not the complicated proofs.

Valiant proved the equivalence between Boolean Matrix Multiplication and Context Free Parsing - which was considered a major result - at least in the late 70's

ReplyDeleteRef: Valiant, Leslie G. 1975. General context-free recognition in less than cubic time. Journal of Computer and System Sciences, 10:308--315.

Vapnik should also deserve a lot of credit in the foundation of learning theory.

ReplyDeleteThis comment has been removed by the author.

ReplyDeleteI'm surprised he hasn't won the Turing Award yet, but I'm sure he will.

ReplyDeleteThere may be a positive side to getting belated Turing award. The cash prize that goes with the Turing award is growing at (significantly) higher than inflation. Once it used to be a paltry $ 2500 or so, but this year (2009), Barbara Liskov will get a cash award of $ 250000, a nontrivial fraction of what Nobel offers. Sign of our discipline's growing stature?

ReplyDelete