Monday, March 09, 2009

Lets Congradulate Gerard Huet! (who?)

Note the following:
The EATCS Award is awarded in recognition of a distinguished career in theoretical computer science. The Committee, consisting of Catuscia Palamidessi (Chair), Paul Spirakis and Emo Welzl in charge of evaluating the nominations to the 2009 EATCS Award has come to the decision to propose Gerard Huet as the candidate for the 2009 EATCS Award in view of the excellent research contributions to theoretical computer science produced throughout his outstanding scientific career. The Committee unanimously shares the motivations contained in the nomination letter. The proposal has been unanimously approved by the EATCS Council. The Award will be assigned during a ceremony that will take place in Rhodes (Greece) during ICALP (July 5-12, 2009).
I have never heard of this guy, which says alot more about my view of theory (narrow) then about his accomplishments (great--- see later in this post). It may also show that the term theoretical computer science covers a rather large area. Also, I think his area of research is more popular in Europe then America. However, I'm not trying to make a statement about what theory should or should not be, I am trying to get us educated--- below is more about him (not from me, from the EATCS site). You can also goto here

Gerard Huet has made numerous, enduring advances in the foundations of computer science and has been a central figure in several important software systems. He has also had a remarkably active and successful academic and professional life. His distinguished career in theoretical computer science has exerted considerable influence on not only the field but also the many students and colleagues that he has directly influenced.

A hallmark of Huet research is his talent for taking highly technical material and providing it with a clear and deep analysis. For example, in his paper on the unification of typed lambda-terms (in the first volume of TCS, 1975), Huet took a difficult topic, reshaped and redefined it, and left a solution so well developed that it took more than 15 years of active research before any one saw a need to extend it. Since that first major result, he has repeated this performance numerous times.

Equally characteristic of Huet's research is the intimate connections he maintains between theoretical computer science topics and their effective implementation. He was one of the first computer scientists who was able to move between these two domains and who felt that there was no option to doing so: these two topics were absolutely needed to inform each other.

Huet was a leader in the general areas of logical frameworks and constructive typed theories. Thanks in large part to his achievements and efforts, formal mathematics has taken huge strides and is starting to have an impact on the wider world. He has worked extensively at disseminating his view of this bold new world of formalized reasoning: for example, he has organized numerous summer schools and given countless invited talks and tutorials. Many researchers in France and elsewhere count themselves as students of Huet even if they were never formally his PhD advisee.

We list and briefly describe some of the many contributions made by Gerard Huet.
  1. n the early 70's, Huet developed both resolution and unification for higher-order logic: these results have became the core of several modern systems that perform deduction in higher-order logic.
  2. Huet has done fundamental research in the areas of rewriting and Knuth-Bendix completion. His writings in this area are extensive and elegant.
  3. Between 1982 and 1989, Huet directed and contributed to the design and implementation of the CAML functional programming language. That language and its descendants have given academics and industries an efficient and well structured programming language.
  4. In the 1990s, Huet and his students designed and built the first version of the Coq proof assistant. Today, Coq is one of the most used and trusted platforms for formalized mathematics and formal methods.
  5. Huet has a broad culture inside and outside of computer science. He has made, for example, important contributions in other areas as well: he is the author of executable lecture notes; he is the designer of the Zipper data structure; and he has built the Zen toolbox for phonological and morphological segmentation and labeling of Sanskrit.

19 comments:

  1. "Congratulate" is spelled with a t.

    ReplyDelete
  2. Unfortunately, it seems to happen too often that, while there are two branches of TCS (one of which includes, e.g., algorithms, complexity, ... and the other includes. e.g., logic, programming languages, theorem proving, ...), several people in the former branch seldom seem to know/acknowledge that the other branch exists.

    In fact, this sorry state of affairs may have existed since Turing and Church! ("Do you think in terms of bits or in terms of lambdas?")

    I happen to be from the "other branch", where people hold Huet in the same view as, perhaps, you may hold Karp. (Some of us regularly check their proofs with Coq before submitting papers.)

    While I was an undergrad, I had the fortune of spending a summer at INRIA doing an internship with Huet. Curiously, he seemed to have, just like most of his colleagues on this side of the river, a much deeper appreciation for the other side. (He actually used a lot of automata theory for his later work.)

    I regularly read your blog, as well as some of the others' in your blogroll (Scott's, Mihai's, Michael's, ...) In at least some of these blogs (guess?), I keep getting disappointed with the lack of knowledge/acknowledgment of our side of the river. I find this amusing as well as embarassing.

    I'm sure P =/= NP is an important question, and I realize SODA/FOCS mean the world to some people, but I would advise such people to also check out LICS/POPL/ICFP/CSF... and the world of theory might suddenly seem much cooler.

    ReplyDelete
  3. Avik --

    I think that lack of knowledge often (albeit not always) corresponds to lack of utility. I'm open to using techniques from other fields -- statistics and information theory -- for work in randomized algorithms. But I haven't been made aware of how knowledge of "the other side" could benefit my work.

    This is not to say anything negative about the field -- I also don't know how chemistry would help my work, but there's a lot positive to say about chemistry.

    I'd certainly be happy to have the situation rectified. You're welcome to guest post on my blog about how tools and techniques from logic, programming languages, theorem proving, etc. would be useful to someone working in networks, randomized algorithms, information theory, or other related fields. (And that's a general open invitation -- guest posts are welcome.)

    ReplyDelete
  4. I'm sure P =/= NP is an important question, and I realize SODA/FOCS mean the world to some people ...

    You give too much credit to people on "this side" of the river. In reality very few (if at all any) of the people who regularly publish in SODA/FOCS devote any time to P vs. NP (other than on occasions when they feel it necessary to put forth eloquently on the
    greatness and depth of their subject). Otherwise, I would have expected some discussions on this blog about the merits/demerits of say Mulmuley's program following his lectures in Princeton. The research on this side of the river (Atlantic Ocean ?)
    except for an occasional gem, has degenerated into little paper-producing industries of various sorts -- game theory, economics, e-commerce,
    World Wide Web, PCP's etc. -- while the truly old and venerable problems of theoretical CS, such as P/NP, exponent of matrix multiplication, better factoring algorithms, true complexity of the FFT to name a few, remain mostly untouched (except perhaps by some hermit types working away in Siberia ...)

    ReplyDelete
  5. This is not to say anything negative about the field -- I also don't know how chemistry would help my work, but there's a lot positive to say about chemistry.

    Chemistry is heavily studied in the US. But why is this other branch of TCS not heavily studied in the US?

    ReplyDelete
  6. Have to agree with anon 1. They gotta make Americans take the TOEFL with foreign grad students!

    ReplyDelete
  7. My first published paper was in concurrency. I think in the world of multicore processing, understanding of logic, program proving, etc is very useful.

    Michael, of course what some of us are currently working on, may not have much utilization of the work done on the eastern side of the pond. But a possible reason could be a narrow focus, if it is, of our work.

    I think one of the problems which keep some of Microsoft's leaders awake is finding a right programming paradigm for multicore processors. I think work in logic programming has some notion of efficiency in case of an algorithm is at least partly asynchronous. Currently if you have a quad core machine, it is likely that a core is not fully used, whereas there is a lot of work which your computer could do for you.

    The notion of correctness is different, the notion of efficiency is diffent, and testing of programs is hard in case of asynchronous programming world. BTW, testing of programs is an algorithmic question.

    There is a lot of work done on these subjects in the past, especially in high performance programming. I think there is a lot remaining which will be needed, since this kind of processing power will be available to masses. People who pursue this line of work, may need to utilize the TCS concepts from both sides of the pond (and of course of both sides of the other bigger pond too).

    ReplyDelete
  8. The research on this side of the river (Atlantic Ocean ?)
    except for an occasional gem, has degenerated into little paper-producing industries of various sorts -- game theory, economics, e-commerce,
    World Wide Web, PCP's etc. -- while the truly old and venerable problems of theoretical CS, such as P/NP, exponent of matrix multiplication, better factoring algorithms, true complexity of the FFT to name a few, remain mostly untouched (except perhaps by some hermit types working away in Siberia ...)


    Pardon my ignorance. But could please you explain why you think people should work on complexity of FFT or Matrix multiplication, and not on PCPs?

    ReplyDelete
  9. Bill, I have the feeling that you meant well by writing this post, but your choice of title and some of the prefatory text might have given a different impression to some of your readers. The blog medium can be very unforgiving at times and some of the comments that are sometimes aired on the very interesting blogs in your blog roll I read with some regularity do seem to indicate the "lack of knowledge/acknowledgment of our side of the river" Avik referred to in his well worded comment to this post.

    First of all, let me stress that the EATCS is an international organization that covers the whole spectrum of TCS. This means that the prizes and awards it sponsors are meant to reflect, at least in principle, work that is done in any area of TCS, broadly construed.

    TCS is a wide field and, in general, we cannot keep track of the details of the work or the latest advances in areas outside our specializations. However, I feel that a well read TCS researcher ought to have at least a little appreciation of the general type of results and contributions, and of the people behind them, in other sub-fields of our beautiful subject. Just attending seminars in areas of TCS we do not work on, or reading general introductory pieces on work done in those areas, can help a lot in this respect. We should inspire our students to have a holistic view of (T)CS and build an appreciation for both volume A and volume B research as well as for the work of the giants of our field, who are by and large still active and inspirational figures one can meet by attending conferences, workshops and other meetings.

    This is the reason why, while I fully understand Michael's "utilitarian" comment:

    I think that lack of knowledge often (albeit not always) corresponds to lack of utility. I'm open to using techniques from other fields -- statistics and information theory -- for work in randomized algorithms. But I haven't been made aware of how knowledge of "the other side" could benefit my work.

    I tend to disagree with it a little. I like to think that, in an ideal world, we should all have a little motivation to develop a passing acquaintance with the "other" (I was about to write, the "dark" :-)) side because of intellectual curiousity.

    Coming back to Huet and to whether his line of research is not represented in America, let me say that, together with some other colleagues, I had submitted a competing nomination for the EATCS award. Of course, I would have liked to see my nomination succeed. However, there was never any doubt in my mind that Huet is a worthy recipient of the EATCS award and, as a member of the EATCS Council, I supported the decision of the prize committee without hesitation when it was announced to the Council.

    A look at the list of signatories for Huet's nomination shows immediately that his work is held in high esteem by very distinguished theoretical computer scientists in America, Asia and Europe, including a Turing Award winner the highlights of whose work, I feel, should be known, at least by name, by any member of the TCS community.

    Also I do not think that it is fair or productive to say that Huet's line of work is not popular in North America when people like Bob Constable, Dale Miller, Frank Pfenning and Benjamin Pierce are amongst those singing his praises. (OK, Dale is now based in France with his wife Catuscia Palamidessi, but he is a born and bred American researcher.) There is a lot of excellent work done in America in TCS on topics that these days are not typically covered by conferences like FOCS, SODA and STOC.

    As Avik put it in his closing sentence

    I would advise...people to also check out LICS/POPL/ICFP/CSF... and the world of theory might suddenly seem much cooler.

    I do not know about the "coolness factor", but certainly the world of TCS would look as broad, fascinating and multi-faceted as it really is. Wouldn't this be a good thing? I strongly believe so.

    ReplyDelete
  10. "I happen to be from the "other branch", where people hold Huet in the same view as, perhaps, you may hold Karp."

    Karp who?

    ReplyDelete
  11. L. Aceto wrote: "We should inspire our students to have a holistic view of (T)CS and build an appreciation for both volume A and volume B research as well as for the work of the giants of our field, who are by and large still active and inspirational figures one can meet by attending conferences, workshops and other meetings."

    I think the problem is much deeper than a mere lack of attention. Semantics on the one hand and complexity/algorithms on the other look at computation from very different angles: Semantics looks at computation by paying attention to intrinsic program structure. Complexity/algorithms look at computation from the outside, by counting the number of execution steps, number of memory cells used. THe extrinsic viewpoint ignores the intrisic viewpoint and vice versa. But this lack of attention to the other viewpoint is because of a deep technical difficulty: there are no known (or hardly any) known connections between the two points of view, and that's despite computational intuition suggesting that there should be.

    I think forging a technical connection between the two is a deep open problem.

    ReplyDelete
  12. People in US TCS are paper machines.
    Some of them regularly have 4 or
    5 FOCS/STOC/SODA papers in one year
    and are pride of that.
    That is why they donot study exponent
    of matrix multiplication or integer factorization, but PCP.
    Otherwise there are simply not
    many papers to write.

    In more serious science, if you write
    too many papers, your colleagues
    will look at you with suspicious.

    ReplyDelete
  13. Pardon my ignorance. But could please you explain why you think people should work on complexity of FFT or Matrix multiplication, and not on PCPs?

    I don't want to sound very flippant but just consider the number of matrices being multiplied and FFT's being computed every millisecond, with the number of PCP's that have been constructed since the time they were defined.

    ReplyDelete
  14. Semanticist wrote: "The extrinsic viewpoint ignores the intrinsic viewpoint and vice versa."

    Perhaps normalization by evaluation offers a link between the two viewpoints you have in mind? Should I get around to rising to Michael's invitation for a guest blog post, what I write would definitely mention NBE.

    ReplyDelete
  15. FFT's are a bad example to cite since it aboslutely has not been ignored. Martin Furer recently improved the complexity of the best algorithm for integer multiplication (based on FFT's) and this recently won a best paper award at STOC.

    ReplyDelete
  16. Michael Mitzenmacher in comment 3 and Semanticist in comment 11 make very good points.
    There is at least one recent example showing some convergence betweeen the two sides of TCS : in cryptography, there has been some work connecting computational complexity based crypto to the more "semantic" approach from the Dolev-Yao model.
    (I can already hear the screams: "wait a minute, I thought Yao belonged to us!!")

    What about other examples ?

    ReplyDelete
  17. Treewidth and related notions such as query width come up in both algorithms and logic.

    ReplyDelete
  18. Pascal Koiran asked:

    What about other examples ?

    Another possible example, which may also relate to the very interesting problem raised by Semanticist, is the study of logical approaches to ensure complexity bounds on lambda-calculus or functional programs. See, for instance, the web page for this course. I am sure that a lot has happened since 2006 in this area, and the experts will be able to point out recent developments.

    The whole field of descriptive complexity links logic and complexity theory in a way that I have always found beautiful.

    In the field of networking, this paper offers an application of formal semantics to formally analyze network protocols based on structural properties, and also to derive working prototype implementations of these protocols.

    The whole field of model checking in based on the rich interplay between logic, models of computation and algorithms.

    Connections between logic, graphs and algorithms are described in, e.g., this survey paper by Martin Grohe.

    I am sure the list could go on and on. One just has to keep and open mind, and look for the connections.

    ReplyDelete
  19. Probably CMU is on the other side of the river :)

    ReplyDelete