Thursday, May 27, 2004

Visas and Titles

Thanks to Technorati I can track who links to this weblog. Recently an Indian student Nitish Korula started a new blog Pseudo-Random Thoughts where he describes the trials of getting a visa so he can start grad school at Illinois in the fall. Good luck Nitish, we're rooting for you.

Meanwhile I agree with Will Baude at Crescat Sententia that most U. Chicago undergrads address faculty as "Professor" rather than say "Mr. Fortnow" as is the official Chicago custom. A decade ago I was more likely to get "Mr. Fortnow" which I never loved since it actually feels more formal than Professor or Doctor.

Graduate students as well as my colleagues call me "Lance," at least to my face. First year grad students sometimes take time to grow out of addressing professors as "Professor." I had this problem myself way back when. A fellow student couldn't shake the professor habit until he started playing sports with them. You just can't say "Throw me the ball, Professor Leighton."

In the end I don't really care that much what you call me. However I do enjoy those letters from Germany that covering all the bases address me as "Herr Dr. Prof. Fortnow."

Tuesday, May 25, 2004

What if P = NP?

A New York Times essay looks at the hardness of understanding math. The essay quotes from the book The Millenium Problems by Keith Devlin which describes the seven million-dollar Clay Mathematical Institute Millenium Problems including the P versus NP question. So I took a peek into Devlin's book.

Devlin doesn't hide his feelings about the P versus NP problem as "the one most likely to be solved by an unknown amateur." He does make a point that if P = NP we can break RSA and "the current dependence of the Western economies on secure communications over the Internet demonstrates just how high are the P = NP stakes."

Let's play make believe and assume P = NP in a strong way, say that we can find satisfying assignments of Boolean formula in nearly linear time with small constants. It will have a dramatic influence on the Western economy but not at all in the way Devlin perceives. We'll lose public-key cryptography but what we will gain from it will make the whole internet look like a footnote in history.

Learning becomes easy by using the principle of Occam's razor--we simply find the smallest program consistent with the data. Near perfect vision recognition, language comprehension and translation and all other learning tasks become trivial. We will also have much better predictions of weather and earthquakes and other natural phenomenon.

Everything will be much more efficient. Transportation of all forms will be scheduled optimally to move people and goods around quicker and cheaper. Manufacturers can improve their production to increase speed and create less waste. And I'm just scratching the surface.

P = NP would also have big implications in mathematics. One could find short fully logical proofs for theorems but these fully logical proofs are usually extremely long. But we can use the Occam razor principle to recognize and verify mathematical proofs as typically written in journals. We can then find proofs of theorems that have reasonably length proofs say in under 100 pages. A person who proves P = NP would walk home from the Clay Institute not with one million-dollar check but with seven.

Monday, May 24, 2004

Informatics in Indiana

Many universities try to integrate information technology into many different disciplines usually through their computer science departments. Our neighbors to the east are creating a bold experiment in this integration, the Indiana University School of Informatics, with fastly growing departments spread over several of their campuses.

What is informatics? According to Indiana, Informatics is

  • understanding the impact technology has on people.
  • the development of new uses for technology.
  • the application of information technology in the context of another field.
Their research groups already encompass quite a few areas including biological, chemical and social issues of information technology.

I visited the Informatics department in Bloomington a few months ago and sensed an excitement of growing a new discipline and bringing in many information technology researchers from different scientific disciplines. Note the real distinction between computer science that studies and improves the nature of computation and and informatics that aims for integration of information technology between various areas of study.

Mixing researchers from vastly different disciplines has had its shares of successes and failures and only time will tell how successful the Indiana experiment will become. But I'm extremely impressed with the commitment from the University and the state to this area of informatics and I expect we'll hear much more from Indiana in this area.

Thursday, May 20, 2004


Some strong comments on Rocco's post on the recent Columbia theory day. In my own highly biased point of view, I find the study of efficient computation critical in a society that becomes continually reliant on computation on both explicit computers and implicitly in various biological, economic and physical systems. And how can one study efficient computation without developing reasonable models of computation and analyzing those models?

I don't mean to sound so altruistic; I get paid to do what I love. But I do truly believe one needs to understand the mechanisms that make up our world if we wish to improve them. I write this weblog, in part, to educate about the beauty and applications of theoretical computer science.

A comment about comments. I understand that many of you choose to post anonymously rather than register at Blogger and I'm fine with that. If you don't mind please add your name at the end of the comment. I like to know who is behind the comments and its useful to match up different comments by the same person. Of course, I'd rather get your comments anonymously than not at all.

Update 5/21: Stanley Fish, the departing Dean of the Arts and Sciences of University of Illinois at Chicago argues more for a separation of academic research and policy.

I exit with a three-part piece of wisdom for those who work in higher education: do your job; don't try to do someone else's job, as you are unlikely to be qualified; and don't let anyone else do your job. In other words, don't confuse your academic obligations with the obligation to save the world; that's not your job as an academic; and don't surrender your academic obligations to the agenda of any non-academic constituency � parents, legislators, trustees or donors. In short, don't cross the boundary between academic work and partisan advocacy, whether the advocacy is yours or someone else's. Marx famously said that our job is not to interpret the world, but to change it. In the academy, however, it is exactly the reverse: our job is not to change the world, but to interpret it.

Wednesday, May 19, 2004

A Part-Time Ph.D.?

A question from a reader (slightly edited):
There are no part-time (or even full time) Ph.D. programs at top universities in computer science or mathematics that can be completed by those who work full time. For various personal reasons I find myself in a position that requires me to work full time; however, I am passionate about theoretical computer science/mathematics. Unfortunately, most American schools do not accommodate Ph.D. students under these circumstances. Is this a decision based upon the assumption that those who work full time will not produce good/enough work, or is this a decision based, simply, upon the fact that professors want to work standard hours and teaching a course from 5:30 - 6:20 is quite non-standard?
Courses are not a major issue. The course requirements for a Ph.D. usually do not significantly differ than those for a Masters and many universities offer a Masters program in computer science for full-time workers. I do see two other major barriers to a part-time Ph.D.: Funding and Research.

Nearly all Ph.D. student get funded for tuition and some living expenses via a fellowship, teaching assistantship or research assistantship. Government agencies generally don't give fellowships to part-time students and a TA or RA requires about twenty hours a week, leaving someone who already has a full-time job with no time for actually completing the Ph.D.

But suppose you felt that a Ph.D. was worth the expense or were independently wealthy and for some reason still had to work a full-time job. Ph.D. level research in math and theoretical computer science requires intense background study and long stretches of thinking, understanding the problem and working through many different ideas until one actually makes significant progress toward original work. For this one needs time and the relationship is not linear. Someone who can spend forty hours a week focusing on research will be far more than twice as successful as one who can only spend twenty.

The dominant limitation on number of Ph.D. students in CS departments is funding. If you have a record that would have gotten you in to a top computer science department as a full-time Ph.D. student and you bring your own money to the table, I suspect at many schools you can work out a part-time schedule. But you'll find doing original research on a part-time basis a daunting if not impossible task.

Monday, May 17, 2004

Randomized Blogspace

A report from Theory Day co-organizer Rocco Servedio

On Friday May 14 a special Columbia/IBM Research/NYU Theory Day was held at Columbia University in New York City. The New York area theory days started at Columbia in 1982; this one was a special event to mark both the 25th anniversary of the CS department at Columbia and the 250th anniversary of Columbia University.

More than 280 attendees came out to hear four talks by outstanding theorists:

  • Richard Karp (UC Berkeley): Current Challenges in Computational Genomics: Haplotyping
  • Shafi Goldwasser (MIT/Weizmann): Proving Hard-Core Predicates using List Decoding
  • Prabhakar Raghavan (Verity/Stanford): Finding Information in Networks
  • Peter Shor (MIT): Quantum error correction and fault tolerant quantum computation
The day ended with a panel discussion on "The Future of CS Theory." Avi Wigderson (IAS) joined the four speakers for the panel, which was moderated by Mihalis Yannakakis (Columbia). Here is a brief summary of what was said.

Mihalis started things off by observing that over the past 50 years CS theory has enjoyed outstanding successes and has had tremendous impact on computing. Indeed, some of the successes were so profound that they gave rise to whole new fields of computer science (databases, security) that are no longer thought of as "CS theory". Mihalis asked each of the panelists to briefly give their views on the future of CS theory. Some highlights of what they said:

Avi observed that CS theory can (and should) have more impact on early education, starting in high school or even earlier. We can give important insights into fundamental ideas such as adversaries, randomness, learning, recursion, games, proofs, and "getting things done efficiently" (which Avi referred to as "the oldest profession in the world"). He also highlighted some specific goals for CS theory at this point, which included showing that BPP ≠ NEXP; coming up with non-natural proof techniques for circuit lower bounds; discovering new types of quantum algorithms; developing a general theory of what types of algorithms can give optimal approximation ratios; and proving that SL = L and that MATCHING is in NC.

Dick Karp warned against taking anyone's advice or predictions too seriously. That said, he advocated for a healthy balance between foundational questions at the core of CS theory and new questions that arise from the role of computation in the world and the sciences. He highlighted three areas of interest for the future: (1) the study of large scale distributed systems such as the Web, incorporating ideas from economics and game theory; (2) connections with areas of natural science, ranging from statistical physics to quantum mechanics to biology; and (3) the "new face" of AI in which stochastic and graphical models and statistical inference are playing a big role.

Peter also commented on the perils of predicting the future; we sometimes tend to think that there will be no more revolutionary ideas simply because we don't know what those ideas will be. But such ideas will come along from "out of the blue" as they always have. He noted that while past predictions for the future of CS theory have tended to be on the doom and gloom side, things have actually turned out pretty well -- there are interesting jobs and demand for theorists in industry; theory is more and more noticed and used by practitioners; and rather than becoming increasingly recondite and inward-looking, theory is building stronger connections with mathematics, physics, and other disciplines.

Prabhakar observed that what we think of as CS theory is really two main thrusts of work with some overlap: there is the theory of computation as an inherent phenomenon (i.e. when we study MOD 17 gates and what they can do even though nobody will ever build one), and the theory of computation as it is practiced (i.e. most of the world's cycles are spent making a billion people happy rather than crunching data for a few thousand scientists). The Web is a paradigmatic aspect of the second thrust; he noted that in this area economic factors may play a role at least as important as traditional resource bounds like time and space. Prabhakar also stressed the importance of backing up claims of practical relevance for our work (and, on an unrelated note, mentioned this weblog in a slide entitled "Randomized Blogspace".)

Shafi observed that CS theory is having an increasing impact on classical mathematics such as coding theory, number theory, and signal processing. On the other hand, we are also dedicating more energy (and having more success) in solving problems in the real world -- both of these trends are good signs for the field. She advised researchers to follow their own tastes and interests rather than anyone else's recommendations when it comes to "the next big challenge for the field."

After these statements the floor opened up to questions and discussion with the audience. A brief summary:

One questioner noted that the theoretical models of parallelism from 20 years ago don't correspond to how large distributed systems work now, and asked whether a similar phenomenon could be taking place with quantum computation -- are we studying the right model? Some panelists responded that while we aren't likely to end up with quantum computers that correspond exactly to quantum circuits, it seems likely that algorithms developed for the quantum circuit model will prove useful if/when we do get quantum computers in one form or another.

There was quite a bit of discussion about the role of CS theory in the undergraduate curriculum and what undergraduate CS majors should know about theory. Some panelists opined that NP-completeness, undecidability, models of computation, and algorithms are core topics that even high school students perhaps should know. A view emerged that there is real (potential) widespread interest out there in the "gems" of CS theory, and that we should do a better job of explaining what is fascinating and beautiful about our field to students.

In response to a question about the future status of the P=NP question, some panelists observed that other great research communities (mathematics, physics) have tussled with unsolved questions for centuries. We seem to be stuck right now, but on the bright side we have some understanding (natural proofs, for instance) of why we are stuck -- perhaps mathematicians should step back and think about why the Riemann hypothesis is still unresolved.

To close, here are three quotes lifted more or less verbatim from the panel discussion (but left anonymous here):

  • "The future for DNA computation is dim" (in response to the question "What is the future for DNA computation?")
  • "Polynomial time computation is a complex object to understand."
  • "We are so much closer to understanding each other's talks than the mathematicians are."

Sunday, May 16, 2004

Cornell's New President

On Friday I went to an alumni reception for Jeffrey Lehman, new president of Cornell University. Besides learning that the cinderblock dorms where I spent my freshman year are finally being demolished, a number of interesting aspects of university life came out of the question and answer session.

One question asked about lack of student activism on campus. Lehman acknowledged the problem outside of environmental issues and told of his plan for a mock presidential election at Cornell before the real election. This seemed like a weak answer--mock elections we had in high school. I doubt college students could get excited about a mock election when most of them can vote in the real thing.

On the other political end was a question about the liberal bias in faculty. Lehman acknowledged this as well but didn't consider it a problem as long as the conservative voice was not silenced. This was a good answer.

On affirmative action he said that Cornell needed more minority applicants and was working on a suggestion to start attracting students even in middle school. And someone asked a question about whether Cornell should have common core courses for the students, an interesting issue for me since even small changes in the University of Chicago's traditionally strong core have caused major controversy. Lehman said that Cornell will continue its tradition of not having any fixed course requirements for all students (besides the swimming test).

Thursday, May 13, 2004

Favorite Theorems: Probabilistically Checkable Proofs

April Edition

No single topic has dominated computational complexity over the past dozen years than probabilistically checkable proofs (PCPs). Arora, Lund, Motwani, Sudan and Szegedy, in a paper on my 1994 list, showed that every language in NP has a polynomial-sized PCP that can be verified by probabilistic polynomial-time verifier using O(log n) random coins and some constant number of queries. Well beyond the complexity interest in this result, PCPs give hardness of approximation results for a variety of NP-complete problems.

Researchers in many exciting papers have improved the parameters of the PCP results in order to get improved limits on approximation. But one paper really puts it all together for some tight results.

Some optimal inapproximability results by Johan Håstad, JACM, Volume 48, 2001.

Håstad's paper shows that every language L in NP has a PCP with with O(log n) random coins and 3 queries, where

  1. If x is in L then the verifier is convinced with probability arbitrarily close to one.
  2. If x is not in L then no proof can convince the verifier with probability more than one-half.
There parameters are the best possible.

The paper gives some optimal approximation results. Consider Max-3-SAT, where one wants to find an assignment that maximizes the number of satisfied clauses of a 3-CNF formula. We can satisfy 7/8 of the clauses by choosing a random assignment, a process we can also derandomize. Håstad's result implies that no better algorithm exists unless NP is easy. The paper also gives improved lower bounds on approximation on problems like vertex cover and max cut.

Håstad's paper pulls in tools from a large collection of research papers. Madhu Sudan's lecture notes describes Håstad's results and the techniques and papers leading up to it. There's also been exciting PCP research since Håstad's paper but I'll have to leave that for another day.

Monday, May 10, 2004

An Auction of Google

For those with an interest in auction theory, the Google IPO auction gives an interesting testbed for auction mechanism design. Instead of having an investment bank set a fixed price for the IPO, instead Google will auction off the shares.

A New York Times article today describes many of the decisions and possible pitfalls of the various kinds of auctions Google might use. Also check out the Google SEC filing. One can learn quite a bit about auctions as well as the business of search engines from this rather informally written document. I have never had so much fun reading a prospectus.

My prediction: Great interest in Google will highly overvalue the stock whatever auction mechanism they will use. If you are interested in investing in Google, hold off until the price settles or you will suffer the dreaded "winner's curse."

Saturday, May 08, 2004

Page Charges

The Journal of the ACM has started asking for page charges.
Author's institutions or corporations are requested to honor a page charge of $60.00 per printed page or part thereof, to help defray the cost of publication. Page charges apply to all contributions. Payment of page charges is not a condition of publication; editorial acceptance of a paper is unaffected by payment or nonpayment.
SIAM also recently asked us for $72/page for a Journal on Computing paper.

I despise page charges. Authors do the research, write the papers, give the journals the copyright and now the journals want us to pay for the privilege. I know the charges are optional and come from research funds but we have other needs for the money. The page charges on a moderate-sized paper could, for example, send a grad student or two to a major conference.

We have problems in our field with expensive for-profit journals and papers that never appear in refereed journals at all. We need to encourage authors to send their articles to journals run by the non-profit societies. We should not then send them a bill for doing the right thing.

Friday, May 07, 2004


A readers asked about the complexity of games like Go and Chess. David Eppstein has a nice site giving a short description and references to a number of specific games.

Let us thought put such games in a general framework. We have a board and each player in turn can make one of a list of legal moves that depend on the current placement of pieces on the board. We focus on deterministic games of complete information, as opposed to games like backgammon or poker.

Games like Go and Chess are played on a fixed board, one could just enumerate all of the possible board combinations and perform perfect play in a constant amount of time. So we need to look at generalized versions of Go and Chess where the size of the board and the set of rules can vary.

Let's place this in a general setting. We have a polynomial-time algorithm that given a board and the current player can tell whether the game has ended with its outcome or can give a list of legal moves for the player. Chandra, Kozen and Stockmeyer have a seminal paper on these alternating games: If we restrict the length of the game to polynomial-time, such games characterize PSPACE (problems solvable with polynomial memory and unlimited time). Games with arbitrary long play on polynomial-size boards characterize EXP (exponential time).

So we have results like given an opening position on a generalized Go games, it is EXP-complete to determine if a player have a forced win. But even if the official Go rules allow it, I find it hard to believe that players can play the game for an exponential number of moves. So it makes sense to add some artificial stopping rules that cause the game to end after a reasonable amount of time and such games are usually PSPACE-complete.

The PSPACE-completeness results hold for many very simple games. This mirrors the fact that complexity does not arise from complicated actions, rather from the interactions of many simple actions.

Wednesday, May 05, 2004

New Web Host

I'm moving my web hosting service--if you can read this you are accessing the new host. I will wait a day or two to post again until the changeover is complete.

Meanwhile enjoy this Guardian column by John Sutherland describing how the British higher education system has evolved over the past four decades (via Crooked Timber). Many of the same issues apply in America and I suspect many other countries as well. Sutherland sums it up nicely.

The big question. Is the whole system in better or worse shape than it was in 1964? I don't know. All I do know is that I'd like to do it all again, and get it right this time.

Monday, May 03, 2004

America Losing Its Edge

Some required reading if you haven't seen it yet, a New York Times article on how America has lost some of its scientific leadership role over the rest of the world.

The article does not go much into the reasons behind the change so let me make some conjectures. For a long while now, the majority of Ph.D. students in the US came from other countries. As the academic job market in the US got tighter, many of these researchers went back to their home countries and established strong research groups there. Also recent technological changes have taken away some comparative advantage of doing research in the states as communication and access to research papers has become a much easier task.

I welcome the added competition, the more globalization of science that we have, the more we will all push one another with scientific research becoming the big winner. In my own field, I like seeing countries like Israel becoming theory powerhouses and definite growth of theory in places as diverse as India and Australia. A few years ago it would have been unthinkable to have STOC or FOCS overseas but recently STOC 2001 was held in Greece and the upcoming FOCS will be in Rome.

Most of all I hope the article serves as a wake-up call to American legislators. Time to give NSF that large budget increase that they've been talking about for several years now.