## Thursday, March 30, 2006

### Uniform Derandomization Assumptions

In 1986 during the prehistory of Hardness vs. Randomness, Sipser showed that if time does not have nontrivial space simulations one can derandomize. Given a (later-proved) assumption about extractor constructions
If there is a constant c such that DTIME(2cn) is not contained in DSPACE(2.99cn) then P=RP.
Later results used hardness against nonuniform circuits to derandomize complexity classes. For example Impagliazzo and Wigderson show
If E does not have circuits of size 2o(n) then P=BPP.
where E=DTIME(2O(n)).

I recently discovered that Peter Bro Miltersen in his derandomization survey (page 56) notes that you can use a uniform assumption, a weaker version of Sipser's assumption.

If E is not contained is DSPACE(2o(n)) then E does not have circuits of size 2o(n) and thus P=BPP.
Miltersen's proof works by searching all circuits, using the local checkability of E to verify the correctness of the circuit.

You can even assume the circuits have access to QBF or Σ2p gates, the later an assumption we needed in a recent paper. Saying E is not contained in subexponential space is so much nicer than saying E does not have nonuniform subexponential-size circuits with Σ2p gates.

Technical Note: For all of the above you need to assume the separations at all input lengths.

## Wednesday, March 29, 2006

Many of us seniors are currently choosing among PhD programs. As you probably know, we are expected to come to a decision by April 15.

Would you mind sharing with us your advice on how an aspiring theorist should go about making this very difficult and important decision? I personally would find such a post very informative, and I'm certain many others would agree.

A great question but one where I have a conflict of interest—In my view you should all have Chicago as your first choice.

So instead I will post this advice from Bruce Maggs (from a 2001 interview via Higher Cohomology).

Choosing an university can be sometimes easy, sometimes hard. You may choose a university because there's a particular faculty member that you want to work with. That's risky, because often there's no guarantee that the faculty member will be able to take you on as a supervisor. As a general rule, it would be best to choose an university with a reputation for high-quality research results. This can be measured, well, by your opinion of different research papers. If you look at some papers, and you find some that you think are good, look where those authors are from. This may be difficult for a student who has yet to begin a research career: then the advice of faculty members at your undergraduate institution can help. I think it is generally a good idea, although I didn't follow this advice, to study for a graduate degree at a different school than your undergraduate institution, because you'll get a different point of view from the faculty, as you will meet many new potential collaborators. In choosing a graduate school the most important thing is that you understand what it is that you want to study.
Let me add that you should, if all possible, visit the schools you are interested in, talk to the faculty and current students. Make sure that you will feel comfortable in that environment, it will make for a much more enjoyable graduate experience.

## Tuesday, March 28, 2006

### Science Without Borders

But suppose you couldn't. A Slashdot reader asks What would we lose from a regionalized Internet?

If the internet was separated into regions, how much would you lose? How often do you visit other countries' web sites? How often do you e-mail people in other countries? What would foreigners lose by not being able to visit US-hosted sites, and how quickly would they be able to recreate what they lost? What other process that we are not normally aware of depend on a borderless internet?
As an academic an international internet has gone from being a useful tool to a critical part of scientific progress. Yesterday alone I had four conversations with different scientists abroad on topics like collaborations on conference and journal papers, conference organization and recommendation letters. Many of my co-authors live abroad and my research would greatly suffer if I could not so easily communicate with my colleagues. Right now I can collaborate with a researchers in Israel as easily as one in New York.

Beyond that I download papers off of researchers homepages abroad and they download my papers from mine. Archive and online journal sites would have to be sychronized on different parts of a separated internet, a difficult task to maintain. Tools like Citeseer which seek out online papers would not function as well. And many of you would not be reading this post—Nearly a third of the readers of this weblog reside outside the US. And how would Luca write to us from China about his travels there?

Speaking of Slashdot, Zeev Dvir writes

Just wanted to let you know about the current Slashdot poll on whether P = NP. The comments are hilarious.
Sigh.

## Monday, March 27, 2006

### Making Complexity a Spectator Sport

Computational Complexity is not a Spectator Sport
meaning that to truly understand and appreciate computational complexity research you need to be an active researchers yourself.

But we can't keep the attitude that computational complexity can only be appreciated by complexity theorists or we become closed and irrelevant. We need to sell complexity and the rest of theory to the scientific community and the public at large.

We can learn some lessons from great spectator sports like baseball. One cannot truly and deeply understand baseball unless you have played the game on a regular basis. But professional baseball knows they wouldn't exist without their fan base. They make the game interesting to fans at different levels of understanding, from casual fans who barely know the basic rules of the game, to sophisticated fans who understand the nuances of strategy.

While I will never expect to see us proving theorems in front of fifty thousand screaming fans, we should aim to make our work understandable and interesting to fans of theory who have different levels of knowledge of the field.

## Friday, March 24, 2006

Bill Gasarch sends in some links on recent activities at Harvard. The Harvard alumni magazine has an "objective" (according to Gasarch) article The End of a Presidency. Also former Harvard College Dean (and Gasarch's Advisor) Harry Lewis has a new book Excellence Without a Soul with an excerpt in the Chronicle, Has Harvard Lost Its Way?

Meanwhile Nature has a online section 2020 – Future of Computing. An editorial about the section talks about an interesting relationship between science and the computer industry.

The computer industry knows that scientists can come up with strange ideas and requirements that may well, in time, have broader commercial application elsewhere. This is one of the reasons why Microsoft is engaging the scientific community with its new Towards 2020 Science report on computers in science. That report inspired this week's focus on computing in Nature. Microsoft is sponsoring free web access to our articles on the subject, although, as always, the content is exclusively Nature's responsibility.

As computing gets ever cheaper, quicker and more powerful, scientists would do well to remember that, by being a demanding and stimulating "user community" that engages the interest of companies such as Microsoft, Google and Intel, they can influence the development of the field, to everybody's benefit.

## Thursday, March 23, 2006

### Large Search Problems for Small Inputs

At Dagstuhl last week Jehoshua Bruck gave a talk giving some interesting open combinatorial problems that have real-world applications. For example the following conjecture by Kotzig has applications to redundant disk arrays.
For every even n, one can partition the edges of the complete graph on n vertices into n-1 perfect matchings such that each pair of perfect matchings forms a Hamiltonian cycle.
Kotzig's conjecture is known to hold for n=2p and n=p+1 for all primes p and a few other special cases. The conjecture remains open for n=16. One would think we could solve the n=16 case by brute-force search but the search space is just too big.

This is an NP problem, one can check a partition quickly. For all those who think they have great heuristics for NP problems, go find the partition and then talk to me.

Bruck also asks how many gates does one need to solve parity on AND-OR circuits with unbounded fan-in. For n-bit inputs the answer is between 2n and 2.5n-2. The first open case comes when n=6 which requires either 12 or 13 gates. Six does not seem like a large number but still there are far too many circuits on 12 gates to check quickly.

## Tuesday, March 21, 2006

### Favorite Theorems: Relativization

February Edition
After the work of Cook and Karp popularized the P versus NP question, computer scientists immediately tried hard to prove P=NP or P≠NP. Baker, Gill and Solovay showed that most of their approaches were doomed to failure.

Theodore Baker, John Gill, Robert Solovay, Relativization of the P=?NP Question, SICOMP 1975.

Baker, Gill and Solovay noted that complexity proofs relativized, that is held even if all machines involved could make queries to some "oracle," i.e., making queries to some fixed set. They created oracles A and B such that
• PA = NPA
• PB ≠ NPB
If one had a relativizable proof that P ≠ NP then the proof would also show PA ≠ NPA, contradicting their theorem. Baker et. al. give a few more relativization results and we've seen hundreds more since.
The paper gives very little about the philosophical implications of their results. For a short while some researchers thought that these results could lead to true independence of the P versus NP question, but this thinking was quickly abandoned and later we have seen some theorems that do not relativize, particularly in the area of interactive proof systems.
Relativization results do help us understand what theorems to pursue, what techniques cannot solve our questions. Nearly all the techniques we know for time classes, outside of the algebraic techniques used for interactive proofs, do relativize. Only a very few of the known relativization results later had proofs in the opposite direction.
For more read my 1994 survey The Role of Relativization in Complexity Theory.

## Monday, March 20, 2006

### Avoiding South Dakota

Because the state recently banned nearly all abortions, there is a call to boycott South Dakota, coincidentally where my family vacationed last summer.

Suppose there was an interesting conference being held in South Dakota. Would you go? There is some precedence—I knew some computer scientists who refused to attend meetings in South Carolina a few years ago when they flew the Confederate flag over their State House.

By avoiding the conference you are mostly hurting the researchers in that state, who likely do not share the government's viewpoints and cannot easily move. Many scientists worldwide don't like much of current US foreign policy but I would hope they wouldn't avoid American conferences for that reason.

## Sunday, March 19, 2006

### An Interview with Vardi

Alex Lopez-Ortiz points out that this month's SIGMOD Record has an interview with Moshe Vardi from Rice University that touches on several topics of interest to the readers of this weblog. Lopez-Ortiz picked out some highlights of Vardi's views.
1. On relevance of work and best-paper awards:
We now have interesting tools to evaluate the success of work in the long term. Things like Citeseer and Google Scholar suddenly give us a view that we could not have had before. One thing that we discovered from these tools is that we are actually very poor in predicting the long term impact of work. There is very little correlation, for example, between the best paper awards that we give and the test-of-time awards that we give. Sometimes, miraculously, we have the same paper win the best paper and the test-of-time awards. But that is the exception rather than the rule. So I think people should focus on just doing the best work they can.
2. On highly selective conferences and low acceptance rates:
I think the low acceptance rate is a terrible problem. I think that the idea that we are raising the quality is nonsense. I think that actually the quality goes down. I think we are very good at selecting about one third to one fourth of the papers; we do a pretty good job there. As we become more selective, the program committee gets larger, the whole discussion gets more balkanized, and the whole process gets more random. Nobody gets a global view.…Conferences are not scalable. They work nicely with up to roughly 300 submissions and a certain size of program committee. When you try to scale it up, very good papers get lost. It becomes more political. I think we are being held back by our own success.
3. On conference publication vs journal publication (Here I did some serious cut and pasting—Alex):
We are very unique among all the sciences in how we go about publication. We have these selective conferences. (People have stopped calling them refereed conferences. They are not really refereed. You don't get good referee reports.) Our conferences worked well in the world we used to inhabit…I think we had a model that worked successfully for about 30 years, but now we see cracks in the foundations…I don't have a good solution to this problem. We don't even have a good forum in which to discuss the problem… We ought to rethink how we do it. Right now, people try to fix things incrementally by having a larger conference with a bigger program committee, a two-level PC, a three-level PC. Maybe we need to rethink the way we do scholarly communication in our discipline…How can computer science go about changing its publication culture? Are there areas that move just as fast as we do, and have journal papers and conferences, but conferences are not the primary vehicle? I have questions about the basic model of scholarly publications. And I find it fascinating that it is difficult to have a conversation about this on a big scale, and make changes on a big scale. We are very conservative. It is interesting that computer science has been one of the slowest disciplines to move to open access publications. Other disciplines are way ahead of us in using online publications.
There is much more, I definitely recommend reading the whole thing.

## Thursday, March 16, 2006

### The Podcast of Uninformed Decisions

Live from Schloss Dagstuhl, the fifth Complexitycast. Our guest is Eldar Fischer who talks about his love and joy, Property Testing. For more read his survey The Art of Uninformed Decisions. MP3 (18:05, 3.1MB).

## Wednesday, March 15, 2006

### Another Approach to P ≠ NP

Last week's Numb3rs episode "Mind Games" centered on a purported psychic causing the mathematician Charlie Eppes to exclaim "Let's all sit down at the Ouija Board and try to solve P versus NP once and for all."

## Tuesday, March 14, 2006

### Resolving Dagstuhl

This week I am at the Dagstuhl seminar Complexity of Boolean Functions. Schloss Dagstuhl is an isolated conference center in Southwestern Germany that hosts weekly seminars in computer science. They have room and board at the center and it is difficult to go anywhere from here so we are forced to spend time with each other, a good thing for research.

Someone at the workshop pointed out that the STOC 2006 program has been posted and lists the award winners. The best student paper award is going to Anup Rao for Extractors for a Constant Number of Polynomial Min-Entropy Independent Sources and Jakob Nordström for Narrow Proofs May Be Spacious: Separating Space and Width in Resolution. The best paper award is (surprise, surprise) going to Irit Dinur for her PCP Theorem by Gap Amplification.

Nordström gave a talk on his paper Monday. A resolution proof of unsatisfiability of a CNF formula takes a clause containing x and another clause with the negation of x and resolves it into a clause containing the remaining variables of the first two clauses. A CNF formula is unsatisfiable iff there is a resolution proof that leads to the empty clause. In 1985, Haken showed that there are no polynomial-length resolution proofs for all unsatisfiable CNF formula.

The width of a proof is the size of the largest clause produced. The space of the proof is the number of clauses one needs to keep in memory. No immediately obvious reason that the two should be related but in fact the space is an upper bound on the width and conjectured to be the same up to constant factors. Nordström disproves the conjecture by giving a family of formulas where the width is constant but the space is not.

## Monday, March 13, 2006

It happens every spring, America's favorite binary tree, the NCAA Men's Basketball Tournament Bracket was announced Sunday night. This is a single elimination tournament; win and move up the tree. In offices across America, people print and fill out these brackets guessing the winners of each game.

How do you score a person's predictions given the final outcome of the games? One could simply give one point for each game, other pools double the points in each round so each round is worth a total 32 points. Or one could base the score on seeds—there are four regions where each has 16 teams in some predetermined order of strength. Some pools give more points to predicting an upset, like seed 13 beating seed 4.

Is there a mathematically ideal way to score the predictions in the tournament yet simple enough for the average American office worker to understand? Billions of dollars are wagered on the NCAA tourney, so creating the perfect scheme can make quite a splash in the world of office pools.

## Friday, March 10, 2006

### On P versus NP

I receive several requests to comment on various papers claiming to prove P = NP, P ≠NP, or the independence of the P versus NP question on this weblog. I have a healthy skepticism about all such claims and simply don't have time to carefully read and evaluate those papers. If there is verified significant progress towards the P versus NP problem you will read about in on this weblog (though the converse is not true).

Sometimes I get a flood of requests about a certain P versus NP attempt, such as the upcoming Berkeley math department Workshop on P vs NP. I don't have much to say, the workshop is not involving any of the Berkeley theoretical computer scientists and in the past we've seen other, sometimes famous, mathematicians who have believed they have an approach to the P versus NP problem that in the end don't pan out. To the organizers' credit, at least they acknowledge they don't yet have a proof.

## Wednesday, March 08, 2006

### Presidents and Faculty

University presidents come and go but Lawrence Summers announcement last month that he will resign as Harvard's president has and still continues to create considerable discussion in the media. Summers was best known in the non-Harvard academic world for his politically-incorrect suggestion that the low representation of woman in the sciences could be party due to biological difference between man and woman.

Within Harvard he managed to upset the faculty in other ways and the faculty's lack of confidence in Summers was a factor in his resignation. Many of the opinions I see point to the faculty as unfirable zealots unwilling to allow a reformer like Summers do his job. For example consider the excerpt from an Op-Ed piece in the New York Times.

It now remains to be seen whether Harvard's Faculty of Arts and Sciences is capable of self-critique. Will its members acknowledge their own insularity and excesses, or will they continue down the path of smug self-congratulation and vanity? Harvard's reputation for disinterested scholarship has been severely gored by the shadowy manipulations of the self-serving cabal who forced Mr. Summers's premature resignation. That so few of the ostensibly aggrieved faculty members deigned to speak on the record to The Crimson, the student newspaper, illustrates the cagey hypocrisy that permeates fashionable campus leftism, which worships diversity in all things except diversity of thought.
and this from a professor, Camille Paglia of the University of the Arts in Philadelphia.

The University of Chicago is getting near the end of its own presidential search as our current president Don Randel is moving on to head the Mellon Foundation. The Search Committee is a combination of trustee members and faculty, where the faculty members of the committee were chosen by election from the faculty at large. If the faculty doesn't like the new president then we will have no one to blame but ourselves.

Update 3/9: That was quick. Bob Zimmer, a mathematician, was just nominated for the University of Chicago presidency.

## Tuesday, March 07, 2006

### Computation and Geometry

Michael Nielsen returns to blogging after seven months since his last real post. He talks about his new Science paper Quantum Computation as Geometry with Dowling, Gu and Doherty. Last year Nielsen had an arXiv paper A Geometric Approach to Quantum Circuit Lower Bounds where he showed one can bound the minimal-size quantum circuit to implement a unitary operation U by the length of the minimal geodesic between the identity and U. The new Science paper shows the other direction, given a short path one can find an efficient quantum circuit.

While others are also trying geometric approaches to separate complexity classes, by having a tight result, the minimal geodesic gives us the right bound. Nielsen also shows that finding the minimal geodesic is as hard as solving a lattice closest vector problem, which means this approach might not have the constructivity requirement of Razborov-Rudich Natural Proofs. Since quantum circuits can simulate classical circuits, one can possibly prove classical lower bounds as well, maybe even an attack on P versus NP?

Well, I can dream, can't I?

## Monday, March 06, 2006

### Computational Thinking

In this month's CACM, CMU Chair Jeannette Wing wrote a neat Viewpoint column Computational Thinking (with related slides). In the article she argues that many of the techniques we use to reason about computation apply to much wider range of problems. She gives many aspects of computational thinking such as
Computational thinking is using abstraction and decomposition when attacking a large complex task or designing a large complex system. It is separation of concerns. It is choosing an appropriate representation for a problem or modeling the relevant aspects of a problem to make it tractable. It is using invariants to describe a system's behavior succinctly and declaratively. It is having the confidence we can safely use, modify, and influence a large complex system without understanding its every detail. It is modularizing something in anticipation of multiple users or prefetching and caching in anticipation of future use.
Wing goes out of her way to separate computational thinking from thinking about computers.
Computational thinking is a way humans solve problems; it is not trying to get humans to think like computers. Computers are dull and boring; humans are clever and imaginative. We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations.
Jeannette Wing makes a strong case that computational thinking should be as important a part of the learning experience as the three R's, though in CACM she preaches to the choir. She suggests that computer science professors teach a course "Ways to Think Like a Computer Scientist." But how do we convince students they should take it?

## Sunday, March 05, 2006

### Computer-Assisted Proofs

Thomas C. Hales talked at the recent AAAS meeting about his proof of the Kepler conjecture. From a New Scientist item
In 1998 Hales submitted a computer-assisted proof of the Kepler conjecture, a theorem dating back to 1611. This describes the most efficient way to pack spheres in a box, wasting as little space as possible. It appears the best arrangement resembles the stacks of oranges seen in grocery stores.

Hales' proof is over 300 pages long and involves 40,000 lines of custom computer code. When he and his colleagues sent it to the Annals of Mathematics for publication, 12 reviewers were assigned to check the proof. "After four years they came back to me and said they were 99% sure that the proof was correct and they said were they exhausted from checking the proof," Hale says.

As a result, the journal then took the unusual step of publishing the paper without complete certification from the referees.

Should we trust such a computer-assisted proof? I have more faith in a computer properly executing its code more than a mathematician verifying a proof. But what should constitute a legitimate proof when part of that proof is verified by a computer?

In most mathematical papers, we don't give formal logical proofs of our theorems. Instead we give a detailed proof that gives all of the necessary ideas to convince the reader that a formal proof would be possible. But, at least with our current technology, that level of informality will not work with computer code. So any proof using computer verification should have a formal proof of the correctness of the code.

This would require significant work from the author, but we are talking about establishing mathematical truths. When the other alternative is publishing the solutions to major open questions without being fully refereed, what choice do we have?

## Friday, March 03, 2006

### Elsevier and TCS

My post A Referee's Boycott generated quite a discussion in the comments, particularly about Elsevier. Paul Beame asked about why the EATCS still sponsors the Theoretical Computer Science through Elsevier. Don Sannella, editor-in-chief of TCS-B (Logic, Semantics and Theory of Programming), responded to Beame and earlier comments. Paul sent me a response to Sannella's comments. I'm reposting Sannella's comment followed by Beame's response.
Don Sannella's Comment

Regarding the relationship between EATCS and TCS: EATCS is in the process of changing its statutes to say that it supports the spread of the results of research and exchange of information through scientific publications, without specific mention of TCS or any other journal. This decision has already been made and approved by the membership; the only thing holding up its implementation is the fact that EATCS is legally a Belgian organization so revision of the statutes involve lawyers etc. I think this is an appropriate change (speaking also as a member of the EATCS Council); the previous situation was simply a result of the way that EATCS and TCS grew up together and were set up by the same people, starting at a time when there were very few journals.

Regarding criticisms of TCS:

• Copyright: There is a lot of misinformation circulating about this issue. I have even caught one of the main advocates of open access publishing making plainly false statements in a public talk. I suggest that there would be more light and less heat if people would take the trouble to find out what the actual situation is before criticizing.
I think the main practical issue is ability of authors to publish their work on their own websites. In this respect Elsevier's copyright agreement is not significantly different from the ACM's, or Springer's, unless there has been a recent change to these that I haven't noticed. There is an explanation of this aspect of the Elsevier copyright, by the Elsevier editor in charge of TCS, in the Bulletin of the EATCS number 75 (Oct 2001). The EPrints organization regards Elsevier as self-archiving-friendly ("green" status) and it reached that status before Springer did.
• Price: I know that TCS is expensive, probably the largest item in any Computer Science library's journal subscription budget. But it is also very large, with 12000 pages published per year. If you look at the price per page (here are 2004 figures from the AMS for mathematics journals which are by the way substantially different than the price comparison given by Wim van Dam) the cost is $0.42/page which is comparable with other journals. This doesn't take the thousands of pages in ENTCS, which comes free with TCS, into account. The whole issue of journal price is complicated because the primary mode of access these days is electronic, and prices for electronic access are negotiated on a case-by-case basis. If you discuss the issue with Elsevier, the statistic they will give you is that the per-download price of an article in TCS (computed by taking the total cost of subscriptions and dividing by the total number of downloads, I think) is considerably less than$1. According to Elsevier, this is the figure that librarians care about, and the fact that it is a fraction of the cost of interlibrary loan is the key point.
• Open access: The open access movement advocates journals that are free to readers. In this model, the author is the one who ends up paying; this fact is mentioned much less often and some people who advocate open access don't appear to be aware of it. (I know of one new open-access journal that is free to authors as well because the costs are covered by a university, at least for the moment. The point is that somebody needs to pay; running a journal is not a cost-free spare-time activity. See "Guide to Business Planning for Launching a New Open Access Journal" from the Open Society Institute.) There are major opportunities for unfairness in the editorial process with author-pays but otherwise the only problem I see is that with both models co-existing, few authors with an article that would be accepted by a "normal" journal will be willing to pay for publication in an open access journal. Springer has recently offered authors the choice of paying a fee in order to make a paper open access, or not paying and leaving it as paid access. I hope they publish statistics on how many authors decide to pay!
• Academic Press versus Elsevier: "Academic Press had its flaws but they were not predatory in their pricing." Well, compare AMS's 2004 figure for Information and Computation ($1.07/page, Elsevier-owned) with its 2001 figure ($1.92/page, Academic Press-owned).
• Quality of TCS: As editor-in-chief of TCS-B — which is admittedly probably not the main part of interest to readers of this blog — I am responsible for its quality. I think the quality is pretty good and improving. Opinions on this may vary of course. At least, it is not the case that the alleged decline in quality is because (as Paul Beame asserts) "TCS went to a highly distributed editorial board". The way that the TCS editorial board works has not changed since it was founded in 1975, as far as I know. I wonder where he gets his information. I am unhappy about the implied suggestion that the TCS editorial board members are not exercising proper editorial judgment.
Finally: I am not here to make excuses for Elsevier. My interest is TCS (and EATCS) and replying to some points above that are factually incorrect.
Paul Beame's Response

I am happy to hear about the EATCS change. Let me address the two main points, copyright and price, as well as open access journals.

Copyright I agree that copyright is no worse at Elsevier than at Springer (in fact Springer has gotten worse recently). Copyright transfer is apparently not required given the following text I received from Elsevier regarding a JCSS paper:

Recently, we sent you a Transfer of Copyright form relating to the above-mentioned. We note that you have not yet returned a completed form duly signed. In order to avoid any delay in publication, we ask that you do so immediately. Attached you will find a further copy of the form. Please return the completed and signed original of this form by mail or fax, or a scanned copy of the signed original by e-mail.

If we do not hear from you by return, the article will carry a line in place of the copyright line merely indicating that Elsevier published the article.

This sounds all right BUT when I have explicitly took advantage of the second option I noticed that when the article was published Elsevier still explicitly claimed copyright on it!

Price Thinking about things as price per page is exactly the problem. TCS was one of the top 2 or 3 theory journals and around 2000 pages annually until 1989 when it decided to go to bi-weekly publication and a much larger editorial board and upped its page count to 3500, raising its prices drastically overnight to keep the same price per page. The average quality declined markedly at this time as the good papers were swamped with more lower quality fare. TCS still publishes many good papers but it is nowhere near as high quality as it was in the 1980's when it got many of the top papers in the field.

Moreover TCS is just one Elsevier journal. Their behavior with others is part of the problem: In the early 90's I was deciding between publishing in Annals of Pure and Applied Logic (Elsevier) and Journal of Symbolic Logic (ASL). I was told that longer papers were more appropriate for APAL and so submitted there. I made the mistake of not checking prices: JSL was 12 issues a year, each over 300 pages, and cost $400 or so annually. APAL had 4 issues per year, each about 250 pages, and cost more than$2000. The quality of the two was similar.

I speak with librarians who have to purchase journals. The pricing for electronic journals that Elsevier sets are bundled in such a way that they feel forced to subscribe electronically to many journals that they do not want to purchase. The comparison with inter-library loan is absurd.

The price comparison should be with society-published journals such as the ACM and SIAM journals. These do provide the main office editorial staff that for-profit journals provide.

Open Access I agree that the long-term soundness of the open access model is not yet fully established. (There are some things that need to be paid for without voluntary investment beyond refereeing and it is not yet completely clear how to do this long-term.) However, if you want an example of an open access journal that does not seem to suffer from the flaws you describe, consider JAIR (the Journal of Artificial Intelligence Research) which has been operating for more than a decade and is one of the top couple of journals in AI.

(It may be too soon to tell about Theory of Computing is in its infancy but it already has a very high quality of papers.)

Why is it that Elsevier regularly emphasizes the comparisons with nascent open access journals but regularly ignores comparisons with high quality society-published journals such as SIAM and ACM journals?

## Thursday, March 02, 2006

### The Internet Never Forgets

The ACM announced the 2005 Award Recipients. Looks like it is for real this time, here is the press release on Peter Naur's Turing Award.

What happened last week? I got an email pointing to the awards site and suggesting that I congratulate Omer Reingold in the weblog. I agreed and put up the post and mentioned a few other winners as well. It wasn't until several hours later that I discovered, via an anonymous comment on the post, that the awards site went up by mistake. By that time the damage had long been done so I decided to just leave the post.

Inadvertent announcements have always occurred but the Internet makes the news travel faster and further and impossible to undo.

## Wednesday, March 01, 2006

### Class Times

At the University of Chicago most courses on Monday-Wednesday-Friday run 50 minutes each and on Tuesday-Thursday run 80 minutes. Many other universities have similar timings. Most professors seem to prefer the longer classes especially for graduate courses: You only have to teach two days a week, you don't have to recap as much and you get an extra ten minutes a week.

I prefer the 50 minute lectures. Many theorems fit nicely into these smaller lectures. These lectures are easier to prepare. But most importantly I remember struggling to keep focused as a student in those longer lectures and I don't want to subject my students to the same.

There are variations on the theme. I took a graduate cryptography class with Silvio Micali that went for three hours once a week. We did have a muffin break in the middle and Silvio has the personality to pull it off.

During my sabbatical year in Amsterdam I taught a short course that had 90 minute lectures. The students insisted on having a break in the middle. Most Dutch movies theaters inserted an intermission in the middle of movies. Apparently the Dutch have an attention span no longer than half of a soccer game. My kind of people.