Monday, May 30, 2005

Conference Presentations

The quality of conference presentations have, on average, much improved over the past decade or two. Why? Certainly technological improvements like PowerPoint and advanced LaTeX macros have helped. As our field gets more specialized, talks in general theory conferences have to appeal to a wider audience which tend to improve the presentation. Or perhaps I'm just remembering only the bad talks from the good old days.

Despite the increase in quality, I find myself going to fewer and fewer talks in general theory conferences. I learn much more talking directly to my fellow computer scientists. As for the presentations, I can read the papers later.

A fellow computer scientist suggested that we hire a company to videotape the talks and make them available on the web. A back of the envelope calculation suggested we could make this happen for about $10 extra per participant for a reasonably sized conference. I am not a fan of making talks available on the web. Outside of a conference, who has time to sit at a computer screen and watch talks. I also worry about giving people yet another reason not to go to a conference. Remember the most important aspect of a scientific conference are not the talks and papers but bringing members of the community together.

Saturday, May 28, 2005

Newspaper Odds

My cell phone received a breaking new alert yesterday: The FDA is investigating a link between the impotence drug Viagra and blindness. The story also made the front page of today's Chicago Tribune. Look carefully though and you'll notice 38 reports of blindness among the 23 million Viagra users. Even if the drug directly caused the blindness the numbers translate to a 0.00017% chance of losing your sight using Viagra. Breaking news indeed. You have a much greater chance losing your sight not using safety goggles in the workplace and not have as much fun in the process.

This is an example of what I call newspaper odds. If some people's misfortune appears in the newspapers then the odds are so low that you really shouldn't worry about it. High school mass shootings. Mad Cow Disease. Carbon Monoxide Deaths. No significant need to worry about these.

When deaths become too common to appear in a newspaper then you need to take notice and act carefully, say with automobile accidents or AIDS. Of course a cause of death might not appear in a newspaper simply because it doesn't happen, like recent US major commercial airline disasters. How to we tell the difference: celebrities. If a celebrity dies of AIDS or gets seriously injured in an automobile accidents, newspapers will cover it and remind us that these remain serious concerns for us all.

Thursday, May 26, 2005

CCC 2005

Ravi Kumar and D. Sivakumar, the local organizers of the upcoming Conference on Computational Complexity in San Jose, ask that I post the following. I hope to see you all there.

The early registration deadline for Complexity 2005 is 5 pm EDT on FRIDAY, MAY 27, 2005 (Eastern Daylight Time == 4 hrs behind Coordinated Universal Time (UTC/GMT)). Please take special note of the time: though the conference is on the Pacific Coast, early registration ends 5pm Eastern Time.

When you register for the conference, if you are not an IEEE Member but a SIGACT/EATCS Member, please enter that number (e.g., SIGACT xxxxx) to qualify for the discounted rate.

Please consider staying at the Conference hotel, Hyatt Sainte Claire; besides being convenient, it will help limit the conference expenses.

In San Jose and around the Silicon Valley, you will experience a unique combination of cultures and cuisines (American, Asian, European, Mexican) like nowhere else. There is a large number of restaurants, coffee shops, and bars within walking distance from the conference venue; these include highly-rated upscale restaurants as well as hole-in-the-wall type places that serve authentic food from around the world. For example, you could even get falafels that pass Ziv Bar-Yossef's stringent standards, and South Indian food certified by your local organizers as the best outside of Chennai. If you're one of those poor souls that happen to be vegetarian/vegan at a theory conference, relax -- there's Good Karma, White Lotus, and Vegetarian House within walking distance.

During mid-June, San Jose is an absolutely pleasant place to be, with daytime highs close to 80 degrees Fahrenheit (about 27 degrees Celsius), and night time lows near 55 degrees Fahrenheit (about 13 degrees Celsius). Downtown San Jose, where the conference will be located, has numerous interesting places: the Cesar Chavez Plaza and the Tech Museum of Innovation are right across from the hotel. The Center for the Performing Arts (CPA), the San Jose Repertory Theatre, the San Jose Museum of Art, San Jose State University, as well as the light rail station, are all within walking distance. CalTrain station (to go to San Francisco) is only about a mile away. The Repertory features Exceptions to Gravity by Avner Eisenberg during some of the conference days. CPA has the Festival of Cultures by the SJ Jazz Society, and an American Musical Theatre show during some of the conference days (see here). The Museum of Art (no entry fee!) has the Blobjects and Beyond : The New Fluidity in Design exhibit during all of June. The Tech Museum of Innovation is a one-of-its-kind museum that you should absolutely not miss when you're in town; during June, the IMAX theater there features a limited-screening edition of BATMAN, plus the Mysteries of the Nile -- be sure to check it out. If you're bringing children along, they will definitely enjoy the Children's Discovery Museum, within walking distance of the conference. Unfortunately, Major League Soccer's San Jose Earthquakes are playing Chivas USA on the road in Los Angeles.

Wednesday, May 25, 2005

Complexity and Sudoku

A Chicago undergrad Amanda Redlich gave a presentation and used the shorthand Complexity (Complex-ity). Clever. Of course this should never be confused with Reality.

Today's Chicago Tribune has an AP article on the British craze of a Japanese number game Sudoku. In this puzzle you have a 9x9 grid subdivided into 9 3x3 grids. The goal is to fill the full grid with numbers 1 through 9 such that each number appears exactly once on each row, column and subgrid given some initial partial setting.

As a computational complexity theorist, I immediately wondered how hard is the generalized game on an n2xn2 grid. A little googling shows the problem is NP-complete, shown in a 2003 Master's thesis of Takayuki Yato at the University of Tokyo. His proof uses a simple reduction from the Latin Squares problem proved NP-complete by Charles Colbourn.

Monday, May 23, 2005

STOC Business Meeting Redux and More

My liveblogging experiment didn't quite work as planned. I seemed to have lost half of what I wrote and then my battery died. So here is some basic info from the meeting.
  • Most of the discussion was on theory funding and on the STOC republication policy and most of those discussions survived from yesterday. Check out the new Theory Matters site advocating increased theory funding.
  • The Gödel Prize went to Noga Alon, Yossi Matias and Mario Szegedy for their paper The space complexity of approximating the frequency moments.
  • Omer Reingold and Vladimir Trifonov won the best paper and best student paper awards respectively for their algorithms for undirected connectivity.
  • Future Conferences: Complexity 2005 in San Jose, California June 12-15. Early registration deadline is Friday. FOCS 2005 in Pittsburg October 23-25, STOC 2006 in Seattle May 20-23, Complexity 2006 in Prague July 16-20, STOC 2007 and Complexity 2007 as part of FCRC in San Diego June 9-16 and STOC 2008 will be in Victoria.
  • Check out the poster of the NP-completeness and the new DIMACS Implementation Challenge.
The conference had several good surveys commemorating Larry Stockmeyer who passed away last summer. Stockmeyer's advisor Albert Meyer gave a talk describing how they worked together and giving an interesting small result in Stockmeyer's thesis that certain sets created through diagonalization have i.o.-speedup. I also posted the slides and paper from my Stockmeyer lecture.

Complexity theory is well represented in this year's conference with some very nice papers in extractors, derandomization, PCP construction, hardness amplification and much more. Check out the program to see more.

On Friday and Saturday nights, the STOC hotel hosted proms from local schools. It's easier to explain baseball to non-Americans than the concept of a prom where high school students wear fancy clothes and spend large amounts of money for a single party.

Sunday, May 22, 2005

STOC Business Meeting

10:45 PM: Hal Gabow on STOC republication policy. When can one submit to STOC when similar paper appeared in previous conference. Current policy does not allow simultaneous submission of the same (or essentially the same) abstract material to another conference with a published proceedings. Should this be more precise? Who enforces the policy? Should the policy be changed?
SIGPLAN policy allows republication if additional value of its publication beyond that of the original paper.

9:57 PM: Michael Foster, Director of CCF at NSF
Proposal tripled over last five years. CISE budget won't change much in next few years.
What's Theory For: Hard foundational questions, linkages between disparate fields; incubator.
Expect theory researchers to do theory but also work in other areas.
Theory Program: Maintain strong supporters in complexity. Narrow systems-theory gap in algorithms and consider applied theory co-funded with other groups.
Looking for theory program director and senior advisor to Peter Freeman (head of CISE).
Questions: How is funding allocated in NSF? Need to show theory necessary to advance well-being of the country. Resist urge to take money away from other ares. Avoid entitlement arguments.

9:37 PM: Andy Bernat from CRA.
CRA focuses on increasing funding and helping researchers with their careers.
DARPA cuts in basic research funding, more than half (>$100M) in last four years. Those researchers are now turing to the NSF.
At recent Future of Computer Science hearing of House Science Committee DARPA argued it funding the computer science that needs funding. Committee charges CS community to come up with a list of areas in CS that need funding. Several workshops planned to address this including summit to be organized by Bill Wolf.
What we can do: Become program director, division director, and assistant directors at NSF. Push on advisory committees, participate in CRA and talk to our legislators. Read CRA Blog.

9:10 PM: Discussion on Theory Funding chaired by Sanjeev Arora.
Some background here and a new Theory Matters website.
Theory claims smaller part of full NSF budget in CS. But also general funding crisis is in funding crisis. Tension between "Core CS" and "Applications of CS"
TCS's greatest strength: Unexpected Payoffs: NP-completeness led to crypto to zero-knowledge to interactive proofs and PCP to coding theory as well as boosting.
More in Advocacy Document.

8:58 PM: Laszlo Babai
G�del Prize: Noga Alon, Yossi Matias and Mario Szegedy for their paper On the space complexity of approximating the frequency moments.

8:52 PM: Andy Bernat from CRA
Outstanding Undergraduate Award: Male Winner Mihai Patrascu (MIT). The female winner, Andrea Grimes (Northeastern) was awarded at the Computer-Human Interaction conference.

8:36 PM: Program Chair Report:
Best Paper: Omer Reingold, Best Student Paper: Vladimir Trifonov, for their low space algorithms for undirected connectivity.
84 accepted papers out of 290 Submitted
33 out of 80 (41%) in complexity.
31 of 128 (24%) in algorithms.
20 of 82 (24%) in "alternative models."
Submissions from 23 countries. Israel most after US.

8:22 PM EST: Local Arrangements Report:
256 Participants including 108 students.
Total Income and expenses each about $97K.

No beer!

I'm liveblogging the business meeting. Keep it here.

Friday, May 20, 2005

Welcome Summer

Most US universities have ended their academic year and moved into the summer season. I like summer not so much for the weather (it gets hot and muggy in Chicago) but for a relaxed research atmosphere. Less courses and more importantly virtually no faculty meetings of any kind give us the time to put some concentrated effort into research.

Summer is also the conference season. We have conferences and workshops year round but many organizers like to have their conferences in the summer when they won't conflict with courses. Instead we have conferences conflicting with each other. Be careful that you don't want to go to too many conferences as they cut into your the summer relaxed research atmosphere.

I plan to attend at least two conferences this summer, STOC, which starts this weekend in the beautiful suburbs of Baltimore and, of course, the Conference on Computational Complexity next month in San Jose. Stop by and say hi if you are there.

It's not summer yet in Chicago. We run in quarters at the University of Chicago and have two more weeks of classes followed by finals week. I can go to STOC missing only one day of classes but there were some conferences and workshops later on I will have to skip for finals week and graduation.

We get our revenge in the fall where most universities start at the beginning of September or earlier and our classes don't start until the last week of September. We hardly see any conferences scheduled in September, particularly in the US, because it is the beginning of most universities semesters. So I use September to visit faculty at other schools. We used to take vacations in September (crowds are smaller everywhere) but now the kids have school starting in late August making our effective summer quite short.

Thursday, May 19, 2005

A Long Time Ago in a City 800 Miles Away

I was 13 when I went to see the first Star Wars movie on opening weekend in 1977 in New York City when it was just called "Star Wars" without a subtitle or episode number. Theater staff handed out buttons saying "May the Force be with you." We had no idea what that meant. We then entered the theater and saw a great movie.

That first movie remains my favorite of the Star Wars series to date, with the movie's single tight finale and the "Force" more mysterious than real. Special effects in movies have gotten so good that they can no longer wow you like they could back then.

In the early 90's a Chicago professor gave a welcome lecture to the incoming freshman and ended by saying "May the Force be with you". Most of the students had no idea what he was talking about. I felt old that day.

As the final Star Wars chapter officially opens in the US today, my oldest daughter is only three years younger than I was when the first movie arrived. The movie has gotten good reviews and I look forward to reliving my youth, being with the Force and traveling one last time to that galaxy far far away.

Tuesday, May 17, 2005

George Dantzig 1914-2005

George Dantzig passed away last Friday. In the 1940s Dantzig invented linear programming and developed the simplex method for solving LP.

Simplex works well in practice but it remains open whether simplex runs in polynomial time for worst-case inputs (though see this paper by Spielman and Teng). Dantzig's death comes just two weeks after the passing of Leonid Khachiyan who had the first provably efficient algorithm for linear programming three decades after Dantzig developed the simplex method. Khachiyan's ellipsoid algorithm is not at all practical as compared with the simplex method.

Monday, May 16, 2005

Crisis in Theory Funding

Guest Post by Boaz Barak

The National Science Foundation (NSF) has a program called "Theory of Computing" which is the only program devoted to funding research in theoretical computer science. We use here "Theoretical Computer Science" in a broad sense, which includes research in Algorithms, Computational Complexity, Cryptography, Computational Learning, Network algorithms, etc. (Of course theoreticians can and do also apply to more application-oriented programs such as cybertrust and others)

Although the ToC program never had a large budget, looking at the projects and people it funded throughout its history, we can safely say that it had a huge impact much beyond the boundaries of TCS and even beyond the boundaries of Computer Science at large. Indeed, much of the initial research on field-transforming concepts like Quantum Computing, Boosting of learning algorithms, Cryptographic Protocols, Interactive Proofs, and more was supported by this program.

Unfortunately, the budget of ToC program has been more or less stagnant at approximately $7M per year since 1989 (in dollars without adjusting for inflation the ToC budget was $6.4M in 1989, $5.1M in 2004, and will be $7.2M in 2005). Needless to say, in these 15 years the costs of research (such as faculty salaries, student stipends etc.) grew even beyond the global rate of inflation. Also the overall CS budget at NSF tripled during this time. Thus the ToC program budget that was merely insufficient in 1990 and dangerously insufficient in 1999 is now at what can only be described as a crisis level. This is a situation that must be corrected if we wish our field to continue to thrive. Given TCS's achievements in the last few decades and challenges for the future, this should concern not only theoretical computer scientists.

What can we, TCS faculty in the U.S., do to fix this?

  1. First of all, educate ourselves about the funding situation and ways to solve it. If you can, please come to the business meeting at the upcoming STOC, where these issues will be discussed.
  2. We need to participate more in the NSF, this includes reviewing proposals, participating in panels, and volunteering for positions. NSF administrators are actually quite appreciative and supportive of theory, but the situation will not change without active community involvement. In particular, please consider volunteering for the position of director of the ToC program.
  3. We need to educate others about what we do. This includes bright math-inclined high school kids who could potentially become TCS researchers, educated adults (including also other scientists), and also other computer scientists. We need more popular science books, general audience lectures, essays and op-eds.

Sunday, May 15, 2005

Pitfalls of the Tenure System

In my last post you can find some links and comments about the tenure system. Let me add a few concerns that don't often get mentioned.
  • Tenure keeps academic salaries low: Tenure has a definite monetary value as when combined with life and disability insurance removes nearly all risk of future income. A university can then shave the salary in comparison to other lines of work. I suspect the shaving is high in the humanities where positions outside academia has a higher risk of long-term under or un-employment.
  • Tenure allows universities to age discriminate: Because hiring with tenure requires a large long-term commitment, departments have to hold candidates for open tenured positions to a much higher standard than for open assistant professor positions. Often departments will only consider candidates for the assistant professor position. Without tenure, US law would prevent holding candidates to different standards because of age for basically the same position.
  • Tenure Jail: Since most jobs in the US are not secure, one can change careers midstream without entailing much additional risk. Many tenured professors would be reluctant to leave their risk-free position for another career, even if that other opportunity would make them happier and possibly more successful.

Friday, May 13, 2005

A Day of Links

The US House Committee on Science held a hearing yesterday on The Future of Computer Science Research in the U.S. You can watch the webcast or read the testimony. The CRA Blog also has it covered.

Thomas Friedman's column today talks about how the US no longer dominates the ACM programming contest.

A couple of links about tenure. An Inside Higher Ed article on the strange policies for tenure decisions (thanks Jeff) and a Chicago Tribune op-ed piece arguing against tenure altogether.

Thursday, May 12, 2005

Temporal Theory

One of our gradate students wanted to define a property P of classes C to hold if we don't know that C has complete sets. Doing so would make the concept time dependent. IP would have property P in 1989 but not in 1991 after we knew IP=PSPACE. One can easily show "If P=BPP then BPP has complete sets" but we don't have "If P=BPP then BPP has property P" rather we would have to say "If we know P=BPP then BPP has property P." Location can also be an issue. If the Martians have a proof that P=BPP then BPP has property P on Mars but not on Earth. Temporal Logic might give us a way to reason about such statements but basing them on unknown mathematical facts seems strange.

Suppose someone proves the polynomial-time hierarchy collapses. Then it didn't collapse because it was always collapsed and in fact was never a true hierarchy. The only reason we call it a hierarchy today is because we don't know that it isn't.

In mathematical reasoning we can't know whether a statement is true unless we have a proof of that statement. However once we have a proof of a theorem like P≠NP then not only do we have P≠NP now but in fact P≠NP was always true even before we had the proof. Despite this we can't help thinking that theorems become true as we see a proof. A year ago undirected connectivity wasn't in log space and now it is. However the theorems that were proven before I started graduate school were all true since the dawn of time.

Wednesday, May 11, 2005

Complexity of Computer Computations

Deep into the bowels of the Physical Sciences Library I went to retrieve the Proceedings of a Symposium on the Complexity of Computer Computation to track down Karp's famous paper for my last post. Later I discovered one of my fellow Chicago professors, Janos Simon, attended that conference as a young grad student and still had the proceeding in his office.

The symposium held March 20-22, 1972 at IBM Yorktown marked a shift in theoretical computer science towards more algorithms and complexity from logic, automata and grammars. The symposium attracted 225 registrants on par with our current STOC and FOCS conferences. From the preface:

The symposium dealt with complexity studies closely related to how computations were actually performed on computers. Although this area of study has not yet found an appropriate or generally accepted name, the area is recognized by a significant commonality in problems, approaches, and motivations. The area can be described and delineated by examples such as the following.
  1. Determining lower bounds on the number of operations or steps required for computational solutions of specific problems such as matrix and polynomial calculations, sorting and other combinatorial problems, iterative computations, solving equations, and computer resource allocation.
  2. Developing improved algorithms for the solution of such problems which provide good upper bounds on the number of required operations, along with experimental and theoretical evidence concerning the efficiency and numerical accuracy of those algorithms.
  3. Studying the effects on the efficiency of computation brought about by variations in sequencing and the introduction of parallelism.
  4. Studying the relative complexity of classes of problems with respect to lower bounds on computation time. In this effort, specific problems are classified as having equivalent difficulty of computation; for example, those problem which can be solved in a number of operations which is a polynomial function of the size of the input.
The symposium had a panel discussion on the state of the field, with a panel that included four future Turing Award winners: Robert Floyd, John Hopcroft, Richard Karp and Michael Rabin. The proceedings has a transcription of the panel session. Here are some edited quotes.

Ray Miller (Moderator): There are four general questions.
  1. How is the theory developing from originally being a scattering of a few results on lower bounds and some algorithms into a more unified theory?
  2. What specific examples have been found to demonstrate how real computer computations were improved from studies of this type?
  3. Is the progress of numerical-type computations and understanding of them much ahead of the combinatorial?
  4. Are there important open questions?
Floyd responding to the first question: Slowly.

Karp: We need a to find a name for our subject. "Computational Complexity" is too broad in view of the work of Blum and others, at least until we can gather them into our fold; "Concrete Computational Complexity" is too much like civil engineering; "Complexity of Computer Computations" doesn't ring true; Markov has already taken "theory of algorithms" away from us…Getting these materials into university curricula, particularly the undergraduate curriculum, is important and it's going to happen quickly.

There are lots of things that computer people do that aren't mathematical in the nineteenth century sense. We manipulate strings, we prove theorems, we retrieve information, we manipulate algebraic formulas. Looking at the primitives appropriate to these domains, we can get a much richer class of problems.

Charles Fiduccia: Working against unification is the failure of specifying a model for the computation. There seems to be too much emphasis on what is being computed and little emphasis on the model.

Hopcroft (responding to Floyd): You could change "slowly" to "it's not". Karp has shown many of these problems complete, the implication being that therefore these problems are hard. That might not be the case, they could even be algorithms that run in linear time. On the other hand, the fact that we know so little about lower bounds provides a lot of interest in the area.

Albert Meyer: The obstacle is not that we don't have a good formulation of a machine model. We have many different models, but we can't prove lower bounds for any of them.

Mike Patterson: Suppose somebody were to prove P≠NP then this would be of great personal and dramatic interest. But if it were to be established by a quite traditional, ordinary argument, which is just happened that nobody had thought of, then we would see in retrospect that it was the wrong problem.

Numerical and combinatorial algorithmic people, with some small exceptions, never did integrate well. But Blum and the computational complexity people did come "into the fold" and algorithms and complexity would come to dominate theoretical computer science in the US. The naming issue never did have a satisfactory solution, the Europeans call it Theory A. And I would be very happy with a "traditional ordinary" proof of P≠NP but I strongly doubt the solution is so simple.

As a side note, had I been able to find Karp's paper online I would never have read about this symposium that marked the new direction of theoretical computer science research.

Monday, May 09, 2005

Favorite Theorems: Combinatorial NP-Complete Problems

April Edition

If Cook made the P versus NP question interesting to logicians, Karp made the question important to everyone else.

Richard Karp, Reducibility Among Combinatorial Problems, Proceedings of a Symposium on the Complexity of Computer Computations, 1972.
All the general methods presently known for computing the chromatic number of a graph, deciding whether a graph has a Hamiltonian circuit, or solving a system of linear inequalities in which the variables are constrained to be 0 or 1, require a combinatorial search for which the worst case time requirement grows exponentially with the length of the input. In this paper we give theorems which strongly suggest, but do not imply, that these problems, as well as many others, will remain intractable perpetually.
Karp's paper used Cook's Theorem that satisfiability was NP-complete to prove 21 other problems NP-complete, including Clique, Integer Programming, Hamiltonian Cycle, Chromatic Number, Partition and Max Cut. Many of these problems arise from real-world optimization problems and Karp showed that if any of them have efficient algorithms then they all do. Researchers would later extend Karp's techniques to show hundreds if not thousands of natural problems are NP-complete.

Karp's paper named the classes P and NP and defined the notion of reduction (now often called Karp Reduction) where a language A reduces to a language B if there is a polynomial-time function f such that x∈A iff f(x)∈B. He defined a notion "polynomial complete" as the set of problems A in NP that every other NP-problem reduces to A, a notion we now call NP-complete. Karp also makes the argument that the definitions are robust to different reasonable encodings of the problem and different models of Turing machines.

Karp also alludes to the polynomial-time hierarchy.

If P=NP then NP is closed under complementation and polynomial-bounded existential quantification. Hence it is also closed under polynomial-bounded universal quantification. It follows that a polynomial-bounded analogue of Kleene's Arithmetic Hierarchy becomes trivial if P=NP.
Karp lists three NP problems whose classification was open at the time.
  • LINEAR INEQUALITIES: Basically Linear Programming, shown to be in P in 1979 by the recently departed Khachiyan.
  • NONPRIMES: Shown to be in P in 2002 by Agrawal, Kayal and Saxena.
  • GRAPH ISOMORPHISM: Not known to be in P but if GI is NP-complete then the polynomial-time hierarchy collapses which was shown in 1987 by combining results of Goldreich-Micali-Wigderson, Goldwasser-Sipser and Boppana-Håstad-Zachos.

Saturday, May 07, 2005

Ranking CS Departments

A few years ago, one ranking listed Harvard as the number one engineering school. The schools were ranked by average starting salary of their graduates and both of Harvard's engineering students that year got good jobs.

I could do several posts on rankings but let us focus on rankings of Ph.D. programs in computer science. One cannot give a linear ordering of departments: One department might have a strong theory group and another strength in systems. Even within theory, one department could have strong algorithms while another group has strong complexity. Even then one's graduate experience depends more on their individual advisor than the department as a whole.

Nevertheless Americans like statistics and ordering things including CS departments. There are two major rankings of Ph.D. programs: The US News and World Report ranking last updated in 2002 and uses a methodology of giving questionnaires to department chairs and directors of graduate study. The NRC ranking looks at a slew of different statistics but hasn't been updated since 1993. I hear rumors that the NRC will do a new ranking soon.

The US News ranking captures perception of strength instead of strength itself, often based one's opinion of a department from a few years back. On the other hand, the purpose of rankings are perception as we wish to be perceived as a strong department. We all complain about the rankings but they affect us greatly, in recruiting students (Americans especially use the rankings to choose a Ph.D. program) and recruiting faculty. When hiring faculty we often rightly or wrongly give extra weight to Ph.D.s from higher-ranked departments. Deans and provosts use the rankings to allocate resources as higher rankings lead to more prestige for the university as well.

For example, if a mid-level department wishes to have the strongest quality faculty they should hire mostly in one area, as people like to join groups with people they know, respect and can work with. But this approach won't help much in the rankings so most departments try for a broad faculty with likely lower overall quality but a better ranking.

At least forty departments have a stated goal of being a top-ten department. The pigeonhole principle guarantees many won't end up happy.

Friday, May 06, 2005

An Endless Frontier Postponed

Science has a special issue on Distributed High Performance Computing including an article Service-Oriented Science by Chicago's own Ian Foster. The must read is an editorial An Endless Frontier Postponed by Ed Lazowska and Dave Patterson.
At a time when global competitors are gaining the capacity and commitment to challenge U.S. high-tech leadership, this changed landscape threatens to derail the extraordinarily productive interplay of academia, government, and industry in IT. Given the importance of IT in enabling the new economy and in opening new areas of scientific discovery, we simply cannot afford to cede leadership. Where will the next generation of groundbreaking innovations in IT arise? Where will the Turing Awardees 30 years hence reside? Given current trends, the answers to both questions will likely be, "not in the United States."
Also from the CRA:
The timing of the issue also couldn't be better, given that the House Science Committee will hold a full committee hearing on "The Future of Computer Science Research in the U.S." on Thursday, May 12th. You can watch it live on the Science Committee's real-time webcast (also archived).

Wednesday, May 04, 2005


While other fields have standardized postdoc programs, computer science still searches for the right approach for postdocs. While we have had postdoc positions since I can remember, quite often researchers have gone straight from Ph.D. to tenure-track positions particularly in times of high growth in CS departments (early-mid 80s and mid-late 90s).

We are now seeing a spike in the demand for postdoc positions for several reasons.

  • A tightening job market means less tenure-track jobs so more people opt to do postdocs to build up their CVs. Several researchers are even taking second and third postdocs, not long ago a rarity in CS.
  • Many students opt to delay a tenure-track position for a year and do a postdoc first. The commitment goes both ways, if a department is holding a position for a student then that student is committing to going to that department. It's not fair to go back on the job market during that postdoc year. Some departments are becoming more reluctant to allow the year delay because of bad experiences with students not fulfilling that commitment.
  • More and more students attend graduate school in their home countries and hope to eventually return to permanent jobs in those countries but take postdoc positions elsewhere to get a broader view of the field.
Alas we don't have an increase in postdoc supply to go along with this demand. Many of the industrial research labs have shrunk and hire few or no postdocs. Meanwhile most US academic departments have no permanent postdoc programs and CS grants are rarely large enough to cover the cost of a postdoc (as opposed to Canadian and European groups which tend to have more postdocs). Just another way the US is losing strong researchers to other countries.

Tuesday, May 03, 2005

Dilemmas of Prisoners and Professors

Some interesting game theory and philosophy from the last couple of NUMB3RS episodes. Usual spoiler warnings.

In the April 22nd episode Dirty Bomb there were three suspects who wouldn't talk. Charlie, the mathematician, likened the situation to Prisoner's Dilemma and suggested putting the suspects in the same room, which is usually the wrong thing to do in prisoner's dilemma. What Charlie did was compute the utility for each suspect cooperating (with each other and not the FBI) based on family considerations and their previous record and convinced the one with the most to lose by cooperating to defect and talk to the FBI. Clever, but I really wonder if that would work in real life.

Last Friday's episode Sacrifice took a more philosophical direction. A murdered think-tank computer scientist was developing a program that measured academic potential based on where someone grew up, down to a city block. If such a program actually worked, how should a program be used, if at all? How far should one go to stop the project?

Charlie and his physicist friend Larry ruminated on whether scientists are responsible for how their research gets used, as well as a discussion on the lonely life of a scientist at a lightly attended memorial service for the murder victim. The episode also had a physics joke I don't quite get.

Applied physicists are from Venus; Theoretical physicists wonder why it spins in the other direction.
I really enjoy those discussions between Charlie and Larry because they ask some interesting questions and add some dimension to a public view of mathematicians and scientists.

Monday, May 02, 2005

Leonid Khachiyan (1952-2005)

Leonid Khachiyan passed away Friday at the age of 52. Khachiyan was best known for his 1979 ellipsoid algorithm giving the first polynomial-time algorithm to solve linear programming. While the simplex algorithm solved LP well in practice, Khachiyan gave the first formal proof of an efficient algorithm in the worst case. The ellipsoid algorithm also has applications for more general convex programming questions and can be used for approximating semidefinite programming used for example in the Goemans-Williamson algorithm approximating max-cut.
One of my first exposures to theoretical computer science came in high school when I read a New York Times article describing the algorithm. I later learned that article has become our standard example of bad science writing, focusing more on an unlikely link to NP-complete problems instead of just describing the important theoretical problem Khachiyan's algorithm does solve.
Mr. Khachiyan's method is believed to offer an approach for the linear programming of computers to solve so-called "traveling salesman" problems. Such problems are among the most intractable in all of mathematics…In the past, "traveling salesman" problems, including the efficient scheduling of airline crews or hospital nursing staffs, have been solved on computes using the "simplex method".
New York Times science writing (and science writing in general) has vastly improved since those days.

Sunday, May 01, 2005

Read this Weblog, Get a Job

One of the requirements for a software engineering job at Autodesk.
Good knowledge of common algorithms and data structures; understanding of computational complexity