Innovations in Computer Science 2010 (post #2)

Guest Post by Aaron Sterling

This is the sequel to my previous post on ICS 2010, the new theoretical computer science conference whose Call for Papers emphasized conceptual contributions and the opening of new areas of research. Before diving back into the technical program, though, I'd like to say one thing about travel to Beijing. I found it surprising that I spent almost three hours at the Beijing airport after I landed. Part of the delay was due to a long line at the immigration counter, but I also spent almost 45 minutes waiting for a taxi -- and I was waiting outside, in 15 degrees Fahrenheit. One attendee told me that he had waited for a taxi for well over an hour on Sunday (I arrived Monday), and he thought it was due to the fact that fewer taxi drivers work on Sunday. So if you fly to Beijing in the winter, dress warm and be sure to allow a lot of time between your flight's arrival and anything else you might schedule yourself to do.

I'll start by mentioning "Effectively Polynomial Simulations" by Pitassi and Santhanam. (Rahul may think I'm stealing his thunder by blogging about both of the papers he presented at the conference. Oh well. His talks just rocked, and I want to share a little piece of them. He must be one of the best speakers in all of complexity theory.) Motivated by consideration of SAT solvers as proof systems, Pitassi and Santhanam generalize the well-known notion of p-simulation of one proof system by another, to a reduction where polynomially much preprocessing can be performed on an input (i.e., set of clauses or tautology) before the simulating proof system simulates a proof of the tautology in the simulated proof system. (Intuitively, using this reduction captures what proof complexity has to say about the P?=NP problem, in much the same way as p-simulation captures what proof complexity has to say about the NP?=coNP problem: a proof obtained by effective p-simulation is a witness for the ability of a SAT-solver to find a satisfying assignment for the input with only polynomial time processing and preprocessing.) This technique clarifies and fleshes out the relationships between several well-studied proof systems. For example, Tree Resolution effectively p-simulates several other proof systems, such as Nullstellensatz and Polynomial Calculus, even though Tree Resolution does not p-simulate those systems.

Perhaps the most philosophically motivated talk was Elad Eban's "Interactive Proofs for Quantum Computations," co-authored with Aharanov and Ben-Or. Eban pointed out that D-Wave is claiming it will soon have a working 128-qubit quantum computer, and he asked the reasonable question: "How could we determine whether that machine is actually a quantum computer if we are limited to classical computation ourselves?" His answer was a new interactive proof system, where the prover can perform feasible quantum computation (i.e., QBP), but the verifier is limited to feasible classical computation (i.e., BPP). (Note that what I just said isn<92>t quite right, because their proofs require that the verifier also have access to three qubits that he can send to the prover, and check the state of later. It's open whether the number of such qubits could be reduced, hopefully to zero.) So the prover -- that is, the D-Wave quantum computer -- needs to convince the classical verifier that a quantum computation did, in fact, take place. This classical-quantum IP system can be generalized so that the verifier is "playing" both Alice and Bob, and the prover is "playing" a quantum channel over which Alice communicates to Bob. This yields results about ensuring that quantum computations are performed correctly, even if the party that owns the quantum computer is not trusted.

The last paper I will summarize (and again, apologies to everyone else!) is "A New Approximation Technique for Resource Allocation Problems," by Saha and A. Srinivasan. I missed this talk, but the paper is cool, and the technique shares a theme with the new Strong Linear Programming method I mentioned in my previous post. Essentially, Saha and Srinivasan present a randomized rounding technique that is sensitive to the constraints satisfied exactly (i.e., with equality). Given an LP, and some vertex x within the polytope, perform a random walk starting at x in such a way that any constraint satisfied exactly by x remains satisfied exactly at every step in the random walk. Provably, the walk will end at a vertex that is closer to an optimal solution than guaranteed by previous methods. They use this technique to obtain improved results for several well-studied problems. Impressive, but my favorite part of the paper is the section on future work. Unlike most authors (including me), who just make general mention of open problems at the end, Saha and Srinivasan provide a detailed plan of attack to settle a matrix theory conjecture using their technique, and they discuss potential application to two other open problems as well.

There was a two hour discussion session following the last talk. It was divided into two parts: suggestions of possible research ideas, and comments about this year's and next year's ICS conference. Several well-known researchers spoke. Shafi Goldwasser gave a brief overview of recent innovations in cryptography, focusing especially on the results presented in ICS. Paul Valiant encouraged researchers to investigate biological evolution through an algorithmic lens, calling it "the ultimate algorithm." Ron Rivest said that innovation need not stop with the submissions; rather, the format of the conference itself could be innovative, accepting videos or other material, not just 10-page papers. Mike Saks synthesized several suggestions by saying there could be a rump session in which people present open problems by saying, "I tried to solve this problem by doing X, and I failed because I ran into the following problem."

(As an aside, Bernard Chazelle made the important point that an open problem session should not turn into a list of grievances. I strongly agree. There were a handful of comments in the discussion like, "The community needs to stop rejecting papers about topic X." I think that's a dangerous road to walk down, and, early on, some of the negative blog commentary about ICS was due to concerns that it might just be a way for certain people to get their pet projects published even though that work was not considered significant by the larger community. I saw no evidence of this kind of nepotism at ICS 2010 -- the papers all seemed to "belong there" on their own merits -- and I hope future conferences continue to maintain the high road.)

There were only three posters at the poster session. During the discussion, Cynthia Dwork encouraged students to prepare posters on their current work, and to submit them to ICS 2011. (She was on the ICS 2010 Program Committee, so I think graduate students worldwide should consider that suggestion to be a big phat hint.)

There were 39 accepted papers out of 71 submitted. Between jetlag and lack of an excursion, a lot of the western-hemisphere attendees missed a lot of the talks. I attended about 30, because I took one afternoon off to see some of downtown Beijing and meet an online (now real-life) friend. Many people attended fewer. For example, I talked to one person who attended only 12-15 talks, because of sightseeing and needing to sleep during the day. Amazing as the banquets and entertainment were, the ICS 2011 organizers might consider investing those resources into some form of "tourist" excursion instead, to increase attendance at the technical program.

The conference atmosphere was a bit different from that of other conferences I have attended, because, if you were on site, you were expected to be in the lecture hall. There were no break-out rooms to my knowledge, and most of the informal contact took place at the coffee break, on the shuttle bus and during the opening and closing banquets. This strikes me as a matter of taste rather than a problem, and I don't feel strongly one way or the other about changing it. However, if the ICS 2011 organizers would like to encourage on-site research collaboration, a couple conference rooms near the lecture hall would go a long way toward making that happen.

All of the talks were videotaped, and ITCS has copies of at least most of the slides used in presentations. Andrew Yao said in the discussion that, if authors consent, he would like to make that material publicly available. I have emailed my consent to post both video and slides, and I would like to encourage other authors to do the same. I believe this conference is a great new resource, and, as such, should be widely shared.

ICS 2011 will be held once again at ITCS in Beijing. Bernard Chazelle will be the Program Chair, and a preliminary Call for Papers is already available. I'm extremely grateful to everyone who made ICS 2010 possible, and I am looking forward to the new approaches developed by "innovators" in the coming year.

Ni de “posts”, hen Hao !

ReplyDelete“I found it surprising that I spent almost three hours at the Beijing airport after I landed”.

I found this surprising too. This was totaly different in the mid-nineties. I lived in Beijing for more than a year then, travelling a lot and no problem at all to find a taxi at any place, any day, any hour.

On the other hand this comment on your previous post

“I think everyone was pleased to be treated like a first-class scientist, instead of like someone doing "impractical" things that are less worthy of support”

does not surprise me at all. China is a scholar based culture, where the elites were devoted to study, since at least a milenium and a half (from Tang´s). Since this is not a political blog we will not duscuss other issues, as you suggest.

No link to Chazelle´s talk or paper ?

Computer scientists can't exactly be blind to the political situation in China. Breaking news today: Google will stop censoring its results and possibly pull out of China entirely because it seems the Chinese government has been hacking into the gmail accounts of human rights activists for some time now...

ReplyDeleteTo take it the next step, if the computer science theory community helps discover attacks on encryption and privacy that could only be undertaken at enormous costs on large scales, is China such a place that would want to encourage these types of advances?

ReplyDelete

ReplyDeleteNo link to Chazelle´s talk or paper ?I wasn't able to find it online. I have the copy in the printed proceedings, which is how I was able to quote from it.

I really appreciate the question, because you're the first person to ask about (or comment about) the technical program of the conference. Thank you.

Where's the ICS 2011 call for papers?

ReplyDeleteNice informative post. Seems that China and ITCS will become a prominent center for tcs from now on.

ReplyDeleteThanks for your answer, Aaron. Due to the title and the author, I´m after a copy of this paper since a while but could not find it yet.

ReplyDeleteAbout Elad Eban's "Interactive Proofs for Quantum Computations", I'm surprised that you wrote: "It's open whether the number of such qubits (that the verifier needs to manipulate) could be reduced, hopefully to zero." In independent work presented at FOCS last year, I showed with Fitzsimons and Kashefi (arXiv:0807.4154) that the verifier only needs to generate randomly polarized single photons. We also showed that in a multi-prover scenario, the verifier does not need any quantum power. The only question thus remaining is the "hopefully to zero" case that you mention: can the single-prover scenario be accomplished with a completely classical verifier?

ReplyDeleteThank you, Anne Broadbent. I apologize for the omission. The Elad et al. paper does mention that you obtain "similar implications" using only one qubit, and then describes differences between their paper and yours. I wrote the post on the plane back from Beijing, so had access only to my memories and the conference proceedings. I didn't understand how closely the results of the two papers were related.

ReplyDelete