Friday, July 31, 2009

The Singularity

Last Sunday the New York Times ran a front-page article Scientists Worry Machines May Outsmart Man that read a bit like a bad science fiction story about the dangers of computers that get too smart. The article reported on the AAAI Presidential Panel on Long-Term AI Futures that took place at Asilomar in February with many reasonable AI folks including a few that I know well, EC regulars Michael Wellman and David Parkes, my old NEC boss David Waltz and TTI-Chicago chief David McAllester.
McAllester chatted with me about the upcoming "Sigularity", the event where computers out think humans. He wouldn't commit to a date for the singularity but said it could happen in the next couple of decades and will definitely happen eventually. Here are some of McAllester's  views on the Sigularity.

There will be two milestones.
  1. Operational Sentience: We can easily converse with computers.
  2. The AI Chain Reaction: A computer that boot straps itself to a better self. Repeat.
We'll notice the first milestone in automated help systems that will genuinely be helpful. Later on computers will actually be fun to talk to. The point where computer can do anything humans can do will require the second milestone.

McAllester's arguments assume that humans are just fancy computers. Personally I believe we think in ways computers never can, in particular having a self-awareness and reasoning beyond the capability of machines and we'll never see a Singularity.

We are more than Turing machines.

Thursday, July 30, 2009

Are you a firstblocker or a lastblocker?

A type of advertisement I think is very odd is the following:
Be the first on your block to have NAME OF PRODUCT!
It might be more honest to say
Be the first on your block to have NAME OF PRODUCT! Be our beta testers! A year from now you can have a better version at a cheaper price!
What are the PROS and CONS of being a a firstblocker (candidate for word-of-the-year) as opposed to being a lastblocker (another candidate)?
  1. Advantages of being a lastblocker: you get a cheaper better version of the product. Or in some cases you realize that you don't want the product (e.g., Betamax, which were actually called that.)
  2. Disadvantage of being a lastblocker: when do you finally buy the product? When I was a kid the first Polaroid Cameras came out. The innovation then was that they gave you the picture right away. It was in black and white, poor quality, and you had to shake it while it was developing. At the time I said I will wait until this is in Color. Once that happened I said I will wait until it is better quality. Once that happened I said I will wait until you don't have to shake it.. Once that happened I said I will wait until the price goes down. Once that happened I said I will wait until you can preview your shots before printing them out. Once that happened I said I will wait until you can do all of this on the computer. This has happened. Now I am waiting for the price to go down. Or the next innovation. Meanwhile, I have never owned a camera.
  3. Advantage of being a firstblocker: you get to tell your friends about it and advice them. Lance has done that with Kindle, KindleDX, and the ultimate wallet. This is what the advertisement had in mind.
  4. Advantage of being a firstblocker: There are times when the first model has some feature (or bug according to the manufacturer) that you want. One example: An older VCR can tape off on-demand, newer ones and DVD recorders have a problem with this.
  5. Some of this applies to seeing movies their first weekend--- you get to tell your friends about it, but you may see some really bad movies. I'm a lastlastblocker: I tape movies off TV, watch the the first 15 minutes, and then decide if I want to finish it. (Ebert's Rule for Comedies: If you don't laugh in the first 15 minutes, you're not going to laugh in the last hour and 45 minutes.) If not I might go to Wikipedia to see how it ends.
Are you a firstblocker or a lastblocker? Of course, it may depend on the product.

Wednesday, July 29, 2009

QIP = PSPACE

A great new result in quantum complexity, Rahul Jain, Zhengfeng Ji, Sarvagya Upadhyay and John Watrous have shown that QIP is contained in PSPACE solving a decade-old open problem. The Pontiff has already blessed this result but as someone who has studied both quantum and interactive proofs, let me give you my take. Steve Fenner gives a mostly non-technical outline of the proof below.

QIP stands for Quantum interactive proofs. That PSPACE sits in QIP follows from IP=PSPACE and the fact that quantum interactive proofs can easily simulate classical ones. QIP=PSPACE implies IP=QIP so classical interactive proofs can also simulate quantum ones. Amazing.

Old papers by Watrous and Kitaev and Watrous, and showed two already amazing results:
  1. QIP can be simulated by 3-round quantum interactive proofs.
  2. 3-round quantum interactive proofs can be simulated in exponential time.
The second result proven by creating an exponential-sized semi-definite program to capture the acceptance probability of these proof systems. 

In classical interactive proofs, if we could simulate unbounded rounds with bounded rounds than PSPACE would collapse through the polynomial-time hierarchy (highly unlikely). So while we now know classical interactive proofs can simulate quantum proofs they will need many more rounds to do it.

In an upcoming FOCS paper, Jain, Upadhyay and Watrous show that two-round quantum interactive proof systems can be simulated in PSPACE. QIP in PSPACE uses techniques from all these papers.

Still open: The power of multi-prover quantum interactive proofs where the provers share prior entanglement but otherwise can't communicate. Classically we have MIP=NEXP (nondeterministic exponential time). Does QMIP=MIP? Neither containment is known.



Here is Steve Fenner's outline of the proof. 

The proof that QIP is in PSPACE is in three distinct parts:

  1. Reduce a given QIP protocol P to an equivalent semidefinite programming problem S. In other words, find a semidefinite programming problem S where the Verifier's optimal acceptance probability is the optimal solution to S.
  2. Describe a high-level decision procedure for P based on computing a solution to S that is close to optimal.
  3. Show that this numerical solution can be approximated to sufficient accuracy in PSPACE.

The proof relies in a crucial way on an earlier result of Marriott and Watrous (building on Kitaev and Watrous) that any QIP protocol can be simulated by a much simpler QMAM (quantum Merlin-Arthur-Merlin) protocol. In this protocol, which is a three-round quantum version of a standard Arthur-Merlin protocol, Merlin is an all-powerful quantum process and Arthur is a polynomial-size quantum circuit. It runs as follows:

  • Merlin gives Arthur some quantum state r.
  • Arthur flips a *single* coin and sends the result to Merlin.
  • Merlin gives Arther another quantum state s in a different register.
  • Arthur performs one of two possible measurements, based on the value of his previous coin flip. He accepts if the result of the measurement is 1 and rejects otherwise.

The proof actually starts with this equivalent QMAM protocol and turns it into a semidefinite programming problem in Part 1, above.

Every semidefinite program has a primal and a dual formulation. The primal is a maximization problem, and the dual is a minimization problem. In Part 2, the authors give an iterative process that finds better and better feasible solutions to both formulations simultaneously. Each solution approaches the optimal one, the primal from below and the dual from above, giving better and better bounds in both directions. The optimal value is Arthur's maximum acceptance probability over all possible behaviors of Merlin. They show that after a certain bounded number of iterations, the solutions are close enough to the optimal to correctly determine the outcome of the QMAM protocol.

Finally, in Part 3, the authors argue that the iterative process in Part 2 can be implemented efficiently to reasonable accuracy in parallel, using polynomial space. This follows from the fact that basic matrix operations (especially exponentiation and spectral decomposition) can be done in the class NC of polynomial-size log-depth circuits. Here the circuits work on exponential-size data (in the number of qubits), yet everything can still be executed using polynomial space.

One last comment: The proof improves upon and uses some techniques from a recent result of Jain, Upadhyay, and Watrous that two-message quantum interactive proofs are in PSPACE (to appear in FOCS 2009). The same basic technique was also used in a paper by Jain and Watrous in the most recent CCC conference, showing that another quantum class is contained in PSPACE.

Tuesday, July 28, 2009

Whither to Post, Tweet or Facebook?

On the blog we have a straightforward post policy, Bill and I try to make one post on nearly every work day. Occasionally we'll have extra posts for breaking news, mostly deaths and truly major new theorems. But occasionally we'd have short announcements or pointers. Sometimes we would pad a short item with an extra observation or I would sometimes just have a post with a long list of short items.

Right after STOC I started experimenting with Twitter and quickly got hooked. With no fixed schedule I can give quick tweets on anything that interests me or I think important for the community to know. Sometimes I can get what would have been a post down under Twitter's 140 character limit. So now I use the blog for longer posts and tweet the shorter stuff, using Twitter as an extension to the blog. 

Sounds ideal but there's a catch. Only a small fraction of you blog readers also follow me on Twitter. Some of what I Tweet would have been in the blog. So sometimes I need to repeat a tweet as a longer post or occasionally post a list of the more important and/or interesting tweets. What an excuse to do that now. 

  • The new and improved ECCC 
  • Just told that for recent stimulus-based grants: If it is determined that you are not spending quickly enough the funds can be revoked.
  • Scan (11MB PDF) of shirt design from old Complexity conference (then Structures). Don't ask about the dog.
  • Netflix contest over but no winner for a few weeks. Exciting conclusion (via:@ipeirotis)
  • RT @statpumpkin: Bellkor's Pragmatic Chaos first place in Netflix Prize - contacted by Netflix and in validation phase. (thx @piggymurph)
  • Tron returns and NYT worries about computers taking over. Coincidence?
  • Iran has stopped issuing visas for US citizens including academics. The world got a little smaller.
  • Rare CS article in Science Times: Destroying data with cryptography. 
  • Faded lines on Beamer slides only draw attention to themselves (e.g. pg 2 of this)
  • Congrats to Rafael Pass and Nothwestern's Nicole Immorlica new Microsoft Research Faculty Fellows.
  • Comic Sans: Yes it looks handwritten. Cute the first 100 times I've seen it. Now go use a grown up font. (See also bancomicsans.com)
  • Awesome EC Keynote by Michael Moritz on innovation. "Companies whither when scientists and engineers move further away from the helm"
  • Moritz: Many best and brightest mathematicians and scientists waste their lives working for private equity houses.
  • And from June 3rd: Taught Moser's proof in my intro theory course today! (A comment worthy to be my first twitter post).

Maybe I can talk Bill into Tweeting too. Imagine non-stop van der Waerden numbers. 

Now what about Facebook? In Facebook I can break my friends into roughly three groups.

  1. Academics
  2. Friends and Family
  3. People I haven't seen in 25 years.

Problem is I have little to say to all three groups so I don't change my Facebook status often. Basically a few status updates about where I am traveling or some photos of the family. Facebook allows you to create groups of friends and show updates only within those groups. I need the opposite operation, creating status updates that only go out to a select group or groups. I'll likely use Facebook for my personal stuff, if you are in groups 1 or 3 and hide or unfriend me I won't be insulted.

Monday, July 27, 2009

Why go to conferences?

  1. To find out about papers that might spark my interest and lead to the following:
    1. Papers. I saw Moser's paper in STOC 2009 and I already have a paper in preparation based on it. By contrast I saw Chandra-Furst-Lipton paper on Multiparty Communication Complexity at STOC83 and wrote a paper based on it in 2005 (its in MFCS-2005).
    2. Surveys. I saw a talk on PIR's at STOC 2000 which lead to my interest in them, my survey, and my website on them.
    3. Topics that I want to read up on. In CCC09 the paper on pointer-jumping pointed me to material I want to read up on.
    4. Ideas for Projects for students, ides for homeworks.
  2. To find out random stuff in the hallways. At CCC2009 Scott Aaronson told me about a quantum proof that PP is closed under intersection.
  3. To ask about stuff in the hallways. E.g., At CCC2009 I had some questions on derandomization that I asked around about.
  4. During talks I don't understand I sometimes get other work done.
  5. During talks where its too warm I get to take a nice nap.
  6. I do not log on when I am at conferences. Really! I'm always curious how many emails I will get. This time I was gone for 12 days and came back to 289 emails. Why so few? Because lots of email is in resopnse to email that you send out, so the less you send out the less you get. Of the 289 about 20 were stll relevent. I did miss a chance to get $1,000,000,000 from some British Bank. Oh well.
Some say that you learn MORE from the hallways than the talks. And it is true that you can just read the talk later. But there is still something about seeing someone give a good inspiring talk that makes you want to follow up and give the paper a more careful read. The key is to maintain this feeling once you are home.

Friday, July 24, 2009

Time for Computer Science to Grow Up

The August CACM has my viewpoint article Time for Computer Science to Grow Up about the urgent need for conference reform.
Our conference system forces researchers to focus too heavily on quick, technical, and safe papers instead of considering broader and newer ideas. Meanwhile, we have devoted much of our time and money to conferences where we can present our research that we can rarely attend conferences and workshops to work and socialize with our colleagues.  

Computer science has grown to become a mature field where no major university can survive without a strong CS department. It is time for computer science to grow up and publish in a way that represents the major discipline it has become.
I argue that computer science uses conferences to play the role of reputation that journals play in other fields for reputation but then conferences no longer focus on the more important role of bringing out community together.

You can also download the pre-publication PDF.

Update 8/3: The editors of CACM have made the full text of the CACM article publicly accessible. 

8/7: Collection of related blog posts and other links. Feel free to send me others.

Thursday, July 23, 2009

Patenting P=NP

The book Patent Failure by James Bessen and Michael J. Meurer notes that considerable more money is spent on litigation defense for technology patent infringement than on royalties collected and discusses various issues and potential solutions. 

From Chapter 9 (PDF) on Software Patents
Patent law assumes that once the words are mapped to a specific set of technologies, one can readily determine which technologies are equivalent and which are distinct. However, for software, this assumption is not always true. Computer algorithms may be equivalent, but computer scientists may not know that they are equivalent. In any cases, it has taken years for them to discover that different techniques are equivalent. For example, it has been shown that the “traveling salesman” problem, which is used for routing delivery trucks among other things, is more or less equivalent to the “map coloring” problem and a whole range of other problems. This means that an algorithm for solving the traveling salesman problem is also, if worded broadly enough, an algorithm for doing map coloring. In other cases, computer scientists suspect that algorithms are equivalent, but they are unable to prove the equivalence.
More than suspect, we can actually construct two equivalent algorithms where one cannot prove their equivalence in a given mathematical theory (assuming the consistency of that theory). In other words, it is theoretically impossible to determine whether one has software patent infringement in general. So if you are writing some code, it is impossible to determine whether you are infringing on an early patent. A theoretical justification for "Patent Failure".

What about the other part of this paragraph? Lipton might disagree, but we aren't in any danger of someone having a legitimate efficient algorithm for TSP or Coloring to patent. But suppose that someone managed to have such an algorithm and patented it as solving all NP-complete problems. For 17 years, only that person and his/her licensees could efficiently solve all sorts of problems while the rest of us toil away in exponential time. Which of Russell's worlds does this correspond to, Algorithmica Hell?

Wednesday, July 22, 2009

California Nightmare

I received a tiny raise for the next academic year but at least it is positive. Many of my academic colleagues have salary freezes or small cuts and many public institutions are having unpaid "furloughs" which for professors means the same amount of work for less pay. But nothing compares to how the best state university system in the country is being hit by the California budget crisis. The UC professors are taking salary cuts of 8% or more and are being told their retirement accounts are not fully funded, hiring is being curtailed. In the middle of a new analysis piece today in the New York Times on the California budget:
Classroom sizes are about to explode, and state universities are furloughing professors, cutting class offerings and reassessing, in the case of the University of California, whether the system can remain one of excellence for residents. 
Luca calls it The end of UC Berkeley as we know it and the many other strong campuses in the UC system will not fare well either.

I've seen vultures across disciplines lining up just getting ready to pounce on the best and brightest of the UC system. Just giving a UC faculty member what they previously had is a big advantage. Maybe the University of California can weather the storm, but we could see a drop from excellence. And that hurts all of us.

Tuesday, July 21, 2009

ECCC

One of the main topics of discussion at CCC last week centered around the problems with the Electronic Colloquium on Computational Complexity now 15 years old and showing its age.

In 1994, ECCC truly innovated the way we discover research. ECCC has a large board of editors that in theory quickly accept submissions that reach a minimum quality level in computational complexity. Once ECCC hit critical mass it became the must check place to see the latest results in complexity. I first heard about Omer Reingold's L=SL result from its ECCC post

Roughly here is how ECCC works: Once submitted, anyone on the ECCC scientific board can look at the paper and either accept or reject through a clunky email interface. Once accepted the paper appears nearly immediately. Papers by board members are automatically accepted. Papers not acted upon after two months are automatically rejected. A weekly email goes out to the board listing new and about-to-expire papers. 

Most papers are either accepted immediately if an editor is interested in it, or right before it expires. But often papers fall through the cracks, or the email doesn't get sent and papers time out. Recently many reasonable papers did time out which led to the discussions at the Complexity Conference. Also the site hasn't been updated much and is sometimes down or not working properly.

The main alternative to ECCC is the Computational Complexity section of the Computing Research Repository, now part of the ArXiv maintained by the Cornell University libraries and thus usually more more reliable than ECCC.

I subscribe to the RSS feeds for complexity papers on both ECCC and the CoRR. CoRR has a policy of accepting all papers in scope, including various P v NP "proofs", and most active complexity theorists self-select ECCC so only a few CoRR papers are interesting to me.

Scott Aaronson led a discussion at the CCC business meeting on suggestions on what to do with ECCC, not that the Complexity conference has any active role in ECCC. There were many thoughts including changing some of the policies by perhaps assigning papers to editors or have acceptance instead of rejection as a default, or have ECCC as just an overlay over CoRR to increase reliability. Maybe we just need to find a way to make sure more submissions get processed.

As the number of researchers and papers in complexity has increased, we have come to rely on systems like ECCC to let us learn the latest important results without having to wade through the chaff. When these systems fail us, even in little ways, we feel lost. But we can't fault the people who run ECCC, they have served us well for a decade and a half with little remuneration. But we need something or we will have to go back to the old ways of learning about good papers from conferences and journals.

Update 7/27: ECCC has relaunched with a new and improved website.

Monday, July 20, 2009

The Moon and a Legend

It is my first memory of a major event. Forty years ago today, my brother and a not-quite six-year old me were in my grandmother's apartment in New York watching a small TV and seeing Neil Armstrong stepping out on the moon. At the time I couldn't appreciate the fulfillment of JFK's 1960 challenge and the technological achievements that made it happen.

Years later I read Jules Verne's "From the Earth to the Moon" from the prospective of a dream already happened. I once read in a Sci-Fi magazine that hundreds of stories were written about a future moon landing, but none of them had it televised live.

I vaguely remember that day so long ago but in 1989 I watched the the CBS rebroadcast of those day's events with Walter Cronkite. I had watched Cronkite report from the Ford and Carter years through that strange day that combined the inauguration of Ronald Reagan and the release of the Iran hostages. In the fall of 1980 I was asked in an interview by an MIT alum whom I wanted as president, not limited to the current candidates, and I proposed Cronkite, generally regarded then as the most trusted man in America. Choosing Cronkite possibly cost me admission to MIT but I still believe him a better choice than Carter, Reagan or John Anderson. My daughters would later know Cronkite as Benjamin Franklin in the show Liberty Kids that Cronkite helped develop to educate youngster about early American history.

We lost Walter Cronkite on Friday, that great newsmen that I first remember seeing that moon landing day. And that's the way it was, July 20, 1969.

Friday, July 17, 2009

Complexity Wrap-Up

I twittered about some interesting papers, the business meeting and various other impressions. Since I don't have much time to write up this post I'll just list my conference related twitters here in chronological order.

  • At CCC in Paris. Arrived half way through Ryan giving talk of our paper. Paris attracted impressive group of researchers.
  • If a talk starts with a warning about simplifying, it won't be simple enough.
  • Steve Cook: Not surprised P vs NP taking so long to solve.
  • Manindra bring back the BH isomorphism conjecture to CCC. I had posted about it back in April.
  • Bill during Manindra's talk. He claims the room is too warm.


  • Prajakta Nimbhorkar gives last CCC talk of the day with log-space alg for planar graph iso. Nice application of L=SL.
  • CCC banquet at MacĂ©o easily beats EC banquet at Google but we got caught in a hail storm on way back to hotel.
  • Scott talks quantum money with no intentional jokes or proofs.
  • Live Tweeting the CCC business meeting. Promised 30 mins/35 if discussion. 100+ attendees with 36 students. USA 28, France 18, Israel 11
  • Hastad PC report: 37 papers accepted out of 113 (32.7%). High because after STOC announcements. PC more fun in person maybe not crucial.
  • CCC 2010 in Boston in Harvard, June 9-12 after STOC and same as EC all in Cambridge. Dieter van Melkebeek PC chair. Salil Vadhan local.
  • CCC 2011 at FCRC (June 4-11) in San Jose. Again with STOC and EC as well as many others.
  • Discussion on paper v. electronic proceedings. Straw vote: 28 electronic vs. 11 paper. Steering committee will decide.
    • @geomblog replies: why do all theory confs have straw polls and let the steering committee decide ? let's be democractic or autocratic, not neither !
  • Scott brings up problems with ECCC, server often down, papers get delayed or rejected. I'll post more next week.
  • New steering committee chair: Peter Bro Miltersen. New members: Johan Hastad and Manindra Agrawal. Meeting over, lasted 50 minutes.

Thursday, July 16, 2009

Our First Complexity Vidcast

Live from the 24th CCC conference in Paris, Bill, Lance and special guests talk about our firsts in our first vidcast. I apologize for the audio quality.

Wednesday, July 15, 2009

Complexity Day 1

I arrived at the Institut Henri PoincarĂ© in time to catch the last half of Ryan Williams giving a talk on our paper with Rahul Santhanam. I missed the first set of talks including Mark Braverman's Poly-logarithmic independence fools AC0 circuits. To no one's surprise Braverman won the best paper award for this work. The Ronald V. Book Prize for Best Student Paper went to Akitoshi Kawamura for his paper Lipschitz Continuous Ordinary Differential Equations are Polynomial-Space Complete.

A good turn out, about a hundred for the conference including many notables such as Steve Cook, Manindra Agrawal and of course Bill Gasarch. Eric Allender is here so we are now both at 24 and counting for perfect attendance at complexity conferences. Great to see many friends, colleagues and former students including Sophie Laplante, the main local organizer. Something about Paris draws people to the conference.

With my jet lag I realized I would just snooze through any afternoon talk so I had a pleasant time talking research in the Jardins de Luxembourg (near the conference site). Now I'm off to bed and hopefully well rested for day 2.

Tuesday, July 14, 2009

A Complexity Proposal

Tonight I fly off to Paris for the 24th IEEE Conference on Computational Complexity, the namesake conference of the blog. If the plane arrives on time I should just make it for my paper's presentation (which will be given by Ryan Williams in any case).

I (along with Eric Allender) will have attended all 24 conferences. I have had several papers in the conference, served on the PC including once as chair and a six-year stint as conference chair. But my most important Complexity Conference event had little to do with any of the above. 

Twenty years ago, I attended the 1989 Structures Conference (as it was called back then) at the University of Oregon between grad school and my first job at the University of Chicago. The outing consisted of a rafting trip and I landed in a raft with Toda and Razborov. Glad we didn't capsize.

But the moment came when I found a florist in Eugene. I sent a dozen red roses to my then girlfriend Marcy's work place back in Boston with a card that said "Will you marry me?"

We've been married nearly nineteen years.

Monday, July 13, 2009

ICALP

Ryan O'Donnell, reporting from last week's ICALP in Rhodes.

ICALP has a pretty large number of papers, divided into three tracks. Track A is roughly algorithms and complexity ("Amero-theory"?). Track B is roughly Logic, Semantics, and PL Theory ("Euro-theory"?). Track C, formerly cryptography, is now Foundations of Networked Computation; this seems to appeal most to the Track A people, and sometimes there were mixed A/C tracks.

Three highlights for me were:

  • Amin Coja-Oghlan's outstanding paper A better algorithm for random k-SAT, which quite deservedly won Best Paper in Track A. I was very sorry to miss the talk, the earliest in the schedule, as my hydrofoil from Bodrum did not get in till just after it finished. In this paper, Amin gives an O(n3/2)-time algorithm for random k-SAT which finds satisfying assignments for clause densities up to (ln k / k) 2k. Much work has gone into this problem, and the previous best algorithm only worked up to density (1.8 / k) 2k. Although the threshold for satisfiability is (ln 2) 2k, Amin's work may in fact be "tight": his FOCS 2008 result with Achlioptas gives significant theoretical evidence that (ln k / k) 2k may be the "algorithmic barrier".
  • The special session devoted to Papadimitriou. This proved to be a birthday Festschrift, although it was not advertised as such. Laci Lovasz gave it away (repeatedly) in his invited talk, and the rumour floating around afterwards was that Papadimitriou is indeed "nearly 60" (cf. this). The invited talks here were excellent: Karp, on algorithmic problems from biology; Lovasz, on graph limits; Nisan, on auctions; Roughgarden, on improved methods for analyzing the price of anarchy; and Yannakakis, on complexity-theoretic aspects of Papadimitriou's research. Normally these events are good for getting amusing anecdotes about the fest-ed person, but most stories were quite heartfelt ones about the influence of Papadimitriou on the speaker's work. Possibly the most interesting thing was learning that one of Papadimitriou's most notable contributions to TCS was convincing fellow PhD-student Yannakakis to switch from EE-style information theory to TCS.
  • Philip Bille and Mikkel Thorup's paper in which they made the first running-time improvement in 17 years on the problem of regular expression matching; they got it down from nm/log(n) to nm/log3/2(n) (modulo log log factors). Naturally, Philip name-checked SLOGN in his very nice talk.
We also saw the Greek and Turkish national youth basketball teams at the conference lunches; I believe they skipped all the talks, though.

More from Noam Nisan.

Pictures: view of hotel grounds; view of hotel beach; view of basketball players at the conference lunch.

Friday, July 10, 2009

Will the real Adam Smith please stand up, please stand up, please stand up

I thought about doing a post on ADVICE FOR GOING TO CCC 2009, but I recalled that my advice from a 2007 blog posting will suffice..

CONGRADS to Adam Smith and Sean Hallgren for being among the 100 people who won the Presidential Early Career Awards for Scientists and Engineers award (PECASE).

Adam Smith (the same one as above) recently began a blog focusing on CRYPTO issues. here it is

To quote Adam Smith:

It was started around the same time as Jon Katz's blog, also with the goal of providing the crypto community a place for online discussions. Here is my explanation of why it's needed: here

If you google Adam Smith you get around 4,700,000 hits. Most are, of course, to the ecomomist or things associated to him (e.g., Adam Smith Think Tank) However, the FIRST Google Page does include our Adam Smith.

Thursday, July 09, 2009

TARK vs. EC

The 12th TARK and 10th EC conferences both are being held at Stanford this week, both look at questions relating economics and computer science and shared yesterday's Keynote by Susan Athey. But that's where the similarities end.

TARK is held every other year. Next time in 2011 likely in Holland. Krzysztof Apt is the sole PC chair.

EC is held every year. In 2010 at Harvard, June 7-11 overlapping STOC and Complexity, both also in Cambridge. A mini-FCRC a year before all three are in San Jose for the real FCRC. Moshe Tennenholtz and Chris Dellarocas are PC chairs. Yiling Chen and Jenn Wortman are doing local arrangements. David Parkes is general chair.

TARK drew about 50 attendees, EC is drawing about 175.

TARK talks were held in a small classroom in the old GSB building. EC talks are being held in a large ballroom in the Alumni Center. The atmosphere encouraged questions after TARK talks and discouraged them at EC.

TARK has no industry support and few industry participants. EC gets money from several companies and has many people here from Google, Yahoo, Microsoft and others. Yahoo bought us lunch yesterday at EC, tonight's banquet is at the Googleplex.

EC has a new general chair every year. TARK has had one general chair: Joe Halpern.

TARK had traditional printed proceedings which are now available on the ACM DL though not an ACM sponsored conference. EC, sponsored by ACM SIGecom, has CD only proceedings not yet available on the DL.

TARK focuses on models, most papers give logical models of knowledge-related concepts and discuss theorems but rarely give proofs. EC has many new models and mechanisms too but more of an emphasis on proofs in the theoretical papers.

EC has a continual identity crisis on broadness and applicability. TARK did change its name a few years ago (from Theoretical Aspects of Reasoning about Knowledge to Theoretical Aspects of Rationality and Knowledge) but is quite happy with what it is.

Wednesday, July 08, 2009

Speedup for Natrual Problems (Guest Post)

Speedup for Natural Problems (Guest post by Hunter Monroe.)

Blum proved speedup for an artificially constructed problem; this paper (arXiv link), (ECCC link) presents two results on speedup for natural problems (not in the worst case). First, it identifies a condition apparently only slightly stronger than P ≠ NP which implies that accepting any coNP-complete language has an infinitely-often (i.o.) superpolynomial speedup, and furthermore NP≠coNP. We define terms and then state the condition:

BHP={<N,x,1t >| there is at least one accepting path of nondeterministic TM N on input x with t or fewer steps};

HP={<N,x>| there is at least one accepting path of NTM N on input x (with no bound on the number of steps)};

TM is the function that maps a string y to how many steps M(y) takes.

The condition states:

(*) For any deterministic Turing machine M accepting coBHP, there exists (N',x')∈coHP such that the function f(t)=TM(N',x',1t) is not bounded by any polynomial.
(*) implies that any coNP-complete language has i.o. speedup: Let M be a TM that decides BHP. By (*) there is an N',x' such that, for all t, (N',x',1t) is NOT in BHP. Hence a TM that first checks if the input is of the form (N',x',1t), outputs NO if it is, and runs M if its not, has a superpolynomial advantage over M on infinitely many inputs. (*) shows that NP ≠ coNP since Schnorr showed there is an optimal M accepting any NP-complete language.

Second, the paper shows that coBHP unconditionally has a weaker type of i.o. speedup. The intuition is that recognizing nonhalting N',x' is useful for an M accepting coBHP M can accept without reading the 1t part of the input. However, no M can avoid reading the full input for all N',x' as M could accept coHP which is not computably enumerable.

In a similar spirit, the following conjectures suggest a nice linkage between the theories of complexity and computability: If there exists a poly time M accepting a coNP-complete language (for instance coBHP), then M can be modified to accept a language that is not c.e. (for instance coHP). If M can factor integers in polynomial time, then M can be modified to accept the set of true arithmetic statements.

Tuesday, July 07, 2009

Raking Up the Frequent Flyer Miles

This week I'm at Stanford for the TARK and EC conferences co-located for the first time. Next week in Paris for Complexity. Also a shout out to ICALP in Rhodes also being held this week including a special day "to honor the mature period of Christos Papadimitriou contribution to CS". Happy maturity Christos!

First up, TARK, Theoretical Aspects of Rationality and Knowledge. TARK draws from a broad range from theoretical computer science, AI, economics, linguistics, psychology and philosophy who try to model questions like what does it mean to know something, to be aware of it, to be rational about one's decisions and how do these models affect the outcome of various interactions.

Much of my recent research looks at finding the right ways to bring efficient computation (from a complexity theorists view) to economic models. My TARK papers, Program Equilibria and Discounted Computation Time and A Computational Theory of Awareness and Decision Making (with Nikhil Devanur) are two attempts in this direction.

This will be my first time tweeting conferences in case you want to follow at least some of what's going on.

Monday, July 06, 2009

Pairs of Celebrities died/born the same day

Farrah Fawcett and Michael Jackson died the same day (June 25, 2009).
  1. Who is the most famous pair of people that died the same day? While its hard to measure fame objectively, there are two candidates that come to mind. (1) If you give a bonus for being in very different fields then the answer could be Orville Wright and Gandhi (Jan 30, 1948). (2) If you have an American bias then perhaps John Adams and Thomas Jefferson (July 4, 1826). Possible Bonus for July 4 being the day. (James Monroe also died on July 4, though in 1831. What is the probability that three of the US presidens died the same day? Prob=1 since they did.)
  2. If you are famous and want people to notice your death there here are some tips:
    1. DO NOT die the same day as a someone more famous. Example: Farrah Fawcett overshadowed by Michael Jackson.
    2. DO NOT die close to the day someone more famous dies. Examples: (1) Ed McMahon died the day before Michael Jackson. (2) Mother Theresa died a few days after Princess Di (It is likely that Mother Theresa wanted to not have a big deal made of her death, so perhaps this was a win for her.)
    3. DO NOT die the last week of the year. James Brown and Gerald Ford both died the last week in 2006. By that time all the lists of famous people who died in 2006 were already made. So James Brown and Gerald Ford never made those lists.
  3. Who were the most famous pair of people that were born the same day? While its hard to measure fame objectively a good candidate pair is Abe Lincoln and Charles Darwin (Feb 12, 1809). Others of interest (from this website): George W. Bush and Sylvester Stallone (July 6, 1946), (One is a pretend war hero, the other starred in the ROCKY movies.) Sigmund Freud and Robert Peary (May 6, 1856), Winston Churchill and Lucy Maud Montgomery (Nov 30, 1874), (I didn't know who Montgomery was. Either fame is fleeting or I am not up on my writers of books for young girls-- she wrote Anne of Green Gables.) Ariel Sharon and Fats Domino (Feb 26, 1928).

Friday, July 03, 2009

FOCS Papers

For this pre-Independence Day post Bill suggested I write about the math knowledge of our founding fathers or how "P=NP" will declare itself independent of ZFC. Luckily FOCS announced the accepted papers so I can talk about that instead. First the lists.


PC Chair Dan Spielman discusses the review process though he doesn't talk about the usefulness of the 2-page addendum or the criteria used to choose the papers, which seems to have tended again towards the technical.

Here are a few of the many interesting looking papers that caught my eye.

  • David Doty. Randomized Self-Assembly for Exact Shapes
    • You just don't see many STOC/FOCS papers on self-assembly or from Iowa State.
  • Ilias Diakonikolas, Parikshit Gopalan, Ragesh Jaiswal, Rocco Servedio and Emanuele Viola. Bounded Independence Fools Halfspaces
    • I like results that just sound neat and can be fully stated in the title.
  • Zeev Dvir, Swastik Kopparty, Shubhangi Saraf and Madhu Sudan. Extensions to the Method of Multiplicities, with applications to Kakeya sets and Mergers
    • And better randomness extractors too.
  • Or Meir. Combinatorial PCPs with efficient verifiers
    • Sounds interesting and the only FOCS paper that cites one of my papers in the abstract.
  • Rahul Jain, Sarvagya Upadhyay and John Watrous. Two-message quantum interactive proofs are in PSPACE
    • QIP is known to sit in between PSPACE and EXP and it's an interesting open question which way it goes. This paper makes some progress in resolving that.
  • Neeraj Kayal and Shubhangi Saraf. Blackbox Polynomial Identity Testing for Depth 3 Circuit
    • Makes progress on deterministic polynomial identity testing, one of the few remaining probabilistic algorithms to derandomize.
  • Ravindran Kannan. A new probability inequality using typical moments and concentration results
    • Looks like something we will all need to add to our toolboxes. The accepted papers list dropped the word "inequality" which made the paper seem unreal.
  • Alexander Sherstov. The Intersection of Two Halfspaces Has High Threshold Degree
    • An exciting result in its own right. The Complexity class analogue goes the other way: Beigel, Reingold and Spielman showed the class PP is closed under intersections.

Thursday, July 02, 2009

Learn PCPs at DIMACS

Prahladh Harsha asked me to post the following announcement on the blog. I don't usually post announcements but this seems like an excellent opportunity to learn about approximation and PCPs from the masters. You can find many more links and short announcements on my Twitter Feed where you would also learn the FOCS accepted papers list is out. My thoughts on the FOCS papers tomorrow.
  


DIMACS Tutorial on Limits of Approximation Algorithms: PCPs and Unique Games

When: July 20 - 21, 2009 
Where: DIMACS Center, CoRE Building, Rutgers University

I would like to advertise an upcoming tutorial on "Limits of Approximation Algorithms: PCPs and Unique Games", organized under the auspices of the DIMACS Special Focus on Hardness of Approximation. This tutorial is geared towards graduate students, postdocs and others who are theoretically oriented, but not necessarily familiar with the material.  The aim of the tutorial is to give participants a general overview of approximability, introduce them to important results in inapproximability, including some of the recent developments in the world of probabilistically checkable proofs (PCPs) and the unique games conjecture. 

The list of speakers includes: Matthew Andrews (Alcatel-Lucent Bell Laboratories), Sanjeev Arora (Princeton University), Moses Charikar (Princeton University), Prahladh Harsha (University of Texas, Austin), Subhash Khot (New York University) and Lisa Zhang (Alcatel-Lucent Bell Laboratories).

Registration is free and limited travel support is available for non-local participants (with preference to students and postdocs). More info on the workshop web site.

Wednesday, July 01, 2009

2pi-day? Other holiday possibilities!

May 28 should be Pi-day instead of March 14 since pi should be what we now call 2*pi (6.28...) since 2*pi comes up in more formulas than pi. (When I blogged about this here one of the commenter's suggested 2*pi*i.)

So what-should-be-pi-day was last Sunday. To honor this day I asked people what day or concept they would want to make a holiday. Here is what I got.
  1. Multicultural day. OR give every ethnic group a holiday. This may lead to the 4-day work week.
  2. Mid-autumn day to give us a break.
  3. Pi-Day (March 14)
  4. Mole Day (Oct 23)
  5. Talk like a pirate day (Sept 19.) Did pirates really talk that way in the past? now?
  6. Election day. That way more people will vote.
  7. The day Louis Brandeis got appointed to the supreme court. He was the first Jewish member of the court. This could be a way to celebrate the decline of anti-semitism in America.
  8. Peace Day. Could celebrate the end of some war. But there is always the next war. Oh well.
  9. The day women got the vote. (Aug 26, 1920). To quote Hail to the Chiefs, with regard to the election of Warren G. Harding as president in 1920: {\it It was the first election women voted in. They needed more practice}.
  10. The day the civil rights act passing. Actually there were many civil rights bills passed, so you'd have to pick which one to celebrate. Or celebrate all of them and get a 4-day work week.
  11. Groundhog day. (Feb 2). The movie movie Groundhog day higher Google rank than the day does.
  12. Chocolate day.
  13. Moon day. They have Earth Day so why not Moon Day?
  14. Children's day. We have Mothers and Fathers day, so why not Children's day?
  15. St. Patrick's day. Or the day after to get rid of your hangover.
  16. Teacher's day. Would teachers get this day off?
  17. Weird Al's birthday (10/23). Since its the same as Mole Day we can combine them to get Weird Mole Day.
Lets say they made your choice a holiday. And then there was a movement to make it, instead of the actual day, the 2nd Monday of that month it was in. (or something like that). Would you mind? On the one hand, your holiday is getting made into just a day off. On the other hand, you get a 3-day weekend!

Veterans day would have been the 2nd Monday in November except that some veterans protested. Or did they? Maybe it was groups that claim to represent them. I wonder if veterans prefer 3-day weekends or prefer having their holiday be on the day WW I ended. What is more important: Efficiency or meaning?

Some days would be hard to move. July 4, Cinco do Mayo and Pi-day are rather tied to the day they are celebrated. Even so, the government (and others) gives July 3 off if July 4 is on a Saturday.