Thursday, December 29, 2011

Complexity Year in Review 2011

Result of the Year goes to the new bounds on Matrix Multiplication by Andrew Stothers and Virginia Vassilevska Williams. It's not every year that we see progress on an important problem where we've had no progress since the 80's. Well there was last year. But not every year.

Some other notable results: Impagliazzo's relativized separation of Algorithmica and Heuristica, The Power of Simple Tabulation Hashing by Pătraşcu and Thorup and Property Testing Lower Bounds via Communication Complexity by Blais, Brody and Matulef

We celebrated Les Valiant's Turing Award. We remember Patrick Fischer, Phillipe Flajolet, Steve JobsJohn McCarthy and Dennis Ritchie.

Thanks to our guest posters Daniel Apon, Lauren Cowles, Annie and Molly Fortnow, Nadia Jones, Samir Khuller, Ryan O'Donnell, John Rogers, Jeffrey Stein, Aaron Sterling and Anonymous.

2011 will go down as a year when computer science started changing the world (again). Watson won on Jeopardy. Social networks brought down dictators and congressmen. Obama opens Robotics Center at Carnegie-Mellon. My daughter's high school had their first computer science class in a very long time and Stanford's AI course goes big time. The New York Times has sections on Computer Science's Sputnik Moment and the Future of Computing. The director of the Mathematical and Physical Sciences at the NSF declares "software is the modern language of science". Is there nothing CS no longer touches?

In 2012 we celebrate the man who started it all with a series of events celebrating the 100th anniversary of the birth of the father of computer science. I'll get it started next week talking on "Turing's Influence on Computational Complexity" during the AMS-ASL Special Session on the Life and Legacy of Alan Turing at the Joint Math Meeting in Boston.

Wednesday, December 21, 2011

Game Changers

Two announcements on Monday connected to my two Alma Maters mark the changing face of universities.

It didn't hurt that Cornell got a $350 million donation and that their main competitor, Stanford, dropped out.

Foreign countries have been creating campuses for some time now, like Northwestern's Qatar campus. Great to see this happening in my own country, a realization of the importance of technology and that New York knows it must make these investments. May this lead to other cities building tech campuses like they build sports arenas. Unlike sports, we can have many winners.

What's going to happen on this tech campus? Education, research, start-up incubators? Will there be a separate CS department in New York or just a branch from Ithaca? I can find very little details on the web, though there is this cool fly-over.

Following up on Stanford's online courses, MIT is creating their own tools for teaching online courses and will share these tools with other universities. Those taking the courses may receive a certificate but will have to pay a small fee to do so and the certificate will not bear the MIT name. According to the FAQ "MIT plans to create a not-for-profit body within the Institute that will offer certification for online learners of MIT coursework. That body will carry a distinct name to avoid confusion. MIT awards MIT degrees only to those admitted to MIT through a highly selective admissions process." Nevertheless these courses will allow people to get access to great MIT courses at little cost.

In the 90's, Newspapers decided they could better serve the public by putting their news stories online. How did that work for them? Are universities starting to go down the same path today?

Monday, December 19, 2011

Romney vs. Aaronson

How are Mitt Romney and Scott Aaronson similar? Different?

  1. Both live in Massachusetts. Actually, Scott lives there but its not clear where Mitt lives since he's been running for president for the last four years.
  2. Both, deep in their heart and soul, believe that Global Warming is a real problem.
  3. Both use money to make a point:
    1. Mitt tried to to bet Rick Perry $10,000 that Mitt was never in favor of the individual mandate here.
    2. Scott blogged that if Deolalikar's proof of P ≠ NP is correct, Scott would give Deolalikar $200,000 here.
  4. Both were somewhat misinterpreted: Some thought that Scott was insulting Deolalikar. He was not. He was just expressing his certainly the proof was not correct. Some thought Mitt showed he was out of touch with Middle Class American (who normally can't afford to casually bet $10,000 on anything). While Mitt might be out of touch, I think this was more of a way to forcibly express that there is no evidence that he was in favor of the individual mandate.
  5. Both Mitt and Scott seem to be right. Deolalikar's proof is no longer believed to be correct, and fact check says that Mitt never supported the individual mandate.
  6. There exists people who say Scott is smart. There exists people who say Mitt is smart. I don't know if this means anything since there exists people who say Newt is smart.
  7. They both seem smarter than Michelle Bachmann, Herman Cain, and uh,uh, I can't think of the third candidate they both seem smarter than. Oops.

  1. Scott believes his belief that Deolalikar didn't prove P ≠ NP. Mitt has no beliefs.
  2. Mitt can easily afford $10,000. Scott would have to struggle to raise $200,000.
  3. Mitt's bet made him look bad. Scott's offer made him look good. He put-his-money-where-his-mouth-is unlike other bloggers who just asserted the proof was likely not correct.
  4. Mitt made a bet partially in jest- it is unlikely to really involve an exchange of money. Scott made a real offer- if Deolalikar's proof had been correct he really would have paid out.
  5. There is one of them that I would vote for. The other was once Governor of Massachusetts.
  6. Scott knows a bit more about Quantum Computing than Mitt.
I emailed the Romney Campaign this post (without this paragraph), and the information that I would post it on Monday Dec 19, in case they had a comment. They did not respond. Perhaps Mitt is miffed about my saying Scott knows a bit more about Quantum Computing.

Thursday, December 15, 2011

Algorithmic Driving

In my post last week, my commentors took me to task on my prediction that cars will drive us in ten years. Some thought Americans would wise up and learn to love mass transit. They don't know Americans.

Others thought the hardware cost would even in ten years remain out of reach. Google did not build an autonomous car by creating the hardware but by harnessing and training good machine learning algorithms. No amount of hardware would have given you a car able to navigate the streets of San Francisco five years ago.

What hardware do you need for an autonomous car, beyond the car itself? A good camera, a GPS device, wireless Internet access, gigabytes of RAM and a fast processor. I carry all that in my pocket. Google does use other sensors including lasers and radar but as the algorithms get better, the cost and need for this hardware can be reduced. Wiring the car to drive itself won't be difficult, already the steering wheel and pedals are mostly just a user interface into a computer that is controlling the car.

I have no doubts that technologically we will have autonomous cars in ten years adding at most a couple of hundred dollars over the cost of the car itself.

Other problems could get in the way. One is legal but Nevada is already changing their laws that will allow a testbed for autonomous cars in that state. Once the cars are viewed as safe one would expect the law to expand and other states to open up as well.

The other issue is social. As with every technological change we will have the usual technological life cycle: Innovators willing to pay the big bucks to try stuff first, Early Adopters who love to jump on new technology (where I usually sit), the early and late majorities following the crowd and finally the laggards who still insist on manual transmission and pumping their own brakes.

There are other issues like patents and industries, like auto insurance companies, that will try to fight autonomous cars. Autonomous cars will be too much of a win, in terms of parking, fuel efficiency, shorter and more productive travel time and most of all safety, not to prevail.

Tuesday, December 13, 2011

Solution to the reciprocals problem

In my last blog I asked you to look at this problem, try to solve it, and tell me if you think it is too hard for a HS competition. (I meant to post this on WED so I posted it at 12:06AM East Coast Time. I didn't know that the blog is on Chicago Time. So it got posted on Tuesday. Oh well.) Here is the problem:
Prove or disprove: there exist natural numbers x1,...,x10 such that
  1. 2011=x1+... +x10 and
  2. 1=1/x1+... +1/x10.
Several solutions and some other points of interest about this problem are here. The answer is YES and here are the solutions that I know of -- both my solution and the ones emailed to me. (The explanation of how they were obtained are at the paper pointed to above.)
  1. My Solution: 2,4,5,80,80,80,160,320,640,640. This used known theorems.
  2. Sam Solution: 2,4,5,40,120,160,300,300,480,600. Sam is a HS senior who is very good at these contests. It took him 30 minutes,
  3. David Eppstein Solution: 3,4,7,16,16,16,20,43,80,1806. This used a bit of advanced knowledge and some hand computations that were probably above what you want for a UMCP Math Competition.
  4. Matt Howell Solution: 2,4,5,50,100,100,250,500,500,500 This solution could have been found by a HS student (it is similar to Sam's solution.) Matt has a BS in Math and Engineering and did the problem in 20 minutes.
  5. An anonymous commenter send me 6,6,10,10,12,15,15,15,62,1860. This solution could have been found by a HS student (it is similar to Sam's solution.)
  6. Another one from same anon: 6,8,8,8,12,15,16,16,62,1860 This solution could have been found by a HS student (it is similar to Sam's solution.)
  7. Mike Roman Solution: 5,5,6,8,8,12,15,32,960,960
So what to make of this?
  1. Much to my surprise, enough people got it right and in ways that a HS student could have gotten it. And one did- Sam took the real exam.
  2. The systematic solution that I got required very little hand calculation. All of the others, which includes the one I think a HS student could have or did get, required quite a bit of hand calculation. That may be a reason to not ask it.
  3. I wonder how many solutions there are. This could be figured out by a Dynamic Program but there may be issues with large ints. (See later in this blog.)
  4. I wonder if there are any that have distinct numbers. This can also be figured out by a Dynamic Program, though there may be an easier way.
  5. I wonder about the computational complexity of the following problems:
    1. Problem 1 Given a,b in N, does there exist x1,...,xa such that x1+...+xa=b and 1/x1+...+1/xa=1. (Can also ask with the stipulation that the x's are distinct.)
    2. Problem 2 Given a,b in N and r in Q, does there exist x1,...,xa such that x1+...+xa=b and 1/x1+...+1/xa=r. (Can also ask with the stipulation that the x's are distinct.)

Possible Dynamic Program (more likely you would do it as a recurrence but save all answers found and check to see if you have already computed it, to avoid recomputing.) Let f(a,b,c,r) (where a,b,c are naturals and r is rational) be
num of sols to x1+...+xa=b and 1/x1+...+1/xa=r. where c ≤ x1 ≤ ... ≤ xa.
Note that
  1. f(1,b,c,r) = 1 if r=1/b and r ≥ c
  2. f(a,b,c,r) = sum as x in {c,c+1,..., min(b,floor(a/r)) } of f(a-1,b-x,x,r-1/x)
(NOTE- Use f(a-1,b-x,x+1,r-1/x) if you want to enforce distinct solutions.)

A math point: Ronald Graham showed that, for all n ≥ 78, n can be written as a sum of natural numbers whose reciprocals sum to 1. The lower bound of 78 is tight: 77 cannot be. A sketch of a proof of this is in the file pointed to. (The proof I give is inspired by one of the comments on the blog. YEAH BLOG COMMENTERS!) (ADDED LATER- Ronald Graham actually proved that for all n &ge 78 n can be written as the sum of DISTINCT natural numbers such that... . The proof I presented in the pointed to document just gives nat numbers, not necc distinct ones.)

Monday, December 12, 2011

Is this problem too hard for a HS Math Competition

The Univ of MD HS Math Competition has two parts. Part I is 25 multiple choice questions in 2 hours (4 points for a correct answer, -2 for an incorrect answer). If you do well on it (the threshold changes from year to year) then you can do Part II which is 5 problems in 2 hours, 30 points each. The winner is the person who does best on the sum of the two parts.

This year I submitted a problem for Part II. The people on the committee who tried it couldn't do it so we decided to NOT put it on the exam. I only knew the answer because I read a theorem and build a problem around it. The people who couldn't do it are very sharp. Since they could not do it the problem was too hard. But... lets see what you think?

I would like YOU to try it without consulting any resources, (and don't look at the comments- someone might post questions that lead to a hint, or the answer) and keep in mind that you can't use advanced techniques (I'm do not think they would help anyway). See if you can do it so I can get a sense if it really is too hard. Post your opinion on if its too hard for a HIGH SCHOOL math competition. Here is the problem:
Prove or disprove: there exist natural numbers x1,...,x10 such that
  1. 2011=x1+... +x10 and
  2. 1=1/x1+... +1/x10
(ADDED LATER- A commenter thought that the xi's in the first and second condition could be different. They are not. We want x1,...,x10 that satisfy both of these simultaneously.)

I'll post a pointer to the solution next time I post. (Probably Wednesday.) ADDED LATER- a commenter wants to know if there is a solution or not and can't wait until WED. Also wants to know if there is a solution is it constructive or proof of existence. To answer the question but NOT ruin it for others, I put it in a file you can click on (or NOT) over here: here.)

Thursday, December 08, 2011

A Great Time to be a Computer Scientist

Ask your friends if they'll be driving an electric car in ten years. The answer: No, cars will be driving us.

Today is the 105th anniversary of the birth of computing pioneer Grace Murray Hopper and Computer Science Education Week is being held this week in her honor. If only she could see what we have reaped from what she has sown.

The New York Times this week devoted Tuesday's Science Times to the Future of Computing. The section includes a series of essays from CS researchers including Daphne Koller, Stefan Savage, David Patterson, Kai-Fu Lee and fellow theory blogger Scott Aaronson. Scott also blogged about the experience. Though what would you do with a quantum computer on your desk? The factoring number thing could get boring pretty quick.

The Times also has a crowdsourced interactive timeline of future computing events. Try to guess which one I submitted. Almost all the advances listed are realistic and should keep CS very active for the long future. The most exciting future computer science advances will be the ones we haven't even thought of.

Finally the new class of ACM Fellows includes many theorists including Serge Abiteboul, Guy Blelloch, David Eppstein, Howard Karloff, Susan Landau, Joe Mitchell, Janos Pach and Diane Souvaine. A great year to be an ACM Fellow because the awards ceremony in June will follow a workshop celebrating the Turing Centenary featuring over 30 former Turing award winners.

Wednesday, December 07, 2011

What is a Breakthrough?

The recent discussion on Matrix Mult inspires the general question of WHAT IS A BREAKTHROUGH? Last year I tried to get an intelligent discussion on this topic but I failed. After saying what some criteria were I applied them in a silly way to several results. This derailed the discussion. My bad, my fault. SO, I'll try again to get an INTELLIGENT discussion on this topic. (NOTE- some of this post is a reworking of the old post.)

Here are some criteria. The first three are extracted from a comment Gowers made on Scott's Blog. I do not know how many a result has to have to be a breakthrough or even if such a threshold makes sense. Perhaps some sort of weighted sum, but would be hard to define.
  1. The result breaks a long standing barrier.
  2. The techniques introduce a fundamentally different method.
  3. The result is practical (or close to it).
  4. The problem being discussed is important. This may be a bit circular in that it then depends on What is Important?
  5. The result has to make substantial progress on the problem. This may depend on What is substantial? That notion may depend on how long the problem has been open.
  6. There has to be a reason why the problem was thought to be hard. E.g., a proof that a new technique is needed, problem has been open for a long time, people in the field say its hard.
  7. A paper that STARTS an important field could be a breakthrough. Cook's Theorem and Valiant's PAC learning qualify here.
  8. A paper that FINISHES a field could be a breakthrough if the field is important. Again this may be a bit circular in that it then depends on What is Important?
  9. The techniques are new. One might end up debating what is new.
  10. The techniques can be used on other problems.
  11. The paper inspires other papers. For this criteria you need to wait a few years. Some papers are not appreciated for a while.
The notions of important, substantial, new are not Boolean. As Gowers pointed out in his comment, the notion of breakthrough is not Boolean.

Do you have other criteria, examples, counterexamples, constructive criticism of my criteria, or anything that is an intelligent contribution to this topic? Is so, please post a comment!

Monday, December 05, 2011


On Saturday, Terrence Fine gave a talk on probability at a workshop at Northwestern. Before the talk he asked who thought probability was subjective (an individual's belief in the chance of an event) or a frequentist (a probability represents what happens if an experiment can be repeated many times). Someone noticed I didn't raise my hand either time so I said that I had a computational point of view of probability, since I have a computational point of view of everything.

I didn't mean computation as in Turing machine but as a process. A process that creates events according to some distribution. How does this process work? I don't care. That's the beauty of computational thinking, we abstract out the notion of probability and just make use of it. I've written papers on quantum computation having no idea of the physical processes that create entanglement. I study nondeterministic computation where we have no physical counterpart. Probability works the same way, at least for me.

My contribution to the workshop was to explain Kolmogorov complexity, the universal distribution and its relationship to inductive learning to the mostly economics crowd. Perhaps I could have explained things a bit more clearly as one econ student said to me afterwards "You lost me at prefix free". 

Friday, December 02, 2011

Analysis of Boolean Functions blog/book (Guest post by Ryan O'Donnell)

(Guest post by Ryan O'Donnell)

Lance and Bill have graciously let me plug my recently begun book/blog project, analysis of boolean functions. I am writing a textbook on analysis of Boolean functions and serializing it on the blog as I go. When I'm done, the book will be available online; hopefully it will also be published in the conventional format. (NOTE FROM BILL- its also linked to off of our blog page.)

The topic is sometimes called Boolean Fourier analysis, though my perspective is a bit more from probability theory than harmonic analysis. I hope the book will be accessible and of interest to grad students and researchers in theoretical computer science and other areas of mathematics. Each chapter will end with a "highlight" illustrating the use of Boolean analysis in problems where you might not necessarily expect it. To give you a flavor of the contents, my planned list of highlights is:
  • Testing linearity (the Blum-Luby-Rubinfeld Theorem)
  • Arrow's Theorem from Social Choice (and Kalai's "approximate" version)
  • The Goldreich-Levin Algorithm from cryptography
  • Constant-depth circuits (Linial-Mansour-Nisan's work)
  • Noise sensitivity of threshold functions (Peres's Theorem)
  • Pseudorandomness for F_2-polynomials (Viola's Theorem)
  • NP-hardness of approximately solving linear systems (Hastad's Theorem)
  • Randomized query complexity of monotone graph properties
  • The (almost-)Polynomial Freiman-Ruzsa Theorem (i.e., Sanders's Theorem)
  • The Kahn-Kalai-Linial Theorem on influences
  • The Gaussian Isoperimetric Inequality (Bobkov's proof)
  • Sharp threshold phenomena (Friedgut and Bourgain's theorems)
  • Majority Is Stablest Theorem
  • Unique Games-hardness from SDP gaps (work of Raghavendra and others)
If you're interested, you can think of it like an online course -- I've been publishing Mondays, Wednesdays, and Fridays, and there are even exercises. (Also like a course, I'll go on break for a few weeks around the new year.) Come see why "juntas" are important, what hypercontractivity "really" means, and why functions on Gaussian space are the "easy special case" of Boolean functions...

PS: I'm using MathJax for the posts; if anyone has suggestions for me or for readers on how to make it look nicer or load better, please do let me know.