Friday, October 30, 2009

FOCS News

Last month I suggested that students should go to FOCS even if they didn't have a paper there. I doubt my post had much to do with it, but I heard FOCS did attract many students this year. However there seemed to be fewer of the older participants and in particular very little news from the blogosphere.

Some news beyond my student reports earlier this week: Lipton posted about the 50th FOCS day of talks. Shaddin Dughmi guest posts for Noam. Joe Fitzsimmons "pretends" to be a computer scientist and tweeted from FOCS. Shiva Kintali promises a post soon on open problems from FOCS talks.

Still sad to see the 50th FOCS with such limited coverage. So if you went, let me know how you enjoyed the conference. What were your favorite talks? Any controversies in the business meeting? What's the gossip? Inquiring minds want to know.

Thursday, October 29, 2009

Proud to be a Monomath

A girl told her father that history was much easier when he was in school. "Why?" he asked. She responded "Because there was so much less of it."

An old joke, but one that ran through my mind as I read the article The Last Days of the Polymath in the Autumn issue of Intelligent Life. That's polymath, not in the Tim Gower's sense of a massive collaborative effort to solve math problems, but the more traditional sense of people who know a lot about a lot, people like Leonardo da Vinci and Ben Franklin. But the article reminisces about an earlier time, when one could learn "a lot" about an area, such as physics, without needing to know all that much, basically what's covered in a first-year college sequence today. As Bill pointed out, we don't even have many polymaths in sense of knows a lot about a lot of math either. 

Advances in knowledge have made it impossible to know a lot about a lot. Advances in communication and travel have made polymaths less important since we can now better pool our talents. I might only know a lot about a little, but there isn't much I can't find a lot about quickly.

Richard Posner's quote in the article caught my eye.
“Even in relatively soft fields, specialists tend to develop a specialized vocabulary which creates barriers to entry,” Posner says with his economic hat pulled down over his head. “Specialists want to fend off the generalists. They may also want to convince themselves that what they are doing is really very difficult and challenging. One of the ways they do that is to develop what they regard a rigorous methodology—often mathematical.

“The specialist will always be able to nail the generalists by pointing out that they don’t use the vocabulary quite right and they make mistakes that an insider would never make. It’s a defense mechanism. They don’t like people invading their turf, especially outsiders criticising insiders. So if I make mistakes about this economic situation, it doesn’t really bother me tremendously. It’s not my field. I can make mistakes. On the other hand for me to be criticizing someone whose whole career is committed to a particular outlook and method and so on, that is very painful.” [Spelling Americanized]
We monomaths develop specialized vocabularies and mathematical tools and models because it helps use deeply understand an area. But much of what Posner says rings true. For example I've seen these "defense mechanisms" kick in quite often from both computer scientists and economists towards those trying to cross into each other's fields.

Wednesday, October 28, 2009

FOCS Last Day

Last (but surely not least) day at FOCS 09!

Jan Vondrak talked about some nice results that unify previous query complexity approaches in proving lower bounds for submodular function maximization. For a general class of functions he proves that good approximations require an exponential number of queries to the value oracle. This duplicates known inapproximability results, and also implies new bounds in two particular problems.

Another interesting work was presented by Heiko Roeglin who proved bounds on the size of Pareto-optimal sets in the context multiobjective optimization. In the general case it is almost always possible to find instances of problems with exponentially large Pareto sets. This renders any approach based on analyzing Pareto sets unfeasible. However, in many real world applications, it turns out that these sets are typically small. Roglin and Shang-Hua Teng give a possible theoretical explanation for this, in the framework of smoothed analysis, proving that the expected number of Pareto-optimal solutions is polynomially bounded.

To conclude the 50th of FOCS, Ryan Williams gave a much appreciated talk on new approaches in boolean matrix multiplication. The aim of the paper was to come up with a more efficient combinatorial algorithm for Boolean matrix multiplication. Ryan and Nikhil Bansal succeed in this direction by connecting Boolean matrix multiplication and the triangle removal lemma. Their approach gives the first improvement on general upper bounds for this problem in 40 years!

Tuesday, October 27, 2009

FOCS Day 2

After an early morning injection of much needed caffeine, Team Northwestern was ready for the next round of talks.  Today's program consisted of talks more related to our interests than the previous days.

In the morning, 2 papers related to the computational complexity of equilibria. In the first one Xi Chen, Decheng Dai, Ye Du and Shang-Hua Teng consider Arrow-Debreu markets in which agents' utilities are additively separable and piecewise linear concave. In this context, via an interesting reduction from 2-player games, they were able to show that the problem is PPAD-complete, confirming the fact that even additively separable concave utilities are not easy to deal with. In another talk, Laura J. Poplawski et al. introduced a simple PPAD-complete game, claiming it is a natural candidate for PPAD-completeness reductions. To support their claim they show how it naturally reduces to known (and also to many unknown) PPAD-complete problems.

Also in the morning session an interesting paper by Andrea Montanari and Amin Saberi talked about coordination games and their relation to spread of news, technologies and other social phenomena on networks. They were able to show how convergence times directly depend on certain graph properties.

A bit later in the morning, the winning student papers were presented.  Alexander Sherstov showed his work on strong improvements of the bounds for the degree of intersecting halfspaces.  Jonah Sherman was next, presenting improvements on sparsest cut algorithms.

Bringing the day to a close, session 8B covered three of our favorite topics: social network formation, revenue maximization, and mechanism design. What more could we ask for in one session?

Monday, October 26, 2009

FOCS Day 1

Michele and Michael continue their guest posts from Atlanta.

The first day of FOCS is officially over. 8 sessions, 26 talks and 270 proceedings pages have already gone by.

Following this blog we have learned many useful things. One of them is not to rank talks, another one is not to limit posts to dry descriptions of the presentations we attended.

So you're stuck with our (admittedly not so interesting) impressions and thoughts. The quality and level of the works presented is amazing, and often wanders above our heads. However it has been a very pleasant surprise that talks are often so good and entertaining that they become manageable even when the topics are not in our background. So only positive impressions came from Day 1.

As a colorful aside we will mention another conference that is taking place in the next room, aimed at soon-to-be-extremely-wealthy people. The mixture between the two communities during coffee-breaks is often entertaining and amusing, and we have to say we spotted a prominent computer scientist carrying around the "proceedings" of this other meeting... is this some kind of sign? and if so how should we interpret it?

Sunday, October 25, 2009

FOCS Day 0

Alas, I'm not in Atlanta but in New York City for another meeting. Something different this time--FOCS through the eyes of two Northwestern students, Michele Budinich and Michael Lucas, both attending their first major theory conference.


The organization on day 0 was fantastic. Fox had even reserved a parking lot for us!!

Wait a second.

Are we sure this is where we're supposed to be?

Luckily enough we met more conscious computer scientists who pointed us in the right direction: the FOCS conferenceWhat a novice mistake.

We finally made it to Georgia Tech's LeCraw Auditorium for Day 0 of the FOCS conference. Also called Theory Day, it was a dual celebration marking the 50th anniversary of FOCS and the 20th anniversary of Georgia TechsACO program.

Theory Day started off in the most appropriate way with a talk by Richard Karp who shared his favorite algorithms, pointing out the many different qualities that can make an algorithm "great."

After that two slightly more technical talks by Mihalis Yannakakis and Noga Alon, speaking about computational complexity of equilibria and path disjointness in graphs.

As very fitting conclusion for a 50th year anniversary the last talk gave some fascinating views and possible directions for computer science: Manuel Blum talked about getting to grips with consciousness, with some new neuroscience models that could shed some light (in a very literal senseon how computers could eventually prove their own theorems.

Day 0 ended with the first official FOCS 2009 event: reception, where much food and camaraderie could be found.  Afterwards, the attendees of FOCS '09 returned to their hotel rooms to dream of algorithms and optimization.

Friday, October 23, 2009

Notes on Dagstuhl

A few notes on Dagshul which Lance and I were at last week.
  1. I could tell you about the talks, but the website does a better job: here
  2. I value going to talks more than Lance does, which is odd since he probably gets more out of them. Nevertheless, they serve two functions:
    1. Lets me know of new results or even areas I should look into (E.g. Chattopadhyay's talk on Linear Systems over Composite Moduli was news to me and is stuff I want to learn about.)
    2. Makes me feel guilty for not having already looked into an areas (E.g., Anna Gal's talk on Locally Decodable Codes reminded me that I really should learn or relearn some of that stuff and update my PIR website, Swastik's talk on parity quantifiers and graphs reminds me that I should read a book I've had on my shelf for about 10 years: The strange logic of Random Graphs by Joel Spencer.)
  3. I thought that I was the only person going to both Dagstuhl and Ramsey Theory in Logic and ... so that I could give the same talk in both places. But Eli Ben-Sasson will be at both. But he missed my talk at Dagstuhl. He'll see a better version in Italy since I have made some corrections.
  4. At Dagstuhl there is a book that people hand write their abstracts in. In this digital hardwired age do we need this? In case a Nuclear Bomb goes off and all information stored electronically are erased, we can rebuilt civilization based on those handwritten notebooks. The first thing people will want to know after they crawl out of their bomb shelters will be: is there an oracle separating BQP from PH. And Scott Aaronson's Handwritten abstract will tell them. (I hope his handwriting is legible.)
  5. Peter Bro Miltersen gave an excellent talk. His open problems were very odd. He did NOT say I want to solve problem such-and-such. He instead said I want to find a barrier result to show that problem such-and-such is hard. It seems to me that one should really try hard to solve a problem before proving its hard. Then again, by trying to prove its hard he may solve it. I hope that if he solves it he won't be disappointed. Maybe his paper will be titled A proof that showing that problem such-and-such is hard is itself hard.
  6. One odd benefit of being in Dagstuhl: I spend some time in the library reading articles that I really could have read at home, but somehow don't have time to. Here is one result I read about that I will blog about later: Let d be a distance. If you have n points in a circle of diameter 1 how many pairs of points are you guaranteed have distance \le d apart?
  7. Next week I'll be in Italy and not posting, so you get Lance Fortnow all five days! Lets hope they are not all on P vs NP.

Thursday, October 22, 2009

Need a Roomate for a conference?

(Guest Post by Dave Doty pointing to a blog by ***SORELLE*** which points to a roomate finding service for conferences.)

Sorelle has announced a new roommate finding forum for conferences: http://conferences.proboards.com. I think this forum a great way to help graduate students and other cash-strapped conference attendees to save money, and I encourage future conference organizers to link to it. ~

Wednesday, October 21, 2009

The Humbling Power of P v NP

[Note from Bill: He now has an on-line list of papers who still need reviewers]

Noam Nisan, back in grad school in the 80's made the following suggestion:
Everyone theorist should try to prove P = NP and P ≠ NP early in their career to understand why the problem is so difficult to solve.
So go ahead and find a polynomial-time algorithm to 3-color graphs. Show that clique can't be solved by small circuits. Not because you will succeed but because you will fail. No matter what teeth you sink into P vs NP, the problem will bite back. Think you solved it. Even better. Not because you did but because when you truly understand why your proof has failed you will have achieved enlightenment.

You will understand that computation is a nasty beast. Very hard to tame, to see what it can't do. Nearly as hard to control, to use it to solve those hard search problems.

Watch Tim Gower's multiple personalities try to tackle even simpler lower bounds. The P v NP monster humbles the best of us. And that's what makes the P v NP problem exciting and us just human.

Tuesday, October 20, 2009

An unintentional Sociology of Blogs experiment

Yesterday I posted a list of books that I want reviews of as SIGACT NEWS Book Review Column Editor. This resulted in an unintentional study of Sociology and Blogs which would make an awful paper but a reasonable posting. Some comments below, plus response to comments, and some pointers.
  1. So far 12 books have been claimed. More than 12 people made requests but some of the books were already claimed. Of the 12 only 1 has reviewed for me before. The typical column generates 3 or 4 requests, and even those are usually from people who have reviewed before. Why is the blog so much more effective than my column for finding people to write book reviews for me?
  2. The most popular book by far was not Knuth Vol 4 Fascicle 0, nor any of the other algorithms book, or crypto books (usually a popular topic) but was instead the book Concentration of Measure for the Analysis of Randomized Algorithms. Why that one? The catchy title? The hotness of the topic? Perhaps several of the other books on my list people already have from the publishers as they are potential textbooks, but this one is not in that category.
  3. One of the commenters wanted to know if you need to be an expert to read a book. NO. But you have to be interested and actually read it and have the background to read it.
  4. One of the commenters wanted to know what level the books were at. Alas. In the far future there may be a way to, given a book title, type it into what might be some kind of Engine of Search and find out more about it. Next time I do this I will supply more information about the book, unless by some miracle some sort of Search Technology has evolved to make such unneeded.
  5. Below I have the revised list with the books that are already claimed removed.
  6. Deadline for reviews is Jan 14, 2010, though the intention is to finish them before your next semester starts. If you need more time then tell me. LaTeX template is at here. Plaintext is also fine if the review does not have too much math in it.
Books on Algorithms and Data Structures
  1. Algorithmic Adventures: From Knowledge to Magic by Juraj Hromkovic.
  2. Algorithms and Data Structures: The Basic Toolbox by Mehlhorn and Sanders.
  3. The Algorithms Design Manual by Skiena.
  4. Combinatorial Geometry and its Algorithmic Applications: The Alcala Lectures by Pach and Sharir.
  5. Algorithms for Statistical Signal Processing by Proakis, Rader, Ling, Nikias, Moonen, Proudler.
  6. Nonlinear Integer Programming by Li and Sun.
  7. Binary Quadratic Forms: An Algorithmic Approach by Buchmann and Vollmer.
  8. Parallel Algorithms by Casanova, Legrand, and Robert.
  9. Mathematics for the Analysis of Algorithms by Greene and Knuth.
  10. Concentration of Measure for the Analysis of Randomized Algorithms by Dubhashi and Panconesi.
  11. Vehicular Networks: From Theory to Practice Edited by Olariu and Weigle.
Books on Cryptography
  1. Introduction to Modern Cryptography by Katz and Lindell.
  2. Concurrent Zero-Knowledge by Alon Rosen.
  3. Elliptic Curves: Number Theory and Cryptography by Washington.
  4. Secure Key Establishment by Choo.
  5. Algebraic Cryptanalysis by Bard
  6. A Course in Number Theory and Cryptography by Koblitz.
  7. Cryptanalytic Attacks on RSA by Yan.
Books on Coding Theory
  1. Algebraic Function Fields and Codes by Stichtenoth.
  2. Applied Algebra: Codes, Ciphers, and Discrete Algorithms by Hardy, Richman, and Walker.
Books on Theory of Computation
  1. The Calculus of Computation: Decision Procedures with Applications to Verification by Bradley and Manna.
  2. Models of Computation: An introduction to Computability Theory by Fernandez.
Combinatorics
  1. Applied Combinatorics by Roberts and Tesman.
  2. A Course in Enumeration by Aigner.
  3. Chromatic Graph Theory by Chatrang and Zhang.
  4. Design Theory by Lindner and Rodger.
  5. Combinatorial Methods with computer applications by Gross
  6. A combinatorial approach to matrix theory and its application by Brualdi and Cvetkovic.
Misc Books
  1. Quantum Computer Science: An Introduction by Mermin.
  2. Complex Social Networks by Vega-Redondo
  3. Branching Programs and Binary Decision Diagrams by Wegener.
  4. When Least is Best: How Mathematicians Discovered many clever ways to make things as small (or as large) as possible by Nahin.
  5. Stories about Maxima and Minima by Tikhomirov.
  6. Decision and Elections: Explaining the Unexpected by Saari.
  7. Creative Mathematics by Wall
  8. Is Mathematics Inevitable? A Miscellany Edited by Underwood Dudley.
  9. Comprehensive Mathematics for Computer Scientists 1: Sets and numbers, graphs and algebra, logic and machines, linear geometry by Mazzola, Milmeister, and Weissmann.
  10. Difference Equations: From Rabbits to Chaos by Cull, Flahive, and Robson.
  11. A Concise introduction to Data Compression by Salomon.
  12. Practical Text Mining with Perl by Roger Biliosly.

Monday, October 19, 2009

List of books I want reviewed

I have been the SIGACT NEWS Book Review Column Editor for a while now. Every issue I have a list of books that I WANT reviewed. This works pretty well, but I recently thought if only I had a way to let LOTS of people know the list of books I want reviewed. That might work better. Until I do, here is the current list of books I want reviewed. If you want to review one of them then email me by Thursday Oct 22 (I will be sending in the column, including a hopefully shortened version of this list, on Oct 23.) You email should include your postal address so I'll know where to send the book. If you are in America I will postal mail you the book, if you are in a different country I will try to get the publisher to send you the book (this works most of the time). For ADVICE on reviewing books see here
Books on Algorithms and Data Structures
  1. The Art of Computer Prgramming Vol 4, Fascicle 0: An introduction to Combinatorial Algorihtms and Boolean Functions by Donald Knuth
  2. Algorithmic Adventures: From Knowledge to Magic by Juraj Hromkovic.
  3. Matching Theory by Lovasz and Plummer.
  4. Algorithms and Data Structures: The Basic Toolbox by Mehlhorn and Sanders.
  5. The Algorithms Design Manual by Skiena.
  6. Algorithms on Strings by Crochemore, Hancart, and Lecroq.
  7. Combinatorial Geometry and its Algorithmic Applications: The Alcala Lectures by Pach and Sharir.
  8. Algorithms for Statistical Signal Processing by Proakis, Rader, Ling, Nikias, Moonen, Proudler.
  9. Nonlinear Integer Programming by Li and Sun.
  10. Binary Quadratic Forms: An Algorithmic Approach by Buchmann and Vollmer.
  11. Time Dependent Scheduling by Gawiejnowicz.
  12. The Burrows-Wheeler Transform: Data Compression, Suffix Arrays, and Pattern Matching by Adjeroh, Bell, Mukherjee.
  13. Parallel Algorithms by Casanova, Legrand, and Robert.
  14. Mathematics for the Analysis of Algorithms by Greene and Knuth.
  15. Concentration of Measure for the Analysis of Randomized Algorithms by Dubhashi and Panconesi.
  16. Handbook of Large Scale Random Networks Edited by Bollobas, Kozma, and Miklos.
  17. Vehicular Networks: From Theory to Practice Edited by Olariu and Weigle.
Books on Cryptography
  1. Introduction to Modern Cryptography by Katz and Lindell.
  2. Concurrent Zero-Knowledge by Alon Rosen.
  3. Introduction to cryptography: Principles and Applications by Delfs and Knebl.
  4. Elliptic Curves: Number Theory and Cryptography by Washington.
  5. Secure Key Establishment by Choo.
  6. Algebraic Crypanlysis by Bard
  7. An introduction to Mathematical Crytography by Hoffstein, Pipher, Silverman.
  8. A Course in Number Theory and Cryptography by Koblitz.
  9. Cryptanalytic Attacks on RSA by Yan.
Books on Coding Theory
  1. Algebraic Function Fields and Codes by Stichtenoth.
  2. Coding for Data and Computer Communications by David Salomon.
  3. Applied Algebra: Codes, Ciphers, and Discrete Algorithms by Hardy, Richman, and Walker.
Books on Theory of Computation
  1. The Calculus of Computation: Decision Procedures with Applications to Verification by Bradley and Manna.
  2. Computability of the Julia Sets by Braverman and Yampolsky.
  3. Models of Computation: An introduction to Computability Theory by Fernandez.
Combinatorics
  1. Applied Combinatorics by Roberts and Tesman.
  2. Combinatorics the Rota Way by Kung, Rota, and Yan.
  3. A Course in Enumeration by Aigner.
  4. Chromatic Graph Theory by Chatrang and Zhang.
  5. Design Theory by Lindner and Rodger.
  6. Combinatorial Methods with computer applications by Gross
  7. A combinatorial approach to matrix theory and its application by Brualdi and Cvetkovic.
Misc Books
  1. Quantum Computer Science: An Introduction by Mermin.
  2. Complex Social Networks by Vega-Redondo
  3. Branching Programs and Binary Decision Diagrams by Wegener.
  4. When Least is Best: How Mathematicians Discovered many clever ways to make things as small (or as large) as possible by Nahin.
  5. Stories about Maxima and Minima by Tikhomirov.
  6. Decision and Elections: Explaining the Unexpected by Saari.
  7. Creative Mathematics by Wall
  8. Is Mathematics Inevitable? A Miscellany Edited by Underwodd Dudley.
  9. Comprehensive Mathematics for Computer Scientists 1: Sets and numbers, graphs and algebra, logic and machines, linear geometry by Mazzola, Milmeister, and Weissmann.
  10. Difference Equations: From Rabbits to Chaos by Cull, Flahive, and Robson.
  11. Mathematical Tools for Data Mining by Simovici and Djeraba.
  12. A Concise introduction to Data Compression by Salomon.
  13. Practical Text Mining with Perl by Roger Biliosly.

Friday, October 16, 2009

Typecasting Again

Lance: Welcome to our second typecast on the last full day of Dagstuhl. I may not see Bill for a while so we'd thought we'd get in one more chat before we go our separate ways.

Bill: Let's wave to the audience. wave

Lance: So Bill have you been enjoying the workshop?

Bill: I enjoyed the hike while you stayed here working on a BOGUS theorem.

Lance: I never do the hike. I enjoy the quietness of a nearly-empty Dagstuhl. Saw the castle, did some shopping.

Bill: We're miles from civilization, or kilometers I suppose in Germany.

Lance: A short drive to beautiful downtown Wadern.

Bill: I thought Wadern was too small to have a downtown.

Lance: Well its no Chicago but does have a central shopping area. I got some cool German pencils for the kids and chocolate for the rest.

Bill: Are you enjoying the talks you don't sleep through?

Lance: Don't talk so loud, you are waking me up. Yes the talks have (mostly) been quite good. And luckily not too many of them.

Bill: If I can understand the problem being worked on then I'm happy because I can always claim I read the paper later.

Lance: Then why do you stay awake for the last 30 minutes of each talk? 

Bill: There is photographic evidence that I don't. By the way, do you realize we invented a new word yesterday, "TYPECAST".

Lance: I hate to break it to you but the word is already in the dictionary, first meaning "to cast (type)" which is exactly what we are doing.

Bill: Has the word typecast been used in this way as a contrast to podcast and vidcast?

Lance: Next topic. Let's leave the new words to the Optimizer.

Bill: Do you mean Scott Aaronson's new word "fourellated"?

Lance: Let's not forget his "algebrization". By the way people in this room are now arguing over the spelling of "fourellate". Let's move on please.

Bill: Let's get back to Dagstuhl. Name three talks that you like and one thing that you learned.

Lance: And have someone be angry I left them out? I learned not to rank talks.

Bill: I have no such fears, everyone is already angry at me :-) I like the Swastik Kopparty talk on 0-1 laws for Random Graphs with First-order Plus Parity, Anna Gal's talk on lower bounds on 3-query linear locally-decodable codes, and Arkadev Chattopadhyay's Linear Systems over Composite Moduli. These talks all had real result unlike yours.

Lance: Since when does it make sense to judge a paper by its theorems?

Bill: Since the DAWN OF TIME. Also I learned I can that I can beat Razborov in ping pong, at least with John Rogers help. 

Lance: Let's wrap it up. You know we could typecast again using IM or Google Wave. 

Bill: Once I join the modern age.

Lance: So until next June at Complexity, remember, in a complex world best to keep it simple.

Thursday, October 15, 2009

Thanks for the Fuzzy Memories

In the 1990's Manindra Agrawal and V. Arvind published a paper claiming that if SAT is reducible to a (non-uniform) weighted threshold function then P = NP. Their proof relied on the following splitting lemma:

For all n, and qij∈{0,1}n, 1 ≤ i ≤ j ≤ n+2, there don't exist dk∈{0,1}n, 1 ≤ k ≤ n+2 such that for all i,j, and k, dk⋅qij > 0 if k=i or k=j and dk⋅qij < 0 otherwise.

Manindra Agrawal and Thomas Thierauf used the splitting lemma to give a polynomial-time algorithm for factoring. They got quite excited and couldn't find a bug in their proof. Apparently I suggested to Thomas at the time that he patent the algorithm before they publish. But later Klaus Wagner showed the splitting lemma actually implied P=NP and Agrawal went back and found that the proof of the splitting lemma wasn't correct and Jochen Messner found a counterexample.

Manindra wrote a retraction for Theoretical Computer Science but for some reason it wasn't published.

I've seen this before. If a technical lemma doesn't imply anything particularly surprising, it might not get checked very carefully. But only later when researchers start using it to get amazing results do people go back and realize there was a problem with the lemma after all.

Flash forward more than a decade later to this week at Dagstuhl. Harry Buhrman, John Hitchcock and Harry's student Bruno Loff came to Dagstuhl all excited. They proved that SAT disjunctively-tt reducible to a sparse set implies SAT is reducible to a weighted threshold and by Agrawal-Arvind thus implies P = NP, answering a long-standing open question. We tried to understand the Agrawal-Arvind paper and Thomas Thierauf "vaguely remembered" there might have been a bug in the splitting lemma.

Arvind who is here at Dagstuhl didn't remember the paper well. Manindra, one of the organizers of this week's workshop, is not here at all but had to attend an IIT event in of all places Chicago. Eventually Harry called Manindra who acknowledged the bug.

Once revealed Jacobo Torán, Messner's advisor, told me the whole sordid story above and Nikolai Vereshchagin showed the splitting lemma true for n ≤ 4 and found counterexamples for n ≥ 5. After talking about the conjecture at the open problem session last night, at breakfast Amir Shpilka showed me an easy proof that the splitting lemma fails even if you allow an infinite number of points for n = 5.

Don't worry, Manindra's proof with Kayal and Saxena that Primes are in P is still safe.

Wednesday, October 14, 2009

Bill and Lance's Excellent German Adventure

Lance: Because we don't have a good microphone, Bill and I decided to have our first typecast. Hi Bill.

Bill: Hi Lance. We'll be typecast SUPERSTARS.

L: What's today's topic Bill?

B: DAGSTUHL!!! First question of the day, they seat us at lunch and dinner randomly but I've had lunch and dinner with Valentine Kabenets four times. What's with that?

L: No need to yell Bill, it's a typecast. Isn't Valentine an expert in pseudorandomness? You should ask him.

B: If I have one more meal with him, I will.

L: So what is your favorite talk so far?

B: Mine of course. But after mine, yours of course.

L: I agree. I like mine the best too :)

B: You've seemed to have resolved the prisoner's dilemma dilemma. For that you deserve the Nobel prize in Economics. It won't be the only premature Nobel prize given this year.

L:  Too late. Actually you already know my favorite talks from yesterday.

B: Wow, we can do links in a typecast, but not in a vidcast. This is AWESOME!!

L: Quiet down Bill there are people on the other side of this table trying to prove theorems and not wasting time typing to each other. But actually you can put links in Youtube if I knew how. Or so I believe.

B: A real question now. We saw a talk, a really good talk, that proved planar graph isomorphism is in a log space. It was a great talk because it gave an outline of where he (Fabian Wagner) was going so even when I got lost later I got something out of it. Why is it a complexity talk (CCC '09) and not a SODA talk?

L: Hard to say but it felt like a complexity talk. He certainly had complexity co-authors like Thomas Thierauf.

B: Co-authors do not a field make.

L: A field is defined by its people.

B: If you, Lance Fortnow, start doing algorithmic game theory would that start becoming part of complexity.

Rahul Santhanam: Yes, because the premise won't happen.

Chris Umans: What are you doing?

B: It's a typecast, like a vidcast only we type.

C: Really, are you serious? 

B: If you walk in you are fair game. 

C: How can this be a typecast if you can edit it?

L: Like we don't edit the video? We got a bit off track. What I do is complexity because I write the blog.

B: By that theory Ramsey theory is now complexity because I write the blog too.

L: You don't count.

B: I count MOD P!

L: So Vinod gave a neat talk yesterday showing the equivalence between Kolmogorov extractors and randomness extractors which I conjectured here first.

Meena: So why wasn't my talk Monday mentioned on the blog?

L: Time constraints.

R: No, space constraints.

B: Let's wrap up soon. So are you enjoying Dagstuhl?

L: Any time I get to spend with you and Rahul is a pleasure.

B: Same here.

L: But you get to be with yourself all the time.

B: And don't think I don't appreciate it! 

L: And so until next time, when in a complex world best to keep it simple.

Tuesday, October 13, 2009

Dagstuhl Day One

The talks Monday were after my own heart. The day started with relativization: The always entertaining Scott Aaronson talked on his approach to get an oracle where BQP is not in the polynomial-time hierarchy. He gets some partial progress and a new conjectured language for separation, something he called Fourier checking.

Valentine Kabanets followed up talking about his work with Impagliazzo and Kolokolova giving a new axiomatic characterization of relativization. In my (seemingly minority) opinion, the number of tools we have that don't relativize are pretty slim and and traditional oracle models tell us plenty about what we can't do.

Fabian Wagner described his log-space algorithm for checking whether two planar graphs are isomorphic. Rahul Santhanam talked about our work (with Harry Buhrman) on the lower bounds we proved at last year's Dagstuhl. I talked about applications of computational complexity to economics and why everyone should walk from the Complexity Conference across the street to Electronic Commerce next summer and see for themselves.

But the best part of Dagstuhl are the open time where we meet and socialize over coffee, or wine and beer and discuss problems, struggle over old questions and try to prove new ones. That's the magic of Dagstuhl.

Monday, October 12, 2009

The Molly Solution

This week Bill and I are both at the Dagstuhl Workshop on Algebraic Methods in Computation Complexity. I'll try to cover some of the talks and discussions on the blog and on Twitter.

Called home last night and had the following conversation with my 11-year old daughter.

Molly: I was thinking about this P/NP problem. What do P and N stand for?

Lance: You probably won't understand but P stands for "Polynomial-Time" and N stands for "Nondeterministic"

Molly: Well, if P and N were random variables then P=NP if N equals one.

An old joke, but my daughter had figured it out all on her own. So I answered her with the only response I could give.

Lance: Or if P equals zero.

Friday, October 09, 2009

Complexity Vidcast 2

Bill and I battle it out over what should be taught in a complexity course in our new vidcast.

Thursday, October 08, 2009

Publicity for P versus NP

On the New York Times website, John Markoff writes an article Prizes Aside, the P-NP Puzzler has Consequences motivated by my CACM article on The Status of the P Versus NP Problem. Not a bad job describing the basic problem, at least for the layman.
The issues center around a group of classic computer science problems, like the traveling salesman problem — how to figure out the most efficient route through a number of cities given certain constraints. Factoring a large number is also a good example of what is referred to as an NP problem. At present, there is no understood method for doing it in polynomial time, but given an answer it is easy to check to see if it is correct…Checking to see if any particular solution is correct is simple, but finding the best solution in the case of very large problems is infinitely difficult.
Good to see some press for this great problem and theoretical computer science in general. And getting a mention in the Times impresses my mom more than any JACM article ever could.

One of the sentences is tricky to parse.
An esoteric branch of mathematics known as algebraic geometry may offer an avenue to prove or disprove the problem, which was first outlined by Stephen A. Cook, a University of Toronto computer scientist, in 1971.
As you readers should know, while Cook (and independently Leonid Levin in Russia) first outlined the P vs. NP problem, the much more recent algebraic geometry approach is primarily due to my former U. Chicago colleague Ketan Mulmuley.

Wednesday, October 07, 2009

The last Universal Mathematician was ???

The book Prime Obsessions has the following on Page 159:
You will see it written that Hadamard was the last of the universal mathematicians--- the last, that is, to encompass the whole of the subject, before it became so large that this was impossible. However, you will also see this said of Hilbert, Poincare, Klein, and perhaps of one or two other mathematicians of the period. I don't know to whom the title most properly belongs, though I suspect the answer is actually Gauss.
  1. I have never heard Hadamard or Klein mentioned as such, though this does not mean it has not been said.
  2. I would add Kolmogorov to the list of candidates.
  3. My wife would add Aristotle and Archimedes to the list.
  4. Gauss seems like a good choice to me.
  5. Do any current mathematicians qualify? One stumbling block--- I do not know of anyone lately who has made serious contributions to logic and some non-logic branch of mathematics.
  6. It is, of course, an ill defined question. Is there a way to better define it and answer it?
~

Tuesday, October 06, 2009

You are coordially invited... Why?

When you get an invitation via email should you accept? How well targeted is it? Some real examples that I got.

EXAMPLE ONE: I would like to invite you to participate at the workshop Ramsey Theory in Logic, Combinatorics and Complexity which will take place in Bertinoro international Center for informatics click here Italy, October 25-30, 2009. The organizers: Pavel Pudlak, Lorenzo Carlucci, Nicola Galesi and Andreas Weiermann Note that this is well targeted (I do Ramsey Theory AND I do Logic!) and the organizerers are people I've heard of (positively!). I have the time to go. This is a clear ACCEPT. (Sorry Lance and Vijay- I'll be missing FOCS.)

EXAMPLE TWO:
It is my pleasure to invite you to publish at the Journal of Emerging Technologies in Web Intelligence (JETWI) (ISSN 1798-0461 ) published by the Academy Publisher in Finland. JETWI is a peer reviewed and indexed international journal and is teamwork of the distinguished editorial board along with the Academy Publisher to respond to the emerging research needs in the evolving area of Web Intelligence and related Technologies. The aim, scope and the targets of this journal are listed at the journal Web Site: click here
This is badly targeted. I am not in the area. So why did I get this email? There is no harm to the sender to send it to ALOT of people since it doesn't cost anything. I wish that, as a courtesy, they would do better targetting. For me a clear DECLINE. Should I email them and tell them to take me off of their list? I doubt it would work and might even be counterproductive. I hope they read this blog entry and take me off of the list. (I also hope to show P\ne NP.)

EXAMPLE THREE:
There is a book that won't be written for many years, yet its here today for you to review. Courtesy of Jahred Kammen, comes a short but salient retrospective on the life and impact of Ted Kennedy. The complete work is Available at _Freedompressintl@aol.com_ (mailto:Freedompressintl@aol.com). J. D. Russell, President Freedom Press International
Why did this get send to me? I get many such emails since I run a book review column for SIGACT NEWS and book publishers are not that good at targetting. While I am tempted to get free books that I have no intention of reviewing, I refrain. I wish they did better targeting, but I get to hear about books that are out there that I might not have heard about otherwise.

EXAMPLE FOUR: (I leave it for you to judge this one)

BECAUSE YOU DESERVE IT! Is your lack of a degree holding you back from career advancement? Are you having difficulty finding employment in your field of interest because you don't have the paper to back it up, even though you are qualified? If you are looking for a fast and effective solution, we can help! Call us right now for your customized diploma: Inside U.SA.: 1-718-989-5746 Outside U.S.A.: +1-718-989-5746. Just leave your NAME & TEL. PHONE # (with country-code) on the voicemail and one of our staff members will get back to you promptly!

Monday, October 05, 2009

Two Recent Complexity Books omit Mahaney's theorem- ovesight or wisdom?

In a prior post (a while back) I pondered if Mahaney's theorem (SAT \le_m S, S Spare, implies P=NP) should be taught in a basic grad course in complexity. I thought YES.

As a sequel to that I note the following: The theorem is not in EITHER Goldreich's Complexity book or Arora-Barak's Complexity book. One of the following is true:
  1. Gasarch is right and Goldreich and Arora-Barak are wrong. It should have been in their books.
  2. Gasarch is wrong and Goldreich and Arora-Barak are right. Its okay that its not in their books.
  3. It is in the book but Gasarch couldn't find it.
Clearly option (2) is correct. These people have thought long and hard on what should be in such a course and are broader and deeper in the field then me. The question arises, why did I think this was important anyway? I could argue (as some of the commenters did) about the Berman-Hartmanis conjecture being important (all NPC sets are really THE SAME SET!) but the real reason is that I was impressed with it in my early years and it is hard to shake that off.

When I taught Graduate Algorithms (I really did!) I taught Yao's MST algorithm that ran in time O(|E|log log |V|). The students asked me WHY this was important (there were better algorithms available). Then I realized that I taught it only because Samir Khuller taught it when he taught the class, so I asked him why it was important. He said it very much impressed him when he was a graduate student.

The contents of the Goldreich book and Arora-Barak book are influenced by a deep understanding of the field, and not by things that impressed them in their youth.

Is Mahaney's theorem important? The real question might be compared to what?. As the number of things that you absolutely have to have in a course grows, at some point, something has to be tossed out.

Friday, October 02, 2009

On Being Narrow

In a tweet a few days ago, David Bacon
wonders why everyone always assumes all he can do is quantum computing? Oh, because that's all he's done. Time to do something new?
What David (aka the Quantum Pontiff) says can apply to any discipline. Are you a narrow researcher? Take my simple one question test.
Do all your co-authors know each other?
Being narrow has some advantages but mostly disadvantages.
  • You can really know an area. Particularly in a field like quantum computing which has steep learning curve before you can become and remain an active researcher. 
  • Your are part of a very tight community. These people know you well. You look forward to seeing them at workshops and conferences. They can write you wonderful recommendation letters (and vice versa).
  • On the flip side, you don't know many outside your research area and they don't know you. 
  • Narrow researchers often lose themselves in the minutiae of the field, looking at questions whose importance seems obvious from those inside the field but impossible to explain to outsiders.
  • You tend to go to smaller and focused workshops and conferences because the majority of the broader meetings have little that interest you. This further isolates you from the larger community.
  • As David laments, others view you as focused and unable to solve problems outside your field.
  • As your field loses importance, so do you.
  • There are fewer grant programs you can apply to, and you have co-authored with most of the people who could properly evaluate your research, making them ineligible to review your proposals.
It's fine for a graduate student to remain focused for their dissertation research. But even then best to take advantage of your university's offerings and take courses in a broad range of topics. As a post-doc or young assistant professor, you should talk to other professors and ask them about their problems. You'd be surprised how much the tools, models and techniques of one field can apply in another. But be careful, others get upset if you try and impose your field's beliefs on their fields. 

Many successful senior researchers will completely change fields a few times in their career, taking a year or more off, maybe during as Sabbatical, to learn the background and important problems of another area and then start tackling those questions.

Thursday, October 01, 2009

The Journal Manifesto 2.0

Reminder: FOCS early registration deadline today. Go here


Below is a revised version of the Journal Manifesto. I restate the key sentence from my last post and then redo the manifesto in light of the (very enlightening!) comments that my last post got.
However, I have a Manifesto: a list of actions that we as individuals can do to make the academic world a better place.
Keep in mind: I am NOT talking to the NSF or to Journal publishes or to Conference organizers. I am NOT going to say what any of these people should do. I am talking to US, the authors of papers. If WE all follow this manifesto then the problems of high priced journals and limited access may partially go away on their own. To be briefer: To the extend that WE are the problem, WE can be the solution.

REQUEST: Leave as comments a simple YES if you agree with the manifesto and a NO and a reason if you do not. I may set this up as a formal online thing that people can sign off on if there is enough interest.

Preamble: We academics publish papers so they can be read. We want anyone in the world to be able to read our papers. We do not expect any money in exchange. Here is what we can all do to facilitate this, not just for our papers but for everyones papers.
  1. Whenever you publish a paper in a conference or journal, post it on your website AND on some appropriate archive. Also post improvements if there are some (the version on your website may be better than the one in the journal!). You would think this is standard practice by now, but alas, it is not. In particular AS SOON AS YOU SUBMIT the final version to a conference it should go on your website. The fact that 12 papers from FOCS 2009 are still not online shows that there is plenty of improvement to be made on our end.
  2. If you give a talk on a paper then post the slides and if possible a video of the talk, along with the paper, on your website. On the archives perhaps post a link to the slides and video.
  3. If you have old papers that are not available on line (it is common go to a website and see only papers past, say, 1995, are online) then get them scanned in (some schools do this for you) and put them on your website. Do not be humble. That is, do not think Nobody cares about that old paper. Somebody might.
  4. If you goto a website and it has a link to a paper, but the link does not work, then email the website owner. Do not be shy. I have done this with some of the best people in our field and they have been grateful.
  5. When you write a paper make sure that all of the bibliography entries include all links where the paper is available: the journal website, the authors website, some archives. If people follow items 1,2,3,4 above then the issue of What if the journals is not online? will become irrelevant. There may still be a problem with older articles; however, this will also become irrelevant over time.
  6. If you gather up papers in an area for your own use, then you may want to make a website out of them. I have done this with the Erdos Distance Problem, Private Information Retrieval, Constructive lower bounds on Ramsey Numbers, Applications of Ramsey Theory, and of course VDW Theorem stuff. There may be some legal issues here, and also some issues of what the publishers will enforce. I have no guidelines to offer, so I leave it to you. However, I will note that I've had my {\it applications of Ramsey Theory} site up for at least 5 years with no problem. Also, not that this point is optional. I suspect that, more than the other items, its one that people think other people should do. (I understand that it can be a pain in the neck to maintain such sites.) THIS ITEM IS OPTIONAL.
  7. If you get a contract to write a book make sure they allow you to post it free online. Blown to Bits, by Abelson, Ledeen, and Lewis is available this way. So is A=B by Petkovsek, Wilf, and Zeilberger. (I understand that if you are writing an undergrad textbook and expect to make real money on it then you may not want to do this.)