Computational Complexity and other fun stuff in math and computer science from Lance Fortnow and Bill Gasarch
Friday, October 30, 2009
FOCS News
Thursday, October 29, 2009
Proud to be a Monomath
“Even in relatively soft fields, specialists tend to develop a specialized vocabulary which creates barriers to entry,” Posner says with his economic hat pulled down over his head. “Specialists want to fend off the generalists. They may also want to convince themselves that what they are doing is really very difficult and challenging. One of the ways they do that is to develop what they regard a rigorous methodology—often mathematical.
“The specialist will always be able to nail the generalists by pointing out that they don’t use the vocabulary quite right and they make mistakes that an insider would never make. It’s a defense mechanism. They don’t like people invading their turf, especially outsiders criticising insiders. So if I make mistakes about this economic situation, it doesn’t really bother me tremendously. It’s not my field. I can make mistakes. On the other hand for me to be criticizing someone whose whole career is committed to a particular outlook and method and so on, that is very painful.” [Spelling Americanized]
Wednesday, October 28, 2009
FOCS Last Day
Jan Vondrak talked about some nice results that unify previous query complexity approaches in proving lower bounds for submodular function maximization. For a general class of functions he proves that good approximations require an exponential number of queries to the value oracle. This duplicates known inapproximability results, and also implies new bounds in two particular problems.
Another interesting work was presented by Heiko Roeglin who proved bounds on the size of Paretooptimal sets in the context multiobjective optimization. In the general case it is almost always possible to find instances of problems with exponentially large Pareto sets. This renders any approach based on analyzing Pareto sets unfeasible. However, in many real world applications, it turns out that these sets are typically small. Roglin and ShangHua Teng give a possible theoretical explanation for this, in the framework of smoothed analysis, proving that the expected number of Paretooptimal solutions is polynomially bounded.
To conclude the 50th of FOCS, Ryan Williams gave a much appreciated talk on new approaches in boolean matrix multiplication. The aim of the paper was to come up with a more efficient combinatorial algorithm for Boolean matrix multiplication. Ryan and Nikhil Bansal succeed in this direction by connecting Boolean matrix multiplication and the triangle removal lemma. Their approach gives the first improvement on general upper bounds for this problem in 40 years!
Tuesday, October 27, 2009
FOCS Day 2
In the morning, 2 papers related to the computational complexity of equilibria. In the first one Xi Chen, Decheng Dai, Ye Du and ShangHua Teng consider ArrowDebreu markets in which agents' utilities are additively separable and piecewise linear concave. In this context, via an interesting reduction from 2player games, they were able to show that the problem is PPADcomplete, confirming the fact that even additively separable concave utilities are not easy to deal with. In another talk, Laura J. Poplawski et al. introduced a simple PPADcomplete game, claiming it is a natural candidate for PPADcompleteness reductions. To support their claim they show how it naturally reduces to known (and also to many unknown) PPADcomplete problems.
Also in the morning session an interesting paper by Andrea Montanari and Amin Saberi talked about coordination games and their relation to spread of news, technologies and other social phenomena on networks. They were able to show how convergence times directly depend on certain graph properties.
A bit later in the morning, the winning student papers were presented. Alexander Sherstov showed his work on strong improvements of the bounds for the degree of intersecting halfspaces. Jonah Sherman was next, presenting improvements on sparsest cut algorithms.
Bringing the day to a close, session 8B covered three of our favorite topics: social network formation, revenue maximization, and mechanism design. What more could we ask for in one session?
Monday, October 26, 2009
FOCS Day 1
Following this blog we have learned many useful things. One of them is not to rank talks, another one is not to limit posts to dry descriptions of the presentations we attended.
So you're stuck with our (admittedly not so interesting) impressions and thoughts. The quality and level of the works presented is amazing, and often wanders above our heads. However it has been a very pleasant surprise that talks are often so good and entertaining that they become manageable even when the topics are not in our background. So only positive impressions came from Day 1.
As a colorful aside we will mention another conference that is taking place in the next room, aimed at soontobeextremelywealthy people. The mixture between the two communities during coffeebreaks is often entertaining and amusing, and we have to say we spotted a prominent computer scientist carrying around the "proceedings" of this other meeting... is this some kind of sign? and if so how should we interpret it?
Sunday, October 25, 2009
FOCS Day 0
The organization on day 0 was fantastic. Fox had even reserved a parking lot for us!!
Wait a second.
Are we sure this is where we're supposed to be?
Luckily enough we met more conscious computer scientists who pointed us in the right direction: the FOCS conference! What a novice mistake.
We finally made it to Georgia Tech's LeCraw Auditorium for Day 0 of the FOCS conference. Also called Theory Day, it was a dual celebration marking the 50th anniversary of FOCS and the 20th anniversary of Georgia Techs' ACO program.
Theory Day started off in the most appropriate way with a talk by Richard Karp who shared his favorite algorithms, pointing out the many different qualities that can make an algorithm "great."
After that two slightly more technical talks by Mihalis Yannakakis and Noga Alon, speaking about computational complexity of equilibria and path disjointness in graphs.
As very fitting conclusion for a 50th year anniversary the last talk gave some fascinating views and possible directions for computer science: Manuel Blum talked about getting to grips with consciousness, with some new neuroscience models that could shed some light (in a very literal sense) on how computers could eventually prove their own theorems.
Day 0 ended with the first official FOCS 2009 event: reception, where much food and camaraderie could be found. Afterwards, the attendees of FOCS '09 returned to their hotel rooms to dream of algorithms and optimization.
Friday, October 23, 2009
Notes on Dagstuhl
 I could tell you about the talks, but the website does a better job: here

I value going to talks more than Lance does, which is odd
since he probably gets more out of them. Nevertheless,
they serve two functions:
 Lets me know of new results or even areas I should look into (E.g. Chattopadhyay's talk on Linear Systems over Composite Moduli was news to me and is stuff I want to learn about.)
 Makes me feel guilty for not having already looked into an areas (E.g., Anna Gal's talk on Locally Decodable Codes reminded me that I really should learn or relearn some of that stuff and update my PIR website, Swastik's talk on parity quantifiers and graphs reminds me that I should read a book I've had on my shelf for about 10 years: The strange logic of Random Graphs by Joel Spencer.)
 I thought that I was the only person going to both Dagstuhl and Ramsey Theory in Logic and ... so that I could give the same talk in both places. But Eli BenSasson will be at both. But he missed my talk at Dagstuhl. He'll see a better version in Italy since I have made some corrections.
 At Dagstuhl there is a book that people hand write their abstracts in. In this digital hardwired age do we need this? In case a Nuclear Bomb goes off and all information stored electronically are erased, we can rebuilt civilization based on those handwritten notebooks. The first thing people will want to know after they crawl out of their bomb shelters will be: is there an oracle separating BQP from PH. And Scott Aaronson's Handwritten abstract will tell them. (I hope his handwriting is legible.)
 Peter Bro Miltersen gave an excellent talk. His open problems were very odd. He did NOT say I want to solve problem suchandsuch. He instead said I want to find a barrier result to show that problem suchandsuch is hard. It seems to me that one should really try hard to solve a problem before proving its hard. Then again, by trying to prove its hard he may solve it. I hope that if he solves it he won't be disappointed. Maybe his paper will be titled A proof that showing that problem suchandsuch is hard is itself hard.
 One odd benefit of being in Dagstuhl: I spend some time in the library reading articles that I really could have read at home, but somehow don't have time to. Here is one result I read about that I will blog about later: Let d be a distance. If you have n points in a circle of diameter 1 how many pairs of points are you guaranteed have distance \le d apart?
 Next week I'll be in Italy and not posting, so you get Lance Fortnow all five days! Lets hope they are not all on P vs NP.
Thursday, October 22, 2009
Need a Roomate for a conference?
Sorelle has announced a new roommate finding forum for conferences: http://conferences.proboards.com. I think this forum a great way to help graduate students and other cashstrapped conference attendees to save money, and I encourage future conference organizers to link to it. ~
Wednesday, October 21, 2009
The Humbling Power of P v NP
Everyone theorist should try to prove P = NP and P ≠ NP early in their career to understand why the problem is so difficult to solve.So go ahead and find a polynomialtime algorithm to 3color graphs. Show that clique can't be solved by small circuits. Not because you will succeed but because you will fail. No matter what teeth you sink into P vs NP, the problem will bite back. Think you solved it. Even better. Not because you did but because when you truly understand why your proof has failed you will have achieved enlightenment.
Tuesday, October 20, 2009
An unintentional Sociology of Blogs experiment
 So far 12 books have been claimed. More than 12 people made requests but some of the books were already claimed. Of the 12 only 1 has reviewed for me before. The typical column generates 3 or 4 requests, and even those are usually from people who have reviewed before. Why is the blog so much more effective than my column for finding people to write book reviews for me?
 The most popular book by far was not Knuth Vol 4 Fascicle 0, nor any of the other algorithms book, or crypto books (usually a popular topic) but was instead the book Concentration of Measure for the Analysis of Randomized Algorithms. Why that one? The catchy title? The hotness of the topic? Perhaps several of the other books on my list people already have from the publishers as they are potential textbooks, but this one is not in that category.
 One of the commenters wanted to know if you need to be an expert to read a book. NO. But you have to be interested and actually read it and have the background to read it.
 One of the commenters wanted to know what level the books were at. Alas. In the far future there may be a way to, given a book title, type it into what might be some kind of Engine of Search and find out more about it. Next time I do this I will supply more information about the book, unless by some miracle some sort of Search Technology has evolved to make such unneeded.
 Below I have the revised list with the books that are already claimed removed.
 Deadline for reviews is Jan 14, 2010, though the intention is to finish them before your next semester starts. If you need more time then tell me. LaTeX template is at here. Plaintext is also fine if the review does not have too much math in it.
Books on Algorithms and Data Structures
 Algorithmic Adventures: From Knowledge to Magic by Juraj Hromkovic.
 Algorithms and Data Structures: The Basic Toolbox by Mehlhorn and Sanders.
 The Algorithms Design Manual by Skiena.
 Combinatorial Geometry and its Algorithmic Applications: The Alcala Lectures by Pach and Sharir.
 Algorithms for Statistical Signal Processing by Proakis, Rader, Ling, Nikias, Moonen, Proudler.
 Nonlinear Integer Programming by Li and Sun.
 Binary Quadratic Forms: An Algorithmic Approach by Buchmann and Vollmer.
 Parallel Algorithms by Casanova, Legrand, and Robert.
 Mathematics for the Analysis of Algorithms by Greene and Knuth.
 Concentration of Measure for the Analysis of Randomized Algorithms by Dubhashi and Panconesi.
 Vehicular Networks: From Theory to Practice Edited by Olariu and Weigle.
Books on Cryptography
 Introduction to Modern Cryptography by Katz and Lindell.
 Concurrent ZeroKnowledge by Alon Rosen.
 Elliptic Curves: Number Theory and Cryptography by Washington.
 Secure Key Establishment by Choo.
 Algebraic Cryptanalysis by Bard
 A Course in Number Theory and Cryptography by Koblitz.
 Cryptanalytic Attacks on RSA by Yan.
Books on Coding Theory
 Algebraic Function Fields and Codes by Stichtenoth.
 Applied Algebra: Codes, Ciphers, and Discrete Algorithms by Hardy, Richman, and Walker.
Books on Theory of Computation
 The Calculus of Computation: Decision Procedures with Applications to Verification by Bradley and Manna.
 Models of Computation: An introduction to Computability Theory by Fernandez.
Combinatorics
 Applied Combinatorics by Roberts and Tesman.
 A Course in Enumeration by Aigner.
 Chromatic Graph Theory by Chatrang and Zhang.
 Design Theory by Lindner and Rodger.
 Combinatorial Methods with computer applications by Gross
 A combinatorial approach to matrix theory and its application by Brualdi and Cvetkovic.
Misc Books
 Quantum Computer Science: An Introduction by Mermin.
 Complex Social Networks by VegaRedondo
 Branching Programs and Binary Decision Diagrams by Wegener.
 When Least is Best: How Mathematicians Discovered many clever ways to make things as small (or as large) as possible by Nahin.
 Stories about Maxima and Minima by Tikhomirov.
 Decision and Elections: Explaining the Unexpected by Saari.
 Creative Mathematics by Wall
 Is Mathematics Inevitable? A Miscellany Edited by Underwood Dudley.
 Comprehensive Mathematics for Computer Scientists 1: Sets and numbers, graphs and algebra, logic and machines, linear geometry by Mazzola, Milmeister, and Weissmann.
 Difference Equations: From Rabbits to Chaos by Cull, Flahive, and Robson.
 A Concise introduction to Data Compression by Salomon.
 Practical Text Mining with Perl by Roger Biliosly.
Monday, October 19, 2009
List of books I want reviewed
Books on Algorithms and Data Structures
 The Art of Computer Prgramming Vol 4, Fascicle 0: An introduction to Combinatorial Algorihtms and Boolean Functions by Donald Knuth
 Algorithmic Adventures: From Knowledge to Magic by Juraj Hromkovic.
 Matching Theory by Lovasz and Plummer.
 Algorithms and Data Structures: The Basic Toolbox by Mehlhorn and Sanders.
 The Algorithms Design Manual by Skiena.
 Algorithms on Strings by Crochemore, Hancart, and Lecroq.
 Combinatorial Geometry and its Algorithmic Applications: The Alcala Lectures by Pach and Sharir.
 Algorithms for Statistical Signal Processing by Proakis, Rader, Ling, Nikias, Moonen, Proudler.
 Nonlinear Integer Programming by Li and Sun.
 Binary Quadratic Forms: An Algorithmic Approach by Buchmann and Vollmer.
 Time Dependent Scheduling by Gawiejnowicz.
 The BurrowsWheeler Transform: Data Compression, Suffix Arrays, and Pattern Matching by Adjeroh, Bell, Mukherjee.
 Parallel Algorithms by Casanova, Legrand, and Robert.
 Mathematics for the Analysis of Algorithms by Greene and Knuth.
 Concentration of Measure for the Analysis of Randomized Algorithms by Dubhashi and Panconesi.
 Handbook of Large Scale Random Networks Edited by Bollobas, Kozma, and Miklos.
 Vehicular Networks: From Theory to Practice Edited by Olariu and Weigle.
Books on Cryptography
 Introduction to Modern Cryptography by Katz and Lindell.
 Concurrent ZeroKnowledge by Alon Rosen.
 Introduction to cryptography: Principles and Applications by Delfs and Knebl.
 Elliptic Curves: Number Theory and Cryptography by Washington.
 Secure Key Establishment by Choo.
 Algebraic Crypanlysis by Bard
 An introduction to Mathematical Crytography by Hoffstein, Pipher, Silverman.
 A Course in Number Theory and Cryptography by Koblitz.
 Cryptanalytic Attacks on RSA by Yan.
Books on Coding Theory
 Algebraic Function Fields and Codes by Stichtenoth.
 Coding for Data and Computer Communications by David Salomon.
 Applied Algebra: Codes, Ciphers, and Discrete Algorithms by Hardy, Richman, and Walker.
Books on Theory of Computation
 The Calculus of Computation: Decision Procedures with Applications to Verification by Bradley and Manna.
 Computability of the Julia Sets by Braverman and Yampolsky.
 Models of Computation: An introduction to Computability Theory by Fernandez.
Combinatorics
 Applied Combinatorics by Roberts and Tesman.
 Combinatorics the Rota Way by Kung, Rota, and Yan.
 A Course in Enumeration by Aigner.
 Chromatic Graph Theory by Chatrang and Zhang.
 Design Theory by Lindner and Rodger.
 Combinatorial Methods with computer applications by Gross
 A combinatorial approach to matrix theory and its application by Brualdi and Cvetkovic.
Misc Books
 Quantum Computer Science: An Introduction by Mermin.
 Complex Social Networks by VegaRedondo
 Branching Programs and Binary Decision Diagrams by Wegener.
 When Least is Best: How Mathematicians Discovered many clever ways to make things as small (or as large) as possible by Nahin.
 Stories about Maxima and Minima by Tikhomirov.
 Decision and Elections: Explaining the Unexpected by Saari.
 Creative Mathematics by Wall
 Is Mathematics Inevitable? A Miscellany Edited by Underwodd Dudley.
 Comprehensive Mathematics for Computer Scientists 1: Sets and numbers, graphs and algebra, logic and machines, linear geometry by Mazzola, Milmeister, and Weissmann.
 Difference Equations: From Rabbits to Chaos by Cull, Flahive, and Robson.
 Mathematical Tools for Data Mining by Simovici and Djeraba.
 A Concise introduction to Data Compression by Salomon.
 Practical Text Mining with Perl by Roger Biliosly.
Friday, October 16, 2009
Typecasting Again
Thursday, October 15, 2009
Thanks for the Fuzzy Memories
Manindra Agrawal and Thomas Thierauf used the splitting lemma to give a polynomialtime algorithm for factoring. They got quite excited and couldn't find a bug in their proof. Apparently I suggested to Thomas at the time that he patent the algorithm before they publish. But later Klaus Wagner showed the splitting lemma actually implied P=NP and Agrawal went back and found that the proof of the splitting lemma wasn't correct and Jochen Messner found a counterexample.
Manindra wrote a retraction for Theoretical Computer Science but for some reason it wasn't published.
I've seen this before. If a technical lemma doesn't imply anything particularly surprising, it might not get checked very carefully. But only later when researchers start using it to get amazing results do people go back and realize there was a problem with the lemma after all.
Flash forward more than a decade later to this week at Dagstuhl. Harry Buhrman, John Hitchcock and Harry's student Bruno Loff came to Dagstuhl all excited. They proved that SAT disjunctivelytt reducible to a sparse set implies SAT is reducible to a weighted threshold and by AgrawalArvind thus implies P = NP, answering a longstanding open question. We tried to understand the AgrawalArvind paper and Thomas Thierauf "vaguely remembered" there might have been a bug in the splitting lemma.
Arvind who is here at Dagstuhl didn't remember the paper well. Manindra, one of the organizers of this week's workshop, is not here at all but had to attend an IIT event in of all places Chicago. Eventually Harry called Manindra who acknowledged the bug.
Once revealed Jacobo Torán, Messner's advisor, told me the whole sordid story above and Nikolai Vereshchagin showed the splitting lemma true for n ≤ 4 and found counterexamples for n ≥ 5. After talking about the conjecture at the open problem session last night, at breakfast Amir Shpilka showed me an easy proof that the splitting lemma fails even if you allow an infinite number of points for n = 5.
Don't worry, Manindra's proof with Kayal and Saxena that Primes are in P is still safe.
Wednesday, October 14, 2009
Bill and Lance's Excellent German Adventure
Tuesday, October 13, 2009
Dagstuhl Day One
Monday, October 12, 2009
The Molly Solution
Friday, October 09, 2009
Thursday, October 08, 2009
Publicity for P versus NP
The issues center around a group of classic computer science problems, like the traveling salesman problem — how to figure out the most efficient route through a number of cities given certain constraints. Factoring a large number is also a good example of what is referred to as an NP problem. At present, there is no understood method for doing it in polynomial time, but given an answer it is easy to check to see if it is correct…Checking to see if any particular solution is correct is simple, but finding the best solution in the case of very large problems is infinitely difficult.
An esoteric branch of mathematics known as algebraic geometry may offer an avenue to prove or disprove the problem, which was first outlined by Stephen A. Cook, a University of Toronto computer scientist, in 1971.As you readers should know, while Cook (and independently Leonid Levin in Russia) first outlined the P vs. NP problem, the much more recent algebraic geometry approach is primarily due to my former U. Chicago colleague Ketan Mulmuley.
Wednesday, October 07, 2009
The last Universal Mathematician was ???
You will see it written that Hadamard was the last of the universal mathematicians the last, that is, to encompass the whole of the subject, before it became so large that this was impossible. However, you will also see this said of Hilbert, Poincare, Klein, and perhaps of one or two other mathematicians of the period. I don't know to whom the title most properly belongs, though I suspect the answer is actually Gauss.
 I have never heard Hadamard or Klein mentioned as such, though this does not mean it has not been said.
 I would add Kolmogorov to the list of candidates.
 My wife would add Aristotle and Archimedes to the list.
 Gauss seems like a good choice to me.
 Do any current mathematicians qualify? One stumbling block I do not know of anyone lately who has made serious contributions to logic and some nonlogic branch of mathematics.
 It is, of course, an ill defined question. Is there a way to better define it and answer it?
Tuesday, October 06, 2009
You are coordially invited... Why?
EXAMPLE ONE:
EXAMPLE TWO:
It is my pleasure to invite you to publish at the Journal of Emerging Technologies in Web Intelligence (JETWI) (ISSN 17980461 ) published by the Academy Publisher in Finland. JETWI is a peer reviewed and indexed international journal and is teamwork of the distinguished editorial board along with the Academy Publisher to respond to the emerging research needs in the evolving area of Web Intelligence and related Technologies. The aim, scope and the targets of this journal are listed at the journal Web Site: click hereThis is badly targeted. I am not in the area. So why did I get this email? There is no harm to the sender to send it to ALOT of people since it doesn't cost anything. I wish that, as a courtesy, they would do better targetting. For me a clear DECLINE. Should I email them and tell them to take me off of their list? I doubt it would work and might even be counterproductive. I hope they read this blog entry and take me off of the list. (I also hope to show P\ne NP.)
EXAMPLE THREE:
There is a book that won't be written for many years, yet its here today for you to review. Courtesy of Jahred Kammen, comes a short but salient retrospective on the life and impact of Ted Kennedy. The complete work is Available at _Freedompressintl@aol.com_ (mailto:Freedompressintl@aol.com). J. D. Russell, President Freedom Press InternationalWhy did this get send to me? I get many such emails since I run a book review column for SIGACT NEWS and book publishers are not that good at targetting. While I am tempted to get free books that I have no intention of reviewing, I refrain. I wish they did better targeting, but I get to hear about books that are out there that I might not have heard about otherwise.
EXAMPLE FOUR: (I leave it for you to judge this one)
BECAUSE YOU DESERVE IT! Is your lack of a degree holding you back from career advancement? Are you having difficulty finding employment in your field of interest because you don't have the paper to back it up, even though you are qualified? If you are looking for a fast and effective solution, we can help! Call us right now for your customized diploma: Inside U.SA.: 17189895746 Outside U.S.A.: +17189895746. Just leave your NAME & TEL. PHONE # (with countrycode) on the voicemail and one of our staff members will get back to you promptly!
Monday, October 05, 2009
Two Recent Complexity Books omit Mahaney's theorem ovesight or wisdom?
As a sequel to that I note the following: The theorem is not in EITHER Goldreich's Complexity book or AroraBarak's Complexity book. One of the following is true:
 Gasarch is right and Goldreich and AroraBarak are wrong. It should have been in their books.
 Gasarch is wrong and Goldreich and AroraBarak are right. Its okay that its not in their books.
 It is in the book but Gasarch couldn't find it.
When I taught Graduate Algorithms (I really did!) I taught Yao's MST algorithm that ran in time O(Elog log V). The students asked me WHY this was important (there were better algorithms available). Then I realized that I taught it only because Samir Khuller taught it when he taught the class, so I asked him why it was important. He said it very much impressed him when he was a graduate student.
The contents of the Goldreich book and AroraBarak book are influenced by a deep understanding of the field, and not by things that impressed them in their youth.
Is Mahaney's theorem important? The real question might be compared to what?. As the number of things that you absolutely have to have in a course grows, at some point, something has to be tossed out.
Friday, October 02, 2009
On Being Narrow
wonders why everyone always assumes all he can do is quantum computing? Oh, because that's all he's done. Time to do something new?What David (aka the Quantum Pontiff) says can apply to any discipline. Are you a narrow researcher? Take my simple one question test.
Do all your coauthors know each other?Being narrow has some advantages but mostly disadvantages.
 You can really know an area. Particularly in a field like quantum computing which has steep learning curve before you can become and remain an active researcher.
 Your are part of a very tight community. These people know you well. You look forward to seeing them at workshops and conferences. They can write you wonderful recommendation letters (and vice versa).
 On the flip side, you don't know many outside your research area and they don't know you.
 Narrow researchers often lose themselves in the minutiae of the field, looking at questions whose importance seems obvious from those inside the field but impossible to explain to outsiders.
 You tend to go to smaller and focused workshops and conferences because the majority of the broader meetings have little that interest you. This further isolates you from the larger community.
 As David laments, others view you as focused and unable to solve problems outside your field.
 As your field loses importance, so do you.
 There are fewer grant programs you can apply to, and you have coauthored with most of the people who could properly evaluate your research, making them ineligible to review your proposals.
Thursday, October 01, 2009
The Journal Manifesto 2.0
Reminder: FOCS early registration deadline today. Go here
Below is a revised version of the Journal Manifesto. I restate the key sentence from my last post and then redo the manifesto in light of the (very enlightening!) comments that my last post got.
However, I have a Manifesto: a list of actions that we as individuals can do to make the academic world a better place.Keep in mind: I am NOT talking to the NSF or to Journal publishes or to Conference organizers. I am NOT going to say what any of these people should do. I am talking to US, the authors of papers. If WE all follow this manifesto then the problems of high priced journals and limited access may partially go away on their own. To be briefer: To the extend that WE are the problem, WE can be the solution.
REQUEST: Leave as comments a simple YES if you agree with the manifesto and a NO and a reason if you do not. I may set this up as a formal online thing that people can sign off on if there is enough interest.
Preamble: We academics publish papers so they can be read. We want anyone in the world to be able to read our papers. We do not expect any money in exchange. Here is what we can all do to facilitate this, not just for our papers but for everyones papers.
 Whenever you publish a paper in a conference or journal, post it on your website AND on some appropriate archive. Also post improvements if there are some (the version on your website may be better than the one in the journal!). You would think this is standard practice by now, but alas, it is not. In particular AS SOON AS YOU SUBMIT the final version to a conference it should go on your website. The fact that 12 papers from FOCS 2009 are still not online shows that there is plenty of improvement to be made on our end.
 If you give a talk on a paper then post the slides and if possible a video of the talk, along with the paper, on your website. On the archives perhaps post a link to the slides and video.
 If you have old papers that are not available on line (it is common go to a website and see only papers past, say, 1995, are online) then get them scanned in (some schools do this for you) and put them on your website. Do not be humble. That is, do not think Nobody cares about that old paper. Somebody might.
 If you goto a website and it has a link to a paper, but the link does not work, then email the website owner. Do not be shy. I have done this with some of the best people in our field and they have been grateful.
 When you write a paper make sure that all of the bibliography entries include all links where the paper is available: the journal website, the authors website, some archives. If people follow items 1,2,3,4 above then the issue of What if the journals is not online? will become irrelevant. There may still be a problem with older articles; however, this will also become irrelevant over time.
 If you gather up papers in an area for your own use, then you may want to make a website out of them. I have done this with the Erdos Distance Problem, Private Information Retrieval, Constructive lower bounds on Ramsey Numbers, Applications of Ramsey Theory, and of course VDW Theorem stuff. There may be some legal issues here, and also some issues of what the publishers will enforce. I have no guidelines to offer, so I leave it to you. However, I will note that I've had my {\it applications of Ramsey Theory} site up for at least 5 years with no problem. Also, not that this point is optional. I suspect that, more than the other items, its one that people think other people should do. (I understand that it can be a pain in the neck to maintain such sites.) THIS ITEM IS OPTIONAL.
 If you get a contract to write a book make sure they allow you to post it free online. Blown to Bits, by Abelson, Ledeen, and Lewis is available this way. So is A=B by Petkovsek, Wilf, and Zeilberger. (I understand that if you are writing an undergrad textbook and expect to make real money on it then you may not want to do this.)