Thursday, January 31, 2008

If you get a new result of interest with little effort...



Over the weekend I made an observation that gives a new (is it new?) lower bound on W(k,c). I will describe what all of this means, but my real interest are in the meta question if you have an easy proof of an interesting result, what to make of that?
Van der Waerden's theorem: For all k, for all c, there exists W=W(k,c) such that for all c-colorings of {1,...,W} there exists a,d such that a, a+d, ..., a+(k-1)d are the same color.
Chanda-Lipton-Furst showed (Lemma 4.4)
If there exists A, a k-free set (a set with no arithmetic sequence of length k) that is a subset of [n], then there is a c-coloring of [n] with no monochromatic arithmetic sequence, where c=O((nlog n)/|A|).
Laba and Lacey showed
There is a constant a such that, for all k that is a power of 2, for all n, there is a k-free set of size n× exp(-a(log n)1/(log k). (There result is sharper than this but this will suffice for us. They acknowledge that the result was already known in 1960 by Rankin.)
If you combine these two you obtain
W(k,c) &ge exp((log c)&Omega(log k))
I have looked at the literature and used google and this appears to be new. I seemed to have proven something new and interesting with very little effort. I pose questions that anyone should pose in this situation and answer them for my case.
  1. Is the result new? I think so- I've been looking at VDW papers for about 5 years now and have not run across it. Even so, would not be surprised (or even disappointed) if someone points to some paper I missed.
  2. Is the result interesting? YES Upper bounds on VDW have gotten alot of attention. Lower bounds less so, but some. And they are a nice check on upper bounds.
  3. Is the proof correct? In this case its just to easy to have gotten it wrong. (Famous last words...)
  4. Are the results you used not commonly known to the same people? While the Chandra et. al paper is not well known to combinatorics people, its the same principle as the Symmetric Hypergraph Lemma, which is known. However, the k-free set result is not that well known.
  5. Did the other people who knew all this stuff not care? Quite possible that the k-free set folks didn't care about this, or didn't think it was worth writing down.
  6. Are you connecting two areas that have not been connected before? NO- k-free sets were studied because of VDW theorem.
  7. What to do with the result? It deserves to be out there (for one thing, if I publicize it I may find out if its already known). I will probably write it up for a short note in some combinatorics journal (will check which ones take notes), and may put it on arXiv. Or might not bother hassling with referees (For more on that train of thought, see Zeilberg's opinion 77

Wednesday, January 30, 2008

If 50 is the new 40 then is 100 the new 80?

If you google 50 is the new 40 you get over 11,000 hits (fifty is the new forty gets only 961 hits). What does the phrase mean? It means that what a while back people did in their 40's they are now doing in their 50's. (Dating, Marrying, Having (perhaps more) kids, Changing Jobs, taking adult education classes, etc.) I tend to agree with this--- I often think that people in their 50's look like they are in their 40's. (The next generation won't have this problem as they will have adjusted to the shift--- unless there is another shift.)

So, if 50 is the new 40 then is 60 the new 50 ? Here is what Google says:
  1. 60 is the new 50 got 3430 hits.
  2. 70 is the new 60 got 788 hits.
  3. 80 is the new 70 got 860 hits.
  4. 90 is the new 80 got 193 hits.
  5. 100 is the new 90 got 8 hits.
  6. 110 is the new 100 got 2 hits.
  7. 120 is the new 110 got 0 hits. You're 120! You don't look a day over 110!
(Note- these may go up because of this post.)

My intuition says that 120 is not the new 110. In fact, I think that 100 is still 100. So we are looking for a function f such that
  1. f(50)=40
  2. for all x ≥ 100, f(x)=x
Linear? log? piecewise linear? Is there a way to really find such an f? Only if you want to replace the intuitive statment 50 is the new 40 with a more rigourous one. Here is one example. Not sure what it would yield, or if you still really have 50 is the new 40 and 100 is the new 100. Let
  1. g1(x) = the life expectancy of someone who was x years old in 1950
  2. g2(x) = the life expectancy of someone who was x years old in 2008
(you may pick other functions that measure something about life then and now.)



With some rigorous definition you could really answer this question. But it may be more fun to put your math hat aside and just see what your intuition tells you. Mine says
  1. 50 is the new 40.
  2. 60 is the new 52.
  3. 70 is the new 64. (will you still need me, will you still feed me, when I'm 70?)
  4. 80 is the new 76.
  5. 90 is the new 88.
  6. 100 is the new 100. Or the old 100.

Tuesday, January 29, 2008

Announcement from source at NSF

(Guest posted by request from unnamed source at NSF.)

The Women in Science Award of the Maria Mitchell Association will recognize an individual who has worked to increase the participation and advancement of girls and/or women in science and mathematics.

To be considered for the Maria Mitchell Women in Science Award an individual must:
  1. Demonstrate consistent leadership and support for the advancement of girls and women in the fields of natural and physical sciences, mathematics, engineering, computer science or technology.
  2. Be someone who served as a mentor, role model or key player in a program designed specifically to encourage and advance girls and women in the fields of science, mathematics and technology.
  3. Be a United States citizen
For more information on the award, please visit our website at here . Nomination forms must be = postmarked by March 15, 2008.

Monday, January 28, 2008

The Lance-Food Diet

As my long time readers know, I enjoy a good Francesinha as much as the next guy, but around the time of FCRC last June I realized I was simply getting too fat and had to do something about it. I would never last eating only so-called "healthy" food so I tried a different approach still eating the foods I love and I dropped forty pounds reaching my goal weight in November.

I should write a little pamphlet and sell it for $39.99 but for you readers I reveal my secrets for free.

  • I went cold turkey on sweets. No brownies, no cakes, not even any non-fat frozen yogurt. Seminars and conferences offer us too many opportunities to partake of empty calories so best to have a hard and fast rule.
  • Smaller portion sizes: One slice of pizza instead of two. A single hamburger instead of a double. Also fewer side dishes like fries.
  • I cut out between meal snacks even if they don't seem too bad like pretzels. Do not eat when you aren't hungry.
  • Avoid at all costs those evil green things commonly known as "vegetables." You'll never survive if you force yourself to eat stuff you don't like.
  • I have been running three times a week, but this is no more than I did before the diet began.
Once I reached goal weight, I had another challenge—How to avoid losing too much weight without bouncing back up. I use the scale like a thermostat, eat less when I weigh more, and eat more when I weigh less, including the occasional ice cream.

Follow these simple rules and you too can lose weight the Lance way.

Friday, January 25, 2008

More on Graph Minors

Guest post by Vahan Mkrtchyan

(Sequel to Graph Minor Theorem Post)

When I was a student my supervisor used to explain me that one of the reasons why the P vs NP problem was not solved and why it could not be solved in near future, was the lack of techniques for proving the pure existence of polynomial algorithms for certain algorithmic problems.

In order to explain what he meant, consider a typical problem for an elementary course of calculus. Suppose we are given a function something like

f(x)=sin(exp(x2)-cos(x3-4x+7)-x4+6)
and we want to prove that this functions attains a maximum in some point from the segment [0,1]. How can we do this?
  1. Possibility 1: just construct the maximum point
  2. Possibility 2: reduce the problem to the finding of maximum of another function for which we already know its maximums.
  3. Possibilty 3: Prove that f(x) is continious and apply Weiestrass theorem.
We know that neither Weierstrass theorem nor its proof imply anything about how one can construct this maximum. It just states that this maximum exists! Now suppose that we have a property P and we would like to show that testing P can be done in a polynomial time. How can we do that?
  1. Possibility 1: just design a polynomial algorithm for testing P
  2. Possibility 2: polynomially reduce the problem to testing another property P' for which we already have a polynomial algorithm
These two possibilities cover almost all the ways that reasearchers working on designing algorithms use while trying to prove the existence of efficient algorithms.

Fortunately, they cover almost all cases, since the Graph Minor Theorem implies that we have a

Possibilty 3: Prove that the property P is minor-closed, since Graph Minor Theory developed by Neil Robertson and Paul Seymour implies that all minor-closed properties can be tested in polynomial (even in cubic) time! Let us note a graph-theoretic property P is said to be minor-closed if whenever a graph G satisfies P then so does evey minor of G. This follows from

  1. Graph Minor Theorem: If S is any family of finite graphs, none of which is a minor of another then S is finite!
  2. The existence of a polynomial (cubic) algorithm for H-minor testing: Given a graph G. Is it true that H is a minor of G?
Incidentally the following problem is NP-complete: Minor testing Given two graphs G and H. Is it true that H is a minor of G? One might wonder whether there are properties for which we can prove the pure existence of a polynomial testing algorithm? As Reinhard Diestel states in his book Graph Theory, an example of such property is the knotlessness, that is deciding whether a graph can be embadded in 3-dimensional space such that none of its cycles form a trivial knot (see the book for details). Though we do know that the algorithm exists, at this moment we do not have an explicit algorithm (we cannot write a program), even if we have got enough enthusiasm to read Graph Minors I, Graph Minors II, Graph Minors III,..... - a series of papers devoted to the mentioned results of Robertson and Seymour.

One might wonder what can be said about graph minor theorem for infinite case. Robin Thomas in A counter-example to Wagner's conjecture for infinite graphs (Math. Proc. Comb. Phil. Soc. 1988, 103, pp. 55-57) constructed an infinite sequence of infinite graphs none of which is a minor of the other. Unfortunately, Thomas's proof contains two "bugs":
  1. The proof heavily lies on the validity of the prominent Axiom of Choice from Set Theory. It would be extremely useful to understand whether the existence of such a sequence really depends on this axiom? To put it in more understanble form for the experts of Mathematical Logic, I would like to ask for the refutation of Graph Minor theorem in ZF-(minus) - Zermelo-Fraenkel set theory that does not include the axiom of choice? This question is interesting not only on its own, but also for the Countable Graph Minor Conjecture.
  2. Thomas's proof implies nothing about the countable Graph Minor Conjecture, a conjecture stating that there are no infinite sequences of countable graphs none of which is a minor of the other?


Recent Developments: Bruce Reed and Ken-ichi Kawarabayashi have recently reduced the complexity of H-minor testing to O(n logn). "...To cast a glance at the next advaces of our science and the secrets of its development..." (the sentence is taken from David Hilbert's epochal talk before the International Congress of Mathematicians at Paris in 1900), especially for the generalization of the Graph Minor Theorem to Matroids (=Graph Minor Project) see Jim Geelen's lecture-notes delivered by him in the recent ADONET-CIRM doctoral school on Graphs and Algorithms.

Thursday, January 24, 2008

The End of TV as I Knew It

It was one of the great mysteries of my childhood. You changed TV stations by turning a dial, a dial that started at "2". What happened to channel 1? So off I went to the library and found out the ugly truth: Channel 1 contains the entire FM spectrum. Each TV station took a very wide spectrum to send off their analog signals.

But as of February 2009 the rest of the analog channels will disappear as well. Today the FCC starts the auction for that spectrum. FCC spectrum auctions are the poster child for combinatorial auctions where bidders bet on subsets of goods and have lots of nifty complexity issues. But I don't want to talk about math today, just want to mourn the analog space that carried the classic TV shows I watched as a kid. The moon landing was broadcast live over those airwaves about to be sold off forever.

Much ado is made about Google entering the bidding. But the spectrum will likely go to major telecoms like AT&T and Verizon. They will use the spectrum mainly for high-speed wireless data. And what will we use that high-speed data for? Watching TV on our cell phones. The circle will be complete.

Wednesday, January 23, 2008

Optimizing your vote

First off, I am delighted that Lance is back! I don't think this picture of Lance looks like him. I'm looking forward to meeting Jason and Nicole (at CCC 2008?) to see if they look like themselves. Second off- A post on politics.

This is not a post expressing any particular political viewpoint. Here is the question: Is it always best to vote for who you actually like best in the primary elections?

Consider the following scenario. I use real names, but the ratings are made up just for this problem and do not reflect mine or Lance's opinions.

Say you rank the current candidates in the following order with the following scores on a 10 point scale:
  1. Obama: 8.8 (acceptable) Democrat
  2. McCain: 7.8 (acceptable) Republican
  3. Paul: 7.5 (acceptable) Republican
  4. Clinton: 7.0 (acceptable) Democrat
  5. Edwards: 6.8 (acceptable) Democrat
  6. Thompson: 6.0 (not acceptable) Republican
  7. Huckabee: 5.8 (not acceptable) Republican
  8. Romney: 5.0 (not acceptable) Republican
Also say that you are in a state where you can vote in either the Democratic or Republican primary (but not both). We could also have numbers for how likely they are to win the nomination (e.g., why vote for Ron Paul when he doesn't have a chance). We could also have numbers for each pair of Republican and Democrate as to who would win the general election (e.g., you may vote for Huckabee because you think that Obama, Clinton, and Edwards all have a good change to beat him in the general election, so you want him to be the nominee. Of course, this is a dangerous strategy since he might win and you didn't think he was acceptable. Also, you DID like McCain so...)

Given all of this data, who should you vote for? Can this problem be made rigorous and solved? Of course, the really hard part in the real world might be getting those numbers. And you may have other reasons to vote, as shown by this ad:

Tuesday, January 22, 2008

Politics and The Blog

One cannot escape the tight races for the democratic and republican nominees for president. While I would never endorse any specific candidate on this blog, one cannot ignore the race and the various mathematical aspects of the elections. I helped, in a very small way, with the design of the Yahoo Political Dashboard, that really boils the race down to just a bunch of numbers, perhaps a bit too much.

The Electoral College comes often comes into criticism but at least it is essentially just a weighted majority of pluralities. States, on the other hand, allocate their delegates in a variety of different and confusing ways. Right now news agencies seem to just count state wins but after February 5 expect some interesting analysis of delegate counts.

A little game theory: Why does John Edwards stays in the race when he has a virtually zero chance of getting a majority of the delegates? Because there is a non-zero probability that neither Obama or Clinton will have a majority either, and then Edwards wields incredible power with his small number of delegates. I don't know what Edwards would do with this power, but he won't give it up by dropping out of the race.

Notice that when we have a surprise victory in a primary, like Clinton in New Hampshire, much of the talk revolves on why the pundits, polls and prediction markets all "failed." Meanwhile in sports when we see a surprise victory, like the New York Giants over Dallas and then again in Green Bay, the focus is on what the Giants did right and the Cowboys and Packers did wrong. Sports fans understand probabilities much better than political junkies—upsets happen occasionally, just as they should.

Friday, January 18, 2008

Bobby Fischer (Guest Post by Ken Regan)

When I watched the Fischer-Spassky match on TV back in 1972, there was a young chess prodigy, just a few years older than me, helping with the commentary of the games. That kid grew up to be Complexity theorist Kenneth Regan and I asked Ken to give some personal comments on Bobby Fischer, who passed away yesterday.

Bobby Fischer lived 64 years, one for each square on the chessboard. Unfortunately most of those squares were empty of playing the game he loved at the highest levels, but the brilliance and spirit of his games and ideas will ensure his board is remembered as more than half full.

What impressed me from Fischer's games was that clear logic and dynamism can both be harnessed. He produced scintillating attacks of the kind we associate with Tal and Kasparov, and positional masterpieces worthy of Capablanca and Karpov—including Game 6 of his 1972 match with Spassky. Kasparov is known for researching new ways of sacrificing pawns in the opening to increase the energy of one's position, while Fischer always maximized the potential of the position to hand. Almost uniquely with him there were no early draws while there was fight left. Other champions are known for how many years they went without losing a game, whereas Fischer won 19 games in a row, in the world championship stages. This ethic rubbed off on me even when I found myself paired against a fellow teen master I'd just shared a long bus ride with from Princeton to NYC. He sensibly proposed an immediate draw so we could rest, but I was there to play—and I lost!

I was attracted to the game just before the "Fischer Boom" years, and had nearly reached master level at age 12 when the Fischer-Spassky match began. I was on the nationwide PBS live broadcast of two of those games as an expert commentator assisting Shelby Lyman's TV coverage. My own rise was aided much more directly by the tournaments organized on a nationwide scale by William Goichberg, who is now President of the US Chess Federation. I never played Fischer—I met him only once in an elevator when he visited one of those tournaments. I was the age to feel the letdown most deeply when he did not play after 1972 and did not defend his title against Karpov in 1975, though what we've learned about his personal travails since then removes blame and much regret over this. Still, a piece of Brooklyn died with me yesterday.

Fischer will also be known for innovations of "Why didn't anyone else think of that?" caliber. The Fischer Chess Clock is now standard equipment. It regulates a player's time allotment in the manner of Social Security so that each move always has some thinking time. The Fischer Castling Rule enables the game to be started from different initial configurations while retaining its character. Fischer Random chess has both players start with the same random choice from 960 placements of pieces on the back row, and is gaining traction as more people agree with Bobby that computers and vast encyclopedias of opening analysis are causing the standard opening configuration to be "played out." I favor "non-random" placements with Black allowed to differ from White in my proposal Baseline chess with Fischer rules. Fischer also feared that computers would ruin the mystery of chess, but I can personally vouch that the game's incredible complexity remains. In response to the challenge compliment from Grandmaster Susan Polgar's premier chess blog, I undertook to tell whether (now ex-) World Champion Vladimir Kramnik missed a win at Move 50 on the slippery slope to losing his title last September. After four months and 300+ pages of analysis, aided by Deep Fritz 10 and two other chess programs running on faster hardware than DF10 used to beat Kramnik a year ago, having sifted over 20 trillion search nodes and tried out almost 100,000 moves, I'm about to throw up my hands and say I have no opinion more definite than "flip-a-coin" on whether White can win!

Lance 2.0

The weblog has called me back. Bill and I will now jointly be posting on this blog.

Many changes in my life since last March, most importantly starting a new job at Northwestern. Jason Hartline, building on an idea of Nicole Immorlica, had a poster made announcing our new group.

Can you guess the movie reference? Hint: I am standing in for Richard Gere.

Don't let the poster scare you—Computational Complexity will always remain my one true academic love.

Thursday, January 17, 2008

If you submitted a CDI proposal READ THIS

Guest REQUEST by Richard Beigel- NSF Guy. (I always forget formal titles.)

The CDI program received around 1300 proposals. You can imagine the difficulty and importance of assigning the proposals to appropriate panels. If you submitted a CDI proposal relevant to theory of computing please send an email Richard Beigel immediately and in any case by close of business Friday January 18. You can find his email address here:
here

If possible, include your Proposal ID Number when writing.

Wednesday, January 16, 2008

Math books you can actually read

How many math books can you read cover-to-cover? How many have you? The problem is that while you can certainly read any discrete math (for ugrads) textbooks cover-to-cover you don't want to- you know most of it.

Computer Science Theory is better than math for this since the field is younger and the books often do not need as much background.

So, which books are just right- hard enough to have things in them of interest, but not so hard that you can't read them. Demanding you be able to read it `cover-to-cover' is rather demanding and also ambigous- do you need to understand everything? I'll define this to just be `read/understood over 90% of the book'

With that in mind, here are the books I've done that for. Its a short list.
  1. Recursively Enumerable Sets and Degrees by Soare. I had a course in 1980 (by Michael Stob- a PhD student of Soare) that covered about 1/2 of it (in preprint form). I understood about 3/4 of that course. But over the next 20 years I (slowly) read and understood the rest. By 2000 I had it all. I try to retain it by presenting a priority argument argument to someone once in a while. Its getting harder to find someone who wants to see these things who doesn't already know them. I did drag Steven Fenner into a room at CCC06 and forced him to see the proof that there is a minimal pair of r.e. degrees using a tree argument,
  2. Ramsey Theory by Graham, Rothchild, Spencer. I had about 1/3 of this in a course (by Spencer) in 1978. Over the next 30 years I've read more of it. Now I'm about done.
  3. Linear Orderings by Rosenstein. Great book, out of print now. Does lots of model theory (E-F games) and recursion theory in a more concrete setting.
  4. I've read several books by Brams and Taylor about fair division. All are readable. The nice things here is that the math is easy but probably new to most of us. Hardest theorem: there is an envy free discrete protocol to divide a cake between 4 people (or more).
  5. Communication Complexity by Kushilevitz and Nisan. Reviewed it and used it in a course twice. By the second time I had it all read.
  6. Proofs that really count by Benjamin and Quinn. This is about proofs of combinatorial identities done by showing that two expressions solve the same problem. Great topic, Great book.
So how about you- have you found many book that hit that sweet spot between too easy and too hard. And that are well written.

Monday, January 14, 2008

Invited Post on Invited Speakers Website

Iftah Gamzu requested that I post on the following topic. When I began to do it, I realized that his email to me, edited slightly, says it all and says it all well. So here is a guest post by Iftah Gamzu:

My name is Iftah Gamzu, and I'm a PhD student in Tel Aviv University. Some time ago, several people brought to my attention the need for a web page that will list past invited speakers in theory conferences. They mentioned that such web page would help program committees, and especially program chairs in the decision of who to invite to present plenary talks. Consequently, I devised such web page

here

Unfortunately, there are still missing pieces of information that I couldn't track down on the web. SO, if you know some of the missing info, please email me at

iftgam@post.tau.ac.il

Thursday, January 10, 2008

Today is Knuth's 70th birthday!!

Today, January 10, is Donald Knuth's 70th birthday. I have some thoughts on Knuth, but I am sure that my readers have more. So, I'm asking you to leave comments on Donald Knuth. Let's break the record for number-of-comments on an entry. Knuth deserves it! (What is the record? If I were as meticulous as Knuth I would know.)
  1. My first exposure to Knuth's work was in 1980 when I was doing a survey on Factoring Polynomials over Finite Fields. I couldn't find the relevant papers in the library (I can hear people under 25 saying weren't they on line? or what's a library?), so I was told to look in Knuth Volume 2. And indeed, there was a wonderful exposition of what I needed to know, and the history behind it. Quite scholarly and, unfortunately, quite rare among other researchers.
  2. Browsing through Knuth's volume I got the impression that his attitude is I want to solve problem X and I'll use whatever math I need to solve it, even if I have to develop it myself. Never shy away from a problem because you don't know the math needed, or even because the math you need hasn't been developed yet.
  3. TeX: Reading the TeX manual you actually learn things of interest outside of TeX. Not only did I learn how to hyphenate in TeX, I also learned that in German when you hyphenate some words, you actually change their spelling. And, of course, TeX can handle this.
  4. TeX and LaTeX: Revolutionized how we write papers. Facilitated the notion of keeping papers on line for easy access.
  5. First Contact!: I got a postcard from Knuth pointing out an error in a paper I wrote. WOW! a postcard from Knuth.
  6. Second Contact!: Knuth asks me to get a review of Volume 4 in my book review column. I was surprised that he emailed me (he does not use email) and amazed that Volume 4 was coming out. The way he asked me was interesting: see this post
  7. I always thought Volume 4 was a myth, like the missing part of the Dead Sea scrolls. How long has the world been waiting for Volume 4? Knuth needed a term for what we know as `NP-completeness' for use in Volume 4. He held a contest, and NP-completeness won. (He ended up not using it in Volume 4.)
  8. Volume 4: Yes, it really is out, sort of. It's coming out as a series (or a sequence) of fascicles (small books) of (I am not making this up) exactly 128 pages. It's on generation of combinatorial objects. The second fascicle, on enumerating all strings of length n, (e.g., Gray codes) is fascinating. Includes history and the needed mathematics. History goes back to the mid 1850's. Quite scholarly and, unfortunately, quite rare among other researchers.
  9. What has Knuth's influence been on computer science? He was one of the first people to realize that an algorithm can be analysed in a mathematical and intelligent way without running it. This is one of the most important starting points for computer science theory. Perhaps even for computer science.

Tuesday, January 08, 2008

predictions for 2008 and beyond

Predictions for 2008 and beyond:

  1. In October the Democratic pundits will predict that the Democrat will win the election, and the Republican pundits will predict that the Republican will win the election.
  2. There will be a big breakthrough in theory. Very hard to predict what it will be- note that this years big breakthrough, faster algorithm for integer multiplication, would have been hard to predict.
  3. P vs NP, P vs BPP, will not be solved.
  4. Computer science enrollment will rise slightly.
  5. There will be a paper claiming to resolve P=NP. Some students will email me asking if it is worth reading. I'll say no.
  6. Medium Term- The Spam problem will get worse.
  7. Long Term- Self-checkout will become more and more common in grocery stores and other stores.
  8. Long term- the business model for academic publishing, both journals and monographs, will change. It has too.
  9. Long term- women will stop taking their husbands names, because taking their names would be google-stupid
  10. Long term- people will give their kids names based on how easy it is to find on google.

Friday, January 04, 2008

Graph Minor Theorem and Non Const Algorithms

The last comment on the last post had some questions about the graph minor theorem and (implicitly) nonconstructive algorithms in general. Here is some background and answers.
  1. If G is a graph then H is a minor of G if H can be obtained by removing vertices, removing edges, and constracting edges (that is, replacing edge (u,v) with just one vertex that has all the neighbors that u and v had).
  2. The formal statement of the Graph Minor Theorem (GMT)is: the set of graphs with the minor ordering is a well quasi order. This means that you cannot have an infinite descending sequence of graphs or an infinite set of incomparable graphs, using this ordering.
  3. The GMT has a hard nonconstructive proof. It was proven in a sequence of papers by Robertson and Seymour entitled `Graph Minors I' `Graph Minors II' etc. It was finally proved in Graph Minors XX. This website claims that it was proven in 1988 but was not published until 2004.
  4. The proof is not only nonconstructive, but it is provably nonconstructive using Harvey Frideman's Reverse Mathematics framework.
  5. The following two facts, one a corollary of the GMT, is what yields polytime algorithms:
    1. For a fixed graph H, there is an O(n3) algorithm for the problem: Given G, is H a minor of G.
    2. If X is a set of graphs closed under minor then there exists a FINITE set of graphs H1,...,Ha such that G \in X iff NONE of H1,...,Ha are minors of G. (This is the corollary to GMT.) EXAMPLE: a graph is planar iff it does not have K3,3 or K5 as a minor. In this case we know the obstruction set. The proof of GMT does not yield this information.
  6. One easily obtains poly time algorithms (indeed O(n3)) for many problems. Here are two such.
    1. Fix k. Test if a graph has Vertex Cover of size &le k. (VCk)
    2. Fix g. Test if a graph has genus &le g.
  7. There are constructive linear time algorithms for the VCk. Last time I checked it was down to O(n + (1.34)k). For the Genus problem I don't know whats known. (Commentators- please comment.)
  8. Fellows and Langston showed how to convert most algorithms (including those for VC and Genus) from poly nonconst to poly constructive. The degree does not go up much (either by 0 or 1), but the order constant gets even worse.
  9. NOW for the commentators question: is the converse true: does an algorithm for (say) genus g that is in time O(n3) (the order constant may depend on g) imply GMT. I doubt this is true. It may be provably not true given that GMT has a provably nonconstructive proof.
  10. Are there other nonconstructive algorithms? A cheap example are things like
    f(k) = 1 if SAT is in TIME(nk), 0 otherwise
    which is in P (its just a step function or the always 0 function) but do not know how to compute it. Are there examples for problems we care about being in P through nonconstructive means that are NOT from GMT? I do not know. Commentators-please comment.
  11. There are many problems in NP where if you fix one parameter they are in O(f(k)p(n)) and not O(nf(k)). Such problems are called FIXED PARAMETER TRACTABLE. Downey and Fellows wrote a book on it a while back, though there are more books out now.
  12. Are there more legit examples? Commentators- please comment.
  13. I will have a later post on nonconstructive things in math.

Tuesday, January 01, 2008

2007 Complexity Year in Review

Guest Post by Lance Fortnow

We had quite an active year in complexity and Paper of the Year goes to Martin Fürer's Faster Integer Multiplication, making a breakthrough in this most basic of algorithmic questions.

This blog, now under new leadership, hit five years and 1000 posts in August. Most commented post: Vijay Vazirani's post suggesting that submissions to conferences come with an accompaning video followed closely by Claire Kenyon's post on cover letters.

Back in February I posted about large proposed increases in the NSF budget but with a caveat that it had to survive the congressional appropriation process. It didn't.

In 2007 we mourned theorists Steve Mahaney and Andrej Muchnik as well as others close to our heart: Martin Kruskal, Jim Gray, John Backus and Paul Cohen.

Bill and I would like to thank our guest posters Kamal Jain, Jonathan Katz, Claire Kenyon, Shiva Kintali, Phil Klein, Stuart Kurtz, Clyde Kruskal, Nicole Immorlica, Amir Michail, Mihai Patrascu, Ken Regan, Jim Royer, Alexander Shen and Vijay Vazirani. Most of all thanks to Bill for keeping this blog active and still going strong.

Here's wishing everyone a great 2008!