Monday, July 30, 2007

Away Message

When I won't be in email contact for a while I set up a vacation program so that if someone emails me they get a message. I used to use the following:
I am not in email contact. If you absolutely, positively, have to contact me then get a life.
My wife told me this was offensive, so I changed it to
I am not in email contact. If you absolutely, positively, have to contact me then you have the wrong priorities.
She didn't like that one much either, but it was better. And I think its cleverer. But this raises the question, what is the proper etiquette for vacation programs?
  1. 15 years ago someone who is not computer savy was offended by the `get a life' vacation program, thinking that I had send it personally.
  2. 2 years ago a shy grad student from a different school was terrified by my `wrong priorities' vacation program.
  3. Aside from that, most people tell me they like both of them.
  4. I often email someone, get a vacation program reply, and then within 5 minutes get a real reply. I find that someone rude.
  5. Whatever the vacation program etiquette it will likely be irrelevant as we are logged on more and more, even on vacation.

Wednesday, July 25, 2007

Suggestion for STOC /FOCS(guest post)

(Guest post from Shiva Kintali. All capital letters, italics, and boldface are from Shiva.) A request to FOCS/STOC PC members Hi All, I would like to point out a concern I have about the FOCS/STOC conference proceedings. There is a huge gap of around four months between the announcement of accepted papers and the conference date. For example: STOC'07 acceptance date was Feb-18th and conference date is June 11. FOCS'07 acceptance date was July 1st and conference date is Oct 21.

Most of the authors don't upload their drafts/camera-ready papers on their homepages, for some unknown reasons. Some of them are kind enough to send their drafts if you send them an e-mail. Some don't bother to reply. If there is an exciting result (most of the STOC/FOCS papers have exciting results), most of us would like to know the techniques used, as soon as possible. For example, one of the FOCS'07 result helped me a lot in my research. I knew that the result can be used in my research, but I had to wait for four months. Waiting for four months to know the details of a result is really frustrating.

Also, there is a gap of around 40 days between the acceptance date and the deadline for camera-ready submissions. I guess the difference between the submitted paper and camera ready version is latexification and adding the suggestions of the reviewers. This should not take more than couple of weeks. Once the committe is happy with the camera-ready version, the digital proceedings can be uploaded on the ACM/IEEE portals. I think a gap of one month between the acceptance date and uploading the digital proceedings is reasonable. Of course, this would require some hardwork from the authors and the committee. This hardwork would not go waste !!

Can somebody PLEASE propose this in the next FOCS/STOC business meeting !!

Monday, July 23, 2007

Checkers Solved- its a draw!

The game of checkers seems to have been solved. Its a draw. See here or here if you don't mind seeing an ad for low cholestrol cooking before getting to the article or here if you subscribe to the nytimes or here if you trust wikepedia. The checkers program CHINOOK cannot lose (it can draw). The program has been around for quite some time, being improved over time. The researchers are Jon Schaeffer (the originator), Rob Lake, Paul Lu, Martin Bryant, and Norman Treloar. They say they have a `computational proof not a math proof'. Not sure what that means, but I do believe that Checkers is now done.

There is a very good book called One Jump Ahead that is about the program Chinook that plays Checkers very well (now perfectly apparently) but it was written a long time ago, before the recent news.

My impression of Chess and Checkers playing programs is that they are very clever engineering but not really much for a theorist to get excited about. However, very clever engineering should not be underrated. I also think that these programs have taught us that (some) humans are very good at these games in a way that is different than machines. When Deep Blue beat Kasporov, rather than thinking (as the popular press did) Oh no, computers are smarter than humans!! I thought Wow, it took that much computing power and that much look-ahead to beat Kasporov. Kasporov must be very good (duh) and the way he plays is different than what a computer would do.

Similarly, the Chinook researchers ended up being very impressed with Marion Tinsley (the best checkers player of all time, since deceased). Analysing his games it seems as though he almost never made a mistake. Chinook and Tinsley had two matches- Tinsley won the first one with 4 wins to Chinook's 2. During the second one Tinsley took ill and had to forfeit- he died a few months later.

Will checkers decline in popularity? I don't think so--- its already so unpopular that it can't decline much. This story may give it a temporary revival.

Thursday, July 19, 2007

W(6,2) = 1132! (excitment, not factorial)

A PhD Student, Michal Kouril, found a new van der Waerden number, W(6.2)=1132. See here for details. I had a list of known VDW numbers in an earlier post, but I redo it here with the new result.

VDW(k,c) is the least number W such that no matter how you c-color the elements {1,2,...,W} there will be k numbers equally spaced (e.g., 3,7,11,15) that are the same color. W(k,c) exists by VDW's Theorem. See Wikipedia or my post in Luca's blog

The only VDW numbers that are known are as follows: (see this paper) by Landman, Robertson, Culver from 2005 and the website above about W(6,2).
  1. VDW(3,2)=9, (easy)
  2. VDW(3,3)=27, (Chvátal, 1970, math review entry,
  3. VDW(3,4)=76, (Brown, Some new VDW numbers (prelim report), Notices of the AMS, Vol 21, (1974), A-432.
  4. VDW(4,2)=35, Chvátal ref above
  5. VDW(5,2)=178, Stevens and Shantarum, 1978 full article!
  6. VDW(6,2)=1132. Michal Kouril. 2007. (Not available yet.)
Over email I had the following Q & A iwth Michal Kouril.

BILL: Why is it worth finding out?

MICHAL: As my advisor Jerry Paul put it Why do we climb Mount Everest?" Because it is there! The advances we've made during the pursuit of W(6,2) can have implications on other worthy problems.

BILL: Predict when we will get W(7,2)

MICHAL: Septemer 30, 2034. Or any time before or after. Interest in Van der Waerden numbers has been growing lately and I would not be surprised if we saw W(7,2) lot sooner than this. Some unknown VDW numbers are already just a matter of the amount of computing power you throw at them in order to prove the exact value. But W(7,2) still need more analysis to make them provable in a reasonable amount of time.

(Back to bill's blog:) In a perfect world Michal would be interviewed by Steven Colbert instead of me. Oh well...

Tuesday, July 17, 2007

Can Jerry Seinfeld crack P vs NP ?

The following is a quote from Comedian Jerry Seinfeld. The source is Seinfeld Universe: The Entire Domain by Greg Gattuso (Publisher of Nothing: The Newsletter for Seinfeld Fans, page 96.
I was great at Geometry. If I wanted to train someone as a comedian, I would make them do lots of proofs. That's what comedy is: a kind of bogus proof. You set up a fallacious premise and then prove it with rigorous logic. It just makes people laugh. You'll find that most of my stuff is based on that system ... You must think rationally on a completely absurd plane.
I doubt that many comedians have seen lots of proofs though they may have an intuitive sense of logic for their routines. And not all comedians use this style.

I know of one theoretical computer scientist who is a comedy writer. Jeff Westbrook got his PhD in 1989 with Robert Tarjan on Algorithms and Data Structures for Dynamic Graph Algorithms. He was faculty at Yale, and then a researcher at AT+T before working on the TV shows Futurama and The Simpsons. I actually met him in 1989- he didn't seem that funny at the time.

Are there other theorists or mathematicians that are also professional comedians or comedy writers? I doubt there are many. If you define theorist or mathematician as having a PhD then I assume its very very few. If you defining it as majored in math or CS there would probably be some.

Monday, July 16, 2007

A postal campaign against spam

I will be sending the following letter by snail mail and you should send a similar letter- you may have an effect on spam.

Dear Govenor Huckabee,
There is someone trying to destroy Americas computer infrastructure and blame it on you! I received an email (excerpts below) that look like it was from your campaign but clearly it is not. I know it is not from your campaign since spam is so vile, so disgusting, that a man of high moral character such as yourself would not use it. (Note that even your ethically challenged competitors have not used it.) The spam in question asks the receiver to send a certain email to friends, relatives, and co-workers. This sounds like a chain letter, which is illegal, but of more importance it could crash America's computers. I urge you to take some action to make sure the public knows it is not you behind this vile spam, and put some effort into tracking down the people responsible.


Here are excerpts and my comments on it.

Mike Huckabee - The Exploratory Committee
When we launched the barber pole campaign a few weeks ago to raise 400 contributions in 96 hours, we had a tremendous response: 600+ total contributions, 400+ first-time contributors to the campaign and quite a few laughs.
While this is not quite asking for money, that might be the next step in this disgusiting scam.
Republicans, Democrats and Independents. I am interested in sharing my vision for America with all comers. I have a clear record that I'm proud of and I am willing to promote it to anyone regardless of their politics.
Another dead giveaway--- during the primaries you target your own party only.
The goal of this new, online campaign is to have 400 online volunteers send emails ! on the campaigns behalf over the next 72 hours. Please focus only on people you know: friends, family members and co-workers. We have designed a special email that we would like you to send.
This is the real dirt- they want to flood our computers with this email!!!

Now that you are allerted to the danger, please do something about it.

William Gasarch, Concerned Citizen

Thursday, July 12, 2007

An Open Problem wiki!

A blog entry of Lance's on open problems noted that it would be good to have a repository of open problems. Perhaps a wiki or something.

I recently go the following email that may be an answer:
I am writing you in (very belated) response to a post on your blog in mid March. You posted a message called "A Place for Open Problems" where you suggest: "We need some Web 2.0 system. A blog or wiki to post the problems. A tagging method to mark the area and status. A voting system to rank the importance of the problem. A commenting system for discussion. A sophisticated RSS system for tracking. A visual appealing and simple interface. And most importantly, someone willing to put it all together for no compensation beyond the thanks of the community."

Together with Robert Samal, we have just finished the construction of a system which matches your request quite closely. There are still some small modifications we are making, but it is alive and fully functional, and we would greatly appreciate any input/publicity from you and your readers. Our website is called "The Open Problem Garden" and lives at the following url: here it is

Hope you enjoy it.



Best, Matt DeVos
I corrected them about Lance making that posting, not me. Of much more importance - they have a wiki!! Is it good to use? Will we use it? This is one of those chicken-and-egg problems where if enough people use it then it will be a good resource. Of course, Matt and Robert are not innocent bystanders- if it has a good interface and other features then we are more likely to use it. It seems to be open problems in all of mathematics, though computer science theory is a category. If there was a wiki tailored to Theory would that be better or worse? I would guess worse because the distinction can be artificial anyway.

And of course there is the issue of- are you better off working on your open problems or posting them? It may come down to this:
Which is greater, your curiosity or your ego?

Tuesday, July 10, 2007

A ``Concrete'' Open problem

(Guest Post by Ken Regan) pdf file available here
Computational complexity theory is the study of information flow and the effort required for it to reach desired conclusions. Computational models like cellular automata, Boolean or algebraic circuits, and other kinds of fixed networks exemplify this well, since they do not have "moving parts" like Turing machine tape heads, so the flow's locations are fixed. Measures of effort include the time for the flow, the amount of space or hardware needed, and subtler considerations such as time/space to prepare the network, or energy to overcome possible dissipation during its operation. These models and measures have fairly tight relations to Turing machines and their familiar complexity measures.
For an example and open problem, consider the general task of moving all "garbage bits" to the end of a string, leaving the "good bits" in their original sequence. We can model this as computing the function f: {0,1,2}* ® {0,1,2}* exemplified by f(1020212) = 1001222, f(2200) = 0022, f(101) = 101, etc., with 0,1 as "good bits" and 2 as "garbage." A rigorous inductive definition, using e for the empty string, is f(e) = e, f(0x) = 0f(x), f(1x) = 1f(x), and f(2x) = f(x)2. This is the "topological sort" of the partial order B = {0 < 2, 1 < 2} that is stable, meaning that subsequences of incomparable elements are preserved. The problem is, can we design circuits Cn, each computing f(x) on strings x of length n, that have size O(n)?
The circuits Cn have input gates labeled x1,...,xn which receive the corresponding "trits" (0, 1, or 2) of the input string x, and output gates y1,...,yn giving y = f(x). The first question is, what interior computational gates can Cn have? A comparator gate g for a partial order (P, < ) has two input and two output wires, maps (a,b) either to (a,b) or (b,a), and never maps to (d,c) when c < d. The unique stable comparator gP maps (a,b) to (a,b) unless b < a. The following slightly extends the famous 0-1 law for comparator networks:

Theorem 1. If a circuit Cn of comparator gates computes f(x) correctly for all x ĂŽ {0,2}n (not even including any 1s), then for every partial order (P, < ), the circuit CP with each comparator replaced by gP computes the stable topological sort of P.

Proof. First suppose CP errs for a total order (P, < ). Then there are x,y ĂŽ Pn such that CP(x) = y, but for some j, yj+1 < yj. Take the permutation p such that xi = yp(i) for all indices i. Define a binary string y¢ ĂŽ {0,2}* by y¢i = 0 if yi < yj, y¢i = 2 otherwise, and x¢ by x¢i = y¢p(i) for all i. Then Cn(x¢) = y¢ (exercise: prove this by induction taking gates one at a time), contradicting that the original Cn was correct on {0,2}*.

For (P, < ) not a total order, an error CP(x) = y (which might violate only stability) is also an error in the total order (Px, < ¢) with Px = {(a,i): xi = a} and (a,i) < ¢(b,j) if a < b or a is not comparable to b and i < j. [¯]

Corollary 2. Circuits Cn of comparator gates computing f require size n*log2(n) - O(n). [¯]

This follows by applying the standard sorting lower bound to CP. It's interesting that we did not need 1s in x to argue stability, and the lower bound allows gates g in Cn to be arbitrary when either input is 1.
For general circuits, however, the argument doesn't hold, and all bets are off! To see why, consider sorting the total order {0 < 1 < 2}. Clever O(n)-size circuits can count the numbers a,b,c of 0s, 1s, and 2s in the input string x, respectively, and then assemble the correct output y = 0a 1b 2c. For the basic idea see Muller-Preparata, 1975, and various sources on the "Dutch National Flag Problem." Applying this counting idea to our poset B reduces our task to "nice" strings z of length N = 2k with exactly N/2 2s.

Theorem 3. If s(N)-size circuits DN can compute f(z) for "nice" z, then f has circuits of size at most s(4n) + O(n).

Proof. We can build O(n)-size circuits En that on inputs x of length n count b,c as above and find k such that m = 2k-1 is the least power of 2 above n. Make En(x) output z = x1m+c-n2m-c, which gives |z| = N < 4n. Then compute y¢ = DN(z) and re-use the computed b,c,m to pluck off the n bits of f(x). [¯]

This reduction to nice z enhances the "flow" metaphor. The m-many 2s in z can be advance-routed to the last m places of y¢, so the whole issue is how the m-many 0s and 1s in z flow together into the first m places of y¢. Must this flow progress (without loss of circuit-size generality) by "squeezing out 2s" in an intuitively plane-filling fashion, allowing "mileposts" whose forced spacing might mandate having n*log2(n) - O(n) gates? Or can linear-size networks rise above the planar view? No one I've asked has known, and lack of them frustrates a desired general linear-size circuit simulation of my "Block Move" model. Issues here may be involved. Nor do I know nicer descriptions of O(nlogn)-sized circuits than "use ancillas to tag bits of x and work in Px as in the proof of Theorem 1, employing ideas of Theorem 3 and/or mapping into the O(nlogn)-sized Ajtai-Komlos-Szemeredi networks." Those seeking an o(nlogn) upper bound may be my guest, but those believing a super-linear circuit lower bound must reflect that no such bounds are known for string functions whose graphs belong to NP or to E.
The above inductive definition of f yields a linear-time algorithm on any model that simulates each operation of a double-ended queue in O(1) time. But is booting a 2 to the rear in f(2x) = f(x)2 really in constant time, even amortized? True, our technical issues shrink away on passing from linear to polynomial time, so all this may seem to have nothing to do with P versus NP. But au-contraire the Baker-Gill-Solovay "oracle" obstacle may mean nothing more than that standard "diag-sim" and timing techniques are insensitive to internal information flow. The "Natural Proofs" obstacle may ultimately say only that network-preparation/"nonuniformity" is a subtly powerful consideration. Honing tools for information-flow analysis on incrementally more-general cases that yield super-linear lower bounds may be the walk to walk before trying to run.



File translated from TEX by TTH, version 3.77.
On 21 Jun 2007, 23:36.

Monday, July 09, 2007

`Its Huffman coded!' does make sense!

On the post Math Terms used in Real Life- Good or Bad I mentioned the following:
On 24, season two, there was a line `we can't break in, its been Huffman coded!' This makes no sense mathematically but it raises awareness of security issues.
I had thought that Huffman Codes are just used to compress data and had nothing to do with hiding information. I was wrong! Yakov Nekrich pointed out the following to me:
Actually Huffman codes can be difficult to break, see for instance this article: On breaking a Huffman code by Gillman, D.W. Mohtashemi, M. Rivest, R.L.
I'm curious- did the writers of 24 know this or not? I would guess no, and they just lucked out. Unless Hillman or Mohtashemi is moonlightening as a writer for 24 (I doubt Rivest needs the money.)

Thursday, July 05, 2007

A Review of THE KLEIN FOUR's CD

As a collector of Novelty songs and a math-person I was morally obligated to purchase Musical Fruitcake by The Klein Four, a band consisting of math grad students singing songs about math. They sing a cappella (without instruments). While you are not morally obligated to purchase their CD,you can. Or find out more about them (or go here for samples).

SO, how is their CD? I give each song a rating between 1 and 10, 10 being Excellent and 1 being unlistenable.

  1. Power of One: A love song that uses Math. Rather pleasant and clever. But the math is fairly easy. lyrics Rating: 8.
  2. Finite Simple Group of Order two: Their signature song, and their best known since its on You-Tube. Another love song that uses math, but much more sophisticated math. Better sung on the CD than on the video. lyrics Rating: 9
  3. Three Body Problem: Sung by a guy about losing his girl to another guy. Lots of Physics-Math involved. Touching. lyrics Rating: 7
  4. Just the four of us: Seems to be autobiographical and partially a Rap Satire. More fun for them than for me. lyrics Rating: 5
  5. Lemma: Lyrics are not online. Thats just as well. It sounds like its a song about liking a lemma- not funny enough for satire, not serious enough for--- how could a math song ever be serious? Rating: 4
  6. Calculating: The best song ever written about algebraic topology. lyrics Rating: 6
  7. XX Potential: Lyrics not online. About Women doing math (XX vs XY). Nice rythmes but not much math in the song. Rating: 6
  8. Confuse Me: About how confusing math can be. Mentions some math- mostly group theory. (A commenter corrected me on this- there is no group theory in this song. I was... confused.) lyrics Rating: 7
  9. Universal: Yet another love song that uses Math. The math used is intermediary between Power of One and Finite Simple Group. Tune is not catchy. Lyrics are as tedious as Category Theory. Lyrics not on line. Rating: 4
  10. Contradiction: Seems to be a guy singing about having lost his girlfriend. But its hard to tell- which is a problem. Also, no math except `contradiction'. Lyrics not on line. Rating: 4
  11. Mathematics Paradise: To the tune of Gangster Paradise by Coolio. Weird Al had the song Amish Paradise to that tune, and for a brief time Coolio was mad at him for that (they seem to have made up). I doubt Coolio has heard this album, but you never know. Anyway, this is the BEST song on the CD. Clever words, sung well (at least well enough). About the pain of being a 5th year grad student in math. Hopes, dreams, despair- its all there! lyrics Rating: 10
  12. Stefanie (The Ballad of Galois): Historically inaccurate, but kind of fun. Has a Country-Western Twang to it. Rating: 8
  13. Musical Fruitcake (Pass it Around) Mostly random words, but kind of interesting. Rating: 6
  14. Abandon Soap Mostly random words, but not so interesting. Title is like `abandons hope' Very short. Rating: 5

So, what is the final evaluation? I rate CD's by how many songs I really like. I like six of them which is very good. Based just on their Video I had written they shouldn't quit their day jobs- thought since they are grad students in math they probably don't have day jobs.. My current opinion is higher. Still, the math novelty song business is brutal- I wish them luck.

The number of times I've bought a CD because the artists had one really good song, and then found out that the one good song was there only good song is at least VDW(4,2). (Yes Arrogant Worms, singers of the brilliant CARROT JUICE IS MURDER but nothing else even half as good- I'm talking to YOU!).

As for other Math-novelty song- I'll have a post on that once I get a complete list of all that I know on this topic. Could take a while.

Tuesday, July 03, 2007

Collapsing degrees (Tribute to Mahaney)

Collapsing Degrees
Guest post by Stuart Kurtz and Jim Royer.

Bill Gasarch asked us to write an article about Collapsing Degrees, in the memory and honor of our coauthor, Steve Mahaney.

In 1986, Alan Selman and Steve Mahaney created the Structure in Complexity Conference, now the Conference on Computational Complexity. But in 1986, it was about structure, a term that Paul Young borrowed from computability theory, and which has passed into disuse, but in those days defined us.

The word structure embodied optimism about a particular approach to the P vs. NP problem—that its solution might be found in through exploring structural properties of sets and degrees. For example, Berman and Hartmanis had shown that if all NP-complete sets are paddable, then all NP-complete sets were isomorphic under polynomial time computable and invertable reductions, and hence P ≠ NP. Their result leveraged a structural property about specific sets (paddability) to a structural result about degrees (the complete polynomial time m-degree of NP consists of a single polynomial-time isomorphism result), to obtain a complexity-theoretic result.

That summer, after the conference, Steve visited us in Chicago, beginning a long and productive collaboration. We beat around the isomorphism conjecture for several days, until Steve mentioned that it wasn't even know that a collapse happened at any nontrivial degree. We smelled blood.

Relativization provided some guidance. Berman had proven that the EXP-complete degree consisted of a single 1-li degree. If P = NP, then 1-li degrees collapse. Of course, if P = NP, our rationale for interest in the Isomorphism Conjecture was mooted, and what we really cared about was the “true” P ≠ NP case.

Our main result from that summer was that collapsing degrees existed, without requiring an additional complexity-theoretic hypothesis. Our proof involved a finite-injury priority argument, and seemed to require it.

It was a joy and a privilege to have had Steve Mahaney as a colleague and friend. Until we meet again, peace.