Friday, August 29, 2008

The Ultimate Wallet

Once in a while a new product comes along, so well-designed, so cool, that I just have to get one. When I visited Yahoo earlier this month, two separate researchers had independently bought one and I instantly fell in love. I dropped some not so subtle hints to my family who got me one for my birthday and I am now a happy man. I'm talking of the All-Ett, a wallet replacement designed to remain thin even holding my many cards. My back pocket has never been more excited.

The All-Ett solves a problem I shouldn't have. I count 16 plastic cards currently in my wallet and that's only because I make a strong effort to keep the number down. We have the technology for me to hold a single card that can be loaded with the information of all of the other cards. Many universities, including Chicago and Northwestern, now have a single card that serve as ID, libary card, food service and other purposes. But there we have a centralized authority. It would drive the privacy advocates mad, but I'd love a national ID with both a smart chip and RFID and a mechanism that would let any other organization embed needed data on the card.

Why stop there? I shouldn't need a wallet at all. Every piece of plastic or paper in my wallet, including business cards, pictures of my kids and cash, could be handled by my all-purpose cell phone. It wouldn't be difficult to even have it open doors and allow me to start my car so I wouldn't need to carry keys either.

But, alas, those days are far away and I'll need to live with my many cards for a while. At least I have a decent place to store them now.

Thursday, August 28, 2008

If a Hobby relates to your work, is it a Hobby?

In yesterdays blog Lance mentions that he didn't see other math people or trade problems when he was on vacation, and that it wouldn't have been a vacation if he had.

This raises a question: Can you ever use a hobby in your work. I give a few examples below.
  1. Steve Skiena wrote a book Calculated Bets about mathematical modelling and betting in sports. This combined Steve's lifelong hobby of watching and betting on Jai-Alai with serious math and computer science of interest.
  2. Ken Regan's hobby is chess and he has some research there, as seen here.
  3. My hobby is collecting novelty songs and I've gotten some blogs out of that. (Is blogging part of my job?) Other bloggers also use their hobbies on their blogs-- Luca and photography comes to mind.
  4. Steve Rudich's hobby is magic, and I've seen him entertain at conferences. Some of the magic is based on math.
  5. Lance has commented on the show NUMB3RS in the blog, and I wrote a reviewed Season 1 my SIGACT NEWS column. (If I buy the DVD and review it again, is that tax deducbible?)

If your hobby is some game then there may be some math of interest surrounding the game that you could get involved with research on. If your hobby is collecting paintings then you may have a harder time connecting it up with your work. (Maybe Game Theory of Auctions.) Do you even want to connect up your hobby with your work? Hobby is almost defined as not work.

Since most of us (I assume) like math, the line between hobby and work may be hard to see. If I work on Math problems in my leisure time, is that a hobby or work? If one of them inspires a paper then does it cease being a hobby? How does Sudoko fit into this? If I actually use the ZK Sudoko protocol to convince someone that I solved it, is that work or play?

Wednesday, August 27, 2008

While I Was Gone

Just got back from vacation. Unlike Bill, I didn't see other math people and trade problems with them. Then again it wouldn't have been a vacation if I did.

As reported on a few other theory blogs, computational complexity hit a home run in NSF's new Expeditions Program. A dozen researchers in the New Jersey/New York area won an award for Understanding, Coping with and Benefiting from Intractability. In fact all four of the Expeditions grants has at least some theory connections. I'm not a fan of these big NSF programs but it's good to see the money going to our community.

The September issue of Scientific American focuses on privacy and Anna Lysyanskaya wrote an article on theory-based cryptography. Further Reading points to an old post Zero-Knowledge Sudoku. Thanks for the plug.

Finally Peter Lee writes about the growing theory group at CMU.

Tuesday, August 26, 2008

What to teach in grad course in Comm Comp- some non-theorists in it

The topic What should a basic graduate complexity course, whose audience is mostly nontheorists (say 15 non-theorists, 20 theorists) have in it has been the subject of a a prior blog. This question is relevant to many people since such courses are common.

Today I ask a question that may be of less relevance. Univ of MD's requirements are such that I will be teaching a course on Communication Complexity that non-theorists will be taking. The Complexity Theory course is not a prereq. I will be using the book Communication Complexity by Kushilevits and Nisan which will add 10 to their book sales this year (maybe less-depends on the used book marked). What should be in such a course? Here is what I am planning, though I am flexible and your answers may influence me.
  1. Overall themes: (1) Given a problem, what is its Comm. Comp.? (2) Applications to other models of computation.
  2. 2-party Comm Comp
    1. Deterministic: Rectangles, foolings sets. SAMPLE : EQ requires n+1 bits. Not hard to prove, but interesting that, unlike complexity theory, we can actually get real results here. Will look at other functions as well like IP (Inner Product), Not sure if I will do Rank lower bound. Powerful, but somewhat mysterious.
    2. Nondeterministic: covers, relation to deterministic. SAMPLE: NE in N(log n), so in this setting P\ne NP. Also, in this setting NP\cap coNP = P. (P is polylog bits)
    3. Randomized: private coin and public coin. SAMPLE: Randomized provably more powerful than deterministic, as EQ shows.
    4. Relation to Circuits: Lower bounds on monotone circuits for MATCHING and other problems.
    5. Upper and lower bounds on streaming algorithms
    6. Upper and lower bounds on the Cell Probe Model
  3. Multiparty Comp Complexity
    1. Multiparty Comp Complexity and Branching Programs and Ramsey Theory Go over Chandra-Furst-Lipton paper on this topic, but fill in all details and background (I'll have my own notes on this.) This is alot of material: Multiparty stuff, Ramsey Theory, Branching Programs. Will also do lower bounds on BP's that use 2-party (these results may imply the results with multiparty, but need to look at it more carefully). Will also do some other material on Branching Programs (might do Dave Barringtons result that NC^1 is in BP_5, but that may take us too far afield).
    2. Other applications of Multiparty comp. Complexity

Friday, August 22, 2008

A nice result, but not good for...



When people ask you if theory is practical you should give them a problem that
  1. People actually want to solve.
  2. The algorithm and lower bound for it are relevant at reasonable sizes of the input.
  3. The algorithm for it can be implemented and perhaps actually has been.
Is there a result that fails on all three? That is, is there a result that
  1. People do not care about solving.
  2. The best known algorithm/lower bound only apply when the size of the input is astronomical.
  3. The algorithm for it cannot be implemented.
I have one in mind. It is from an excellent paper by Alon and Azar: Finding an Approximate Maximum
  1. The problem is, given n numbers, find some number in the top half (that is max or 2nd largest or 3rd largest or ... or median). The model is parallel: we get to make n comparisons per round. The question is: How many rounds does it take?
  2. Let r(n) be the number of rounds. They proved log* n - 4 &le r(n) &le log* n + 2.
  3. The algorithm uses a certain type of expander graph. They prove this type exists by prob method, so the proof is nonconstructive.
I like this result alot! I like having very close upper and lower bounds and the proofs are nice. Gives a nice application of a simple type of expander graph! However, I would not use it to convince someone that theory is practical.

Wednesday, August 20, 2008

Ways Scott could have gotten his question answered

In a blog entry of Scott Aaronson's he asks an interesting question (he often does!) about boy-meets-girl type fiction- both film and literature. Later in comment 20 he asks another interesting question (he often has interesting questions in his comments!):
How could I have gotten efficient answers to my question other than blogging it. I don't know of any existing way to search books and movies by plot-structure other than by querying the the humans who've read or watched them. (Then again, this is probably not a common want.)
Rather than be comment 90 on Scotts blog, I share my thoughts on this question here.
  1. Scotts audience is, I assume, mostly CS/Math/Phy people. Not clear if thats really a good target group to ask about literature and film.
  2. Is there some other blog which does have the audience helpful to Scott? I honestly don't know, but that is one answer: Find someone who has a blog with a more appropriate audience and ask to guest blog. (And in exchange that blogger can guest blog on Scott's blog and ask about Quantum Complexity.)
  3. There used to be READNEWS GROUPS (actually there still are) where there is no head person who controls it, people just post stuff. They are not used so much anymore, but Scott could try to post on an appropriate one. (NOTE: thats how I got some of my Bob Dylan satires, by posting a request to rec.music.dementia)
  4. i> Find an expert. I wonder if Google or Google++ or whatever will ever replace asking someone who knows stuff. Scott should just ask Roger Ebert about film. Perhaps for money. Or they could barter since Roger Ebert has always wanted to know about Perfect Completness for QMA. Actually Ebert does have on his webpage a place for people to ask questions; however Scotts question is more complex than the usual How come you recommended movie X when it sucked?-type questions.
  5. Are there already experts on the web who will answer questions, either for free or for money or for barter?

Tuesday, August 19, 2008

Acknowledging anonymous blog comments

Twice times now I have gotten an anonymous comment on this blog that I may want to use in either a paper or my (never-ending) web-monograph on VDW stuff. They are
  1. Anonymous posted a combinatorial proof of a summation. See comment 6.
  2. Former VDW ugrad posted that the exact bound of polyvdw(x2,3)=29. See comment 11. I asked the obvious people, and they all deny they posted it.
If I use these proofs then I will reference the blog link (how long these links last?) and also give the full proof. I would also like to acknowledge the people who came up with those proofs. How to do this
I would like to thank Anonymous ....
and
I would like to thank Former VDW ugrad...
do not seem like I am really giving them credit. So, what to do? I make the following request:
If you are one of the two people above please email me who you are and which entry you posted.
Will this work? There are two concerns.
  1. That nobody will respond.
  2. That too many people will respond. How do they verify who they are?
Will this be a bigger problem in the future? If so then future textbooks may have
P vs NP was resolved by kittykat17.

Monday, August 18, 2008

Amihood Amir more famous than Dick Cheney

(Guest post by Richard Beigel.)

Top-9 list of internet fame criteria. You know you are famous when ...

    9. You show up on the first page of google hits for your full name (Richard Chang)
    8. You take up all 10 slots on the first page of google hits for your full name (Carolyn Gasarch)
    7. You show up on the first page of google hits for your last name (Lance Fortnow, Johnny Carson)
    6. You take up all 10 slots on the first page of google hits for your last name (Bill Gasarch)
    5. You show up on the first page of google hits for your full name ... even when it is misspelled (Richard Beigel)
    4. You show up on the first page of google hits for your job title (Richard Cheney)
    3*. You show up on the first page of google hits for your first name (Amihood Amir, Don Knuth, Lance Armstrong, Johnny Depp, Johnny Cash)
    2*. You show up on the first page of google hits for your initials (Richard M Stallman)
    1. You show up on the first page of google hits for your middle initial (George W Bush)

*It was hard deciding which of these two should come first, but Richard Stallman shows up under both and, surprisingly, Don E. Knuth doesn't show up under 2.

Disclaimer: These criteria are intended for the purpose of humor only. They do not represent the opininion of the Natiοnal Science Fοundation or the the Federal Gοvernment, and they will not affect your chances of getting a grant. ~

Friday, August 15, 2008

Olympic Markets

What do the Olympics have to do with computational complexity? Not much, so while I have spent much more time watching the games than proving theorems this week I couldn't think of the right way to fit it into the blog. Until I got the following email from David Pennock yesterday.
We implemented Olympics medal count prediction on Yoopick. Since it was your idea, you now have a moral obligation to blog about it, use it, and evangelize it to all your friends. :-)
Where most prediction markets track binary events, like whether the Obama will win the election, the Facebook-plugin Yoopick, developed at Yahoo Research in New York, is a fake-money market that predicts distributions over a range like the number of points scored in a basketball game. I suggested using Yoopick to predict the distribution of medals won by county and now you too can make your predictions and win some Yootles. I hear 100 Yootles and 2 Dollars will get you a subway ride in New York.

In other Prediction Market stuff at Yahoo, Sharad Goel used Amazon's Mechanical Turk to offer 100 people three cents each to predict the probability that Obama will win. The predictions were all over the place but average them up and you get exactly the same value as Intrade. Not the first time we've seen this phenomenon and it seems hard to explain.

And don't forget to check our our Electoral Markets Map. Obama has the slight edge as we get closer to the conventions and start of the real campaign season.

Thursday, August 14, 2008

AI Follows Theory

In one of the 2006 AAAI Outstanding Paper Award Winners Model Counting: A New Strategy for Obtaining Good Bounds, Gomes, Sabharwal and Selman show how to approximately count the number of satisfying assignments of a Boolean formula with a SAT solver. They add random parity constraints ala Valiant-Vazirani and approximate the number of solutions based on the number of constraints needed until the formula is not satisfiable.

Sounds like a neat idea that complexity theorists should have come up with in the 80's. And we did, where by "we" I mean Larry Stockmeyer in his 1985 SICOMP paper On Approximation Algorithms for #P. Stockmeyer uses random hash functions from Sipser but it is essentially the same algorithm and analysis as Gomes et. al.

Stockmeyer wrote a purely theoretical paper. Since then we have better algorithms and much faster computers and one can now solve satisfiability on non-trivial input lengths. Gomes et. al. write the paper as a practical technique to approximate the number of solutions and even implement their algorithm and contrast with other programs that exactly count the number of solutions.

Gomes et. al. mention Toda's paper on the hardness of exact counting but don't cite Stockmeyer or any other paper in the vast theory literature on approximate counting.

We complexity theorists know many other nifty things one can do with an NP oracle such as uniform sampling and learning circuits. Read them now so you don't have to reinvent them later.

Wednesday, August 13, 2008

Ridiculously hard proof of easy theorem

Justin Kruskal is a High School Student working with me on VDW stuff (of course). The following conversation happened recently.

BILL: Justin, you've seen a proof of VDW's theorem, but there are easier things you haven't seen and should. Can you prove that the number of primes is infinite?

JUSTIN: Thats easy.

BILL: Good. How does it go?

JUSTIN: By the Green-Tao Theorem there are arbitrarily long arithmetic sequences of primes. Hence there are an infinite number of primes.

What to make of this?
  1. Justin does not know how to proof the Green-Tao theorem (neither do I). However, if the proof does not use the fact that there are an infininte number of primes, then Justin's proof is valid. READERS: does anyone know, does it use the infinitude of the primes?
  2. Justin now knows the standard proof. However, he should also learn that, at the level of math where he is at, you should be able to prove everything you use.
  3. When does one start using theorems whose proofs one does not know? In research this is common. For basic coursework it should be rare.

Tuesday, August 12, 2008

Math Problems on vacation

SO, as mentioned in my last post, there were two other math-folks on the bus tour I was taking. So what did we do? Exchange problems to solve on the bus. We alternated.
  1. I asked them: if there are n couples in a resturant, and everyone sits either across from or next to their darling at a rectangular table (nobody sits at the ends) then how many ways can they be seated?
  2. They asked me: A marathon is 26.2 miles. A runner runs in such a way that during EVERY 1-mile interval he averages exactly 10 miles an hour. But his overall time is better than 10 miles an hour. How can this be?
  3. I asked them: Show that if no matter how your 3-color the numbers {1,...,2006} there will be two points, a square apart, the same color.
  4. They asked me: There is a bus where n people have assigned seats. The first person sits randomly instead of in his assigned seat. Henceforth, every person looks for his assigned seat, and if he does not find it, sits in a random seat. What is the prob that the last person sits in his assigned seat?
How did we do on these problems? They got my problems correctly. I basically got theirs (missed some points).

It was interesting coming up with problems that had not well known. I couldn't ask them truth-teller-and-liar problems or hats problems, since these are well known. Some of the problems above have appeared on this blog before. Not sure if that makes them well-known. At least they didn't know them. I tried asking the following problem that I thought was not so well known (I read it in American Math Monthly and told it to Peter Winkler two years ago-- he had not heard of it), but they had already heard it:

Here is a game: There are initially two piles of stones with a in one pile and b in the other. Every move a player removes a multiple of the smaller pile from the larger. If either pile has 0 in it then you cannot move. THe first player who can't move loses. For which (a,b) does player I have a winning stradegy.

Readers- I am not going to post solution. But you can in the comments!

Monday, August 11, 2008

What is the probability that ...

(I've been on vacation for the last 10 days on a Tauck Bus Tour of Canada with Heli-hiking.)

What is the probability that a bus tour has on it two people that know the proof that S2S is decidable? (Original Proof by Rabin is here. For reviews of several books on the topic see here.)
  1. Not a trick question. The bus tour was not organized by the Association of Symbolic Logic.
  2. The prob seems like it would be low. But this is not the right way to look at it.
  3. This did happen. Suzanne Zeitman was on the tour. I learned about the proof from her (excellent) writeup which, alas, is not online. It is incorporated in the book The Classical Decision Problem by Borger, Gradel, Gurevich.
  4. Should I be saying `WOW! that is so unlikely, yet it happend!'. No. Consider the following fictional conversation:

    BILL: The most amazing thing just happened! I just tossed a coin 40 times and got HHTTTHTHTHTHHTTTTHTHTHTTHTHHHTTHHTHTHHTH.

    LANCE: Why is that remarkable?

    BILL: Because the prob of that particular sequence is so small!
  5. The prob that someone the distance away from me which Suzanne Zeitman is (I have written some math reviews for her and been in some email contact) happens to be on the same bus trip as me may be low, but its not so low as to be astonished if it happens. This is my third bus trip and the first time it happened. Is the probably 1/3? I doubt that, but its not so low as to be notable.
  6. If before going on the trip I had said Gee, I wonder is someone who knows the proof that S2S is decidable will be on the tour? then THAT Would be notable.
  7. What is the prob that the maintainer of the Erdos-Number Website was, Jerry Grossman, was on the trip? This is a trick question-- Jerry Grossman is Suzanne Zeitman's husband.
  8. What is the prob that on the trip there were people who know, through their homeowners association, Steven Simpson an eminent logician who works on Reverse Mathematics, who I know. This did happen. Not a trick question, but again, not that notable.

Friday, August 08, 2008

Discounted Time

A write-up of some ideas I presented at the Complexity Conference Rump Session.


In computational complexity when we talk about time it usually represents a hard limit in the running time, solving the problem in time t(n). So we are happy, say, if we can solve the problem in one hour and miserable if it takes 61 minutes. But our real gradation of happiness over the running time is not so discontinuous.

Let's take an idea from how economists deal with time. They discount the utility by a factor of δ in each time step for some δ<1. What if we did the same for complexity?

Let δ = 1-ε for ε>0 and very small. Think ε about 10-12. We then discount the value of the solution by a factor δt for t steps of computation.

Discounted time gives us a continuous loss due to time. It has the nice property that the future looks like the past: The discount for t steps now is the same as the the discount for the t steps already taken.

When t is small, δt is about 1-εt, a linear decrease. For t large, δt is about e-εt, an exponential decrease.

We can also recover traditional complexity classes. DTIME(O(m(n)) is the set of languages such that for some constant c>0, δt>c for δ=(1-1/m(n)).

I'm not sure what to do with discounted time which is why this is a blog post instead of a FOCS paper.
Some ideas:
  • What does average case and expected time mean in the discounted time model?
  • What if you take the value of the solution of some approximation problem and discount it with the time taken? Can you determine the optimal point to stop?

Thursday, August 07, 2008

The Gaza Fulbright Story

A story that has gotten far less press than it should have.

Seven Palestinians in Gaza received Fulbright grants this year. In May the US State department cancelled the grants because Israel closed the border between Gaza and Israel and the State department was afraid they couldn't get them out. After some noise got made, Condoleeza Rice stepped in and got the grants reinstated. Israel let in four of the seven so they could go to the American consulate to get visas but denied the other three for security concerns. For the other three, Rice got the state department to send a team and equipment into Gaza to help the remaining students who eventually got visas on July 30th.

Sounds like a happy ending. Alas that's not the end of the story.

A few days later, Fidaa Abed, one of the three, flew from Jordan to Washington and when he landed he learned that his visa was no longer valid. He was put back on a plane to Jordan. The other two hadn't left yet but their visas were also canceled. The decision to revoke the visas came after the US State Department received more information, probably from Israel.

More from the New York Times and the BBC.

Wednesday, August 06, 2008

A Theory of Reductions?

A guest post by Jens Zumbraegel

I am working on cryptography, and I came across complexity theory only recently. My question is whether a general framework for the various types of reduction exists. Let me give motivation:

The concept of reduction in order to compare the computational hardness of algorithmic problems is a very important one. However many types of reductions are used in the literature for different purposes. Trying a rough classification:

  • "Classical" complexity theory: Karp/many-to-one or Cook reductions
  • Average-case complexity: Karp reductions with a domination condition between distributional problems
  • Cryptography: "reducibility arguments" in "provable security", i.e. ad-hoc reductions of the problem BREAKING-THE-CRYPTOSYSTEM to some well-known hard number theoretic problem, like FACTORING
I believe that a reduction theory for cryptographic purposes could simplify and structure many results in the area of provable security. Apparently there is not yet such a theory. One reason might be that although informally stated problems like "breaking the cryptosystem" have been formalized they do not fit into the framework of decision or search problems (they often involve oracle access for e.g. deciphering).

Back to my question - I wonder whether there is a general abstract theory of reductions, probably similar to category theory: One defines the class of algorithmic problems (the objects of the "category") and the type of reductions (the morphisms) one would like to consider. For example: (decision problems for languages in {0,1}*, Karp reductions). Such a general framework could help to set up a reduction theory for cryptography.

Comments most welcome!

Tuesday, August 05, 2008

A Simple Heuristic

When I started off as a professor, my wife worked at a company called Teradyne, which makes testing equipment, as a master scheduler. A master scheduler makes the master schedule of what jobs get run on which machines at what time. As you readers already know, almost every interesting variation of job scheduling is NP-complete. As I was teaching intro theory at the time, I thought about bringing her in to give a lecture on how people deal with NP-complete problems in the real world.

So I asked my wife what algorithms she uses to make up the schedule. She had a simple rule:

Whomever yells the loudest gets their job scheduled first.
Needless to say I didn't bring her into class.

Monday, August 04, 2008

Analysis in Complexity

A shout out to my colleagues who have gathered in Banff for the BIRS workshop on Analytic Tools in Computational Complexity.
An important development in the study of computational complexity has been increased role of analytic methods. Fourier analysis has become an essential tool of the field, playing a critical role in the study of interactive proofs, the computational hardness of approximation problems, and the learnability of Boolean functions. The notion of Gowers uniformity (which was introduced by Gowers to give an analytic proof of Szemeredi's theorem on arithmetic progressions, and whose use can be viewed as "generalized Fourier analysis") has also been recently employed in the context of Probabilistically Checkable Proofs and hardness of approximation. A new paradigm in computational complexity is beginning to emerge, which involves reducing high dimensional discrete problems that arise in the study of Boolean functions to high dimensional continuous problems and then applying analytic methods to the resulting continuous problems.
I started graduate work in complexity right before the algebraic revolution that drove Razborov-Smolensky's circuit lower bounds, Toda's Theorem on the power of the permanent, the power of interactive and probabilistically checkable proofs and much more. But now, two decades later, algebraic techniques are producing diminishing returns and we have seen a growth in using real analysis in complexity as highlighted by this workshop. Avi Wigderson is giving a two-hour survey on "The Power of Partial Derivatives." Hard to have imagined a connection between partial derivatives and complexity.

What about those of us who went into computational complexity because we enjoyed discrete math? Should an old dog like me try to learn new tricks? Ah, the challenge of keeping up with a field that moves in mysterious new ways.

Friday, August 01, 2008

Complexity Special Issue

Here are the results of the vote taken after the special issue debate at the Complexity conference business meeting. The conference committee decided to stay with the Springer journal Computational Complexity for three more years and revisit the issue in 2011.

For those not invited to the special issue: I'd love to see your papers in ToCT but in any case please do submit to any of our community's fine journals. A conference proceedings should not be your paper's final resting place.