Friday, May 29, 2009

The Alpha

"I'm not keen on the hype" says Stephen Wolfram, this from the man whom I once heard exclaim "First there was Euclid. Then there Gödel. Then there was Mathematica". Despite the quote, Mathematica doesn't really help you prove new theorems but I found it quite useful when trying a few simple cases, simplifying messy expressions, optimizing parameters and checking other people's proofs.

But then Wolfram wrote that book that Scott Aaronson read so I didn't have to, and had that silly Turing machine contest. Now he comes out with the modestly named WolframAlpha.

WA is not a search engine, it isn't sure what to do with computational complexity and functionality for P=NP is under development. Good luck with that.

On the other hand it has considerable built in data and calculations. Type in sunset Chicago and I'll get useful information that used to take multiple clicks on Google.

More importantly WA is Mathematica Lite. We have a site license for Mathematica at Northwestern but the convoluted process to download it dominates its value to me. But I can just simplify and maximize directly into WA. solve x=4y^2-7, y=x+5 gives clean solutions. I had a little trouble maximizing probabilities with max p(1-2p) but change p's to x's and everything works fine. It solves all my daughter's homework questions on solving quadratic equations and simplifying radicals (but she's smart enough to know not to use it).

So kudos to Wolfram for producing something useful and free. Hope he keeps it that way.

Thursday, May 28, 2009

Randomness in Madison

I'm at the University of Wisconsin for the second FRG Workshop on Algorithmic Randomness. FRG stands for "Focused Research Group", in our case a 12 PI NSF-funded project to study the properties or random reals, essentially Kolmogorov complexity on infinite strings. Most of the PIs are logicians from the computability theory community but a handful of us (Eric Allender, John Hitchcock, Jack Lutz and myself) have a computational complexity background. The workshop draws well beyond the PIs including foreign researchers and students.

Both the deep theory and connections of algorithmic randomness to other research areas (such as expert algorithms, extractors, measure theory and Hausdorff dimension) has grown tremendously in the past several years. The Kellogg business school at Northwestern just hired a new assistant professor Tai-Wei Hu from Penn State Econ whose job talk paper applies algorithmic randomness that he learned in a logic class from Steve Simpson to understand probability in repeated games.

I've only seen one talk here so far but it was a good one: Alexander Shen showed how to convert many basic questions about Kolmogorov complexity into who wins a certain kind of game.

Unfortunately because of teaching and STOC, I won't be at the workshop long. I've giving a short talk here tomorrow that doesn't directly relate to the topic at hand but kind of a cute not too difficult result: There is a computably enumerable, non-computable oracle A, such that PA-P has no computable sets. Ponder it and I'll post the proof someday when I get it written up.

Wednesday, May 27, 2009

Games from the Eco-Daughter

My 11-year old daughter took a some courses on computer gaming that used the Greenfoot system and wrote a eco-themed game Trees (requires Java, click "play" then press "enter" to start). It's no Ribbit but then again when I was 11 I hadn't even seen a computer in person, much less programmed one.

The Greenfoot system uses Java with some built in objects so my daughter with simple coding can create interesting games while learning the basics of object-oriented programming. Pretty cool. Earlier she tried out Alice but found it too much hand holding.

On the eco theme, my daughter had me watch The Story of Stuff, a twenty minute animation of how consumerism is literally destroying the world. When I tried to argue a few points about the movie with her, she shot back "That's the truth. If you don't like the truth, well neither do I. That's why we have to change it." That's my little Al Gore.

Tuesday, May 26, 2009

Oracle Results are Good For You

For a short time in the 70's, some believed that oracle results would lead to true independence results in computational complexity. Richard Lipton in his post I Hate Oracle Results bemoans the fact that oracles haven't lived up to this hype. But "Hate" is an awfully strong word to apply to one of the most powerful tools in computational complexity.

Let me illustrate by an example. Consider the following open problem:
Does P=NP imply P=PSPACE?
Here's an approach: Chandra, Kozen and Stockmeyer characterized PSPACE by alternating polynomial time and one could possibly use the P=NP assumption to eliminate those alternations one at a time.

This approach doesn't work and in fact no similar approach can work. How do I know that? Because of the following relativization result due to Ker-I Ko.
There is an oracle A such that PA=NPA≠PSPACEA.
If you are an oracle-naysayer, Ko's result shouldn't stop you from working on the problem. Let's try non-relativizing techniques. The tools of interactive proofs don't seem to work here. How about other non-relativizing techniques? We don't have any other non-relativizing techniques.  CKS is nearly 30 years old, Ko's oracle is 20 years old and this problem remains completely open.

There are hundreds of like problems which we cannot settle with non-relativizing techniques. If you feel oracle results no longer play a major role in computational complexity than we should have been able to settle scores of these problems. But in fact we have only twice proven a result in computational complexity when there was a previously known oracle going the other way, both directly related to interactive proofs.

I agree with Lipton that it would be nice to have a formal logical characterization of a "relativizing technique" but just because we don't doesn't mean oracle results don't properly guide our research efforts. More from my 1994 relativization manifesto that I still stand by today.

Monday, May 25, 2009

Raymond Smullyan's Birthday



CLYDE: I met Raymond Smullyan in the around 1985. He was old then. When did he pass away?

BILL: Lets look it up on Wikipedia. (He does.) OH, he didn't! He is still alive! And his birthday is May 25. He turns 90! (thats not 90 factorial). OH, I can post about it ON his birthday.

CLYDE: What will you say?

BILL: I will say that he wrote many popular books on self-reference which were quite good, but after a while somewhat repetitive. And that he his latest book, Logical Labyrinths came out this year, and claims to bridge the gap between the type of logic in his popular books and serious logic. WOW, looks like he is still active!

CLYDE: Isn't that a short post?


BILL: I'll find other stuff to put around the post to make it a bit longer.

Friday, May 22, 2009

Get Your Stimulus Money Here

"Are you taking full advantage of President Obama's stimulus package?" asks a recent robocall. So are you? If not the National Science Foundation has some ways to help.

The NSF gave the CRA $15 Million to start the Computer Innovation Fellows project, basically 60 CS postdoc positions for next year. A little late in the season but if you haven't nailed down a job for next year yet be sure and apply by June 9th. If you are willing to hire a postdoc paid by this program, sign up as a mentor.

Are you still in school? The NSF is greatly expanding the Graduate Research Fellowship Program and the Research Experiences for Undergrads

Already have an academic job? Apply for the CAREER award if you are eligible (CS deadline July 21), a regular theory proposal (August deadline for medium, November for large, December for small) or one of the many other CISE programs.

The NSF budget is on track to improve dramatically over the next decade but politics have and can get in the way. Be sure and take advantage now while you have the chance.

Thursday, May 21, 2009

Gödel Prize

This year's Gödel Prize has been awarded to two papers.
I consider Reingold's paper the second best result of the decade (after "Primes in P" which won the prize in 2006). Reingold settled a long-standing open question deterministically showing how to get from point A to point B in an undirected graph using memory equivalent to remembering a constant number of vertices.

The zig-zag paper developed a new construction of constant-degree expanders. We already had simpler expander constructions but the zig-zag technique gave us a recursive way to think about expanders that Reingold drew on for his paper.

This is the first Gödel prize for each of these authors and I'm especially happy to see Wigderson, perhaps the best complexity theorist of our generation, take the prize. Avi could have easily won it earlier for his work on zero-knowledge, derandomization, circuit complexity or many other topics.

Congrats to All!

Wednesday, May 20, 2009

Is FKS and other Data Structure Algorithms actually being used?

MEM PROBLEM: Let n,U be such that n < < U. We want to do the following: For all sets A &sube {1,...,U} such that |A|=n we want to store A in s(n,U) cells and be able to tell MEMBERSHIP in q(n,U) probes where a probe is just LOOKING at the contents of a cell. A cell may have O(log U) bits in it.

Fredman, Komlos, and Szemeredi showed (in this paper) that MEM PROBLEM can be done with s=O(n) and q=O(1).
  1. They actually showed something stronger, that you can get s=n+o(n) and q=O(1). But this does not concern us.
  2. The result with s=O(n) and q=O(1) is a simple algorithm with reasonable constants.
  3. Part of the algorithm involves showing that a randomly picked hash function works with nonzero probability; however, with a slight change in some parameters you can get that most hash functions work (of the type they use) work, so this is not an obstacle to actually using their algorithm.
  4. SO, here is my question/answer/new question:
    1. Is anyone actually using the algorithm?
    2. I would suspect NO because it does not allow for INSERT and DELETE. However, if you just INSERT or DELETE a few things I think it should be fine. So maybe YES.
    3. This paper does MEM in O(1), space O(n), and INSERT/DELETE in amortized expected time O(1).
    4. There has been alot of work on data structures with this model. The model seems reasonable, the problems seem like ones people in the real real world want to do quickly. SO--- are the data structures being used? If not, then are variants being used? This seems like an area where the gap between theory and practice is small and could be bridged. Has it been? If not why not?

Tuesday, May 19, 2009

Lessons from EC

We just posted the talk schedule for the 2009 ACM Electronic Commerce Conference. Early registration deadline is June 12th and hotel deadline is June 5th.

The EC conference covers a broad range of work connecting computer science and economics. While a large part of the EC community draws from the from the theoretical computer science community, it also draws from AI, computer-human interaction and economists. So the PC process works a little bit differently than in other conferences. Some thoughts.

PC co-Chairs: We had two PC chairs (Pearl Pu and myself) to spread the workload and add some diversity. The workload wasn't that much less but having different viewpoints definitely helped. I worried about what would happen if we had irreconcilable differences but we managed to work around our disagreements. I have nothing but great things to say about working with Pearl, but I would suggest future conferences avoid having PC co-chairs seven time zones apart.

Multilayered Committee: We had a dozen Senior PC members and a very large (about 90) PC members. We let the Senior PC members choose their PC members. Most papers were assigned to two senior PC members who submitted it to two PC members each, some of whom sent it out to outside reviewers. There was discussion between the PC member and sometimes the senior PC members before Pearl and I made the final decisions. The system seemed to work well for this diverse community.

Author Rebuttal: We sent the reviews to the authors before the final decisions and allowed them to give a 500-word rebuttal. Some papers were helped (or hurt) by the author's ability (or inability) to argue against the criticisms given by reviewers. I found the rebuttals quite useful but we did hear some complaints about the extra work required of the authors and the PC.

Would these ideas work for pure theory conferences? I suspect they would work best for the more diverse conferences like STOC/FOCS/ICALP where for any given paper, most of the PC do not have expertise in that paper's area.

Monday, May 18, 2009

Halt is Undecidable in Verse (YES, you've seen it before)

About a month ago I came across a proof that HALT is undecidable in verse! I thought what a great post that would make! Last week I tried to find it again but, alas, I could not. This link did not work on my school computer, but it did on my home computer. I am tempted to say thats odd but its not--- odd computer things that don't make sense are so common that they are no longer odd. Just to make sure I always had a copy I made my own copy which is here.

While searching the web for this poem I came across a posting about it by Lance here. However, the link there no longer works. (Though Lance may read this and fix it.)

If you find something on line that you like SAVE IT! It may go away. Especially if its a Dr. Suess Style poem See this post).

Friday, May 15, 2009

The Quadratic Formula

A few weeks ago my eighth-grade daughter learned about the quadratic formula for solving ax2+bx+c=0. She's impressed I can still rattle it off like a song "Negative bee plus or minus the square root of bee squared minus for-ay-see all over two-ay". She then asked the question I love to hear, "How did anyone come up with that formula?" So I decided to show her how to derive it but couldn't figure it out on the spot. How embarrassing! I did manage to figure it out later and showed her the next day.

Her textbook gives conditions to use the quadratic formula.
  1. b2-4ac≥0.
  2. a≠0.
The first because they haven't learned complex numbers in the eighth grade. How about the second? After all the solution for a=0 is well defined at x=-c/b (assuming b≠0). What happens when you take the limit as a goes to zero? For the square root with sign opposite of b the limit doesn't exist going to plus or minus infinity depending on whether you approach zero from above or below. But for the other square root after applying l'Hôpital's rule and greatly simplifying you do get -c/b. Possibly the most complicated way to solve bx+c=0.

Thursday, May 14, 2009

The Prime Number Theorem

Recall the Prime Number Theorem (PNT).
Let &pi(n) be the number of primes that are &le n. As n goes to infinity &pi(n) approaches n/ln(n).
Note that we did not say &theta(n/ln(n)) or n/ln(n) + &theta(1). We really said n/ln(n) (NOTE: Comments on this post have correctly refuted this comment about ``really said n/ln(n)'') However, I will have need to refer to a Weak PNT:
Let &pi(n) be the number of primes that are &le n. There exists constants A and B such that As n goes to infinity An/ln(n) &le &pi(n) &le Bn/ln(n).


In Hardy and Wright's treatement of PNT (and others) they prove the very badly named Bertrand's Postulate (BP) on the way to proving PNT.
(BP) for all n&ge 3 there is a prime between n and 2n.
  1. The first proofs of PNT used Complex Analysis and were considered to be not elementary. Erdos and Selberg (ind? not ind?- see this paper for the history) found elementary proofs that question the use of the word elemenatary since they were quite difficult. It is hard to pin down the term elementary since, Oliver Sudac (TCS, The Prime Number Theorem is PRA Provable, Vol 257, NOT Online) showed there is a proof in Primitive Recurive Arithmetic which is weaker than Peano Arithmetic. An interesting article on this which IS online is by Jeremy Avigad on all of this is at Number Theory and Elementary Arithmetic.
  2. Applications of PNT. Are there any? The Tao-Greene theorem (there are arb long arithmetic progressions of primes) uses it, though I suspect Weak PNT would suffice. QUESTION: Is PNT, not Weak PNT, ever actually needed?
  3. WEAK PNT has a much easier proof. Here are my notes on it. The constants in this presentation are reasonable.
  4. From the proof I present of the Weak PNT you can obtain that for large n there is a prime between and 3n.
  5. Bertrands Postulate is used in Computer Science sometimes to show that you can get hold of a prime. (E.g., the proof that EQ has Comm Complexity O(log n).) QUESTION Is full BP ever actually needed?
  6. Better results are known: Baker, Harmon, Pintz showed that, for large n, there is always a prime between n and n+n{0.525}. See their paper.
  7. I have not been able to find why Bertrand's Postulate is called that. History: Conjectured by Joseph Bertrand in 1845, proven by Chebyshev in 1850.

Wednesday, May 13, 2009

Shaving Logs with Unit Cost

A grad student asked me the following question about how to measure running times.
A paper in POPL 08 claims to have broken the long-standing n3 barrier on a problem in programming languages, bringing it down to n3/log n. The algorithm has a lot of details, but the main pillar it stands on is the so-called 'fast set' data structure.

This data structure is used to represent sets as bitmaps of its elements (so a set {1,2} over the domain 0-7 is represented as 01100000), and supports three operations, where one is important for my question: set difference. Set difference can be performed by going through the bits of each set. Here's the claim I have a question about: The author says, assuming an architecture with word size Θ(log n) bits, we can split the bitmaps of each set into (n/log n) chunks of size (log n) bits each, and since the word size is Θ(log n), these chunks will be stored in a word each, and operating on each word is a constant. Hence, set difference can be performed in Θ(n/log n) time. This is the whole gist of the algorithm to break the previous known barrier.

I was appalled by this argument, since my primitive techniques would easily 'prove' that set difference has a lower bound Ω(n), since you have to 'touch' each element of the set. I contacted the author, and he said assuming this word size is standard in algorithms literature, such as saying that depth-first search takes O(e+v) time ignores the log v time needed to spend to retrieve each vertex.

I talked to a few people in my department with varied responses, some said it is cheating since algorithms such as DFS do not depend on the logarithmic word size, it is just ignored; and some said it is probably a good argument.

The simple question is, what do you think as a complexity researcher? Is the complexity given a valid one? I am worried that you will say no, because a lot of people are citing this paper and everybody thinks the n3 barrier for that problem has been broken. I am also worried that you will say yes, as a person not particularly in the complexity field, but thinks he has a decent knowledge and sense of algorithms.

I remember complaining as a student that sorting actually takes O(n log2 n) time since one needs O(log n) time to do a comparison but my complaints fell on deaf ears.

In algorithms you use the unit-cost RAM model where basic register operations over O(log n) bit registers count as a single computation step. There are some good arguments for this: As technology improves for us to handle larger input sizes, the size of the registers tend to increase as well. For example, registers have grown from 8 to 64 bits on microprocessors over the past few decades.

You can do set difference of two register bitmaps A and B by the bitwise AND of A with the bitwise complement of B, all standard register operations. So the author has a legitimate O(n3/log n) time algorithm in the standard algorithmic unit-cost model.

I understand the student's frustration. The algorithm seems like a cheat and would not be in DTIME(O(n3/log n)) on a multi-tape Turing machine as I teach it in an introductory complexity course. But as an algorithmic result on the standard algorithmic model, the author stands on solid ground.

Tuesday, May 12, 2009

My Obligatory Star Trek Post

Hard to believe I have yet to make a Star Trek post in this weblog. Bill had a guest post The End of an Era? back in 2005 back when Enterprise ended its run. The new movie starts another Star Trek era. I saw it Saturday and my three-word review: go see it.

I was but a toddler when the original series premiered but by the time I was in middle school Star Trek fever had taken over. The local TV station showed Star Trek nearly every day and I watched as often as I could (back in the era before Hulu, Tivos, DVDs, VCRs or even remote controls). I memorized the episodes, read the novelizations, tried to catch the animated episodes though time conflicts usually saved me from that disaster. This was the show young nerdy teenagers talked about. 

I had great fun seeing the first movie just to see the cast back together. Caught Wrath of Khan on its first showing in New Jersey at a surprisingly empty theater. Ricardo Montalban was one of those actors on the original series who went on to greater fame, in this case on Fantasy Island. Great to see him reprise his role on Star Trek in the best movie with the original cast. 

In grad school we had a party for the first episode of Next Generation. I watched most of that series as well as Deep Space Nine and Voyager before giving up on Enterprise well before it ended its run. In 2005 Star Trek seemed to have run its course.

J.J. Abrams took Star Trek back to its roots, recasting the original cast and rebooting the series. I forgive the various issues of canon for a very enjoyable movie and the return of the most important Sci-Fi franchise of my generation. 

Monday, May 11, 2009

Requst for Info on Prob Hypergraphs

A grad student recently asked me for some refs on probabilistic hypergraphs. I didn't know any, but I suggested we harness the power of the internet and of the blog. Here are his questions. If you know any refs please comment.

(Here are his questions.)

I have a problem that deals with coalitions. The basic idea is to have a hypergraph with a bunch of nodes representing actors and edges representing coalitions. Such a hypergraph would have the following properties:
  1. non-regular (edges can have any number of nodes, and different edges in the same graph can have different numbers of nodes)
  2. down-set (the existence of an edge implies the existence of all edges that are a subset of that edge)
  3. finite
  4. probabilistic EDGES (I had found previous work on probabilistic nodes, but not for edges)
Does anyone know of any work on these types of hypergraphs? ~

Saturday, May 09, 2009

New URL Same Blog

The Computational Complexity weblog has a new URL blog.computationalcomplexity.org. This change should increase reliability of posting, commenting and loading the blog.

All the old weblog.fortnow.com URLs will now forward to the appropriate new URL (or go back to the original files if needed) so you don't have to change your bookmarks, links or feeds. You can access the old files at oldblog.computationalcomplexity.org but that site will no longer be updated.

The URLs computationalcomplexity.org and www.computationalcomplexity.org will continue to point to the Conference on Computational Complexity so you still can easily register for the upcoming conference in Paris.

Friday, May 08, 2009

Doctor for Two Decades

I measure academic age by years since Ph.D. and by that measure I just turned twenty. I passed my Ph.D. defense on May 5, 1989 (though technically I didn't graduate until June).

20 years. Wow. 20 years before my Ph.D. we didn't have a P vs. NP problem. Does the PCP theorem seem as ancient to today's grad students as Cook's theorem was to me?

In those 20 years we had incredible advances in 
  • Complexity of approximations which includes developments of interactive proofs, probabilistically checkable proofs, the parallel repetition theorem, semi-definite programs, and unique games.
  • Great advances in coding theory in particular list-decoding codes.
  • Tight connections between circuit hardness and derandomization. And we don't need random coins anymore for primality. 
  • Explicit constructions of extractors, expanders and various other combinatorial objects with SL=L coming out of this theory.
  • Quantum computing.
  • Proof Complexity.
  • Cryptography. Almost anything you can imagine you can implement cryptographically, at least in theory.
  • The applications of computation theory to economic theory and vice versa.
  • Amazing connections between all of the above.
And I got to watch it all in real time.

I've seen the Internet go from E-mail to Twitter. I've seen fax machines and CDs go from amazing technologies to has beens. And computers got much faster and smaller. I no longer have time for a restroom break when I LaTeX a paper.

Technology that hasn't changed much: Transportation. I still get from point A to point B the same way I did twenty years ago though now without being fed. 

What will the next twenty years bring? Many great theorems. Definitely. Multi-core computers with millions of cores. Probably. Large-scale quantum computers. Doubtful. A proof that P≠NP. Don't count on it.

Thursday, May 07, 2009

New York THEORY DAY 2009


Dear Organizers of IBM Research|NYU|Columbia Theory Day,

You emailed me the ad for Spring 2009 Theory Day. THANKS!
You did this about a week ago. Hence, as is often the
case this is too much short notice to go.
Please announce it earlier in the future.

Also, I could not find on your website a pointer to the
abstracts of this years talks (If I missed it then please
let me know.  Other readers, also let me know.)

And lastly, your page on Recent and Upcoming Theory Days
should be retitled
Theory Days from 2003-2007 or be updated.

                                        bill gasarch

P.S. The talks look AWESOME!


                 The IBM Research|NYU|Columbia Theory Day
                         Friday, May 22, 2009


The Theory Day will be held at the Davis auditorium,
412 Schapiro (CEPSR) building, Columbia University, New York.

                                  Program

9:30   - 10:00    Coffee and bagels

10:00  - 10:55    Dr. Dana Moshkovitz (IAS)
                  Two Query PCP with Sub-Constant Error

10:55  - 11:05    Short break

11:05  - 12:00    Dr. Craig Gentry (IBM Research)
                  Fully Homomorphic Encryption Using Ideal Lattices

12:00  -  2:00    Lunch break

 2:00  -  2:55    Dr. Mark Braverman (Microsoft Research)
                  Approximating bounded depth circuits with polynomials

 2:55  -  3:15    Coffee break

 3:15  -  4:10    Dr. Muthu Muthukrishnan 
                  (Google Labs and Rugers University)
                  3 Problems in Internet Ad Systems

For directions, please see DIRECTIONS

To subscribe to their mailing list, follow instructions at HERE

Organizers:
Yevgeniy Dodis   dodis@cs.nyu.edu
Rocco Servedio   rocco@cs.columbia.edu
Tal Rabin        talr@us.ibm.com
Baruch Schieber  sbar@us.ibm.com


Wednesday, May 06, 2009

My Kindle

As an MIT alum I have an automatic subscription to Technology Review, a pretty nice perk. Wrapped around the current issue was a note that I could trade my printed issue for access to the digital issue, with more content and earlier availability. I'll keep the printed issue because I can't read documents on the screen for long periods of time. I get a complementary digital issue of Scientific American by being part of their blog network but end up just skimming it and printing out any articles I really want to read.

But I found I can read for long periods from the Kindle which I got when Amazon released their second edition a few months ago. I've fallen in love with the device. I cancelled my paper subscription to the New York Times and read it now on the Kindle and have downloaded and read several books on the device. Often I'll send text documents to the Kindle for reading. The Kindle has resparked my interest in reading and I love many of the features including automatic bookmarking and easy dictionary look up where before I rarely put in the effort.

I saw recently that the Kindle attracts a much higher age demographic than most new electronic devices. My guess is that us older people have never gotten used to reading off a computer screen and are willing to spend money on a device that just lets you read.

But the Kindle doesn't work for much of my reading—mathematical research papers. The Kindle doesn't do formatting well, just converting PDFs to text. 

Today Amazon will announce the new Kindle DX with a larger screen, true PDF formatter and supposedly textbook friendly and thus I assume math friendly. Imagine for example putting conference proceedings on a Kindle-like device instead of having a heavy book or proceedings one has to use a laptop to look at. Or putting all the papers you need to referee/review on such a device. Would that make it more or less likely you will get the reviews in on time?

But how do I justify getting a new Kindle when I already own one? Sometimes being slightly ahead of the technology curve works against you.

Tuesday, May 05, 2009

My Publications

I spent some time this weekend updating my publications page. I did well in the summer conference season: An ICALP paper and two each in Complexity and TARK. Not to mention my first ever Algorithmica paper was just accepted for publication.

My proudest 2009 publication: The Complexity of Forecast Testing written with Rakesh Vohra that appeared in the January issue of Econometrica, generally considered the top journal in economic theory. Consider the problem of testing a forecaster (like a weatherperson) who give a probability each day of an even that happens the next day. Alvaro Sandroni proved a strong negative result: Suppose a tester T makes decisions in a finite number of steps and passes (with high probability) any forecaster that correctly predicts nature. Then there is a probabilistic forecaster that will, with high probability, pass test T without any knowledge of nature, even if nature is adversarial.

Sandroni's forecaster requires solving exponential-sized linear programs. Vohra and I showed that Sandroni's theorem doesn't hold if we limit both the tester and forecaster to be efficiently computable (under generally believed hardness assumptions). We created efficiently computable testers so that for some distributions for nature, a forecaster would have to factor numbers or solve PSPACE-hard problems to pass the test.

This paper represents a large part of my current research: Applying the ideas and tools of computational complexity to economic theory, in particular seeing what happens in a wide variety of economic models with when we require the agents to be computationally efficient. Given the success of this paper (and others), perhaps some economists are beginning to realize the importance of capturing computation costs in their models.

Monday, May 04, 2009

A new way to educate the public

How well known is the concept of the Turing Test? Readers of this blog know what it is. On Google it gets 2,360,000 hits. It has a Wikipedia entry: here. None of these mean that the public knows what it is.

I didn't think it was well known. However, in Roger Ebert's review of the movie X-men Origins: Wolverine he titles the review
A monosyllabic superhero who wouldn't pass the Turing Test.
In the review he never mentions the Turing Test or what it means. Does he expect his readers to know? Do they? Does the general public know what the Turing Test is? Will readers of the Ebert's column go to Wikipedia to find out? Might this be a way to educate people?

CHALLENGE TO MY READERS: find a movie whose review could illustrate a computer science or math concept. Here is one: The Usual Suspects:
As complicated an enjoyable as the classical proof of van der Waerden's theorem.