Friday, October 31, 2008

Focs Wrap Up

(Guest post from Rahul Santhanam)

As promised, I did do a bit of experimenting this FOCS in terms of attending talks from other areas. Here's an encapsulation of the results:

Cryptography - Yevgeny Vahlis spoke about his negative result with Boneh, Papakonstantinou, Rackoff and Waters on black-box constructions of identity-based cryptosystems (cryptosystems in which an arbitrary string can be used as a public key) from trapdoor permutations. ID-based cryptosystems have been extensively studied recently in work of Dan Boneh and others. Security of known constructions is proved in the random oracle model, and it would be interesting to base security on the same assumptions as those used in standard public-key cryptography. This result indicates this might be too much to hope for, at least using standard techniques.

Learning - Adam Klivans gave an excellent talk on "Learning Geometric Concepts via Gaussian Surface Area" (with O'Donnell and Servedio). There's been a lot of interesting work recently on learning in the presence of random or adversarial noise. It's known that functions with low noise sensitivity, or even with low Gaussian noise sensitivity, can be learned efficiently in this setting, but these quantities are hard to bound in general. The current work resolves some open problems about learnability of geometric concept classes by using some deep mathematical results relating Gaussian noise sensitivity to Gaussian surface area, and working with the latter more tractable quantity.

Games with Entangled Provers - I went to a couple of talks on another currently hot topic: the power of two-prover games with entangled provers. A semi-definite programming formulation due to Tsirelson can be used to show that the optimal value can be computed efficiently for a very simple form of 2-prover games, known as XOR games. The first talk that I attended was given by Thomas Vidick on "Entangled Games are Hard to Approximate" (with Kempe and Kobayashi and Matsumoto and Toner) - he showed a reduction from the problem of approximating the value of 2-prover games for classical provers (known to be NP-complete) to the problem of approximating the value of 3-prover games for entangled provers. A complementary talk by Ben Toner on "Unique Games with Entangled Provers are Easy" (with Kempe and Regev), showed that for unique games, the optimal value can actually be approximated efficiently using "quantum rounding" to a semi-definite program. The most interesting open problem in this area seems to be to derive some sort of upper bound on the complexity of approximating the value for general games.

Algorithms - I heard Dimitris Achlioptas speak on "Algorithmic Barriers from Phase Transitions" (with my colleague Amin Coja-Oghlan). This paper tries to understand the reason why simple algorithms for constraint satisfaction problems on random instances fail in the region around the threshold, by showing that this "region of failure" coincides with the region where the structure of the solution space changes from being connected to being (essentially) an error-correcting code. Rather intriguing that a computational fact, the success or failure of standard algorithms, is so closely related to a combinatorial artifact. Finally, I went to Grant Schoenebeck's talk on proving lower bounds for the Lasserre hierarchy of semi-definite programming formulations of CSP problems, which is interesting because a number of different algorithmic techniques including the Arora-Rao-Vazirani formulation of Sparsest-Cut can be implemented in the lower levels of the hierarchy. Grant's result uses a connection to the width complexity of resolution, which I found very surprising, but then my understanding of this area is rather naive...

All interesting stuff, but I'd actually prefer FOCS to be even broader, and representative of all areas of theory. If that requires multiple sessions, so be it. Currently, FOCS seems to be an algorithms, complexity and combinatorics conference with papers from other areas filtering in unpredictably depending on the composition of the committee. It's very creditable that the standard of papers has remained consistently high (or even improved) over the years. But with several major sub-areas (semantics and verification, concurrency theory, database theory, theory of distributed computing, computational geometry, computational biology etc.) represented barely or not at all, it's quite hard to still make the case that FOCS and STOC are absolutely central to theoretical computer science as a whole. I guess the question is how much we value FOCS and STOC being core conferences?

Thursday, October 30, 2008

Election Special

A hodgepodge of election stuff, in my last post before November 4.

The markets and polls both suggest that the election is not that close. There's no question who will win in my (and Obama's) home state. The senate race here is even less interesting with the Republicans not even bothering to mount a serious campaign against the incumbent and Majority Whip, Dick Durbin. At least the house race in my district is tight with the incumbent, Mark Kirk, a moderate republican running against Dan Seals, an up and coming African-American. Sounds familiar.

Before voting, check out the answers Obama and McCain both gave to Science Debate 2008. Keep in mind that science will likely get a back seat given the tight budget constaints that we will have in the next several years. McCain's promise to freeze most federal spending will be particularly bad for science.

In the category of "Good Searching means Less Privacy" comes the Contributions Search Engine on the New York Times site. Type in a relatively unique name like "Fortnow" and you'll discover my $255 donation to the Obama campaign. In full disclosure, I also donated$100 to Obama in the primaries.

So why $255? Did I really mean to donate the maximum I could in one unsigned byte? Actually$255 = $1000/4 +$5, with the $5 coming from one of my daughters who wanted to add her own contribution. I still live in a base-ten world. Wednesday, October 29, 2008 Math NAMES (A quick note: Howard Karloff is organizing an excursion to the South Pacific during SODA. Sign up by November 3.) And now for todays post: What do the names Pascal, Hilbert and Mittag-Leffler have in common, aside from all being names of math folks? I have AN answer in mind. You may very well give me other ones. I'll be curious if anyone give me the one I had in mind. Tuesday, October 28, 2008 The Gossip Center Rahul Santhanam continues to spread the word from FOCS Conferences are not really about the talks, they're about socializing and gossip. Who's having an affair with whom? Who's been drinking too much? It's for answers to questions like these for which we brave jet lag, find others to teach classes for us, endure conference food&hellip. Well, not quite. Poets and artists might gossip about such things, but scientists are on a higher plane, of course; we're beings of pure reason, are we not? The questions that concern us tend to be more of the form "Who's deserted University X for University Y?" & "Who's been spending weeks holed up in his room thinking about Conjecture Z?". And while I'm sure there are those of you who want to know more about the various research addictions triggered off by Razborov's proof of Bazzi's theorem, jobs news is probably of more general interest. So here's what I've learned in the past week: Emanuele Viola → Northeastern Andrej Bogdanov → Chinese University of Hong Kong Nisheeth Vishnoi → LRI, Paris Neeraj Kayal → Microsoft Research, Bangalore Brent Waters → University of Texas, Austin Shang Hua-Teng → University of Southern California, Los Angeles [Lance's Note: Rahul Santhanam → Edinburgh] I rely on commenters to make good the omissions. It certainly seems as if there's been more movement than usual on the job scene since STOC. The grad students and postdocs I've talked to seem pretty apprehensive about the market for next year, and with fair reason, I think. The market hasn't been that great for theory in the past couple of years, in any case, and the financial crisis seems likely to lead to funding cuts and more hiring freezes. Perhaps there is some cause for optimism in the increasing number of postdocs available? The improvement in the NSF situation in the past couple of years means that more faculty in North America are able to hire postdocs. The emergence of Microsoft Research, Cambridge and the new Center for Computational Intractability in Princeton certainly won't hurt. But while having more postdocs around is good for our field, it might not be such a good thing from the point of view of the postdocs themselves. First, there is the intrinsic transience of the position - I had very good research environments in my postdocs at Simon Fraser and Toronto, but I never escaped the feeling of being in Purgatory. Second, there's the fact that job applications and interviews are very time consuming - it's hard to be productive when you know your entire future career might depend on how well you can advertise your research. And do we really want theoretical computer science to become like theoretical physics, where it's normal for graduating students to expect a postdoc apprenticeship of 6-7 years before they can find a permanent job? For theorists not intent on getting a job in North America, the situation might be a little better. There are increasing opportunities in Europe and especially in Asia, as recent job news indicates. In general, the best attitude might be to be realistic about your prospects and to use the competitiveness of the market as motivation for your research. Monday, October 27, 2008 Post-Lunch Highlights, Day 1 More from Rahul at FOCS I realize I could quite easily spend more time writing about the talks then going to them, so I'll keep it shorter • Gil Kalai gave a vastly entertaining talk on "Elections can be Manipulated Often" (Kalai and Friedgut and Nisan). Clearly a topic of current relevance, but Gil bravely resisted making an Obama-McCain reference. The main result of the paper essentially says that the cumulative manipulation power (meaning their capacity to bring about their preferred global outcome by voting in a way that's different from their actual ranking of the candidates) in any "reasonable" election (where the voting scheme is far from yielding a dictatorship, and is insensitive to the names of the candidates) is invariably high, in a setting where voter preferences are modelled by random choices. The main open question from the talk was to prove that the current financial crisis was unavoidable. • Mihai Patrascu gave a talk on his paper "Succincter", which won the Machtey best student paper award. His paper greatly advances the state of the art in the field of succinct data structures. For instance, the paper shows how to store N "trits" (numbers that are either 0,1 or 2) in N log(3) + O(1) bits of memory with only logarithmic query time. The proof uses a clever recursive encoding scheme. • Dana Moshkovitz spoke about her Best Paper-winning work with Ran Raz on constructing sub-constant error 2-query PCPs of nearly linear size. This has been a major open problem for quite a while now, and the resolution of this open problem yields improvements to a large family of hardness of approximation results. The proof builds on earlier work by Moshkovitz and Raz constructing sub-constant error low degree tests of nearly linear size. There's been precious little controversy at this conference, so I feel duty-bound to stir something up. There's an asymmetry between the two seminar rooms – one is considerably larger, wider and has better acoustics than the other. Is it a coincidence that the more complexity-oriented talks in Session B are in the smaller room, and the algorithms-oriented Session A talks in the larger not? I think not. This is merely the latest instance in the step-sisterly treatment accorded to complexity theory down the ages, right from when Euclid designed his algorithm but failed to analyze its complexity. I urge all right-minded readers to express their outrage at this state of affairs. More seriously, can't think of a hitch; even lunch yesterday was good. Kudos to Sanjeev Khanna, Sudipto Guha, Sampath Kannan and the student volunteers. Business Meeting Another FOCS report from Rahul The business meeting was compered by Paul Beame. 1. The local chair Sanjeev Khanna spoke. 270 registered this year, as opposed to 250 last year and 220-230 the year before that. Excelsior! 2. The P.C. report was given by the P.C. chair R.Ravi. Some salient features 1. 349: not the number of reviewed papers, but rather the number of external reviewers 2. June 13, 2008: the Night of the Long Knives, when as many as 150 papers had their hopes quashed. 3. "Succincter": In an unprecedented double, Mihai's Best Paper Title awardee also wins the Machtey Best Student Paper award 4. Dana Moshkovitz and Ran Raz win best paper for "Two-query PCP with Sub-Constant Error" 3. Milena Mihail gave a presentation on FOCS 2009, which will be held in Atlanta from November 8-10 next year. 4. Volunteers were requested for hosting FOCS 2010. Three hands shot up instantly, and after a long and contentious debate culminating in a beer-bottle fight... Nah. 5. We all got to vote nine days before the big day. David Shmoys was elected vice-chair of the IEEE Technical Committee on Mathematical Foundations of Computing. 6. A couple of major Prize announcements were made; no suspense involved for readers of this blog. Volker Strassen is the recipient of the Knuth Prize, and Spielman and Teng have won the Godel Prize for their paper on smoothed analysis of linear programming. 7. NSF Report by Sampath Kannan, who's a Division Director in the CCF (Computing and Communications Foundations) program. We theorists seem to have a nice program of our own now directly under CCF called Algorithmic Foundations, which covers most of the traditional STOC/FOCS areas, and has a budget over$30 million for this year. Grant deadlines coming up pretty soon actually: for the Large grants on October 31, for the Medium grants shortly afterward in November, and for the Small ones in December. There was also some information on relevant cross-cutting funding opportunities.
8. STOC 2009 will be held May 31- June 2 in Bethesda, Maryland. The submission server is already active. Title and short abstract are due November 10, extended abstract is due November 17.
9. Pierre Mckenzie recapitulated CCC 2008 in 30 seconds, and then announced that CCC 2009 would be held in Paris. Paris, France, as a matter of fact; excited murmurs from the audience. For a brief moment there complexity theory was cool.
10. David Johnson announced that SODA 2009 would be held January 4 - January 6 in New York, and that Volker Strassen would be giving his Knuth Prize lecture there.
11. Miscellaneous announcements, including a long-overdue archiving of old FOCS proceedings in IEEE Xplore ( I believe the only ones left are '60,'61, '63 and '67) and news about the establishment of the Center for Computational Intractability in Princeton, funded by the NSF/CISE Expeditions grant announced earlier on this blog.
That was one of the duller business meetings in recent memory. Which is a good thing, of course – it means we're all happy and getting along.

Sunday, October 26, 2008

Pre-Lunch Session

More from Rahul

A typical FOCS day divides up the same way as a day in a cricket test match: pre-lunch session, afternoon session and post-tea session. And approximately the same fraction of the audience is awake and/or paying attention?

I set myself the goal of attending the first talk of the day, which naturally meant that I stumbled in around halfway through Manindra Agrawal's talk on "Arithmetic Circuits: A Chasm at Depth Four" (Agrawal and Vinay). This is one of my favorite papers in the conference: the result is surprising, easy to state and not hard to prove. The authors show that an exponential lower bound on arithmetic circuit size for depth-4 circuits actually implies an exponential lower bound on circuit size for general arithmetic circuits. There are precedents for this kind of depth reduction for arithmetic circuits (unlike for Boolean circuits), eg. the Berkowitz-Skyum-Valiant-Rackoff result that polynomial size circuits can be simulated by polylogarithmic-depth circuits, but the nice thing about this result is that it gives an explanation for why most recent progress on arithmetic circuit lower bounds has been in the very low depth regime. Manindra's talk was marred a little by technical glitches, but he has an unparalleled ability to stay cool in a crisis.

Then came an excellent talk on Madhur Tulsiani on "Dense Subsets of Pseudorandom Sets" (Reingold, Trevisan, Tulsiani and Vadhan). This is a topic that Luca has said a lot about in his blog, and I certainly can't hope to better that. The final talk in the session was by Timothy Chow on "Almost-natural Proofs", about how the natural proofs framework of Razborov and Rudich ceases to be a barrier to proving circuit lower bounds if it is relaxed slightly. The talk was good, but I was even more intrigued by the speaker. He is unusually wide-ranging in his choice of problems to work on - he comments quite often on the FOM mailing list, and I know he's worked among other things on logics for polynomial time. It's good for our heavily algorithms-and-complexity centered community to have people from outside the mainstream working on our problems.

You'll notice that most of the talks discussed so far are complexity-related - old habits die hard. But I did try compensating by going to Mihai Patrascu's talk on "Dynamic Connectivity: Connecting to Networks and Geometry" (Chan, Patrascu and Roditty), about data structures for dynamic connectivity in graphs where both vertex and edge updates are allowed. There did seem to be some elegant ideas involved, but the talk was geared for an audience which knew something already about dynamic data structures, and I didn't follow the details. This reminds me of why I don't tend to go to talks too far afield of areas I work in - presenters do tend to take certain things for granted in their audience, and it makes sense for them to do so, since most members of the audience probably have a reasonable background in the subject of the talk. Also, it wasn't the most enthusiastic of talks, but I am looking forward to Mihai's talks on "Succincter" and "(Data) Structures", both of which concern very natural questions that are interesting even to non-specialists in data structures.

Next for me was another algorithmic talk: Chris Umans on "Fast Modular Composition in any Characteristic" (Kedlaya and Umans), which showed that the composition of two polynomials f and g of degree n modulo a third polynomial of the same degree can be computed very efficiently, in time that's nearly linear in n. The proof goes through an earlier reduction by Chris of the problem to simultaneous evaluation of multilinear polynomials on several points; the new contribution is a clever solution to this simultaneous evaluation problem through recursively applying Chinese remaindering. The talk, as is usual with Chris, was crisp and well-organized.

That concluded my pre-lunch session – I wouldn't have been alert enough to do justice to the remaining talks. Stay tuned for a report on the rest of the day's play.

Sampling FOCS

I'm in Philly for FOCS, and will be reporting on the extravaganza for this blog, including the drama and suspense of the talks ("Will there be heckling?" "Will he make an Obama joke?" "Will she break the 20-minute barrier?") and the drunken revelry of the business meeting. Typically, I have a few simple rules for attending talks:

1. No conflicts with my regular 4 am - noon bedtime.
2. Attending a double-digit number of talks is just as good as attending no talks at all.
3. The title better have the word "complexity" in it.
But with my reporting duties as an excuse, I've resolved to discard my old inertial lifestyle and tax my brain with the new and unfamiliar. Facility location, electronic commerce, quantum entangled games – bring it on!

You could help me with this. If you're a theory junkie who couldn't make it here and there's some talk you really wanted to hear, let me know and I might be able to give you some idea – the barest idea, or at the very least a fictional idea, of what it was like to be there…

Saturday, October 25, 2008

Knuth Prize to Strassen

The ACM announced that Volker Strassen will receive the Knuth Prize for his algorithmic work, most notably on primality, matrix multiplication and integer multiplication. The Knuth Prize is given for major contributions over an extended period of time. The winner gives a Knuth Prize lecture and Strassen will give his at SODA.

Strassen matrix multiplication is a simple but amazing recursion to beat the easy cubic bound. And the probabilistic algorithm for primality with Solovay drove complexity research into probabilistic computation and helped make modern cryptography possible.

Friday, October 24, 2008

Who to vote for?

I have not blogged much on the election because I doubt I can say anything you haven't already heard. I have gathered here reasons for/against the candidates that you might not have heard. They are NOT mine- they are things I've heard or read. If 1/2 of these comments are new to 1/2 of my readers, I'll be happy.

REASONS TO VOTE FOR OBAMA OR AGAINST MCCAIN
1. I've always wanted a Prez younger than me!
2. Joe Biden is one of the few non-millionaires in the Senate. Hence he understands the working man.
3. If Obama wins then Hillary probably won't be able to run for 8 years. (If McCain wins she can run in 4 years.)
4. In the first debate Obama showed that he was willing to agree with McCain on some things. McCain did not.
5. McCain does not know how to use email.
6. I have two bets that Obama will win. (One is with an African-American who thinks the country will never elect an African-American Prez. She hopes she is wrong and will gladly lose the bet.)

Monday, October 13, 2008

Thanks for better upper bound. Still want better lower bound- On Comm Comp of MAX

I posted on the max problem a while back.
Alice has x, an n-bit integer. Bob has y, an n-bit integer. They want to both know, max(x,y). This can be done with a deterministic n + \sqrt{2n} + O(1) bits. Can they do better?
Anon Comment 5 on that post, and possibly a paper that was pointed to, lead to a Deterministic Protocols that took n+O(log(n)) bits. I am presenting it carefully in this post, hoping that someone will NOW prove the lower bound OR get an even better upper bound.

When I had the upper bound of n +\sqrt{2n} + O(1) I thought it was optimal. Now that I have the upper bound of n+O(log n) + O(1) I'm not so sure. This is a good example of why lower bounds are so interesting- someone clever may come along with a better algorithm-- How to you prove that they can't?
1. Alice has x, Bob has y. L is a paramter to be determined later. We will assume L divides n. Let m=L/n. Alice views x as x1...xm where each xi is length L. Bob views y as y1...ym where each yi is length L.
2. Alice sends to Bob a string A of length L that is NOT any of x1,...,xm. Bob sends Alice a string B of length L that is NOT any of y1,...,ym. Note: we will need 2L > n/L.
3. So long as it looks like the strings are identical They alternate sending the next block: Alice sends x1, Bob sends y2 Alice sends x3, Bob sents y4 (NOTE- nobody has to send a bit saying `we are tied'.) They keep doing this until one of them notices that the strings are NOT equal. If they they never discover this then they spend n+Lb bits and discover x=y. They both know max since its x=y.
4. If Bob notices that xi > yi then he will send B0. Alice knows that B can't be one of Bob's segments, so she knows why Bob send this. She will then send the rest of her string that she has not already send. They both know max(x,y). This took n+3L bits.
5. If Bob notices that xi < yi then he will send B1. Alice knows that B can't be one of Bob's segments, so she knows why Bob send this. Bob will then send the rest of his string. They both know max(x,y). This took n+4L bits.
What can L be? we needed 2L > L/n. L=lg n works. So this works in n+4lg n. You can do slightly better by taking L=lg n - lglg n.

Friday, October 10, 2008

Patents and Copyrights vs Web Traffic (guest Post by Amir Michail)

(Guest post by Amir Michail)

Title: Patents and Copyright vs Web Traffic

Software patents are intended to reward innovation, especially by startups with limited resources. But in today's Web 2.0 world of social sites such as twitter, a service is more useful when it has more users. Even if the implementation contains something that is worthy of a patent, getting such a patent is not particularly important because the service will generally have an overwhelming advantage over its clones in terms of number of users -- new users will generally pick the service with the most users, thus resulting in much more web traffic growth for the original service.

Similarly, one could argue that copyright is unnecessary on the web -- those who copy are unlikely to get anywhere near as much web traffic as the original source. For example, someone may copy posts verbatim from a popular blog, but it is unlikely that he/she will get anywhere near as much traffic. In particular, people already linked to the original source, thus giving it higher PageRank.

Admittedly, there are occasions where things don't work out like this and where patents and/or copyright would have been helpful. Consequently, we could have a law that requires search engines to provide a link back to the original source if any for each search result. This would be done using a completely automated heuristic matching algorithm. In this way, those who copy are even less likely to get anywhere near as much traffic as the original source.

Thursday, October 09, 2008

A Victim of the Internet

I come from the perhaps the last generation of those who mainly got their news from newspapers. I could spend a morning enjoying the Sunday paper and I still like to pour over the paper over breakfast. But the newspaper industry has been hard hit from the Internet with much less advertising particularly with classifieds going to places like Craigslist and fewer and fewer people reading the physical paper with young people getting their news from the Internet and the myriad news (and comedy) channels and TV. Newspapers have had to shrink staff and reduce the quantity and quality of their product.

This crisis hit home last week with the redesign of the Chicago Tribune. I understand the need for a paper to freshen up its image and can look beyond the extra color and fluff. But it is what's missing that bothers me. Two sections (Business and Local News) have been merged into the first section with business news more focused on personal finance than business. The Sunday Perspective (The Tribune's version of the Week in Review) has become a 3-page editorial/op-ed. But most dramatically is a significant drop in the coverage of real national and international news.

The Tribune hit its nadir last Saturday. The day after the bailout passed the house and was signed by the president, the front page had only one story—on the dating habits of suburban teens. The bailout story was on page 16.

The Tribune picked an interesting time to have their redesign, a month before the elections and with both Chicago baseball teams about to start their (short-lived) march into the playoffs. Perhaps they felt few would unsubscribe during this time. I went ahead and subscribed to the New York Times (instead of just skimming it online) because I still crave real news with my coffee. I'll get both papers for a while and if the Tribune doesn't recover I may have to say goodbye to my old friend.

Wednesday, October 08, 2008

(This is not a partisan post. I am using a Republican example since it illustates the point so well.)

In most (all?) serious fields of academia it is not acceptable to directly contradict yourself. No so in the field of political pundits. Karl Rove (and others) recently did this as the following clip illustrates. My question is, why do they get away with it? Some thoughts.
1. I only saw this on The Daily Show. No other media seems to have picked up on it. (Some blogs did.) Karl Rove will not be challenged on this. FOX NEWS will not call him into their office and ask him how he could say contrary things.
2. Rove is only talking to fellow conservatives. Its like Cold Fusion worskhop or a Parapsychology conference or an Intelligent design journal or a Bush Rally- your audience is pre-picked.
3. People think (perhaps correctly) that all pundits do it, so nobody is particularly criticized if they do it. (I've seen this reasoning used to defend negative ads.)
4. If Karl Rove was challenged he might say The Daily Show ran that piece because they are part of the liberal media elite.
5. Most people have not had a course in basic logic to tell them when two statements are clearly contradictory. (Though common sense should suffice.)
6. There is an end justifies the means mentality. If Karl Rove helps get McCain elected, then that is all that matters.
7. Is there any serious field of academia where people can make contrary statements and not be called on it?

Tuesday, October 07, 2008

Expectation of Expectations

Can we quantitatively judge the effect of a debate on the election? One reasonable approach would look at prediction markets. If the value of the security for Obama to win would greatly increase after the debate in the absense of significant other factors, one could conclude that the debate went well for Obama, or at least that he did better than expected.

Intrade, the Irish real-money prediction markets site, went one further and set up a new security 2ND.DEBATE.OBAMA based on the outcome of tonight's presidential debate. In short,

This contract will be settled according to which Presidential Candidate receives the largest increase in value following the Presidential Debate on Tuesday October 7th. The pre-debate value will be calculated for 2008.PRES.OBAMA and 2008.PRES.McCAIN and then compared with the post-debate value for each of these contracts. This contract will expire at 100 if the value of 2008.PRES.OBAMA increases more than the value of 2008.PRES.McCAIN. This contract will expire at 0 if this is not the case.
A similar security for the VP debate paid off zero as there was a slight bump for McCain suggesting Palin did better than expected.

But there is a flaw in the logic for such a security—The market should already take into account the expectation of the debate. In general for any random variable A, the expectation of the expectation of A is the same as the expectation of A, i.e., E(E(A))=E(A).

So, assuming that these securities represent real probabilities, should the value of 2ND.DEBATE.OBAMA before the debate be 50? Not quite. Suppose that there is a 75% chance that the Obama stock goes up 5 and a 25% chance it drops 15. Then the price of 2ND.DEBATE.OBAMA should be 75%. The price of the security indicates who has the least chance of major downside from the debate. But to think one could use this security to predict how well the debate will go makes little sense.

The press release suggests using this security for real-time debate tracking. The price of 2ND.DEBATE.OBAMA should gyrate dramatically on small changes of 2008.PRES.OBAMA. But one could just derive this directly from 2008.PRES.OBAMA and since the later security has much greater liquidity it will likely give better information.

Chris Masse suggested another explanation.

[Intrade] is trying to induce traders into trading more (so as to get more transaction fees), and this new contract offers more upsides for the traders who will bet right. One could double one's initial investment in a matter of 2 days.
In which case all the power to them.

Monday, October 06, 2008

The Unexpected Hanging Paradox-any seriosu math their?

(20th Maryland TCS Day Tue, Oct 14, 2008 9:30am - 4pm)

The liar paradox is very similar to the serious math of Godel's Incompleteness theorem. Berry's Paradox is very simlar to the serious math of the Kolmogorov function (map x to the length of the shortest description of x) being noncomputable.

Is their any serious math that is similar to the Unexpected Hanging Paradox? Here is the paradox:

A judge tells a condemned man that he will hang on either M, Tu, W, Th, or Friday at 1:00PM And that he will be surprised when it happens. His lawyer tells him not to worry: The hanging can't happen on Friday, since if he is alive by Thursday at 1:01PM then he knows it will be on Friday and hence won't be surprised. However, he also knows he can't be hung on Thursday since Friday is out, so if by Wedensday at 1:01PM he alive he will KNOW that it is Thursday. Hence he won't be surprised. Hence it can't be Thursday. Proceeding like this he reasons that he can't be hung. On Thursday the judge comes with the noose- and the prisoner is surprised.

I'm not asking for a resolution-- it seems to be a significant problem for philosophy-- however, I am asking: Is there some serious math that is similar to this paradox?

Friday, October 03, 2008

A minor but perhaps indicative failure of technology

In Feburary of 2007 needed to look up on Hallmark Channel Website if they had some program changes. They did not have any program changes on the website but they did have a number to call. The subwebsite had the number 1-888-390-7474. I called it and it said:
You have reached viewer services at the Hallmark Channel. In order for us to better serve our viewers, this mailbox is specialized designed to answer most of your questions. If you are having trouble viewing the Hallmark Channel, press 1. For a list of program changes, press 2.
I thought GREAT!. I pressed 2.
The following changes have been made to our program scheduled as of Wedensday June 29, 2005. Jane Doe-The Harder they Fall, originally scheduled to air Friday July 1, 2005 9:00 Eastern and Pacific, 8:00PM Central, and 7:00PM Mountain time, has been replaced by Matlock-The Power Brokers.
They had not updated their phone message in two years. I left a message (If you would like to leave a comment, press 4) telling them of their error. I have since called on April 1, May 1, and June 1 of 2007 to see if they have corrected it. They had not. I've also emailed them, to no effect. (Its been fixed since then.)

How could a phone line be this out of date? How could the website maintainers not know this?
1. They got careless and are not suffering for it so they don't know they got careless. I'm not going to stop watching their shows because their website gives a phone number that is two years out of date.
2. Lack of coordination- the right hand does not that the left hand stopped working two years ago.
For this particular issue, the breakdown of technology is not important. But it troubles me that in other domains it might be.

Thursday, October 02, 2008

Go Sox

After beating three different teams in three days to make the playoffs, my White Sox join the Cubs in an historic time in Chicago: The first time both Chicago baseball teams are in the postseason since that great 1906 World Series. The Sox open the playoffs today against the surprising AL East winner Tampa Bay.

With the Boston Red Sox that makes all three teams that I have once held season tickets (I lived walking distance from Wrigley for a year and I organized season tickets for the theory group at MIT as a student there). And add on Milwaukee a team I have grown to like as they play in new stadium that I can get to easier than either of the Chicago ball fields and you can't beat the sausage race. And four more teams that hopefully will all be eliminated in the first round. This is just the ultimate playoff year to keep me distracted from various fiscal crises and political activities. I might not get much research done in October.

The eyes of America look at the Cubs who last won a World Series in 1908, one hundred years ago. But although I now teach north of the city, my heart remains with the Southsiders. The best outcome: The White Sox win the world series in five games. Why five and not the full seven? Because it is so much more fun to see the Cubs choke at home.

Wednesday, October 01, 2008

When does Alice deserve to be a co-author?

When has someone done enough work to deserve a co-authorship? Here are a few scenarios I have seen. I will say what did happen, not what should have happened.
1. Alice comes up with the problem and makes some obvious observations about it, she shows it to others, who solve it. She writes it all up. (Alice got the co-authorship.)
2. Alice comes up with the problem and makes some obvious observations about it, she shows it to others, who solve it. She does not write it up but does some proofreading. (I've seen this twice- once she got the co-author, once not.)
3. Bob shows Alice a problem, Alice makes one observation that is the key. She has put only 2 minutes of work into the paper. (Alice got the co-authorship.)
4. Bob has already written up the 50 page paper and it has covered all of the cases except for one. Alice solves the one case left. It was not a hard case at all. (Alice did not get the co-authorship.)
5. Bob and Carol solve the problem together. Alice is in the room and does not contribute anything, but her tenure case is coming up soon and she is a bit short on papers. (Alice got the co-authorship. This is a rare scenario.)
6. Alice proofreads a paper and finds a serious flaw in it, but fixes tht flaw. It was a rather subtle flaw and needed very clever argument to fix it. (Alice got the co-authorship.)
7. Alice reads Bob's paper and comes up with a generalization that Bob does not care about, but wouldn't really be worth a paper on its own. (The paper did not have the generalization and Alice did not get the co-authorship.)
8. Alice makes a key observation that makes all of the proofs shorter. The Conference version has her as a co-author. However, while writing up the journal version it is noticed that Alice's contribution was not correct. It is removed and the journal version is pretty much what it would have been had Alice not been involved at all. (Alice asked to be taken off of the journal version The other authors were surprised by this and did not mind keeping her as an author. They pointed out that there were other people on the paper that were even less involved. But they agreed to take her off on her insistence.)