I'm honored and humbled that Lance Fortnow decided to entrust his weblog to me for the week. Lance's only request was that I obey a few simple ground rules: keep it clean, stay on topic, don't betray confidences, and absolutely no politics.
To demonstrate my commitment to Lance's ground rules, I'd like in this first post to address the Israeli-Palestinian conflict. No, not the conflict itself, but rather a meta-question that it raises: how can so many smart, educated, well-meaning people disagree so vehemently about the most basic facts of an issue? How can they end every conversation not closer together but farther apart, "agreeing to disagree"? We can ask the same question about free markets versus socialism, the interpretation of quantum mechanics, or other issues on which two or more sides are certain of their own arguments.
A 1976 theorem of Robert Aumann has been interpreted as showing that two honest, rational people (who believe each other to be honest and rational) should never disagree about anything. More precisely, let Alice and Bob be Bayesians who assign the same prior probabilities to all random variables (admittedly a big assumption), but who have since gained different knowledge about the variables. Let p be Alice's posterior probability that (say) it will rain tomorrow, conditioned on everything she knows, and let q be Bob's posterior probability. The theorem says that if p and q are common knowledge to Alice and Bob, then p=q.
The key point here is that "everything Alice and Bob know" includes their knowledge of p and q. So for Alice and Bob to reach consensus, it isn't enough for them just to announce p and q -- for then p and q might change, so they'd need to announce the new values, and so on iteratively. However, so long as the whole probability space is finite, this iterative process will end eventually with Alice and Bob agreeing on the probability that it will rain tomorrow.
Great, you say, but what does this have to do with complexity? Well, if Alice and Bob exchanged everything they knew, then obviously they'd agree on the chance of rain. So the crucial question for me -- and one that seems never to have been addressed in the large economics and philosophy literature on this subject -- is, how many iterations are needed until convergence? Or, if we let Alice and Bob use an arbitrary protocol, then how many bits must they exchange before reaching consensus or near-consensus? In communication complexity language, instead of evaluating a function f(x,y) where x and y are n-bit strings, now we merely want Alice's expectation of f(x,y) (conditioned on her partial information about y) to equal (or nearly equal) Bob's expectation, given a known prior distribution over x,y pairs.
For some f,x,y and distribution over x,y pairs, can we show that (say) n(1-o(1)) bits of communication are needed? Such a lower bound could provide the first compelling explanation for why honest, rational friends can disagree: because they're not Siamese twins, and they don't have their entire lives to talk to each other.
No comments:
Post a Comment