Tuesday, April 06, 2010

Finding the Right Model

Last fall I wrote about the different focus on models and proofs in the Econ and CS theory communities. Today I'll focus on the purpose of a model and what makes a good one.

An economist once explained the difficulty in coming up with a good model. A good example is how people prefer one choice to another. In standard utility theory, people assign a value (or utility) to each state of the world and try to maximize their expected expected. But then one notices people tend to give too much weight to low probability events like in the Allais Paradox. So one can look at more general models of preferences such as prospect theory. But these general models might be too general, and thus have little explanatory value and in particular may not allow us to predict future behavior and outcomes, the ultimate purpose of an economic model.

In computational complexity, we don't use our models for prediction of future events. Our models of computation are meant more for comparing different types of resources: Time versus space, quantum versus classical, parallel versus serial and of course, nondeterministic versus deterministic. What makes a good model for us is robustness (small changes to the definition doesn't change the problems it can solve), nice closure properties and some connection to reality. Sure the class P, Polynomial-time, is not equal to efficient computation if the constants or exponents of the running time are large but it's a reasonable approximation.

Part of my research agenda over the past few years is trying to find computation models for agents in economic theory. Many of the ideas of our community, such as connections between randomness and unpredictability, can play an important role in economic theory. But finding and determining what is the right models are the biggest challenge, as even the purpose of models differ in our communities. Communication is the key, working directly with people in the other community. One of the great advantages of Northwestern is having a large strong micro-economics group, many of whom understand that computation is an important resource and are willing to at least listen to us computer scientists.

We also need to keep an open mind--the goal should be making the best connections between communities and not worrying so much about results to impress our direct peers. Having tenure definitely helps.

5 comments:

  1. The models I've seen have been coarse-grained and seem to simulate what people already know. I would like to see fine-grained simulations written in Erlang with lots of message passing that reveal subtleties and make us think, the way ant colonies make us ponder the mechanism of intelligent behavior. I wonder if Radhika Nagpal's work in bio-inspired multi-agent systems could be applied to economics.

    ReplyDelete
  2. Sure the class P, Polynomial-time, is not equal to efficient computation if the constants or exponents of the running time are large but it's a reasonable approximation.

    It isn't a reasonable approximation. If we weren't used to it and someone were to suggest it today the paper would be laughed silly and rejected outright.

    Hypothetical PC meeting:

    --hey, look at this guy, he's proposing that an O(n^100) algorithm is efficient

    --it wouldn't even work for n=2 !!

    --it also ignores constants, as if an algorithm with time complexity O(k*n^3) was worse than one with O(2^k*n^2) complexity. Hasn't this bozo read papers in parameterized algorithms showing how the former is much preferable?

    --sure, P, i.e. _polynomial_ time is closed under _polynomial_ reductions, but this is obvious, how could it be otherwise.

    --Do we even know of a problem in say Theta(n^5) that shows such class is worth studying? This is an otherwise rigged class with no natural applications.

    ReplyDelete
  3. To previous anonymous,

    We can give a similar argument that real numbers do not make sense, they are not related to reality. Similarly, very big natural numbers does are not realistic. You can go on to criticize almost all of mathematics. Who care if unsolved problem like RH is false for numbers which we can not even talk about. Same for crypto.

    We idealize reality because then we can make statement about it. Every model is useful only in a restricted domain. Newton's physics is not useful for sub atomic particles but when restricted to a day to day life, people use it for building almost everything we use today, from bridges to iphone. It is the responsibility of the engineer to know the conditions he can use a result about an abstract model.

    P is useful because it gives an idealization that we can work with, and as result we find new questions. Probabilistic algorithms, ... all of these are partially result of having this abstract model. We you want use an algorithm in practice, it is your responsibility to check the specific constants about your algorithm and you machine, the range and restrictions of inputs it is going to be given, ... .

    I hope that I made the point I was trying.

    ReplyDelete
  4. To anonymous 2 (or anyone else who doesn't like P as a model of efficient computation): can you suggest a better model?

    ReplyDelete
  5. To anonymous 2 (or anyone else who doesn't like P as a model of efficient computation): can you suggest a better model?

    For sure, in fact there are some already out there. Say for example the O~(n^k) hierarchy that ignores polylog factors. Then one can define efficient computation as O~(n^2). Reductions are also more robust under this model.

    ReplyDelete