Tuesday, May 02, 2006

New Priorities for Computability Theory

Bob Soare, who wrote one of the great textbooks on recursion theory and then almost single-handedly changed the name of the field to computability theory, teaches an intense two-quarter class on the topic every other year in Chicago. To my surprise just now, halfway through the second quarter (15 weeks into computability theory) he is just proving the solution of "Post's Problem", the existence of incomplete degrees, that excited Gödel in his letter.

When I sat in on Soare's class in the early 90's, by this time he had covered much more complicated finite injury arguments and was starting the infinite injury constructions like the Sacks Density Theorem: Given r.e. sets A and B, such that A is reducible to B and B is not reducible to A, there is a set C that lies in between.

I asked Soare about this last week. He isn't dumbing down his class, rather he's acknowledging a change in direction in the field, more along the lines of looking at the complexity of reals (infinite sequences of bits), often by examining those defined by infinite branches of computable trees.

For example, many computability theorists today are studying notions of "random reals", infinite sequences that share some properties of randomly chosen numbers. They have shown neat connections to Kolmogorov complexity and connections to Chaitin's Ω. For any reasonable ordering, Chaitin's Ω is a computably enumerable random real and Kucera and Slaman show the surprising result that the converse is true as well.

One used to measure a recursion theorist by the difficulty of their constructions; now we see more a focus on the beauty of the theorems and their proofs.

Soare is working on a new version of his textbook that will differ in a couple of ways. He is changing terminology (recursive and r.e become computable and c.e.), but more importantly he changes the focus to more strongly develop the theory that drives computability today. A good lesson: Fields change over time and better to acknowledge and embrace those changes than to fight them.

8 comments:

  1. Thanks for the very informative post!

    Think complexity will learn from the 'new wave' of computability as it did from the first? (We do have complexity over the reals..)

    ReplyDelete
  2. Actually, we still haven't started Post's Problem yet. Hopefully, we'll get to that before the *third* quarter.

    Other terminologies that we would like to stamp out include "weak truth-table reductions" (a.k.a. bounded Turing reductions) and "hyperimmune-free degrees" (now 0-dominated).

    ReplyDelete
  3. Lance: Is it not Chaitin instead of Chatin?

    ReplyDelete
  4. Sacks, not Sack's

    ReplyDelete
  5. Fixed that too. I could really use a spell check for CS/Math people.

    ReplyDelete
  6. A good lesson: Fields change over time and better to acknowledge and embrace those changes than to fight them.

    Only if the changes are of high priority and serious. Here as you suggested it is the case. Because lot of work to come to support theorists:)

    ReplyDelete
  7. Teutsch:

    Who calls a weak truth-table (wtt) reduction a "bounded Turing reduction"? I've only seen wtt in recent papers. Personally, I think "bounded Turing reduction" might be misleading because it doesn't say what you are bounding. A wtt reduction means the query use of the machine is bounded by a computable function, whereas a truth-table reduction means the time/space use of the machine is bounded by a computable function.

    ReplyDelete