Thursday, June 30, 2005

Research Directions for Theory

Sanjeev Arora asked the "theory blogs" to take up the issue of finding a few new challenges of theory that one can sell to nonspecialists and congressional aides. SIGACT has set up an outreach committee led by Richard Karp that will prepare a list of research directions for the theory community and they want your input. More from Suresh.

I feel a little déjà vu here. Ten years ago Karp led a NSF sponsored group with the mission of suggesting where the NSF theory group should focus its funding. The group held a panel discussion at the end of the 1995 STOC conference. Representatives from different subfields gave a short talk on the importance of their fields. After these presentations the panel opened the discussion to the audience.

Now instead of a physical panel discussion, Arora asks for a virtual one in a hope to draw from a larger base of people. Feel free to leave your ideas as comments on this post, on the committee page of the Theory Matters Wiki (edit password: tcs), or just by email to one of the committee members. Not everyone was happy with the last Karp report, so better to get your comments in now than complain afterwards.

Wednesday, June 29, 2005

FOCS Accepts

The list of accepted papers for the upcoming FOCS Conference has been posted (via Suresh via Bacon). Given recent comments the Internet really raises expectations on how fast we get to see the list. As I write this the list still has a mysterious "One extra paper" at the end.

In complexity two of the unique games papers I mentioned on Monday will be at FOCS. Some other interesting looking complexity papers:

Looks like the big area winners at FOCS are upper and lower bounds on approximation, electronic commerce and cryptography.

Tuesday, June 28, 2005

Defining Theory

Theory Matters points to a definition of Theoretical Computer Science given on the SIGACT Home Page.
The field of theoretical computer science is interpreted broadly so as to include algorithms, data structures, complexity theory, distributed computation, parallel computation, VLSI, machine learning, computational biology, computational geometry, information theory, cryptography, quantum computation, computational number theory and algebra, program semantics and verification, automata theory, and the study of randomness. Work in this field is often distinguished by its emphasis on mathematical technique and rigor.
This definition first appeared in the December 1997 SIGACT News with a slightly different order and missing quantum computation and automata theory.

I dislike these "laundry list" definitions. They both tend to overcompensate by listing too many areas (e.g. "the study of randomness" is really subsumed by the other areas) and failure to capture all the areas we study (e.g. electronic commerce). Such lists cannot remain stable over time and need constant updating. We find it hard to delist any areas even if we should.

Most importantly laundry lists don't capture the spirit of a field. If we really wish to sell our field properly we need to start with a clear definition. Here is a suggestion.

Theoretical Computer Science is the formal analysis of efficient computation.
Simplicity should beat complexity every time.

Monday, June 27, 2005

The Unique Games Conjecture

A unique game consists of an undirected connected graph G=(V,E), a color set C, and for each edge {i,j} with i<j a permutation πi,j:C→C. A coloring of the graph c:V→C fulfills an edge {i,j} if πi,j(c(i))=c(j).

There is also a linear version of unique games where C is a finite field and for each {i,j}, πi,j(x)=ai,jx+bi,j with ai,j and bi,j in C and ai,j≠0.

If a coloring fulfills all the edges then knowing the color at one edge uniquely determines all of the other colors. One can efficiently determine whether such a coloring exists by trying all possible colors at one node and seeing if any of the resulting coloring fulfills all the edges.

However it might be difficult to determine whether one can fulfill some large fraction of the edges. Subhash Khot defines the unique games conjecture.

For every constant δ>0 there is a fixed finite color class C such that it is NP-hard to distinguish the following two cases for any unique game with color class C.
  1. There is some coloring that fulfills at least 1-δ-fraction of the edges.
  2. Every coloring fulfills at most a δ-fraction of the edges.
Some results on unique games:
  • Khot, Kindler, Mossel and O'Donnell reduce unique games to approximating Maximum Cut better than the best known approximation due to Goemans and Williamson (about 0.878567). Khot et. al. also required a "Majority is Stablest" conjecture which was later proved by Mossel, O'Donnell and Oleskiewicz. Thus under the unique games conjecture any improvement in approximating Max Cut would imply P=NP.
  • Similar results showing that given the unique games conjecture (and P≠NP) it is hard to approximate Vertex Cover with 2-ε (Khot-Regev) and Sparsest Cut within any constant (Chalwa-Krauthgamer-Kumar-Rabani-Sivakumar).
  • Luca Trevisan shows that we can solve the unique games in polynomial time if we allow δ=o(1/log n) instead of a constant.
  • In an upcoming FOCS paper, Khot and Nisheeth Vishnoi use unique games to (unconditionally) disprove the conjecture that negative type metrics (metrics that are squares of Euclidean metrics) embed into L1 with constant distortion. They also show a superconstant lower bound on the integrality ratio for Semi-Definite Programming relaxations for Sparsest Cut.
The introduction of Trevisan's paper gives a nice overview of unique games.

Update 6/28: The hardness of approximating sparsest cut given the unique game conjecture is also in the Khot-Vishnoi paper done independently from CKRRS. Also Khot has a recent survey in SIGACT News on PCP-based hardness results that has a section on unique games.

Friday, June 24, 2005

The End of an Era?

On May 13 in the US, the Star Trek franchise (temporarily?) ends 18 straight years of first-run episodes. Bill Gasarch comments.

About a month ago was the final episode of ENTERPRISE. I just saw it last week. I assume that NONE of the readers are saying "Gee, how did he do that!"

At one time many computer scientists were also science fiction fans.

At one time both were small communities (with enrollment dropping computer scientists may return to being a small community).

At one time you couldn't time-shift how you watched TV so people would talk about the same show the next day.

At one time there was not so much Science Fiction out there so all the fans graviated towards the same materials.

So in the past there was much more cohesivness to the CS/Sci-Fi community.

There is no longer.

Is this good or bad?

I thing its good to NOT be so homogenous. New ideas come from all kinds of places. And you don't want people who are not Sci-Fi fans to NOT major in Comp Sci since they think they have to be.

Thursday, June 23, 2005

Herbie: AI Marvel

Rarely do people notice the true technological breakthroughs in science fiction and fantasy movies. Roger Ebert gets it in his review of the rather silly Herbie: Fully Loaded, a new entry in the series about the mischievous car.
I see I have subconsciously stopped calling Herbie "it" and am now calling Herbie "he." Maybe I've answered my own question. If Herbie is alive, or able to seem alive, isn't this an astonishing breakthrough in the realm of Artificial Intelligence? That's if computer scientists, working secretly, programmed Herbie to act the way he does. On the other hand, if Herbie just sort of became Herbie on his own, then that would be the best argument yet for Intelligent Design…

The real story is Herbie's intelligence. The car seems to be self-aware, able to make decisions on its own, and able to communicate with Maggie on an emotional level, and sometimes with pantomime or by example. Why then is everyone, including Lohan, so fixated on how fast the car can go? The car could be up on blocks and be just as astonishing.

It goes to show you how we in the press so often miss the big stories that are right under our noses. There is a famous journalistic legend about the time a young reporter covered the Johnstown flood of 1889. The kid wrote: "God sat on a hillside overlooking Johnstown today and looked at the destruction He had wrought." His editor cabled back: "Forget flood. Interview God."

Wednesday, June 22, 2005

Communicating Open Problems

A famous complexity theorist once said "The hardest part of being an advisor is not working on your student's problems." Good open problems are quite rare and one is often torn between the desire to see a problem resolved as quickly as possible versus giving people a fair chance to work on them. So I put together a set of guidelines for distributing problems.
  1. If you ask someone about what problems they are working on you shouldn't start working on those problems or give them to others without permission. When this rule is violated, even students in the same department are sometimes afraid to discuss their own research with each other.
  2. If someone discusses a problem with you shouldn't mention the problem to others without permission. Asking a question like "Do you mind if I tell this problem to my students?" is sufficient.
  3. If you are an advisor and you give a problem to a student you shouldn't work on the problem yourself or give it out to other students without the first student's permission.
  4. Outside the advisor-student relationship the above rule does not apply. You can work on a problem even if you give it to someone else or distribute it as you wish unless you've had a prior agreement.
  5. Once you make a problem public (in a talk, in a paper or on the web) the problem is fair game to all.
I realize I have not always followed all of these rules myself and I apologize. One could argue that one best advances science by making all problems as widely available as possible but following these guidelines will open communication as researchers will have less need to hide what they work on.

Monday, June 20, 2005

Favorite Theorems: The Polynomial-Time Hierarchy

May Edition

The Equivalence Problem for Regular Expressions with Squaring Requires Exponential Space by Albert Meyer and Larry Stockmeyer, FOCS (then called SWAT) 1972.

The title result of this paper gave an early example of a natural problem that provably does not have an efficient algorithm. But it is the second half of the paper that developed one of the most important concepts in computational complexity.

The class NP consists of those problems with efficiently verifiable solutions. Similar to the arithmetic hierarchy, Meyer and Stockmeyer define a hierarchy above NP inductively as follows:

  • Σ1p=NP
  • Σk+1p=NPΣkp, where NPA represents the class of problems solvable in nondeterministic polynomial time with access to an oracle for solving problems in A.
The union of all of the Σkp form the polynomial-time hierarchy. The Meyer-Stockmeyer paper and follow-up papers by Stockmeyer and Celia Wrathall showed many interesting properties about the hierarchy including:
  • Alternation characterizations of the hierarchy using quantifiers and second-order logic.
  • If for any k, Σkpk+1p then for all j≥k, Σkpjp. If this happens for some k we say the polynomial-time hierarchy collapses, otherwise the we say the hierarchy is infinite.
  • PSPACE contains the polynomial-time hierarchy and if the converse holds then the hierarchy collapses.
The polynomial-time hierarchy has had a major impact in computational complexity in many area, including
  • classifying some problems like succinct set cover and VC dimension that NP does not capture,
  • using the conjecture that the hierarchy is infinite to imply the likelihood of a number of statements like that NP does not have small circuits and that graph isomorphism is not NP-complete,
  • attempts to show the polynomial-time hierarchy is infinite in relativized worlds have led to major results on circuit lower bounds,
  • led to the concept of alternation giving new characterizations of time and space-bounded classes, and
  • variations on the hierarchy led to interactive proof systems that themselves led to probabilistically checkable proofs and hardness of approximation results.
Much more in my recent paper on Larry Stockmeyer.

Saturday, June 18, 2005

An Eulerian Tour

Chris Barwick (aka optionsScalper) is a fan of Euler and tracked down my academic legacy back to Euler and Gauss through many other great mathematicians. Of course the same legacy applies to the many theoretical computer scientists who descend from Manuel Blum.
  • Lance Jeremy Fortnow was a student of Sipser
    Awarded: 1989. Dissertation: Complexity-Theoretic Aspects of Interactive Proof Systems
  • Michael Fredric Sipser was a student of Blum (1938-)
    Awarded: 1980. Dissertation: Nondeterminism and the Size of Two-Way Finite Automata
  • Manuel Blum was a student of Minsky (1927-)
    Awarded: 1964. Dissertation: A Machine-Independent Theory of the Complexity of Recursive Functions
  • Marvin Lee Minsky was a student of Tucker
    Awarded: 1954. Dissertation: Theory of Neural-Analog Reinforcement Systems and Its Application to the Brain Model Problem
  • Albert William Tucker was a student of Lefschetz (1884-1972)
    Awarded: 1932. Dissertation: An Abstract Approach to Manifolds
  • Solomon Lefschetz was a student of Story
    Awarded: 1911. Dissertation: On the Existence of Loci with Given Singularities
  • William Martin Story was a student of Carl Gottfried Neumann (1832-1925) and Klein (1849-1925)
    Awarded: 1875. Dissertation: On the Algebraic Relations Existing Between the Polars of a Binary Quantic
  • Felix Christian Klein was a student of Julius Plücker (1801-1868) and Lipschitz (1832-1903)
    Awarded: 1868. Dissertation: Über die Transformation der allgemeinen Gleichung des zweiten Grades zwischen Linien-Koordinaten auf eine kanonische Form
  • Rudolf Otto Sigismund Lipschitz was a student of Dirichlet (1805-1859) and Martin Ohm
    Awarded: 1853. Dissertation: Determinatio status magnetici viribus inducentibus commoti in ellipsoide
  • Gustav Dirichlet was a student of Poisson (1781-1840) and Joseph Fourier (1768-1830)
    Awarded: 1827. Dissertation: Partial Results on Fermat's Last Theorem, Exponent 5
  • Simeon Poisson was a student of Lagrange (1736-1813)
    Awarded: Unknown. Dissertation: Unknown.
  • Joseph Lagrange was a student of Leonhard Euler (1707-1783)
    Awarded: Unknown. Dissertation: Unknown.
Also some Gauss starting at Klein and progressing through Plücker.
  • Felix Christian Klein was a student of Plücker (1801-1868) and Rudolf Otto Sigismund Lipschitz (1832-1903)
    Awarded: 1868. Dissertation: Über die Transformation der allgemeinen Gleichung des zweiten Grades zwischen Linien-Koordinaten auf eine kanonische Form
  • Julius Plücker was a student of Christian Gerling
    Awarded: 1823. Dissertation: Generalem analyeseos applicationem ad ea quae geometriae altioris et mechanicae basis et fundamenta sunt e serie Tayloria deducit
  • Christian Gerling was a student of Johann Carl Friedrich Gauß (Gauss) (1777-1855)
    Awarded: 1812. Dissertation: Methodi proiectionis orthographicae usum ad calculos parallacticos facilitandos explicavit simulque eclipsin solarem die
Notes from Barwick:
  1. My sources are various in print and online, but they originate from The Mathematics Genealogy Project.
  2. Little is known of William Edward Story and Albert William Tucker and their lives.
  3. Martin Ohm is the brother of Georg Simon Ohm, for whom Ohm's Law is named.
  4. I find it interesting that Klein was awarded his doctorate the year of Plücker's death. Klein was Plücker's assistant for nearly three years.
  5. It had been believed, but not shown that Carl Gottfried Neumann was advised by Georg Friedrich Bernhard Riemann. Neumann was, in fact, advised by Otto Hesse and F. Richelot. Hesse was also a friend of Neumann's father, Franz. Many modern mathematicians mistakenly trace their roots through Neumann to Riemann and Gauss. Riemann received his doctorate in 1851 at Göttingen. Riemann was subsequently awarded a post at Göttingen by Gauss in 1851 to allow Riemann to study for his Habilitation. Riemann delivered his lecture to earn the Habilitation under Gauss in 1854. Gauss died the following year (Dirichlet was given his chair). Carl Gottfried Neumann was awarded his doctorate in 1855 at Königsberg.

Thursday, June 16, 2005

Where will you be next year?

As the long computer science recruiting season has pretty much finished we go around conference like STOC and Complexity asking "Where will you be next year?" But often you won't find out the new job a person has until you see their name tag at a conference in the fall or Google has caught up with their new home page.

So if you are have recently taken or will take a new position we want to know. Leave a comment on this post and tell us your new job whether in industry or academic at any level (professor, postdoc or even starting graduate school).

To all who post I say in advance: "Congratulations and Good Luck!"

Tuesday, June 14, 2005

Understanding "Understanding"

Yesterday Manuel Blum gave the invited talk on Understanding "Understanding:" Steps towards a Mathematical Scientific Theory of Consciousness. He started with a history of how trying to understand the mind shaped his academic career. His father told him understanding how the mind works would help him academically. So when we went to college he got interested in the work of McCulloch and Pitts that formulate neurons as automata. This led Blum to study recursion theory with Hartley Rogers and then work with his eventual thesis advisor Marvin Minsky studying the new area of artificial intelligence. In the end Blum wrote one of the first theses in computational complexity under Minsky, not to mention doing groundbreaking work in many areas, winning the Turing award and being the advisor to my advisor (Michael Sipser).

Blum made a strong point that his theory of consciousness is just being developed and emphasizing the word "towards" in the title. Roughly his theory has an environment (representing the world at a certain time) modeled as a universal Turing machine that interacts with several entities (representing organisms or organizations) each modeled as a (possibly weak) computational device. An entity has CONSCSness (CONceptualizing Strategizing Control System) if it fulfills certain axioms.

  • The entity has a model of its environment and a model of itself.
  • The entity is motivated towards a goal. Blum modeled the goal as a difference between a pleasure and a pain function which looked to me like utility functions used by economists.
  • The entity provides a strategy to head towards the goal.
  • The entity has a simple serial interface with the environment.
Blum briefly defined notions of self-awareness (able to reason about oneself) and free will. For free will Blum used an example of playing chess where we have free will because we don't know what move we will make until we have time to think about it, very similar (though I believe independent) of McAllester's view.

Blum called on complexity theorists to take on the cause of consciousness. He pointed to an extensive bibliography on the general topic maintained by David Chalmers.

My take on the talk: Much of theoretical computer science did get its start from thinking about how the brain works but as computers evolved so has our field and theory has since the 70's focused on understanding efficient computation in its many forms. It's perfectly fine to model humans as efficient computers to understand their interactions in areas like cryptography and economics. But we should leave issues like consciousness, self-awareness and free will to the philosophers since any "theorems" we may prove will have to depend on some highly controversial assumptions.

Monday, June 13, 2005

Conference on Computational Complexity

Howdy from the 20th IEEE Conference on Computational Complexity in San Jose, California. Last night we had a short business meeting with beer and wine but without much controversy. Dieter van Melkebeek was elected to the organizing committee. Next year's conference will be held in Prague July 16-20 and in 2007 we will join STOC and many other conferences at the Federated Computing Research Conference (FCRC) June 9-16 in San Diego. During the Program Committee Chair Report, Luca Trevisan made the point that even by theoretical computer science standards, the computational complexity conference has a small female representation. Something to keep in mind.

My favorite talk on the first day came from the best student paper winner, Ryan Williams on Better Time-Space Lower Bounds for SAT and Related Problems though I'm a bit biased since he's improving on some of my earlier work. He shows SAT cannot be solved by a random-access machine using nc time and no(1) space for c slightly larger than the square root of 3 (about 1.732) improving on the previous lower bound of 1.618. He had several clever ideas recursing on the previous techniques. One can hope that by extending these techniques to push the lower bound to any c<2. Above 2 you seem to lose any advantage from doing recursion.

Today Manuel Blum given an invited talk taking "steps towards a mathematical theory of consciousness." More on that and the rest of the conference later.

Friday, June 10, 2005

Graduation Day

The University of Chicago has four graduation convocations in the spring quarter spread throughout today and tomorrow. The first session (mostly law students) has just marched past my office window. I will march in the second session this afternoon which includes the liberal arts graduate students.

My Ph.D. student Rahul Santhanam (co-advised with Janos Simon) will receive his diploma this afternoon. He did his thesis work on time hierarchies and next year will be a postdoc working with Valentine Kabanets at Simon Fraser University in Vancouver. Rahul is officially my fifth student to receive the Ph.D. following Carsten Lund, Lide Li, Sophie Laplante and Dieter van Melkebeek, all of whom graduated in my pre-weblog days.

Also from our theory group, Daniel Stefankovic graduates today. He did his thesis on "Curves on Surfaces, String Graphs, and Crossing Numbers" and will be an assistant professor at the University of Rochester in the fall.

Call me a romantic but I really enjoy the pageantry of the graduation ceremony. I enjoy putting on the gown and the hood (even with those drab MIT colors) and marching past the parents as a member of the faculty and see the students come one by one, especially my own students, and receive their degrees. Chicago has a wonderful ceremony led by bagpipes in the front of the procession and the nice tradition of rarely having outside speakers (a major exception was Bill Clinton during his presidency). The ceremony was even more impressive when it was held in the Rockefeller Chapel but even with four ceremonies the chapel is not large enough to hold all the family members who want to attend.

Wednesday, June 08, 2005

Growth Causes Shrinking

Jeff Erickson makes an important point in his post on the SoCG (Computational Geometry) business meeting. Links and emphasis are his.
Finally, and most importantly, there was no discussion of the theory community's efforts to increase NSF funding for theoretical computer science, as there was at the (also beer-free) STOC business meeting. One question in particular was never asked: Are we computational geometers still even part of the theory community? The answer should be a resounding NO!, followed by a slap to the back of the head�of course computational geometry is part of theory! Look, we have big-Oh notation! Unfortunately, reality seems to disagree. None of the new material on TheoryMatters mentions computational geometry at all, although it does mention another border community: machine learning. With few exceptions, the computational geometry community rarely submits results to STOC and FOCS; this was not true ten years ago. Lots of geometric algorithms are published at STOC/FOCS by people outside the SOCG community, but nobody calls them computational geometry. (Sanjeev Arora's TSP approximation algorithms are the most glaring example.) For many years, computational geometry has been funded by a different NSF program than the rest of theoretical computer science. (This worked to our advantage when graphics was getting lots of money, but that advantage is now gone.) At one infamous SODA program committee meeting a few years ago, one PC member remarked that nobody at SODA was interested in computational geometry, they have their own conference, they should just send their results there. (This declaration led another PC member to resign.) Apparently, the divorce has been a complete success.
Not just computational geometry, but the COLT (Computational Learning Theory) and the LICS (Logic in Computer Science) communities used to have their best papers in STOC/FOCS but now we see few of their papers in the standard theory conferences. As the theory community grew larger and broader, the STOC and FOCS conferences started to emphasize certain areas in theory. Those areas which were not greatly represented felt some resentment and started putting more and more emphasis on their own specialty conferences, in some cases eventually abandoning STOC and FOCS altogether.

The Conference on Computational Complexity started in 1986 as the Structure in Complexity Theory Conference (Structures) by some researchers who felt their interests of complexity were not being well represented in STOC and FOCS. This view becomes self-fulfilling—sometimes very good papers would be turned down from STOC and FOCS because they were considered a "better fit" for Structures. In response we changed the name in 1996 and brought a broader view in complexity to the conference (though not without some controversy) and tried to work our way back into the STOC/FOCS community.

Other conferences like COLT, LICS and SoCG have moved the other direction. Note that SoCG also decided not to join the Federated Conference in 2007 while both STOC and Complexity will be there. I don't expect to see COLT or LICS at FCRC either.

What can we do, if anything? STOC and FOCS cannot properly cover the broad range of areas that have ever been considered theory. Unless we have a major restructuring of how the general theory conferences operate, we will continue to shrink the vision of theory as the area continues to grow.

Tuesday, June 07, 2005

Humor in Talks

Should you have jokes in talks? Too much humor can detract from your real work but a little laughter can lighten up an otherwise dry presentation. You must use jokes with care. You should avoid any offensive jokes: nothing sexist, racist, homophobic or sexual innuendos. Many jokes are funny only in context and in a major conference it will be hard to find context with people from different religions, countries, backgrounds and many of whom do not have English as their native language.

Some topics to be careful with:

  • Popular Culture: Most scientists even many American scientists have no clue what occupies the minds of most Americans. Even a Michael Jackson joke would likely fall flat at our conference. A Star Wars joke might work on a majority of our crowd but too many of them feel that anything that is popular should be ignored. One exception is children's popular culture: Not that anyone likes Barney but you can't avoid him, especially if you have young kids.
  • Politics: Since our field lies in such a narrow band in American politics, political jokes are fine as long as they sit in this band (i.e. making fun of Bush and his cronies). But a seemingly harmless joke outside this band will be considered "offensive". I once talked about a paper by Allender and Gore and said "but this is not the Gore that invented the internet." Didn't go over very well.
  • The P versus NP problem: Some things are too important to joke about.
What can you joke about? Make fun of yourself and your research (without insulting other's research). Make fun of your friends if they can take a joke and other people know who they are. Make fun of George Bush, Donald Rumsfeld and the religious right. Make fun of the French (okay maybe you shouldn't make fun of the French though they are such an easy target). Most of all just make fun and keep your talk interesting.

Monday, June 06, 2005

The Wife and The Mistress

An old math joke:
Three friends from college went on to become a doctor, lawyer and a mathematicians. They met back at reunion and the discussion went to whether it was better to have a wife or a mistress.

The doctor said "a wife" because having a monogamous relationship limited the risk of disease.

The lawyer said "a mistress" to avoid all of those nasty legal obligations of marriage.

The mathematician said "Both." "Both?" echoed the doctor and lawyer simultaneously. The mathematician responded "Of course both. That way your wife thinks you are with the mistress, the mistress thinks you are with the wife and finally you have time to do some math."

In that vein, to everyone at the Oberwolfach Complexity workshop, I wish I could attend but I have a conflict with the ACM Conference on Electronic Commerce. To everyone at EC sorry I couldn't be there but there is a complexity workshop in Oberwolfach. Now leave me alone and let me do some math.

Friday, June 03, 2005

Making Pigs Fly

Toda's famous theorem states that the polynomial-time hierarchy reduces to counting problems (in complexity terms PH ⊆ P#P). His proof uses two lemmas:
  • PH⊆BPP⊕P
  • BPP⊕P⊆P#P
Here is a straightforward proof of the first lemma using relativizable versions of previously known results.
  1. ⊕P⊕P=⊕P (Papadimitriou-Zachos)
  2. NP⊆BPP implies PH⊆BPP (Zachos and also here)
  3. NP⊆BPP⊕P (follows easily from Valiant-Vazirani)
  4. NP⊕P⊆BPP⊕P⊕P (relativize 3 to ⊕P)
  5. NP⊕P⊆BPP⊕P (apply 1)
  6. NP⊕P⊆BPP⊕P implies PH⊕P⊆BPP⊕P (relativize 2 to ⊕P)
  7. PH⊕P⊆BPP⊕P (use 5 and 6)
  8. PH⊆BPP⊕P (immediate corollary of 7)
We often call results like Zachos (2 above) a "pigs can fly" theorem because we don't believe the assumption in this case that NP is in BPP. This proof shows that relativization can give pigs wings and lead to some interesting containments.

Wednesday, June 01, 2005

On Language

Language has never been my strong suit. I didn't speak full sentences until I was five. I had a 220 point spread between my verbal and math SAT scores. I fumbled through three years of high school French (which required some summer school). This knowledge of French was only useful a couple of times. Wandering the streets of Paris, a women asked me Quelle heure est-il? and I knew enough to show her my watch but enough to actually tell her the time. Also I saw Secrets & Lies in France and sometimes the French subtitles made more sense than the heavily accented English.

During my undergraduate years at Cornell I struggled and gave up on Spanish. Luckily a linguistics professor had a theory that people who had trouble learning English early (like me) would have too much difficulty in picking up a new language, so I could take an intro linguistics course to cover my language requirement. Pretty cool as we covered context-free languages simultaneously in linguistics and in my introduction to theoretical computer science class.

In graduate school my three years of high school French got me out of the Ph.D. language requirement. If English was not the lingua franca of our field, I would be in serious trouble. I've always been impressed how many non-native speakers of English have succeeded in computer science.

I spent an entire year on sabbatical in Amsterdam but only learned enough Dutch to navigate the supermarkets and order in restaurants. Most Dutch speak English (and 3-4 other languages) and my attempts to say most Dutch words usually got responses in English. Still I definitely missed something as when I left a conversation the language shifted to Dutch and I couldn't get back in.

Suppose I could retroactively master a single foreign language, what language should it be? At times I would have liked to know Dutch, German, Hebrew, Japanese and the occasional French, Spanish, Danish, Italian and Portuguese. In the future I suspect I would visit countries speaking Hungarian, Russian, Chinese, Swedish and many others. I've gotten very good at navigating in countries where I don't know the language. In most European countries I can pass as a local as long as I keep my mouth shut.

The University of Chicago has a rather strict TOEFL requirement that would likely have caused a problem for me had I grown up in say Germany. Our department also has a small foreign language requirement for the Ph.D. Foreign language requirements made sense in a different era when papers were written in many languages. I remember a scene in graduate school where my advisor Mike Sipser and some Russian speaking students poured over the latest paper by Razborov translating from the Russian and hoping to understand Razborov's next great result. But now with nearly all papers written in English the requirement seems like a relic from a bygone time. Perhaps we should require every student to take the test in French, for France still has a few researchers stubborn enough to keep writing in their native tongue.