Monday, May 31, 2010

24: Really Bad Game Theory, technology, and science

The TV show 24 aired its last episode on May 24. The show used computers and technology ALOT, more than on most shows. Was it realistic? What it interesting?
  1. Bad Game Theory: When two sides are negotiating you can have interesting sceanrios like Prisoners Dilemma. OR you can have stupid scenarios like on 24:
    1. The bad guys threaten to do XXX unless the USA does YYY. Here is the problem with this sceanrio: Even if we do YYY we have NO GUARANTEE that they won't still do XXX. The bad guys don't need some sort of credibility for later deals. This happened alot and the USA often got doublecrossed in some fashion.
    2. The Good Guys tell some bad guy they have captured: If you tell us the plot we will give you immunity. Why should the bad guy believe that we will let them go? This is not a public court of law.
  2. Bad portrayal of technology:
    1. It takes Chloe less time to break into the NSA's secure computers then it takes me to log on to NSF's FASTLANE.
    2. Jack had crucial evidence that he wanted to get out to the public! It never dawns on him to upload to the web. Why not? Because then the show would be called 16 instead of 24.
  3. Bad Science.
    1. In the show the USA has had two nuclear devices detonated on its soil during the show. That does not seem to have had any real bad consequences.
    2. Can someone really be awake 24 hours and still function pretty well? To be fair, there was a season where Jack was clinically dead for 20 minutes. Does that count as a nap?
  4. Politics.
    1. A Right Wing Propaganda show as it shows America has real nasty enemies and must use torture (which always works on this show).
    2. (This has been much less commented on). A Left wing show as we often see that the people funding the terrorists are often oil companies or even our own Government (only when its under a Republican Prez). Of the presidents we have seen on the show the best one was a Democract (David Palmer) and the worst one was a Republican (Charles Logan).
Having said all this, why did I watch it for 8 seasons?
  1. I watched it on treadmill, and it was the best treadmill show EVER!
  2. The funniest show on television! Examples: (1) In one room of the White House some people are trying to convince the President to pardon Jack for his torturing of bad guys in the past. In a different room Jack is torturing a senator's aid. (2) An innocent person is arrested without cause. There is much debate about the ethics of this. All of that debate goes out the window when we find out the innocent is not so innocent and in fact is the main bad guy. Hilarious! They make you think they will be making a serious point and then undermine it.

Friday, May 28, 2010

The Structure (or lack thereof) of Data

Many of the various "proofs" of P≠NP follow a common theme: Define an NP problem with a certain structure. Argue that any algorithm that solves that problem must work in a certain way and any algorithm that works that way must examine an exponential number of possibilities.

When you look at most common algorithmic problems, we state them as a structured object, say as a graph or a formula. But really every input is just a series of bits. Usually when we write an algorithm to solve a problem, the algorithm relies on the structure of the problem. But it doesn't have to, it could manipulate the bits in any way, shape or form. That's what makes separating complexity classes so tough: We can't assume anything about how an algorithm works. Algorithms can ignore the underlying semantic meaning of the input and focus on the syntactic part, the bits themselves. 

Over the past 10-15 years the machine learning community has had some significant success with statistical modeling where little focus on the actual structure of the data. Google recently released its Prediction API that does supervised learning (giving labelled examples) from unstructured data. Google gives the example of learning which language a text comes without telling the API that the data represents language samples. 

There is a constant back and forth in the machine learning community on how much to consider structure when developing learning algorithms. Nevertheless there are many examples of strong learning algorithms that make little or no assumptions on the structure of the data. In the theoretical CS community we haven't seen as many algorithms that work that way (its hard to do worst-cast analysis on such algorithms) but the possibility of such algorithms makes proving general lower bounds all that more challenging.

Thursday, May 27, 2010

Complexity and Computability

The 5th Conference on Logic, Computability and Randomness is going on this week at Notre Dame. Because of teaching and other commitments I only was able to attend for one day. Bill is there all week which is why he's not blogging. 

This meeting also brought together most of the 13 PIs from the NSF Focused Research Grant on Algorithmic Randomness (Kolmogorov complexity) four of which come from computer science (including me) and the rest from computability theory. We have gotten together several times formally and informally because of the grant and we all are friendly with each other but the CS and logic people have not mixed often in their research activities. Why not? We all like Turing machines, complexity has drawn many of its concepts from computability theory and this crowd all study randomness. A few people, Rod Downey for example, have had recent successes in both areas, but why don't we see a larger group of people doing both computability and complexity?

Didn't use to be that way, until the 90's there were quite a few people who were active in both worlds. But while computability and complexity sound related, the techniques used to study these areas are quite different. Despite the similarities, the P versus NP problem is a different animal than computable versus c.e.

As our fields get more specialized it becomes harder to keep up with the latest tools in other fields. And so complexity and computability have drifted apart. And then they develop different cultures making it hard to make the jump between the two. Even understanding the importance of the problems people work on in other fields takes some effort.

Still the FRG grant and these meetings bring some of us from complexity and computability, all children of Turing, talking together and at least helping us appreciate the work each other does.

Tuesday, May 25, 2010

Parallel Thinking

Guy Blelloch from CMU gave a distinguished talk last week at Northwestern on Parallel Thinking, a take on the Computational Thinking pagadigm from his past and future chair, Jeannette Wing. Blelloch's thesis: Because of multi-cores and cloud computing, parallel computing is now the norm and we should teach parallel computing ideas early in the undergraduate major. Blelloch gave some examples and described some of the changes in the Carnegie-Mellon curriculum towards his goal.

I'm a bit skeptical--parallel computing has many pitfalls, one has to be careful about data access and order of execution, that could trip many students if they try to learn parallel computing off the bat. But I'm happy that CMU is experimenting and we'll be watching.

Guy and I chatted a bit about theory's role in parallel computing. We both agree that the PRAM model doesn't really capture today's parallel computers because it assumes exponentially more processors than computation time. Guy would rather see a clean tradeoff: If one has a sequential algorithm that takes time T(n), he would like a parallel algorithm that takes time t(n) with p(n) processors for p=t(n)ε and p(n)t(n)=O(T(n) polylog T(n)). That way you can have a smooth trade-off so initially with fewer processors you can still get some advantage and it will scale nicely as the number of processors grows.

Guy's primary question was s-t shortest path. One can solve shortest paths by a PRAM using polynomial number of processors and polylogarithmic time using matrix multiplication methods. But can one have an algorithm using p=nε processors and time t(n) with p(n)t(n) quasilinear?

Monday, May 24, 2010

The Life of Martin Gardner 1914-2010

The math and science writer Martin Gardner passed away on Saturday. Gardner wrote the column Mathematical games for Scientific American from 1956 to 1981 and Scientific American reposted a 1995 profile, some Gardner puzzles and a 2002 column on Gardner's old book on pseudoscience. (Full disclosure: This blog is part of the Scientific American Partner Network).

Mathematical Games was the math blog before we had blogs. I read his columns religiously as a math-minded teenager, the best part of Scientific American at the time. Gardner retired from the column shortly before I started college replaced by the far too philosophical Douglas Hofstadter. 

I remember most Gardner's various columns on Conway's Game of Life. Gardner would excitedly report about someone who had solved some challenge about the game. These columns inspired one of the first programs I wrote on my TRS-80 in high school to simulate the game, first in Basic (too slow) and then in Z80 Machine Code (too fast). Didn't realize at the time that I had written a universal Turing machine with such a short set of instructions. 

Thanks Martin for inspiring both math and computation for me. This blog may never have happened if not for yours.

Friday, May 21, 2010

The End of Numb3rs

Sunday marks the end of the TV series that deals with the numbers 4, 8, 15, 16, 23 and 42. Monday marks the end of the TV series called "24". But lets talk about Numb3rs, which CBS officially recently announced would not be renewed for next year and aired its last episode on March 12th. 

Back in December of 2004 I wrote a skeptical post on this new TV series announced by CBS about "a FBI agent who recruits his brother, a math genius, to help solve crimes." The show lasted a surprisingly long six seasons and this is the 17th of the our blog posts through the years that have mentioned the show. 

After a solid start, the series slowly devolved into a crime procedural where the math became more jargon than relevant. I admit that I stopped watching after a while but have since been catching up on DVD. I haven't seen the last season yet.

Theoretical computer science had several mentions on the show with algorithms from Dijkstra to Kruskal to Shor. In the second episode, the mathematician Charlie became obsessed over P versus NP. In the second season Charlie exclaimed to a purported psychic "Let's all sit down at the Ouija Board and try to solve P versus NP once and for all."

What I liked most about the show was the portrayal of the scientists, in particular the interactions between the mathematician Charlie (played by David Krumholtz) and the physicist Larry (Peter MacNicol). Their personalities and discussions seemed real, not unlike a various combination of academics I know (as opposed to say The Big Bang Theory).

Now that Charlie no longer needs to catch criminals, he can go back to being obsessed over P and NP, doing some real good in the world.

Thursday, May 20, 2010

Teaching for the first time: SUBRUK's story

(Guest Post by Subrahmanyam Kalyanasundaram)

My First-time Teaching Experience.

Background: This spring semester, here at Georgia Tech, I got the opportunity to be the full instructor for Automata and Complexity, CS 4510 -- a senior level undergraduate course. I learned about it last summer and started preparing for it during the fall semester. It isn't that common for Ph.D students here in computing to teach a full course, so I was very excited about this class. The course outline, was the following -- regular languages and finite automata, context free languages and pushdown automata, Turing machines, Turing recognizable and decidable languages, undecidable languages, classes P, NP and NP completeness, classes L, NL, NL completeness, NL = co-NL, and PSPACE completeness. If you read Sipser, these are chapters 1, 2, 3, 4, 5, 7 and 8 (with some minimal additions/deletions). These were the material that I was expected to cover.

Beginning: As a student instructor, and since I was doing the class for the first time, I was unsure of how much time (more or less) I would have. I was also a bit nervous. In the beginning of the semester, I didn't know if I would be able to do all of the above topics. So I thought I shall try to not cover anything extra till I got an idea of my pace. In the past I held office house. But this is very different from a class. During office hours, I have the student's attention, and can explain again and again till the student gets it. I didn't have any pressure to finish teaching within a given time. But 2-3 classes into this semester, I realized that I don't have that luxury now. I was responsible for the whole class, and it became imperative that I get the explanation right the first time, or at least mostly right. I realized that good preparation was essential.

The Class: The class had around 20 students. I quickly learned all their names and that helped me develop a rapport with the class. I started teaching the material rather slowly, because I didn't know the level of the students. I soon learned their comfort level and paced myself better. By mid semester, I had finished decidability theory and started on time complexity, which was better than what I had hoped for in the beginning of the semester. So I could afford to do spend more time on the rest. Another thing that I tried to do was to ask the students for feedback at a regular frequency. I tried my best to let them know that they are welcome to air whatever views they have about the class.

I tried to do more examples in the latter half of the class. Also, I did certain things differently from the Sipser book. To give an example, I covered space complexity a bit differently from Sipser. I preferred defining the space bounded TM with a separate worktape up front. Sipser does it after PSPACE completeness, when defining classes L and NL. Plus we covered L, NL, NL completeness, Savitch's theorem and NL = co-NL before concluding with PSPACE completeness. I finished this with some 2-3 classes left, in which I taught the definition of communication complexity, and a few lower bounding methods. Why communication complexity? I felt that the students are much more likely to see randomized classes in future classes. So it was just a choice between interactive proofs and communication complexity, a choice I left to the students at the end. I did interact with GASARCH after his earlier post on what I should cover during the extra lectures, but I didn't use any of his suggestions. For someone who is interested in how I paced my lectures, look at the class schedule.

Exams: We had 2 mid term exams in addition to the final exam. I didn't do a good job of judging the difficulty before the first mid term exam. As a result, there lots of students who were unhappy. I didn't want them to feel demotivated, and wanted to encourage them to continue performing well in the class. So I asked H Venkateswaran, who had the experience of teaching this class before, how I could offer the students a chance to improve their scores. We discussed, and came up with the idea of an optional reading assignment, for those who wanted to improve. This worked well, the students were in general happy for the chance, also they got an opportunity to learn additional material.

Overall: It was a wonderful experience for me overall. Really satisfying, and was worth all the effort that I put into it. And I think most of the students returned happy too, at least those who cared to tell me. It would be incomplete if I didn't thank H Venkateswaran who helped me prepare for the course all through last Fall in a very comprehensive manner, and was readily available to answer my queries of all kind throughout the course. And thanks to my advisor Dick Lipton who was very supportive and understanding all throughout. Finally, I had great students, teaching is very interactive, and the class would have been much less fun if not for them.

Wednesday, May 19, 2010

What should be in an automata course: Two views based on recent experience

(Joint Post with Subrahmanyam Kalyanasundaram)

In this post, I speculated on what I might put into my automata theory course. That prompted Subrahmanyam Kalyanasundaram, a PhD student at Georgia Tech, who was teaching Automata theory this spring, to begin a private correspondence with me. Now that both of our courses are over we can tell you what happened.
  1. BILL: I did do Decidability of WS1S (weak second order with Successor and less-than). This was a real win-win-win. (1) It was a nice application of Automata Theory. (2) When I spoke about P and NP and they wanted a natural problem that is KNOWN to not be in NP, Dec of WS1S had already been discussed so I could refer to it and they thought it was natural (I would agree). (3) When I spoke about computability theory and they wanted a natural problem that was not computable I could talk about Hilbert's tenth problem. Since they already had the notion of a fragment of math being decidable or not. Will definitely do this again. Another pro - you can ask real questions about it.
    SUBRUK: I didn't do this.
  2. BILL: Non-Graph-Isom is in AM. I had to do it informally and not prove anything about it. That's fine. I did not do enough of it to be able to ask questions about it. That is a problem with the material. It did give them a plausible intermediary problem (in NP, likely not in P, likely not NPC). Will do it again, it's short, but will think harder about asking them questions about it.
    SUBRUK: I thought about doing this but ended up doing Communication Complexity instead. But I might do this in the future. I only had a few lectures at the end for non traditional topics.
  3. BILL and SUBRUK: Neither of us did Ladner's theorem or the time-hierarchy theorem.
    BILL's REASON: These yield unnatural sets and hence are not of that much interest. Or to put it another way: if a student says ''is there a problem that is decidable but not in NP'' they would much rather see WS1S than a construction of an artificial set. And I agree with them.
  4. BILL and SUBRUK: Neither of us did primitive recursive. There is no real punch-line there. And we won't in the future.
  5. BILL: I didn't do proofs of obvious things. And I won't in the future. I did constructions (e.g., Reg Exp == DFA == NDFA) but did not prove that they work since they obviously work.
    SUBRUK: I did some stuff with TM equivalence that is kind-of obvious which I will skip next time I do the course. Similarly, I may follow BILL's approach on things like Reg Exp == DFA == NDFA in the future.
  6. BILL: Didn't do ZK protocol for SAT. That I would have liked to do and may do in the future; however, we lost two days to snow so that made the semester more time compressed. This I think I COULD make up questions on.
  7. BILL and SUBRUK: We didn't do Mahaney's theorem (if there is a sparse NPC set then P=NP). The PRO of this would have been that there are LOTS I could ask about it. The CON is that there is no punchline for them. I don't really want to go into the Berman-Hartmanis conjecture. BILL personally likes this material (see here) but that is not a good reason to tell them about it. SUBRUK didn't want to start anything that heavy in the few days he had for non-traditional topics.
  8. BILL and SUBRUK: Neither of us did Det PDA's.
  9. BILL and SUBRUK: Both of us did the poly time algorithm for CFGs. This is nice since it fits CFGs in between REG and P.
  10. SUBRUK did a little bit of Communication Complexity Lower Bounds. It was nice and he says that the students liked it. It did not even occur to BILL to do this, but he may consider it in the future.
  11. SUBRUK pondered doing the poly hierarchy but didn't. When he was talking about Sigma-2-P he found that it was hard to keep students interested, and so decided to not cover PH. BILL speculates that this may be true since there are no natural problems there.
  12. SUBRUK did space complexity, NL-completeness, NL = co-NL, Savitch's theorem, PSPACE-completeness. BILL didn't do any of this but is intrigued by the notion of doing it and may consider it in the future.
  13. BILL's Future: SUBRUK has given me some things to think about doing which I had not thought of before. I may move some in or out in the future. If I teach this course many years in a row I will need to vary it some for my own sanity. (Some would say it's too late to worry about that.)
  14. SUBRUK's Future: As noted above, less details on things that are obvious. I would use the extra time to do more interesting non-traditional topics like Interactive Proofs or Zero Knowledge. I might keep varying some of the topics, to see if some new topics might be worth adding.

Tuesday, May 18, 2010

Boycotting Arizona

I heard a suggestion that computer science conferences not be held in Arizona because of their new anti-immigration law. Both Bill and I have discussed academic boycotts before in regards to China, Israel and South Dakota. I have not been a fan of academic boycotts because we shouldn't push a political agenda that isn't directly related to the health of our field. Otherwise we start down a slippery slope that could prevent us from having conferences just about anywhere and punishes local researchers that have played no role in and often oppose those very policies.

The Arizona issue literally hit close to home. Our local high school (where my older daughter attends) had decided to withdraw the girl's basketball team from an Arizona tournament planned in December. Many of the parents have gotten quite upset about using high school athletics to push a political agenda. Sarah Palin had her say during a visit to Chicago last week. The school board says it wasn't a political decision:
Under long standing constitutional law, all school districts are required to provide an education to all children within the District’s borders regardless of immigration status. District 113 boasts a diverse student population and, as a school district, we believe in equal opportunity for each of our students.  The selection of a girls’ varsity basketball team for the 2010-2011 winter athletic season will take place in November, 2010.  The team has yet to be selected.  When our students travel, the school district is responsible, both legally and ethically, for their safety, security and liberty.  We cannot commit at this time to playing at a venue where some of our students’ safety or liberty might be placed at risk because of state immigration law.  Our athletes will play in a competitive basketball tournament during their winter break.
but that didn't stop a heated debate in last night's school board meeting.

Monday, May 17, 2010

When to go Low Tech

Recently someone asked me to subreferee a paper for a conference. She emailed me a pdf file but when I printed it out it was unreadable- the spacing was all off. What should I do?
  1. Print it out in different ways with different options.
  2. Change printers.
  3. Email the sender (perhaps ask to resend).
  4. Email my staff.
OR I could do the following:
Spend no more than 5 minutes on the problem and if that does not work email the sender ``the paper you send me does not print correctly on my machine. If you want me to subreferee than FAX to (I gave the number). I have already spend 5 minutes trying to get it to work and that is my limit.''
I of course opted for this option (she was happy to do it and it worked). I have seen too many people waste too much time on things that either do not work, or eventually do but the solution discovered do not help at all the next time such a problem occurs.

If the fax machine did not work I would have asked for the person to use a technology you may not be familiar with. Its called Snail Mail. If you don't know this technology then ask one of your older faculty to help you since this was a common way to send papers 20 years ago.

This is NOT being a Luddite. This is recognizing the limits of technology and knowing when to stop wasting time.

Friday, May 14, 2010

Goodbye PostScript

I was uploading a new PDF file up to my publications page and noticed that the paper also had a PostScript link but I didn't have an updated postscript file. So I asked myself (and my Twitter readers) if anyone still uses PostScript. Apparently not so I converted my old PS files (including my thesis) to PDF and removed all the PostScript links from my page. I left the old files in place in case others link to them but I won't maintain or update them. No more PostScript for me.

In college I was one of the early people to typeset a paper on a computer using a program called Script (no relation to PostScript). In grad school I shortly used troff before turning to LaTeX for my technical papers. LaTeX produced dvi (device independent) files which could then be converted to PostScript which was the language used by many printers. PostScript was also in ASCII meaning we could then (around 1989) easily email papers, marking a major change in research distribution. When I started a web page a few years later, I put up PostScript files of my papers (marking another major change in distribution). 

In 1993, Adobe created the PDF format though for a while it was hard to convert LaTeX into good looking PDFs. But when I could I would put up both the PS and PDF files on my page. Now we have direct LaTeX to PDF compilers and PDF readers abound for all devices, so it's time to say goodbye to the PostScript era. Just not worth the effort to maintain two formats.

On a different note, as I was doing the conversion I tried to find the online version of my recent STACS paper on the Springer LNCS site. Couldn't find it because in 2008 STACS moved the proceedings to the Dagstuhl server. First time for me to have a conference proceedings paper on an open-access online-only archive site. Another era begins.

Thursday, May 13, 2010

Knowledge is Power!

In my last post I gave and asked for examples of people who didn't know things that they really ought to know. A commenter named Josh said posted the following:
Bill's question seems to generate a lot of nonpositive responses, e.g., embarrassing stories of lack of knowledge.

How about a slightly more positive question: are there examples where someone KNOWING something from slightly outside their tiny area of expertise led to significant results?
This is a great question! I give two example from my own research that almost qualify--- I used knowledge from outside of my tiny area, though I would not call the results significant. Readers- do what I do, put aside the question of significance (a topic for another post perhaps) and just comment on when knowing stuff from a different area or areas helped you in research.
  1. Knowing omega-automata and Hilbert's tenth problem was the key to the paper Learning via Queries which was co-authored with Carl Smith and appeared in FOCS 1988 and JACM 1992. It was the first in a series of papers that added queries to the Inductive Inference model.
  2. There is a trick in recursive graph theory where you build a graph G with two special vertices u and v so that if G is properly k-colored then u and v are the same color. This is NOT a hard trick. I used a variant of this trick to find better bounds for some poly VDW numbers. This lead to, in turn, a problem on the Maryland Math Competition, two projects for High School Students, one project for college students, and finally a paper (in progress) by Gasarch-Kruskal-Kruskal.
But never mind me, I want to hear from YOU!

Wednesday, May 12, 2010

What did he know and when did he know it?

Sometimes you learn a theorem in your academic career far later than you should have. Here are some examples.
  1. I didn't know the classic upper bounds on the higher Ramsey Numbers until 2 months ago (note that I have been studying Ramsey Theory with a passionfor about 8 years). Upon hearing the theorem 8 years ago I came up with a proof of my own (which had TOWER bounds). I later forgot that I came up with it myself and thought it was the classic proof. I recently saw a reference that said it had double-exp bounds. This confused me so I looked it up and then realized that I never knew the classic proof! Now I do. And I have written up both my proof and the classic proof here. This is odd--- it is common to hear a proof and then later think you came up with it on your own. This was the opposite- I came up with a proof on my own and assumed I had heard it somewhere. I don't quite know what to make of this- I am happy that I came up with a new proof, but it gives worse bounds and is has no advantage whatsoever. (Both proofs are of about the same difficulty, though you can read them and see what you think.)
  2. I know of two professors, one in Cryptography, and one in Parallelism, who did not know that every planar graph has a vertex of degree at most 5 until I mentioned it casually in a lecture. One said I missed the lecture on planarity in my graph theory class. While I believe this to be true I was surprised that he hadn't heard it somewhere else. Also, there is a point where you should stop thinking of knowledge in terms of courses you took. The other said Why should I care? My answer: it gives an easy proof that all planar graphs are 6-colorable, and it is the starting point for a fairly easy proof that all planar graphs are 5-colorable. He was not impressed.
  3. A Complexity theorist I knew did not know that if you are operating mod a product of L primes, a number can have 2L square roots.
  4. An Applied Math Professor I knew did not know that the reals were uncountable until I told him. He was a very practical person and didn't seem to care, calling it a mathematical trick. Oddly enough he used the term measure 0 sometimes.
  5. Someone I knew who worked in Probabilistic Learning Theory did not know what variance is.
  6. I knew a grad student in applied math who did not know that eπ i=-1.
  7. There are several complexity theorists who do not know there is a c.e. set that is not Turing Equivalent to HALT and not decidable. When I tell them there is such they say Oh, just like Ladner's Theorem. This is true but sort-of backwards.
This raises the questions: (1) What should everyone know? (2) What does everyone know?

I would have thought that a planar graph having a vertex of degree at most 5 is in both categories, but perhaps I am wrong. I would have thought that knowing the reals are uncountable is in both categories, but here I am RIGHT.

So I ask you readers: name something you or someone you know found out far later in their academic life then is expected.

Tuesday, May 11, 2010

Which Name Do You Play For?

A relative on Facebook posted the following note about baseball player Alex Rodriguez.
He plays for the name on the back of the jersey [Rodriguez] and not the front [Yankees].
How about us academics? Do I play for "Northwestern" or "Fortnow"? Even at Northwestern do I play for the department, the school (McCormick Engineering) or the university?

Americans, in general, change jobs more frequently than people in other countries. Academics a little less frequently, a bit because of the tenure system. Nevertheless academics do move around and I've had my share of job changes. But when we do take a new job we immediately put our loyalties there, now that I'm at Northwestern I want Northwestern to succeed at every level and I put in effort to help make that happen.

Even though I left University of Chicago, I had many good years there and I still want the department there to succeed. But I no longer play a role in that effort.

In the end you have to do what's best for your career and your family. Occasionally that means considering new opportunities. But when an institution supports you research, you have a responsibility to make it strong as well.

Luckily in baseball and academics these goals are often well aligned. When Alex Rodriguez hits a home run he helps himself and his team. Do great research and you make yourself and your university look good.

Monday, May 10, 2010

NSF at 60

On May 10, 1950, Harry Truman signed Public Law 507 creating the National Science Foundation based on Vannevar Bush's Science - The Endless Frontier. The NSF is celebrating with the Sensational 60, a highlight of the greatest scientific ideas that came out of NSF funding including RSA, Google, Cloud Computing and the proof of Fermat's Last Theorem. 

While no agency that receives its annual budget from the president and congress can remain politics-free, the NSF's decision makers are career scientists and grants are decided by peer review without having to be cleared by government bureaucrats to meet some outside goal. Many other countries have adopted the NSF model for their own scientific agencies and I feel sorry for the ones that haven't.

In 1986, computer science research came under a new Directorate for Computer and Information Science and Engineer (CISE). NSF CISE provides most of the government research funding in theoretical computer science including my own.

The entire theory branch of the NSF, save program director Tracy Kimbrel, changes this summer and who those new people will be remains a mystery. MIT Engineering dean Subra Suresh is rumored to being vetted as the new director of the NSF. Susan Graham is collecting nominations to replace CISE AD Jeannette Wing. The search goes on for people to take over for CCF Division Director Sampath Kannan and Theory program director Richard Beigel. If you have suggestions for people, let me know and I'll pass them along. The future of theory funding depends heavily on us getting good people in those positions.

On the CCC blog, John King a notes how computer scientists are quite harsh in their grant reviews and how that can ultimately hurt the amount of CS funding (Ed Lazowska also chimes in). Probably comes out of our unique conference culture where program committees usually have a fixed number of slots to fill so tearing down other papers opens more room for papers you like. But NSF grants are not a zero-sum game. You shouldn't lie about the quality of a grant proposal, but neither should you look for reasons to tear it down.

Friday, May 07, 2010

Is Complexity Math or Science?

In March David Pennock promoted Computer Science as STEAM (Science, Technology, Engineering, Arts and Mathematics) a takeoff on STEM. There are aspects of computer science that fit all five of these areas. But how about theoretical computer science and in particular computational complexity? I don't feel like I do much art, technology or engineering in my research, even though I now sit in an engineering school. But is complexity math or science, a question also raised in the comments to my post last week. 

Computational Complexity studies the power and limitations of efficient computation. So is efficient computation purely an abstract mathematical object or is it trying to model a real world phenomenon? I would argue the latter. Efficient computation occurs not just in computers but in biological systems, physical systems, chemical systems, economic systems and much more. Physics focuses on the "what", computational complexity on the "how".

We don't do experiments but I do see the role of computational complexity similar to theoretical physics, trying to model the world we live in.

Pure math is the study of abstract objects, applied math develops tools used in other disciplines. Computational complexity is neither, rather an attempt to understand fundamental processes that occur in the universe, much like other sciences.

Thursday, May 06, 2010

COLT and CCC no longer take papers in ...

The list of COLT papers are posted here.

Carl Smith claimed that COLT was made possible because of THREE strands of learning theory coming together to form a workshop and later a conference: (1) Inductive Inference (Computability Theoretic Learning), (2) PAC learning, (3) Query Learning. This seems right ot me, but even early on Inductive Inference had far less papers than the other areas. The conference has evolved since then and there are types of learning that were not known back then. Inductive Inference seems mostly gone. There is one paper on it (Kotzing-Case paper Strongly Non-U-Shaped Learning Results by General Techniques). Also, it looks like there is nobody on the Program Committee in that area.

Part of the reason Ind. Inf. is gone is that ALT Algorithmic Learning Theory (ALT) has taken up some of the slack--- there were 4 papers in Ind. Inf. and 2 people in the area on the Program Committee. ALT is actually a pretty big learning theory conference since its actually ALT/DS -- two learning theory conferences that are at the same time and place. See here for info on DS. But looking at ALT as the reason may confuse cause and effect. Was ALT founded partially to take those papers that COLT no longer took?

It has been noted before that CCC does not take papers on computability-theoretic Complexity. Similarly, COLT does not take papers on computability-theoretic Learning. Such papers in Learning theory still have an outlet: ALT. Does CCC still have an outlet for such papers? Perhaps Computability in Europe (CIE).

Why do certain subfields of a field die out and others live on? I abbreviate Ind. Inf. by Ind. Inf. I abbreviate Computability-Theoretic Complexity Theory by CTCT. I define CTCT rather narrowly: arguments similar to those in Computability theory- reductions, constructions of weird sets.
  1. If a subfield does not really connect up to the main field then it may die. True for Ind. Inf. even at the beginning of COLT. For CTCT there are some nice connections to complexity Theory, such as the very definitions of reductions and also Ladner's theorem. The results on Sparse Sets might be considered CTCT, but that's pushing it and that was a long time ago.
  2. If the field moves in a more practical direction, the more theoretical subfields may be left behind. True for Ind. Inf. For CTCT this is some true- but here practical means lower bounds on approx algorithms which CTCT has had no impact on.
  3. If a subfield runs out of questions of interest then it may die. True for Ind. Inf. Less true for CTCT. People in both fields might say that there are still questions of interest; however, if its only of interest to people in that subfield, that might not count. Though I agree that of interest is a slippery notion.
  4. The field finds other things more worth studying. True for Ind. Inf. as Learning Theorists got into other things. Some True for CTCT as concrete models, PCP, quantum, crowds the field out.

Wednesday, May 05, 2010

Ralph Kramden: Your wait is over! 3D-TV is here!

(This was written before I saw Lance's post on gadgets. This post could be called an unintentional co-post. Is that a word? Now it is!)

In The Honeymooners episode that aired on Oct 1, 1955 Ralph Kramden does not want to buy a TV set because I'm waiting for 3D TV. He could be called the ultimate last blocker.

Well Ralph, your wait is over! 3D TV is here if you have 9000 dollars to spend. And you should look here for more information.
  1. I won't buy it until its MUCH cheaper and MUCH better. Actually I may not buy it at all. Do I really need it?
  2. Monsters vs Aliens and Ice Age: The Dawn of the Dinosaurs are already out in 3D. The Shrek series should be soon. Some sporting events are being broadcast in 3D. (Information from the most recent Consumer Reports Magazine which had an article on 3D TV.)
  3. In America one needs a cell phone and a computer to function in society (there are exceptions). People selling and improving cell phones and computers are supplying a need. People selling and improving HDTV and 3D-TV are creating the demand. Do we really need these things? I would say no. Will our great great nieces wonder how we ever managed without them? Probably. We wonder how our great great grandparents managed without TV. Some of us wonder how our grandparents managed without cable.

Tuesday, May 04, 2010

Gadget Love

Many of my fellow CS theorists are surprisingly technophobes. Don't own a cell phone. Begrudgingly got a credit card but still refuse to buy anything online. Still read email by typing "mail" at the unix prompt. Maybe they don't trust technology, but mostly they just can't be bothered--takes away valuable theorem proving time. 

Then there are those who love to tinker. Spend an hour writing an emacs macro to save ten minutes of manual editing. Writes an iPhone app just to see how it is done. I used to be like that.

These days I'm more of a gadget person. An incredible amount of technology put into a little device that does lots of useless things or sometimes one thing very well. And I'm at that point in my life where I can afford them. My wife complains a bit but she realizes that all my gadgets combined are considerably cheaper than a sports car or a mistress.

Some of my favorites (beyond the laptop/Kindle/iPhone/iPad)

TomTom - A GPS combined with some shortest path algorithms can get me from here to there. But it takes a great user interface to make it easy to use. My old TomTom died after many good years. Replaced it with the TomTom app on my iPhone.

Satellite Radio - Commercial-free music. Lots of variety. Great sound that doesn't fade when you leave the city. Worth it for the Met Opera broadcasts alone.

Tivo - I've been time-shifting TV shows since the VCR first came out. But Tivo pioneered the DVR with a great interface that no one else has matched.

Wii - The graphics aren't as cool as the PS3 or Xbox but just pure fun playing with the kids.

I don't really have time to watch that much TV or play the Wii. Too bad.

Squeezebox Duet - Just a simple device that lets me stream music wirelessly into my stereo. I use this device more than the others because I like to work and read to music.

Plenty to keep me happy. At least until those new 3-D TVs come out...

Monday, May 03, 2010

Guest Post on Robin Milner who passed away recently

Robin Milner died on March 20, 2010. For obits see here and here. A review of his most recent book will be in a future SIGACT NEWS book review column; however, you can read it here.

This is a guest post by Rance Cleaveland about Robin Milner.

A REMEMBRANCE OF ROBIN MILNER by Rance Cleveland.

Robin Milner died March 20, 2010 of a heart attack at age 76. He made numerous contributions in the areas of automated reasoning, programming languages, software verification and concurrency theory and was awarded the Turing Award in 1991. (British readers will also note his election as Fellow of the Royal Society in 1988.) His contributions to the development of the Logic of Computable Functions, the programming language ML, the Calculus of Communicating Systems (CCS) and the pi-calculus are justly revered and amply documented in the many obituaries that populate the media.



I want instead to offer a recollection of my own about Robin, whom I first met in 1987 while I was visiting Edinburgh, where he was at the time a professor. I had just finished my PhD and had arrived in the UK for a two-year postdoc at the University of Sussex, where I was to work on a joint project (the Concurrency Workbench project) involving that university and Edinburgh. I was not quite 26, very green, unpublished at the time, and star-struck at meeting people like Matthew Hennessy (my supervisor at Sussex) and Colin Stirling (another project member at Edinburgh), whose papers I had read during my studies. I could not fathom meeting an Olympian like Robin Milner, and indeed during the initial days meeting at Edinburgh I kept my mouth shut and tried not to succumb to a sense of surreal disconnect.

The meetings came to a close, and the question arose as to where I was to stay that evening. A PhD student at Edinburgh had been drafted to host me, but when Robin heard this, he instead offered accommodation at his house. More comfortable, you can have your own room he said.

Oh dear.

Of course I accepted his offer, even as the rising thunder in my ears presaged the possibility of a nervous collapse, and we went to his house after a group meal at a restaurant.

Would you like some vodka? he inquired, explaining that a visiting Russian mathematician (recall the Cold War was still ongoing, and meeting a Russian, never mind hosting one in your house, was impossibly exotic to me) had brought it to him a few weeks previously.

Well, yes, I would, thank you very much.

So out came the bottle, and we spent the next two hours talking about process algebra, bisimulations, logical characterizations of system equivalence, you name it. And I began to relax, and enjoy the conversation, because Robin was listening intently, and responding intelligently and respectfully, and offering up intuitions and insights that made concepts I had struggled to understand on my own instantly clear, and even inevitable. And I realized that I could not only follow, but contribute.

The next morning Robin and his wife Lucy made me breakfast, and he and I returned to the university for more meetings, with me leaving that afternoon to return to Sussex. That whole day, though, I remember feeling light as a feather, willing to wheel and dart and engage intellectually with other team members like I had not just the day before. In retrospect, I think I can say I became a scientist that night, sharing a glass of vodka with Robin Milner.

I saw Robin from time to time over the years, every few months during the course of the project, less so as the years passed and my career took its own path. I still can recall with almost crystalline clarity, though, that night, where a great scientist showed a young acolyte one of the greatest kindnesses of all: taking him seriously.

Rest in peace, Robin.