I was wondering if you might comment on the history behind the formation of computer science, and in particular, on the subsequent overwhelming emphasis on implementation (studied via science/math), rather than specification (perhaps better studied as an art?).Would you make the same arguments about physics or biology? Computer science is foremost a science and trying to understand the nature of computation has its own beauty just like trying to understand the fundamental building blocks of the universe.One might argue that two fields should have formed, analogous to architecture and engineering say.
Sure, the implementation part is important (e.g., efficiency), but what to implement should be equally important.
Sure, there are a few subfields of computer science where people try to come up with new sorts of applications, but I think this is worthy of much greater emphasis, a different undergraduate program, a different research methodology (perhaps not so "scientific," but more like "art"). Moreover, I suspect that completely different sorts of people would be attracted to this field.
The MIT Media Lab is perhaps the best example of what such a field would look like.
Can one teach "what to implement" any more than an art class can teach "what to paint"? Best for us to teach the theory and tools of computation and then let the world find neat ways to use computers as they already have in countless ways.
I would say perhaps art is not the right example. I am not so sure if painting teaches less of tools and more of which landscape to draw. On the other hand ``Mathematics is actually an aesthetic subject almost entirely" (John H. Conway)� and like art it teaches tools � and their classical usage (analysis of Van Gaugh).
ReplyDeleteThe partition between architecture and engineering is artificial and backward. It is appauling to hear architects talk about designing a building and then going to the engineer to ask if it can be built. Safety and cost are important parts of a building design and there is no justification for an architect designing a building with purely functional/aesthetic concerns in mind while ignoring those two other "minor" ones.
ReplyDeleteAlso I'd hesitate to call the MIT Media Lab "best example" of anything. Their main means of publication is the press release and problems in no small part because of this the serious group (as opposed to the hype driven vaporware rest) within the lab splintered a few years ago. This was documented in a magazine article a few years back.
This dichotomy is false. Building anything successful involves creativity in design but also the ability to analyze (formally or not) what one is doing.
ReplyDeleteKnuth called his master work on design and analysis in Computer Science, "The Art of Computer Programming" for good reason.
Computer Science is as much doing computation as it is understanding the nature of computation. We know nothing of the nature of computation, anyway.
ReplyDeleteTake a look at Hugo Liu's papers to get some idea of the sorts of things the MIT media lab does:
ReplyDeletehttp://web.media.mit.edu/~hugo/publications/
Believing in these sorts of dichotomies has hurt physics, in particular by making it harder for physicists to imagine the idea of quantum computing, and then for awhile to define it more as an engineering curiousity than as a fundamental problem (about, say, the nature of entanglement or information).
ReplyDeleteIn architecture the dichotomy is not as bad as anonymous #2 implies, other than a few famous exceptions like Gehry. My mom is an architect and routinely does the basic calculations herself w/o involving engineers(and this is standard among architects). Large projects are done as collaborations with architects and engineers; of course each one says the others don't appreciate their perspective sufficiently.
In CS, of course the central question is "what can we do with computers that we haven't imagined yet?" I think practical/theoretical divisions are better split between tasks where we can expect private industry/individuals to do a good job, and tasks where university/gov't research is needed.
The MIT Media Lab is perhaps the best example of what such a field would look like.
ReplyDeleteAs a previous commenter mentioned, the Media Lab has employed serious people, but its hype about "inventing the future" was already a joke ten years ago. From The Onion: Experts Predict On-Line World Of 21st Century To Feature More Breasts
The MIT Media lab has some serious, serious problems, but there are some very good researchers there -- I don't know all of them, but Ros Picard's work, in my area, is regularily interesting and generally very high-quality.
ReplyDeleteRos goes to academic conferences. Many Media Lab researchers don't -- and that, combined with the Media Lab's long-standing habit of internal hiring, has led to some *very* disconnected people. I saw an appalingly bad job talk by an Media Lab grad student not long ago which showed complete ignorance of any work after 1965 which hadn't been conducted at the Media Lab. Eeek!
The MIT Media Lab may have problems, but it best demonstrates the sort of field that I have in mind where brainstorming of new kinds of applications is the main focus.
ReplyDeleteOf course, industry is doing that anyway, but that does not imply that universities should not be involved.
BTW, you may find this interesting:
http://www.research.ibm.com/journal/sj/393/haase.html
it best demonstrates the sort of field that I have in mind where brainstorming of new kinds of applications is the main focus.
ReplyDeleteThe MIT media lab brainstorms mostly sexy sounded press releases with little attention to actual realizability. How much pre-planing and engineering do you think went into the "$100 crank-powered laptop" before it was announced as a done thing?
If the Media Lab is truly the best example of what you mean, then I want no part of it.
The MIT media lab brainstorms mostly sexy sounded press releases with little attention to actual realizability. How much pre-planing and engineering do you think went into the "$100 crank-powered laptop" before it was announced as a done thing?
ReplyDeleteNot all work done at the MIT Media Lab is interesting. Why is this surprising?
Does this imply that research groups should not try out hundreds of wild ideas in the hope that a few of them will be compelling in some way?
Not all work done at the MIT Media Lab is interesting. Why is this surprising?
ReplyDeleteMuch of what is done at the MIT Media Lab wouldn't even qualify as work, much less of the interesting variety. That is the distinction that seems to escape you.
Gullible people buy into the hype, more seasoned professionals know exactly how scientific are the "contributions" from the Media Lab.
Gullible people buy into the hype, more seasoned professionals know exactly how scientific are the "contributions" from the Media Lab.
ReplyDeleteThe field I have in mind would not be a science at all actually, but more like an art.
So looking for scientifically rigorous contributions would not be the point here.
Of course, I agree that hyping something trivial is a bad idea.
Lance sez: Best for us to teach the theory and tools of computation ... It's amazing to me that Lance---and most of the complexity theory community---assumes that "teaching the theory and tools of comutation" is a feasible goal, given the exponentially growing complexity of the complexity literature ... and of all branches of scientific literature.
ReplyDeleteThere are wonderful opportunities here. E.g., if an market is open and free, but its efficient operating point is NP-hard to compute, then is it free in any meaningful ethical sense?
In the words of Kent Brockman: I for one welcome our new algorithmic overlords!
Note: the "algorithmic overlords" link includes an in-retrospect-hiliarious quote: "The keynote address by Jeffrey K. Skilling, President of Enron, featured an intense review of the critical role financial engineering plays in shaping this evolving energy company.".
More seriously, tens of thousands of people were impoverished because Mr. Skilling's investments were NP-hard to analyze. The close relationship between complexity, freedom, and ethics deserves closer attention, IMHO.
It is certainly true that during periods of time it was famously caught up in the hype cycle,
ReplyDeleteI appreciate your candor, and knowing that the Media Lab has a person that acknowledged the Lab's past mistakes forebodes well for it.
but I believe that this characteristic is over-hyped and out of date itself.
I'm willing to give the Media Lab another chance, but frankly the entire $100 laptop introduction used too much of the old school hype tactics. Nine months after being "introduced" Mr. Negroponte announced a set of "specifications" which are in reality nothing but desiderata.
Is there need for a cheap laptop in the developing world or would a library with PC workstations be better? Can an LCD alone be purchased for less than $100? Can a laptop be powered via wind-up or are the strains on the crank too high? Is $100 the critical barrier point? Why not $200?
Those are the questions that a scientist/engineer would ask and at least partially answer before issuing a single press release.
Janos Simon: the Enron scandal did not go on because their optimizing algorithms were NP-hard to analyze (whatever that is supposed to mean). Those guys were crooks, cooked their books, and the accountants did not do their job. The trickery was not hard to unmask, and in fact the Wall Street Journal did exactly that.
ReplyDeleteJanos, you might have reached a different conclusion had you applied "engineering-style" complexity theory---as mentioned in the original post---to the analysis of the Enron debacle!
If we start with the anthropic cognition models of, e.g., Franz de Waal's Chimpanzee Politics (which IMHO should be required reading for every young complexity theorist), and we conjoin these cognitive constraints with the Jeffersonian ideal that all citizens have the right to be informed and thus, the conjoined rights to an equal say in the government and to equal opportunity in the economy, then does it not follow, that an excessively complex economy---or academic literature!---becomes algorithmically unjust, in the Jeffersonian sense?
And therefore, does it not also follow, that the Enron debacle reflected not only the inherent weaknesses of human nature, but a failure of applied complexity theory in the political and economic sphere, specifically, a failure to devise checks and balances sufficient to rein-in these weaknesses?
From this point of view, a modern reading of the Federalist Papers sees them as a masters' exercise in applied complexity theory, the objective being, a social order that is just, secure, prosperous ... and sufficiently simple for these concepts to have individual meaning. Which, when you think about it, is a pretty amazing achievement in applied complexity theory!
To quote mathematician Andres Hodges' review in the August 2006 Scientific American(p. 97): "[These considerations] spotlight questions that are erased in formal papers and narrow research-group training. They are full of interest for anyone in the business of showing reality."
To which I may add, they have application not only showing reality, but constructing reality ... including its pragmatic elements of food, clothing, shelter, and individual liberty.
Hopefully, most courses in complexity theory will attract at least some young students who are, in Hodges' words, "full of interest in the business of reality." IMHO, their interests should not be wholly neglected, so that no matter where in the world the class is being held, we may say to these students "You are young, and vigorous, and your services as informed citizens will be necessary to the peace and prosperity of the world."
It's no joke living on a planet with six-to-ten billion recently-evolved chimpanzees ... even if you're one of them (as we all are)!
I am a little mistified by some of these arguments. Of course it is part of Computer Science to look for novel uses of computation, and, of course that is often an art. In fact, as Google has shown, it can be a very profitable art.
ReplyDeleteThere is ample opportunity to pursue novel ideas in academic settings (again. look at Google, or network protocols, or html for some success stories). It is unclear how much one can teach creativity, other than provide a millieu where it can flourish. Many academic programs do provide that.
At the same time, academic programs should do a good job of teaching what we already know, providing creative people with the tools of the profession.
Can you point me to a conference or journal exclusively devoted to new application ideas -- including novel ideas not yet validated in a scientific way?
Also, creativity isn't entirely a magical thing. It requires extensive brainstorming experience and multidisplinary knowlege. Both of those things can be part of an academic program.
The situation in computer science is similar to a music industry where everyone learns to build sound equipment, and on the side, a few engineers decide to compose some music as well!
Of course, you need a separate academic program for music composition, even though there is a creative component that is hard to teach. And of course, this program will attract entirely different sorts of people from those interested in building sound equipment.
To make this discussion a bit more concrete, take a look at these courses from the MIT Media Lab:
http://smg.media.mit.edu/classes.html
Can you point me to a conference or journal exclusively devoted to new application ideas -- including novel ideas not yet validated in a scientific way?
ReplyDeleteThe "Fringe" sessions of ACM CHI is one such place -- DIS (Design of Interactive Systems) is another such place...
The "Fringe" sessions of ACM CHI is one such place -- DIS (Design of Interactive Systems) is another such place...
ReplyDeleteI know about alt.chi and oopsla onward. Those are certainly steps in the right direction.
However, I still believe that this sort of thing is worthy of a completely separate field in the arts.
What's the point of arguing over whether CS should be one field or two? Do whatever sort of work you like, and convince others to join you. If you build up a community that's large enough and different enough from mainstream CS, it will automatically be viewed as a new field. If you don't, then nobody's ever going to accept an abstract argument for your new field. (That's how academia works: new fields of study are recognized only after they become popular, not in anticipation of future popularity.)
ReplyDeleteWhat's the point of arguing over whether CS should be one field or two? Do whatever sort of work you like, and convince others to join you. If you build up a community that's large enough and different enough from mainstream CS, it will automatically be viewed as a new field. If you don't, then nobody's ever going to accept an abstract argument for your new field. (That's how academia works: new fields of study are recognized only after they become popular, not in anticipation of future popularity.)
ReplyDeleteIt matters because the proposed field would be in the arts. Papers would not necessarily make any scientific contributions. Consequently, it would be difficult for people in CS to do this sort of research and be rewarded for it.
Anonymous posted: That's how academia works: new fields of study are recognized only after they become popular, not in anticipation of future popularity.
ReplyDeleteAnonymous makes a great point that is well-born-out by history.
The process of creating new disciplines in science and engineering is slow: within our University of Washington College of Engineering, the past 30 years have seen precisely one new department created (Bioengineering) and one old department eliminated (Nuclear Engineering).
The UW School of Medicine is very different: new departments come and go like tropical fish on a reef--so much so that boundaries between medical disciplines are dissolving!
IMHO, the UW School of Medicine's rapid evolution of academic diversity is most easly understood via Jared Diamond's world-view: it is being driven by access to an important new resource; that resource being the informatic flood resulting from modern genomic and proteomic instrumentation.
E.g., at present around a million new gene sequences are cataloged every day; the need to exploit these resources is creating new medical disciplines at a furious rate (and needless to say, these new disciplines have a large algorithmic and complexity theory component).
For those of us working to establish quantum system engineering (QSE) as a stand-alone academic discipline, the lesson is clear: the fast path is to link QSE to the creation of new resources. These new resources can be mathematical and/or scientific and/or engineering , but in all three cases, they wil gain vigor most rapidly by engaging the exploding diversity of biology and medicine.