For a long time, people got used to being lazy. If computers become twice as fast every 1.5 to 2 years, there is no point in investing much efforts in writing efficient code. If something does not run fast enough, simply wait for the next generation of Intel x86 and everything will be resolved. In particular, CPUs became fast enough that traditional programming languages and efficient data structures and algorithms were being abandoned in favor of high level scripting languages whose most sophisticated data structure is an associative array. Suddenly, every Joe-hoe could become a programmer developing sophisticated Web applications with no effort – no need for hard earned computer science degrees anymore.Let's ignore the issue as to whether Moore's law is really coming to an end (Moore's law has had 5-10 years left in it since Gordon Moore developed his law). Friedman misses the bigger point, an end of Moore's law will do great harm to Computer Science.
Friendman's mistake is to assume that people always have the same problems to solve. Suppose computers stopped getting more powerful fifteen years ago. Machine Learning as we know now would not have been possible. Nor would we have the advances in graphics, robotics and virtually every other area of computer science. A good deal of CS systems research goes into making these new machines even more efficient, secure and reliable.
Even in theoretical computer science, the advances in computer technology make our work stand out. As we try to tackled larger problems asymptotic algorithmic techniques start dominating the running time. While we can now brute-force solve NP-complete problems on moderate input sizes, as computer performance improves, so does the thirst for solving even larger challenges and the P versus NP problem becomes a harder monster to slay.
Sure Joe-hoe can wait a year or two for his application to run fast without new coding ideas. While he waits Sam-ham takes advantage of new technologies making Joe-hoe's applications out of date.
Without the increased power of computers, we all become hackers to make our programs run better and that is not computer science.
1. What if Moore's Law has concluded (ever nonlinear advance per year in number of transistors per chip), but there are continued dramatic increases in power-efficiency and corresponding decrease in costs, leading to nonlinear growth per year in number of deployed chips?ReplyDelete
2. It's good to recall that once upon a time, only decidability mattered. What's that you're working on, satisfiability? Not interesting, it's decidable. We can look back and laugh, but who's to say that there won't be some beautiful theory of what you call hackers reducing the constants? Can you prove this won't be the case?
Roy Friedman's argument reminds me of the Broken Window Fallacy.ReplyDelete
I don't think the old notion of (practical) efficiency applies anymore anyway. Going forward the only way (or at least maybe the only proper way) to be "more efficient" is to take full advantage of parallel processing, multicore processors, etc. That means writing programs that are multi-threaded from day one.ReplyDelete
Robert E Bixby's article "Solving Real-World Linear Programs: A Decade and More of Progress" (Operations Research, 2002) surveys the balance between "Moore's Law" (that is, exponential) progress in computing power versus progress in algorithmic efficiency, and concludes that the two exponents are approximately equal. Good.ReplyDelete
More broadly, exponential technical progress is seen across broad spans of STEM capabilities that include:
• algorithmic efficiency (see above),
• computational speed (see above),
• computational energy efficiency,
• channel capacity of data transmission, and
• channel capacity of data acquisition/sensing.
An advance in any one of these areas helps all the others, and conversely (but slightly less obviously) obstructions and imbalances relating to progress in any one of these areas, in the long run, act to hinders all the others.
In my own field of medicine, for example, progress is presently slowed by an imbalance between a burgeoning gene sequencing capacity — with costs dropping 10^6-fold in a generation — versus essentially fixed costs associated with data acquisition relating to the epigenomic mechanisms that regulate gene expression. To the extent that the economic and social vigor of the biomedical sector of the STEM community is compromised by the roadblock, everyone is adversely impacted (not just biomedical researchers, physicians, and their patients).
Recalling that Aergia was the Greek Goddess of indolence, key STEM sectors that once were perceived as holding forth unbounded Moore's Law potentialities, that nowadays are perceived as afflicted (temporarily?) by Aergia Syndrome, readily come to mind:
• epigenomic molecular biology,
• regenerative medicine,
• efficient code-writing,
• practical quantum computers,
• rigor within the Complexity Zoo,
• viable fusion power, and/or
– safe nuclear power, and/or
– cheap photovoltaic power, and/or
– "clean coal."
However, Aergia Syndrome is perhaps most strikingly evident in STEM education. In recent decades the perceived value of a STEM education, relative to the costs and duration of that education, has not increased at anything approaching Moore's Law exponention. One wonders, why not?
The increasing prevalence of Aergia Syndrome in STEM education is a tough, deep, vital question … and yet in-depth analyses of it are surprisingly hard to find.
Friedman is complaining that the advances in computer technology have been traded for more abstract general-purpose languages. This has been fantastic for theoretical computer science. The lost skills of hand-tuning exquisite data structures were the antithesis of the asymptotic-minded theoretical approach.ReplyDelete
Your dismissive attitude towards what you call "hacking" is disturbing. You sir are a "CS" snob.ReplyDelete
"Without the increased power of computers, we all become hackers to make our programs run better and that is not computer science."ReplyDelete
Because shaving a factor of log log n off the running time of an algorithm *is* computer science...
Is getting constants down "hacking" because if it is then a bunch of "theory" work on median finding needs to get demoted.ReplyDelete
@ Previous anons: No one said anything about constants and log log factors, you are the ones who brought all that up. Lance's comment is no more snobbish than Dijsktra's famous quote "Computer science is no more about computers than astronomy is about telescopes." Computer science is the study of problem solving techniques, and this is distinct from computer engineering. Clearly there is a spectrum and much work falls somewhere in between, and some exceptional work isn't well described this way at all. But that doesn't make it snobbish to draw a distinction between computer science and computer engineering.ReplyDelete
@Anon 1: It isn't feasible to give a tight analysis of an algorithm down to the level of constants, for problems significantly more complicated than median finding. Also it isn't particularly meaningful since that detail is lost in Church-Turing equivalences. If you can write a paper that gets down to the level of constants that is interesting at a high level and isn't just a tome of tedious detail, people may still read it. But it's a tall order. I don't think pinning down the runtime constants is likely to be the next frontier as you seem to suggest.
Moore's law is very good for CS. The most important effect is probably not increasing the maximum speed to solve a problem, but decreasing the minimum price. Remember that in the 1960s, there were a host of "expensive" programs at the cutting edge: the expensive typewriter, the expensive desk calculator, the expensive planetarium, the expensive tape recorder. Many things were just barely possible on the computers of the 1960s, but economically indefensible.ReplyDelete
As the cost of computers and computation comes down, more things get turned into computer science problems. Let the boring problems get solved through better hardware, they keep making more of the interesting problems.
I find your negative connotation about hackers to be frustrating. It is this hacking which has lead to many improvement in the CS world.ReplyDelete
It's a funny question to ask, whether a law is good or bad for a field. Is gravity good for physics? Even if you said "no" (and that would be an interesting defense!), it's not like you could do anything about it.ReplyDelete