We lost another computing pioneer of a very different kind. Dennis Ritchie, who developed C and co-developed Unix, passed away last weekend. Ritchie and Ken Thompson received the 1983 Turing Award for their development of Unix.
In the early 80's I programmed extensively in assembly language on the Apple II and IBM 370. In both cases we needed speed a high-level language couldn't get us. C and Unix let us take advantage of high-level constructs and abstraction and yet retain the full power of the underlying machine. Ritchie changed the way we did computing.
If there is a moment that captures both Jobs and Ritchie, it was in 2002 when the Mac moved to the Unix-based OS X from which also the Apple iOS was later derived. As you play with the new iOS 5, you can't help but think of these great pioneers that made it possible.
i obviously regard him on an entirely different level than steve jobs. this is on purely theoretical grounds. he is a completely different calibre ... one few clouds up!
ReplyDeleteit is saddening to see him leave this planet. that said, nothing beats von neumann. had u told me that von neumann died assuming back in the day ... that would have been reason for me to shed a tear or two. wat i would give to talk to von neumann in person....
I agree with the previous commentator, I would never compare Richie to Jobs. Richie have had a much greater impact on our lives than Jobs did, but most people will never know it because he wasn't selling toys to kids and was not a businessman/showman that media would care about. RIP Richie.
ReplyDeletePlease, stop talking about Steve Jobs. This is an insult to Dennis Ritchie. Also, this is a computer science blog. Not a blog about business & marketing.
ReplyDeleteIsn't one of the big draws of macs is that they make a unix like operating system user friendly?
ReplyDeleteps. this is a discussion for all of TCS. Sorry to find out that there is so much censorship of comments.
Censorship of comments?
ReplyDeleteLance and I let almost everything through- so what
are you referring to? You can email me privately
if a comment was blocked that you think shouldn't have
been.
OR are you referring to COMMENTS that try to dictate
what other comments should be allowed. That's not censorship,
that's just being a CENSORED.
I too don't think it's fair to compare one of the great computer scientists to a business man. The kind of attention and talks/articles/etc that was given to Steve Jobs was in many occasion way over the board and ridiculous. There was an interview the other day on one of the major radio station here in which Steve Jobs was called the "Albert Einstein" to computer scientists and it appears most people really do think that way. He sure was a great businessman (and from what I have heard and read I am not sure if he was a great person to work or deal with).
ReplyDeleteOne disappointment for me is how little high-level programming languages seems to have progressed since the 80s.
ReplyDeletehttp://blog.oddhead.com/2010/07/19/higher-level-programming/
One disappointment for me is how little high-level programming languages seems to have progressed since the 80s.
ReplyDeleteOh brother, don't even get me started.
High level programming language research suffered three big shocks in the late 70s from which it hasn't yet fully recovered:
First, was the discovery that syntactic sugar can be bad, which somehow became its contrapositive: "all syntactic sugar is bad". For over two decades any improvement to a language was dismissed as "that's just syntactic sugar".
Second, they discovered that type checking saves lives (ok removes some bugs) and has a rich theory behind it. So the false equation PL research==Type Theory took hold.
Last it discovered functional languages with all their elegance, beauty, and impracticality.
The rest, as they say, is history. This is the main reason why, as compared to before, nearly no successful languages have come out of academia since Pascal: while academics laboured over automated type checking in ML (which no one in industry really wants), Stroustrup observed that C needed objects, Larry Wall realized that programming languages needed to support strings as first class citizens and von Rossum observed that one could merge the best parts of the functional programming paradigm into a language with imperative style syntax.
Thanks, Alex. Very insightful. Maybe there will be a comeback someday inside academia.
ReplyDeleteOn my first day on the job at Bell Labs in 1992, I had some files I need to copy to my workstation from a 5.25 inch MS DOS floppy disk. I asked if there was a DOS machine around, and was told there was one upstairs in "the Unix Lab". I was green enough not to realize they meant *the* Unix Lab.
ReplyDeleteI walked one floor up and found my way to room with a lot of machines, and asked the guy there if they had a DOS machine I could use. He said yes, and I said "Great, by the way, I'm Dave Lewis" and he said "Hi, I'm Dennis Ritchie". So yes, the first words I ever said to the co-inventor of Unix was to ask where the DOS machine was. :-)
A gracious and brilliant man, and I'm sad to hear of his end.
Steve Jobs was no Dennis Ritchie. Dennis was a humble man who worked in the shadows and changed the world profoundly. Jobs was an arrogant showboater who got rich from other people's work. I know people love their macs and their jesus phones, and I won't begrudge them that, but please, please, please, don't soil the contributions of this great man by comparing him to Steve Jobs.
ReplyDeleteActually, there is one common feature of Ritchie and Jobs: they both had fantastic good taste.
ReplyDeleteSteve Jobs did not invent anything, but he presented existing technology in a wrapper that made it irresistibly attractive. As others have noted, the Mac interface was a version of Xerox Palo Alto Labs' Alto, Pixar was successful because Steve Jobs understood exactly how much techniques of Computer Graphics matured, etc.
His contribution was to select the pieces and put them together in an elegant and beautiful way.
Ritchie was, of course, a hacker's hacker, but he also had fantastic taste. The C language is essentially a subset of the British BCPL language, designed for systems programming. Unix is a scaled-down version of the Multics operating system developed at MIT. Ritchie's genius was to know what to select--and his ability to get an efficient and wonderful implementation incredibly fast. Very likely, his programming brilliance offered a good guide of what features were essential, what could be implemented well, and what would be useful to programmers.
A huge difference between the two is that we know the guiding principles that determined Ritchie's choices. The "Unix philosophy" has been extensively and eloquently explained by Ritchie and his collaborators. We can all learn from it. In contrast, Steve Jobs' choices and his taste died with him--this is why some analists worry about the future of Apple.
Per Steve Yegge's recent samizdat memo, was Ritchie's C-philosophy "Amazonian", and Jobs' Mac-philosophy "Googleian"?
ReplyDeleteAnd is a really good new mathematical theorem/algorithm more naturally appreciated as a Ritchie-style "platform" or as a Jobs-style "service"?
One should question our capitalist system after noting that Ritchie contributed to technology at least as much as Jobs did, but probably wasn't even a millionaire in comparison to Jobs billions.
ReplyDeleteIs that really fair? Best for society?
Alex, you are certainly free to stick with languages such as C, but you shouldn't blithely dismiss the useful programming languages research that has occurred over the past 30 years.
ReplyDelete1. Syntactic sugar isn't bad, but it is more the domain of industry and hobbyists rather than serious academics. Should reformatted papers be accepted to theory conferences?
2. Type systems are the true origin of language features and as such they are the backbone for research. See Bob Harper's book for an elaboration:
http://www.cs.cmu.edu/~rwh/plbook/book.pdf
Perhaps you haven't come into contact with a type system which ensures your program is being correctly distributed across multiple computers, or one that verifies the correctness of your proofs, or generates a substantial part of your program for you, but I think type systems are quite useful.
3. Saying functional programming languages are 'impractical' is uttering an insult that belongs to the '90s. Many industrial-strength functional programming languages are used in large-scale applications. The web is teeming with Javascript programmers who indulge in CPS. If you aren't keen on functional programming, you are unlikely to land a job at any top tech firm.
1. Syntactic sugar isn't bad, but it is more the domain of industry and hobbyists rather than serious academics.
ReplyDeleteThis quite nicely proves my point that PL academics ignore real life improvements under the "syntactic sugar" label.
Should reformatted papers be accepted to theory conferences?
In fact your opinion of syntactic sugar is so low that you compare it with reformatting!
In reality, some syntactic sugar is bad (e.g. COBOL) and some syntactic sugar is good (e.g. templates). As such determining which is which, far from being a reformatting exercise, is very much a valid object of scientific inquiry.
but I think type systems are quite useful.
I never said otherwise. I took issue with equating type theory with PL research. The subset relationship
Type Theory \subsetneq PL
on the other hand, definitely holds. In fact, I just finished teaching an entire course on type theory as it relates to programming languages.
Saying functional programming languages are 'impractical' is uttering an insult
Let's keep personal terms like "insult" out of this discussion and argue the subject on its technical merits, ok?
Functional languages are practical in as much as they are not at all like the ones in academia. That is, with imperative-style syntax and with heavy use of state.
I don't think that PL academics ignore syntactic sugar. Ergonomics are important. But the study of such is best conducted in the marketplace, and throughout the development of vast software systems. Professionals and hobbyists are certainly capable of thinking scientifically. Their process has advanced us far from UltraVerboseJavaClassName.
ReplyDeleteI consider radically new approaches, such as visual programming, to be academic topics, but more in the domain of HCI.
Design and presentation are very important topics. Many students will spend entire semesters on the layout of a single document. However, such work doesn't belong at STOC.
Technically, type theory is a central concept in the design and implementation of programming languages. I would put it on the level of optimization in theory.
Would you consider Haskell or Scala to be academic functional languages? They do not really have "imperative" syntax but they do involve a fair amount of state. Even systems like Isabele and Coq are being used for industrial verification.
Note: I study machine learning, not PL.
We are straying from the subject of this blog, so this is my last post on the issue. There are two aspects to my claim:
ReplyDelete(1) PL people do this, and
(2) it has had bad consequences.
So far the responses agree that syntactic sugar is looked down as unworthy of study, that type theory is the end-all be-all of PL and that functional languages are common in academia and really nice.
The only thing that remains to be argued then is if this focus has been to the detriment of advances in programming languages coming from academia.
In my opinion, the marginal improvements of PLs on the field, as pointed out by David Pennock, as well as the limited impact from academic research in day to day PL advances are quite telling, but I'll let the readers reach their own conclusions.
I think your view is somewhat extreme. Lots of PL research is devoted to "nice" ways of writing things; take a look at ICFP proceedings on e.g. parser combinators. Type theory is a central concept but it's not the "end all be all." Pure functional programming isn't especially common in industry, but neither is asymptotic algorithm analysis. Elements of functional programming are just as common as boilerplate algorithm design.
ReplyDelete