News:

And we're back!

Main Menu

Fuck you apple

Started by Josquius, November 09, 2011, 03:43:30 AM

Previous topic - Next topic

Josquius



QuoteThe tributes to Dennis Ritchie won't match the river of praise that spilled out over the web after the death of Steve Jobs. But they should.

And then some.

"When Steve Jobs died last week, there was a huge outcry, and that was very moving and justified. But Dennis had a bigger effect, and the public doesn't even know who he is," says Rob Pike, the programming legend and current Googler who spent 20 years working across the hall from Ritchie at the famed Bell Labs.

On Wednesday evening, with a post to Google+, Pike announced that Ritchie had died at his home in New Jersey over the weekend after a long illness, and though the response from hardcore techies was immense, the collective eulogy from the web at large doesn't quite do justice to Ritchie's sweeping influence on the modern world.

Dennis Ritchie is the father of the C programming language, and with fellow Bell Labs researcher Ken Thompson, he used C to build UNIX, the operating system that so much of the world is built on -- including the Apple empire overseen by Steve Jobs.

CNN's GeekOut blog: Without Ritchie, you wouldn't be reading this

"Pretty much everything on the web uses those two things: C and UNIX," Pike tells Wired. "The browsers are written in C. The UNIX kernel — that pretty much the entire Internet runs on -- is written in C. Web servers are written in C, and if they're not, they're written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.

"It's really hard to overstate how much of the modern information economy is built on the work Dennis did."

Even Windows was once written in C, he adds, and UNIX underpins both Mac OS X, Apple's desktop operating system, and iOS, which runs the iPhone and the iPad. "Jobs was the king of the visible, and Ritchie is the king of what is largely invisible," says Martin Rinard, professor of electrical engineering and computer science at MIT and a member of the Computer Science and Artificial Intelligence Laboratory.

"Jobs' genius is that he builds these products that people really like to use because he has taste and can build things that people really find compelling. Ritchie built things that technologists were able to use to build core infrastructure that people don't necessarily see much anymore, but they use everyday."

From B to C

Dennis Ritchie built C because he and Ken Thompson needed a better way to build UNIX. The original UNIX kernel was written in assembly language, but they soon decided they needed a "higher level" language, something that would give them more control over all the data that spanned the OS. Around 1970, they tried building a second version with Fortran, but this didn't quite cut it, and Ritchie proposed a new language based on a Thompson creation known as B.

Depending on which legend you believe, B was named either for Thompson's wife Bonnie or BCPL, a language developed at Cambridge in the mid-60s. Whatever the case, B begat C.

B was an interpreted language -- meaning it was executed by an intermediate piece of software running atop a CPU -- but C was a compiled language. It was translated into machine code, and then directly executed on the CPU. But in those days, C was considered a high-level language. It would give Ritchie and Thompson the flexibility they needed, but at the same time, it would be fast.

That first version of the language wasn't all that different from C as we know it today -- though it was a tad simpler. It offered full data structures and "types" for defining variables, and this is what Richie and Thompson used to build their new UNIX kernel. "They built C to write a program," says Pike, who would join Bell Labs 10 years later. "And the program they wanted to write was the UNIX kernel."

Ritchie's running joke was that C had "the power of assembly language and the convenience of ... assembly language." In other words, he acknowledged that C was a less-than-gorgeous creation that still ran very close to the hardware. Today, it's considered a low-level language, not high. But Ritchie's joke didn't quite do justice to the new language. In offering true data structures, it operated at a level that was just high enough.

"When you're writing a large program -- and that's what UNIX was -- you have to manage the interactions between all sorts of different components: all the users, the file system, the disks, the program execution, and in order to manage that effectively, you need to have a good representation of the information you're working with. That's what we call data structures," Pike says.

"To write a kernel without a data structure and have it be as consist and graceful as UNIX would have been a much, much harder challenge. They needed a way to group all that data together, and they didn't have that with Fortran."

At the time, it was an unusual way to write an operating system, and this is what allowed Ritchie and Thompson to eventually imagine porting the OS to other platforms, which they did in the late 70s. "That opened the floodgates for UNIX running everywhere," Pike says. "It was all made possible by C."

Apple, Microsoft and beyond

At the same time, C forged its own way in the world, moving from Bell Labs to the world's universities and to Microsoft, the breakout software company of the 1980s. "The development of the C programming language was a huge step forward and was the right middle ground ... C struck exactly the right balance, to let you write at a high level and be much more productive, but when you needed to, you could control exactly what happened," says Bill Dally, chief scientist of NVIDIA and Bell Professor of Engineering at Stanford. "[It] set the tone for the way that programming was done for several decades."

As Pike points out, the data structures that Richie built into C eventually gave rise to the object-oriented paradigm used by modern languages such as C++ and Java.

The revolution began in 1973, when Ritchie published his research paper on the language, and five years later, he and colleague Brian Kernighan released the definitive C book: The C Programming Language. Kernighan had written the early tutorials for the language, and at some point, he "twisted Dennis' arm" into writing a book with him.

Pike read the book while still an undergraduate at the University of Toronto, picking it up one afternoon while heading home for a sick day. "That reference manual is a model of clarity and readability compared to latter manuals. It is justifiably a classic," he says. "I read it while sick in bed, and it made me forget that I was sick."

Like many university students, Pike had already started using the language. It had spread across college campuses because Bell Labs started giving away the UNIX source code. Among so many other things, the operating system gave rise to the modern open source movement. Pike isn't overstating it when says the influence of Ritchie's work can't be overstated, and though Ritchie received the Turing Award in 1983 and the National Medal of Technology in 1998, he still hasn't gotten his due.

As Kernighan and Pike describe him, Ritchie was an unusually private person. "I worked across the hall from him for more than 20 years, and yet I feel like a don't knew him all that well," Pike says. But this doesn't quite explain his low profile. Steve Jobs was a private person, but his insistence on privacy only fueled the cult of personality that surrounded him.

Ritchie lived in a very different time and worked in a very different environment than someone like Jobs. It only makes sense that he wouldn't get his due. But those who matter understand the mark he left. "There's that line from Newton about standing on the shoulders of giants," says Kernighan. "We're all standing on Dennis' shoulders."

http://edition.cnn.com/2011/10/14/tech/innovation/dennis-ritchie-obit-bell-labs/index.html


I just learned the guy had died today. Blimey did the world keep that one quiet.
██████
██████
██████

Monoriu

All of my memories about UNIX are bad.  I was in university, and we were forced to use designated computer terminals to do a number of vital admin tasks.  It was a nightmare.  I looked at that terminal and I had no idea what to do.  In the end I asked an Ultima Online friend to use a remote access programme to sorta hack into it and got the stuff that I needed. 

I hate UNIX  :mad:

Darth Wagtaros

Quote from: Monoriu on November 09, 2011, 04:55:16 AM
All of my memories about UNIX are bad.  I was in university, and we were forced to use designated computer terminals to do a number of vital admin tasks.  It was a nightmare.  I looked at that terminal and I had no idea what to do.  In the end I asked an Ultima Online friend to use a remote access programme to sorta hack into it and got the stuff that I needed. 

I hate UNIX  :mad:
Yeah, well UNIX doesn't like dyed int eh wool commies either.
PDH!

Camerus

Didn't read the article at all, but isn't it pretty much de rigueur when a famous inventor or thinker dies for some people to insist that that his ideas or inventions were actually stolen / plagiarized?  In other words, don't wake me up till this actually goes anywhere significant.

Slargos

American industrialists have a long and proud history of stealing the ideas or property of their betters and capitalizing on them. It's what made America great.

For this reason, it's hard to care about the crocodile tears shed when Chi-com spies steal American tech.

Barrister

Sounds like the guy made a remarkable contribution, but the bitching that 'why isn't he getting the accolades that Jobs is'?

Sorry, but :rolleyes:

Ritche helped invent C, which is an underpinning of UNIX and most computer languages.  OS X is UNIX based, and used a variant of C for it's programming.  But Ritchie did that in the early 70s - 40 years ago.

Jobs made a huge contribution to computing as well in the 70s - a little thing called the Apple II.  And if that's all he did, when he passed away he'd get some level of recognition from the hard core geekc ommunity too.

But Jobs has been busy since the 70s, and has done an awful lot to build on his legacy besides the Apple II and Macintosh.
Posts here are my own private opinions.  I do not speak for my employer.

frunk

Ritchie didn't just help invent C, he was also vital in creating UNIX and in getting it ported to different systems.  Before that an operating system was custom built for one type of computer and couldn't be easily transitioned from one to another.  This is an innovation that Apple resisted for years, and still isn't too fond of.  Meanwhile you have Linux and BSD that can run on just about anything.

In addition to developing these technologies through his writing he has taught a couple of generations how to use them.

Jobs might have had a continuing impact on how technology looks, but very few people were more important than Ritchie in getting it to work.

Barrister

Quote from: frunk on November 09, 2011, 09:59:24 AM
Jobs might have had a continuing impact on how technology looks,

Are you deliberately trying to minimize Jobs influence?

Obviously Jobs had a keen interest in industrial design, but that's hardly his most important influence.  His influence was first in creating the entire personal computer revolution, then in profoundly changing how we interact with computers by popularizing the GUI, and then the touch interface.
Posts here are my own private opinions.  I do not speak for my employer.

chipwich

If Ritchie didn't properly capitalize his inventions or craft them into a worldwide brand of products known for their sleek design that sounds like his fault.

frunk

Quote from: Barrister on November 09, 2011, 10:05:31 AM
Are you deliberately trying to minimize Jobs influence?



No more than you were trying to minimize Ritchie's.

I don't see why it should matter if a person's immediate impact was several years ago or right now.  Your argument seems to be that since Ritchie hasn't done much lately he doesn't deserve the accolades that Jobs has received.  When you get to the meat of it, Ritchie has had a greater influence on what your computer is doing right now than Jobs.  What it looks like, how you interact with it is Jobs.  Jobs was great at finding intuitive and effective ways of interacting with a computer.  How a computer actually is programmed is all Ritchie.

Barrister

#10
Quote from: frunk on November 09, 2011, 10:34:30 AM
Quote from: Barrister on November 09, 2011, 10:05:31 AM
Are you deliberately trying to minimize Jobs influence?



No more than you were trying to minimize Ritchie's.

I don't see why it should matter if a person's immediate impact was several years ago or right now.  Your argument seems to be that since Ritchie hasn't done much lately he doesn't deserve the accolades that Jobs has received.  When you get to the meat of it, Ritchie has had a greater influence on what your computer is doing right now than Jobs.  What it looks like, how you interact with it is Jobs.  Jobs was great at finding intuitive and effective ways of interacting with a computer.  How a computer actually is programmed is all Ritchie.

I'm certainly not trying to minimize Ritchie's influence.

But it's fairly common knowledge that the world is very much a "what have you done for me recently" kind of place.  Whenever someone passes away it gets far more attention when they are in the height of their career than if  their acheivements were decades in the past.

And - Jobs did have a tremendous influence on how I'm using the very computer in front of me.  Not only because I'm using a mouse, but because I'm using an entire computer that sits on my desk.  Without Jobs and the PC revolution, I might very well be working on a dumb terminal attached to a mainframe.

And you can say "well if it wasn't Jobs, it would have been someone else", but, you could say the same thing about Ritchie as well.
Posts here are my own private opinions.  I do not speak for my employer.

HVC

Quote from: Barrister on November 09, 2011, 10:36:58 AM

And you can say "well if it wasn't Jobs, it would have been someone else", but, you could say the same thing about Ritchie as well.
Not from what i understand. What made Jobs famous is that he took what other created and repackaged it. He was very good at it, but in essense that's what he did. Ritchie created his contribution to computers
Being lazy is bad; unless you still get what you want, then it's called "patience".
Hubris must be punished. Severely.

grumbler

Quote from: Barrister on November 09, 2011, 10:05:31 AM
Are you deliberately trying to minimize Jobs influence?

Obviously Jobs had a keen interest in industrial design, but that's hardly his most important influence.  His influence was first in creating the entire personal computer revolution, then in profoundly changing how we interact with computers by popularizing the GUI, and then the touch interface.
Overstatement much?  :rolleyes:
The future is all around us, waiting, in moments of transition, to be born in moments of revelation. No one knows the shape of that future or where it will take us. We know only that it is always born in pain.   -G'Kar

Bayraktar!

Martinus

Quote from: HVC on November 09, 2011, 10:47:08 AM
Quote from: Barrister on November 09, 2011, 10:36:58 AM

And you can say "well if it wasn't Jobs, it would have been someone else", but, you could say the same thing about Ritchie as well.
Not from what i understand. What made Jobs famous is that he took what other created and repackaged it.

Hollywood repackaged moving pictures. Ford repackaged a fuel engine. St. Paul repackaged Jewish and Greek philosophy.

Finding a creative way of popularizing someone else's invention is often a greater feat than the invention itself - and, appropriately, often carries a greater fame and recognition (not to mention, riches).

HVC

Quote from: Martinus on November 09, 2011, 10:50:34 AM
Quote from: HVC on November 09, 2011, 10:47:08 AM
Quote from: Barrister on November 09, 2011, 10:36:58 AM

And you can say "well if it wasn't Jobs, it would have been someone else", but, you could say the same thing about Ritchie as well.
Not from what i understand. What made Jobs famous is that he took what other created and repackaged it.

Hollywood repackaged moving pictures. Ford repackaged a fuel engine. St. Paul repackaged Jewish and Greek philosophy.

Finding a creative way of popularizing someone else's invention is often a greater feat than the invention itself - and, appropriately, often carries a greater fame and recognition (not to mention, riches).
I'm not saying Jobs wasn't important, just that the likelyhood of another jobs coming along is far greater than another Ritchie coming along.
Being lazy is bad; unless you still get what you want, then it's called "patience".
Hubris must be punished. Severely.