The future according to Dennis Ritchie (a 2000 interview)

By Danny Kalev, |  Development

Dennis Ritchie

Source: Wikipedia

Dennis M. Ritchie heads the system software research department at Bell Laboratories's Computing Science Research Center.

Ritchie joined Bell Laboratories in 1968 after obtaining his graduate and undergraduate degrees from Harvard University. He assisted Ken Thompson in creating Unix, and was the primary designer of the C language. He helped foster Plan 9 and Inferno.

He is a member of the US National Academy of Engineering and is a Bell Laboratories Fellow, and has received several honors, including the ACM Turing Award, the IEEE Piore, Hamming, and Pioneer awards, the NEC C&C Foundation award, and the US National Medal of Technology. Can you introduce us to Plan 9 (see Resources for a link), the project in which you're currently involved, and describe some of its novel features?

Dennis Ritchie: A new release of Plan 9 happened in June, and at about the same time a new release of the Inferno system, which began here, was announced by Vita Nuova. Most of the system ideas from Plan 9 are in Inferno, but Inferno also exploits the exceptional portability of a virtual machine that can be implemented either standalone as the OS on a small device, or as an application on a conventional machine.

As for Plan 9, it combines three big ideas. First, system resources and services are represented as files in a directory hierarchy. This comes from Unix, it is worked even better in Linux, but Plan 9 pushes it hardest. Not only devices, but things like Internet domain name servers look like files. Second, remote file systems -- likewise not a new or unique idea. But if all system resources are files, grabbing bits of another machine's resources is easy, provided the permission gods permit. Third, and unusual, is that the namespace -- the hierarchy -- of files seen by a particular process group is private to it, not machine-wide. C and Unix have exhibited remarkable stability, popularity, and longevity in the past three decades. How do you explain that unusual phenomenon?

Dennis Ritchie: Somehow, both hit some sweet spots. The longevity is a bit remarkable -- I began to observe a while ago that both have been around, in not astonishingly changed form, for well more half the lifetime of commercial computers. This must have to do with finding the right point of abstraction of computer hardware for implementation of the applications.

The basic Unix idea -- a hierarchical file system with simple operations on it (create/open/read/write/delete with I/O operations based on just descriptor/buffer/count) -- wasn't new even in 1970, but has proved to be amazingly adaptable in many ways. Likewise, C managed to escape its original close ties with Unix as a useful tool for writing applications in different environments. Even more than Unix, it is a pragmatic tool that seems to have flown at the right height.

Both Unix and C gained from accidents of history. We picked the very popular PDP-11 during the 1970s, then the VAX during the early 1980s. And AT&T and Bell Labs maintained policies about software distribution that were, in retrospect, pretty liberal. It wasn't today's notion of open software by any means, but it was close enough to help get both the language and the operating system accepted in many places, including universities, the government, and in growing companies.

Join us:






Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Ask a Question