Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
40 Years of Unix (bbc.co.uk)
73 points by glaze on Aug 20, 2009 | hide | past | favorite | 34 comments


Semi-obligatory companion link to The UNIX-HATERS Handbook:

http://en.wikipedia.org/wiki/The_UNIX-HATERS_Handbook

Now entirely obsolete, of course. Today everyone has come to love everything about Unix, the most perfect OS in this best of all possible worlds. ;)


Alternatively, the worst operating system there is—apart from all the others.


Multics: carefully engineered, fail.

Unix: small and hacked together, success.

Plan 9: carefully engineered, fail.

I'm not sure what the lesson is here... there's that "worse is better" essay... maybe it's that when great engineers let their hair down, they do much better?


The lesson to be learned is, surely, the same one as we should have learned from the web: creating small pieces, loosely coupled is the best way to build large systems. Ecosystems, not clocks.


To add to that:

"A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system."

http://en.wikipedia.org/wiki/Galls_law


A little more than a collection of small pieces... they need to be able to work together in standard ways.


Absolutely, although of course looking at the web, it's clearly not fatal if those standards aren't properly adhered to by some members of the ecosystem, just very annoying, time-consuming and expensive.


That requirement is implied by the "ecosystem" bit. And it's important to note that the need to work together is not a requirement for careful interoperability design. It just means you need to do simple stuff that's easy to understand.


With respect to Multics, unix would run on smaller computers. Moore's Law means we keep getting those, so whatever can get a foothold on them has an advantage. The question of "why unix?" instead of one of the other small OS at the time is less clear. Luck may have played a part.

With respect to Plan 9, it had to compete with unix which was already established.

Note that during this time, another OS also became popular: DOS. Like unix, it ran on smaller hardware (actually, even smaller than unix could run on... giving DOS a foothold advantage denied unix). And clearly, DOS had some luck with respect to all the other small OS available at that time, being put on the IBM PC. There were two other OSes available for the PC (CP/M and UCSD p-System). Why did DOS win there? I don't know, but it was much cheaper, and possibly, MS BASIC compatibility had a role. Before the IBM PC, personal computer manufacturers would tout "MS BASIC" compatibility as a benefit because it gave users access to existing software.

Once you establish a platform with applications, it's hard to beat you. People don't want the platform - they want the applications. It's a network effect.

By targeting low-powered hardware, you can get established ahead of others, so that by the time the hardware has grown powerful enough to support a "real OS", you have already won. It's a disruptive innovation.


That doesn't explain why Plan 9 failed, it's really a better version of Unix but it never caught on.


Plan 9 was intended as a "research operating system". Unix actually fulfilled a need at the time - a better multi-user "mini" OS.


"Better" isn't enough to gain traction. You need to provide something completely new.


Better can be enough, if it's ENOUGH better. The rule of thumb is that something has to be at least twice as good to replace a similar but already functioning system (the real costs of retraining and lost productivity during the changeover can almost never be accurately calculated, that's why there is the rule of thumb). For an academic view, see "The Fable of the Keys" http://www.utdallas.edu/~liebowit/keys1.html about qwerty vs Dvorak; and the version that was expanded into a book about Dvorak, Microsoft, and the internet http://www.amazon.com/Winners-Losers-Microsoft-Stan-Liebowit... .

ADDED: And Christensen in "Innovator's Dilemma" http://www.amazon.com/Innovators-Dilemma-Revolutionary-Busin... shows how slight improvements can gradually replace something that was originally better.


There was unlucky timing - it got upstaged by this thing called "Linux" that appeared around the same time.


Never caught on doesn't mean it failed.


I've always been a fan of Gall's Law: http://en.wikipedia.org/wiki/Galls_law


Your link is broken—it needs an apostrophe after "Gall".



Multics didn't "fail"; rather, the Bell Labs guys decided they didn't want to work on it any more and went off to do their own thing. Multics eventually reached a useful state and was in use in various places (by such customers as the US DoD) until around 2000, when the last Multics system was taken offline. However, Multics ran on an extremely small set of expensive Honeywell computers, which is why it did not reach great prominence.

Plan 9 lacks adoption for an entirely different reason. It is very portable, and in the earlier days (~1995) ran on Suns, SGIs, Alphas, some 68k machines, and the PC. As non-PC hardware has become less widely-used, the other ports were not kept updated. However, compilers exist for the MIPS, 68k, ARM, AMD64, Alpha, x86, SPARC, and PowerPC architectures, and it is not particularly difficult to port to a new machine. Unfortunately, Plan 9 was only available via closed license until 2000. Had it been given freely from the earliest days, it very likely could have taken what is now Linux's place as the primary free OS.

The reason UNIX is so widespread is because it was the right solution at the right time. It was freely available to universities and, like Plan 9, pretty easy to port. The interface was simple yet powerful, and the introduction of pipes meant you could write stupid programs, then chain them together for powerful functionality. It really served to end the days of "One OS per Computer" and the kind of monolithic programs one found on such systems as VMS, where every program was an island.

The original Unix was put together in a thoughtful fashion with an eye to both hardware efficiency and powerful features. However, it was unfortunately written before networks and graphical interfaces became widespread--those portions were "hacked on" later, giving us network sockets and X, two systems often considered the worst parts of Unix.


Regarding "Unix: small and hacked together, success."

My favorite part was:

"He allocated one week each to the four core components of operating system, shell, editor and assembler."

Now that is a productive four weeks!

Maybe the lesson is, if you don't have a working prototype after four weeks, you're doing it wrong. At least when it comes to software. I mean, if that's what it takes for one guy to bang out the first draft of Unix...


Maybe you should leave yourself a bit more time if you're not ken. Then again, maybe not.


Unix - Keep It Simple and Scalable (KISS)


Maybe that engineers are better at engineering than managers are at managing?


This article is really sloppy and full of errors; it's not worth reading.

Unix had computer networking built in from the start

Bill Joy and his buddies added networking to Unix in 4.2BSD around 1977, 8 years after the start.

Work on Unix began ... after AT&T, ..., MIT and GE pulled the plug on ... Multics.

AT&T pulled out in April 1969, but the development of Multics continued elsewhere; GE/Honeywell/Bull worked on it until 1990: http://www.multicians.org/chrono.html

The syntax of many of those commands, such as chdir and cat, are still in use 40 years on.

chdir doesn't exist on Unix. It's cd.

The idea of users directly interacting with the machine was downright revolutionary.

In a way, yes. But this revolutionary idea was also present in CTSS and Multics before Unix started, heavily backed by DARPA, and the major reason for the ARPANET project that began in 1969. By the time Unix V7 was out (in 1977?), in addition to many timesharing systems, PARC was pursuing Alan Kay's 1969 Dynabook vision, there were Altos in operation at PARC, the Star was well on its way to production, and thousands of Altairs had been sold, not to mention things like the COSMAC ELF, the IMSAI, the Osborne 1, and the Apple ][. All of these had, as their central principle, the idea of users interacting directly with the machine.

What helped this grassroots movement was AT&T's willingness to give the software away for free.

Not to anybody but universities, and I think even an academic license included some derisory fee.

In May 1975 it got another boost by becoming the chosen operating system for the internet.

Not even close. If there was a chosen operating system for the internet in the late 1970s, it was TOPS-20.

The wars are over and the Unix specification is looked after by the Open Group - an industry body set up to police what is done in the operating system's name.

That is a severe misrepresentation both of the origin of OSF and of the current activities of the Open Group.


the Unix philosophy heavily influenced the open source software movements and the creation of the Linux desktop OS

Of course there are plenty of desktop Linux distros, but most Linux machines in the wild are running servers.


Why is this at -2? It's a perfectly factual, if unprofound, remark on the linked article. I thought the same thing when I read it. Why identify Linux as a "desktop" thing? It's a unix clone, and remains so whether it runs on TVs or phones or telco racks.


I'd say most Linux machines in the wild are running on consumer devices.


Both can be true simultaneously. My server is running on a consumer device.


Scratch that. I just realized that he wasn't referring to desktop computers when he said consumer devices. I now think he meant set-top boxes and things of that nature.


Bingo, the tivo's and tomtom's of the world. and if you go by generic unix-style systems, you can throw in iphones and ipod touches to that list too.


tivos and tomtoms rather.


Yeah really. I have a few thousand "consumer devices" running both Linux and actual UNIX.


A few thousand? Of what nature are these?


I have the book "A Quarter Century of UNIX", published June 1994. I bought it some time later. Still, this is making me feel old. ;)

http://www.amazon.com/Quarter-Century-UNIX-Addison-Wesley-Sy...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: