Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Gruber: The OS Opportunity (daringfireball.net)
71 points by concretecode on Nov 19, 2009 | hide | past | favorite | 33 comments


I was born in 1990. The first computer that I could call my own was a custom built PC with Windows XP. This was in 2003.

I cannot express in words how sad that makes me. I've heard older programmers talk fondly about the BeOS, the classic MacOS, Amiga Workbench (IIRC), NeXtStep and even Win3.1. Even older programmers talk about computers built by Tandy and the BBC, and the legendary machines built by Atari and Commodore. I cannot help but feel that I've missed something. I salute Gruber for making the point I have been trying to make for a long time now.

I've grown up in a world where we have only two major families of operating systems[1]: Windows and UNIX. That makes me a sad panda :(

Even though I hate the flimsy machines Dell make, I would love to try out the DellOS, if they ever decide to build one.

[1] I'm only talking about the desktop space here.


For what it's worth, people who are children now will have missed the wild days of the Web when everyone had their own API, data, and UI stacks and no two apps looked or behaved the same.

I'm not sure I agree with the OP. When there is real competitive advantage in some newish layer of the tech stack, you will see incredible diversity. When the advantage moves on to other parts of the stack (eg applications, the web, etc), the older layers standardize and homogenize.

Today, in 2009, neither I nor my mom give two shits what OS we're using 70% of the time because our work is done inside a browser. I run WinXP in a virtual machine on my Mac. At least once a day I catch myself using the browser inside the Windows instance without realizing it.


And I think this is why hardware vendors could come out with their own spin on an operating system.

We didn't use to have a standard for sharing information. If you wanted to share a file, you had to have the exact same hardware and software. That is what led us to the local minima that is PCs running DOS/Windows.

Now, however, we have a well accepted standard. If you can have a good TCP/IP stack and run a browser, you're golden. With just these two things you can be productive on any operating system.

So now, the fact that the OS is the last thing that you consider, might just free manufacturers to come up with their own designs.

That does however leave one crucial thing out, which might just kill Gruber's argument: games.


I grew up owning a Vic 20 (which I still have, by the way), and my friends all had different types of computers; Tandy, Amstrad, Sinclair, BBC, C64, Atari etc. Can't even remember them all now, and they were all different but it was great. Learned to programme just about all of them and had a ball.

Many many happy memories of those times

[Edit: thinking about it now, the one thing that was missing was one-upmanship. I don't remember any comparison between machines for the purposes of deciding which was 'better'. We were just happy to have something; didn't matter what brand it was]


If it's diversity in operating systems you seek, get into networking or storage.


I was born in 1946. I didn't get a computer I could call my own until I was 33. Stop whining.

(You could also buy some antique computers, restore them, and explore their OSs).


Read this earlier. Dead on. Any company which cannot build decent software does not have a bright future. Not just in computers -- probably any field. Software is becoming a more pervasive part of life every year. The many who cannot adapt will be easily replaced by the few who can. And with software, it really is easy for one person or company to replace thousands in a single stroke.

For now, though, this effect is probably most visible in the tech field. There's a large and growing disparity in profits between companies which can produce good software, such as Apple, Google and Microsoft, and those which cannot. I suspect this disparity will more and more reflect how the economy functions for individuals. It wouldn't surprise me if the wealth gap further widens every year. I'm not sure if our politicians, society or culture are prepared to deal with it.


I don't see how that is the case, since the main software supplier in the world today that is microsoft, is totally dependent on hardware manufacturers.

I don't see Intel, (ok easy example) or any other hardware manufacturer dying anytime soon. The only one who 'could' die are those whose job is only about assembling hardware components together. And even them have a role in the low end market that will be very hard to meet for a company aiming to handle the whole chain of assembly of a computer, software included.

It's easy to agree on the fact that software has a much more profound impact on people that hardware. It's why there is an UI in GUI. But to me this point is as moot as saying an engine manufacturer is useless because it doesn't do steering wheels.

It seems to be a fact (I dont have enough informations to be sure) that software economy is wealthier than hardware's. But i'd like facts about that first, because i see a whole lot of software companies dying too . And then even if it is true it doesn't mean we NEED hardware manufacturers to come and release a ton of more or less compatible software and trying to impose that on users the exact same way we are stuck with windows at the moment.

Software building is a field of expertise in itself. A company can convert itself, or invest a new field. But it certainly doesn't mean they should, or they gonna die if they dont ..

My summary : Basic and non credible argumentation from the original article. Takes a visionnary tone , but in my opinion this is everything but where we are headed at the moment.


Intel makes some very good software -- for example, their compilers.

Car companies that cannot produce good software are indeed dying. The next generation driving systems rely heavily on a software component (traction control, user interface feedback, gas management and hybrid drive, Tesla's power management, etc) and those which cannot make the leap to produce software in tandem with their car hardware are facing irrelevance.


I don't see any reason to think a company like Dell or Sony would be well positioned to create an operating system. What software have they created that suggests they could do it? If Windows is so unsatisfactory doesn't that suggest that merely having resources isn't nearly enough? If Dell decided to create an operating system they would have little advantage over anyone else, and would have the disadvantage of being a huge company attempting an equally huge software project from scratch. They might as well just light piles of cash on fire. The sensible way to do it would be to buy a company that is building an OS.

I think Gruber is underestimating the fact that Apple has been developing not just the Macintosh OS, but the ecosystem and community for like a quarter of a century. I don't know that there is a shortcut to the latter two. When I switched back to Windows from Ubuntu it wasn't because of any failing of the operating system. It was because I was unable to find suitable replacements for all the Windows applications (and drivers) I used.

Web applications have not yet usurped desktop apps across the board, and when they eventually do, won't it be a worse environment for a new proprietary OS? At that point what would a proprietary OS offer that a free OS on inexpensive hardware couldn't?


I was thinking about this the other day. There's a lot of opportunity in terms of other Operating Systems/Interfaces. Computers used to JUST be these beige boxes that sat under our desks. Actually at one point, it used to be huge mainframes. Now, computers are everywhere, and by computers I mean full fledged system specific GUIs tied to some piece of hardware. They're in our pocket, in our cars, on our desks, in our laps, on our tvs, in the cloud as servers, and probably other places I'm not naming. This is where I see other Operating Systems taking away Microsoft's dinner. Microsoft got us to the first billion, but I'm pretty sure someone else will get us to the first trillion. Now, that's pretty exciting.

As far as the desktop goes? Apple will have its market share, which is the high end computing market (they have ~90% of it that 10%) and windows will have the other 90%. You can make Ubuntu function/look exactly like windows/mac. The problem comes down to the OEMs. Will they support it? Will they make the strong push in marketing to get people to try it out? Your grandmother is not going to do sudo and apt-get commands. If an OEM made beautiful machines, that ran tight software/hardware integration running a customized version of Ubuntu, marketed the hell out of it, and offered the right amount of training like Apple does, it could possibly get off the ground. Even then, linux is a scary place for the average person and the apps just aren't there.

Honestly, I'm more bullish on other Operating Systems taking us from billions to trillions on devices that aren't beige boxes that sit under our desk.


I don't agree at all with Gruber on this one.

It seems in fact that software manufacturers, and for good reasons, are taking more and more control into how hardware is designed. And that is a good thing because hardware ought to be designed for the software, not the other way around.

But it doesn't mean that what we need is more monolithic entities like apple. The fact that hardware and software is at least somewhat decoupled is a VERY good thing in my opinion.

What google is doing with android is interresting in this regard, and i wonder if their strategy with Chrome OS is gonna be similar; they release an OS (open-source, and this is quite important in the end), do the usual strategy of partnership with hardware manufacturers, and in the end , also decides to produce their own smartphone. It's an harder path cause your partners are your opponents at the same time (something apple for example, hasn't to deal with concerning OS X). But for the user it's clearly the best path : You have the freedom of using the software as you want (!= Apple) and you can also buy a proprietary solution that supposedly offers a better synergy between software and hardware


(From your other post below)

I don't see Intel, (ok easy example) or any other hardware manufacturer dying anytime soon.

I don't think that, when the author said PC makers are "busy dying" that he meant that they're necessarily on a direct trajectory towards bankruptcy. I took it in the softer, more poetic sense that they're listless, apathetic, lacking in the proud vigor and vibrancy of their youth. This was captured best when he said this:

People today still love HP calculators made 30 or even 40 years ago. Has HP made anything this decade that anyone will remember fondly even five years from now?

I agree with you that hardware and software decoupling is a very good thing. I also do not see this as being mutually exclusive to what Gruber is suggesting. If Dell wanted to invest in being a player in user experience, there's no reason that they would have to tie their system software to their hardware.

The biggest flaw that I see in Gruber's article is that he might be overestimating consumers themselves. Sure, the computer industry is different from the days when you tried to sneakernet a 1-2-3 file to a friend's machine only to find that he was running Visicalc on CP/M. But will consumers accept this change without knowing what an open standard is, or a document format, or a portable runtime, or cross-compling, or virtual machines?

I'd guess not, given that I've talked to seasoned geeks who haven't yet fully internalized that things are different today. Consumers, recoiling from the fear of "incompatibility", all happily ran to one side of the boat in the 90s and nearly capsized it. Teaching them that diversity doesn't have to mean incompatibility seems like a daunting task.


Building an OS is feasible for a small team or even a single programmer.

Providing enough of a subset of the expected APIs and the device support and the databases and web browser and the compilers and file systems the rest of the stack that the customers and third-party partners expect? That's a bigger project and a bigger budget.

Simply being better isn't enough.

Being faster isn't enough.

Being cheaper isn't enough.

Even running Microsoft Windows on your (non-x86) hardware isn't enough.

You need some combination of "betters" and of application and document compatibility, and you need to get to critical mass of applications and tools and device and hardware support, or you need to get to "massively better" in one or more dimensions to get enough early adopters on-board, or you need enough money to buy the tools and ports you need.

And then you have to get to big volume and to enough of a profit margins to get your prices down to where you attract application developers and resellers, or pay for the developers.

As for competition, you're working against Microsoft Windows on x86 and Apple, and FOSS limits your margins. Or against embedded vendors that excel in one or more dimensions.

There are little-known and niche and embedded operating systems and vendors all over the place. Wind River (now at Intel) is one. HP has at least three operating systems (NSK, HP-UX, VMS) and has retired others including MPE and Tru64 Unix and Domain/OS, and not counting embedded software platforms such as EFI and all those HP printers. IBM has its own OS offerings.

Building an OS is comparatively easy. Anyone with enough skills or enough cash can certainly build or buy one, and various folks can buy enough partners. But to create a self-sustaining environment within your target market and to avoid creating a massive write-off, you need to build an ecosystem around your operating system. That's a much tougher and much bigger effort.


Note that both litl and chrome use Linux under the hood, so you don't have to do all of that to make a "new" OS, for values of "new" used in this article.


I assume he means creating their own user experience. As all the examples he gives of companies creating their own OS are actually using Linux.


OS is more than a kernel and some drivers.

Unless you would consider Android, Palm's WebOS, Nokia's various OSs, Arch and Ubuntu to all be equivalent.


All the "OS"s in the article run GNU userland, X windows, webkit or gecko browsers. So yes they are the same operating system. They just have different applications in different configurations.

Android is it's own operating system. It has own user space and GUI technology. But the article didn't mention it.


Computer makers that want to succeed? I wouldn't buy a computer from someone that locks me into their product line for ALL my needs with that computer.


Classic Apple fanboy-ism at best with missing knowledge here and there. If OS does matter that much first Google couldn't be such a success and second Apple couldn't enjoy its current success because first 2 (even 3) Mac OS X releases were horribly slow and have very little app support!


Classic anti-Apple fanboy-ism at best with missing knowledge here and there. The OS mattering != everything else not mattering


maybe downvoters didnt get my point and/or Gruber's point. i'm saying that OS does not matter as it used to be to be developed by other players such as HP or Dell. asking again what is wrong with this???


The original article makes an excellent point, and expresses it with a great deal of coherency: the reluctance of computer makers to engage in operating system development is based on an outmoded fear that was established in the days where interoperability was hard because we hadn't yet figured out things like open document formats, open network protocols, portable APIs, portable language runtimes, and virtual machines. A lot has changed since then, so why are they still so reluctant?

This is an excellent question.

Your response seems to have a very tenuous grasp on coherency. I'm trying to pull out what you're trying to say. It appears to hinge on the transitive verb "to matter", and I can find only one introductory paragraph where the author uses this verb, and he doesn't say anything objectionable the two times that he uses it:

He says that hardware and software both matter. I find that hard to refute. Then he goes on to say that if you asked him to say which matters more, he'd say software. I'm not surprised that he would say this, since he tends to be a "user experience" guy and I'm willing to grand him this premise for the rest of what he wrote.

Honestly I can't tell what counter-argument you're trying to make. Yes, Google has been successful. It certainly wasn't because of their hardware. Regardless, you're only responding to the setup of the thesis, not the thesis itself.


A lot has changed since then, so why are they still so reluctant?

It's this premise that I find inaccurate. Sure there's open everything, but writing an operating system from scratch to support all this is still hard. Hell, give me one commercial operating system written from scratch in the last decade.

The point is, why do it? Maintaining an operating system is big money. And you can't stop there - you've got to have a full-stack offering - business apps, fun apps, drivers, the whole set. Add to that support, interoperability with the world, backwards compatibility etc. It's a long-term commitment; you can't back out of it that easily.

The risk-to-reward ratio is pretty small; unless you have some earth-shaking innovation up your sleeve, and/or it reinforces/supports your business model significantly.


Why from scratch? If you remove that seemingly pointless requirement then there's quite a few examples, starting with the Litl OS which is based on Ubuntu

Heck, there's plenty of good solid starting points. What about BSD? Linux? Android?


Still doesn't matter. Unless you create an ecosystem around your software, I think it still isn't relevant in the larger scheme of things. For eg., can Maemo, Android, WebOS etc all share applications, APIs, drivers, and other infrastructure seamlessly?

The point is you need to end up creating an ecosystem around your offering, which is non-trivial. Even if you do, you may still end up as a niche player. Litl is nice; but how well do you think they'll do against traditional netbooks?


Depending on how you architect your OS, it doesn't have to be impossible.

I have a little Exokernel project I devote my Saturdays to, and we'll actually be able to offer a POSIX-compliant libOS, and run anything that Linux does.

Or, there's always the hypervisor route, on desktops anyway.


i must say i'm puzzled by the downvotes. coming to the main debate articles' points are signs of the "word" i used which i shouldnt. even if i wrote without that intention but i sounded rude. sorry for that. thanks for pohl and mechanical_fish for the comments.


can someone please tell me why it's downvoted? buy it or not these are my arguments against ones in the article and i DO think that OS does not have the same importance as had in 80s. what is wrong with this?


Your safest bet is to simply never use the word "fanboy" except to refer to yourself. It's name-calling, and is therefore both rude and a sign of bad rhetoric. (I would say ad hominem, but that's a cliche phrase on HN, so I do so with reluctance. ;)

As for the rest of your post, here is Gruber's thesis statement:

PC makers who want to succeed should create their own OSes

Your statement that "Google succeeds without an OS" does not belong in this argument. First, because Google's success has nothing to do with PC making: That's not their business. Google is an online advertising business. To the extent that the rest of their amorphous, cash-cow-supported business plan has any coherency, it seems to be about using various web and mobile apps to drive advertising traffic, although they also derive a certain amount of revenue from selling software subscriptions and licenses, and of course there's AppEngine. And probably some stuff I've forgotten.

Second, because where Google does have a strategy to venture into the hardware market -- with mobile PCs, marketed as "phones", and perhaps desktop/laptop PCs as well -- the plan seems to be to start out by building an OS, just as Gruber suggests. Google's phone venture is centered around Android. I'm not sure what Chrome OS is all about, but if Google releases a tablet or a laptop it will presumably run Chrome. To the extent that Android and Chrome are or will be successful, these will be points in favor of Gruber's argument.

Finally... Apple's first couple of OS X releases really weren't that successful. Apple survived that, of course, because they still had enough dedicated users of their previously successful OS -- Mac OS 9 -- to tide the business over. And of course that iPod thing didn't hurt -- that bought a lot of time. [1] The success of Mac OS X was a near thing, though. Mac fanboy that I was, I nevertheless abandoned the Mac myself for dual-boot XP/Ubuntu in the early 2000s, and didn't come back until OS X 10.3 or 10.4, and I don't think I was alone.

---

[1] The iPod, of course, is successful because of iTunes. Which isn't exactly an OS. But is certainly a software ecosystem, common to all Apple hardware, with a circumventable but nonetheless real form of lock-in, built around what was at the time a superior, unified interface for purchasing music, organizing it, and copying it to your devices. The success of the iPod derives mostly from Apple's custom software for the iPod. People have been cloning the hardware forever. Pundits famously didn't see the iPod coming because other companies had beaten Apple to market with apparently equivalent hardware.


“The iPod, of course, is successful because of iTunes.”

Really?! Now maybe. But five years ago? I have a very hard time believing that.


You've got it backwards :)

Right now, with the iPhone and iPod Touch, you don't really need to connect them to a computer at all. When I bought my first iPod in 2003, that was absolutely the case, and all the existing PC software for organizing your music sucked ass. I had to use fucking Musicmatch Jukebox to sync my iPod, and the alternative library programs were even worse.

The release iTunes for Windows was a godsend to me -- sure it's pretty resource hungry and has only gotten moreso, but it's pretty fucking fantastic at just getting the basic library management stuff right. Being able to easily rip to AAC was gravy, as it meant that I no longer had to have some of my albums as lossless (MP3's psychoacoustics shit all over gravelly voices). I switched to a Mac a few years later.


See, iTunes never mattered that much to me when I first got a iPod. And I do actually like iTunes.

But everyone I know who ever got a iPod absolutely loathed iTunes. iTunes was not the reason, the hardware was. I have seen that change with the iPhone. Of course, no data, just anecdotes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: