For what it's worth, people who are children now will have missed the wild days of the Web when everyone had their own API, data, and UI stacks and no two apps looked or behaved the same.
I'm not sure I agree with the OP. When there is real competitive advantage in some newish layer of the tech stack, you will see incredible diversity. When the advantage moves on to other parts of the stack (eg applications, the web, etc), the older layers standardize and homogenize.
Today, in 2009, neither I nor my mom give two shits what OS we're using 70% of the time because our work is done inside a browser. I run WinXP in a virtual machine on my Mac. At least once a day I catch myself using the browser inside the Windows instance without realizing it.
And I think this is why hardware vendors could come out with their own spin on an operating system.
We didn't use to have a standard for sharing information. If you wanted to share a file, you had to have the exact same hardware and software. That is what led us to the local minima that is PCs running DOS/Windows.
Now, however, we have a well accepted standard. If you can have a good TCP/IP stack and run a browser, you're golden. With just these two things you can be productive on any operating system.
So now, the fact that the OS is the last thing that you consider, might just free manufacturers to come up with their own designs.
That does however leave one crucial thing out, which might just kill Gruber's argument: games.
I'm not sure I agree with the OP. When there is real competitive advantage in some newish layer of the tech stack, you will see incredible diversity. When the advantage moves on to other parts of the stack (eg applications, the web, etc), the older layers standardize and homogenize.
Today, in 2009, neither I nor my mom give two shits what OS we're using 70% of the time because our work is done inside a browser. I run WinXP in a virtual machine on my Mac. At least once a day I catch myself using the browser inside the Windows instance without realizing it.