Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yea, thinking about it, many of the old UNIX workstation vendors used the same 68K CPUs but then they decided to develop their own RISC architectures, fragmenting the market and reducing the economies of scale.


Well, it's hard to say that Motorola had it's heart in the 68K CPUs, it was a wildly diversified and not entirely focused electronics conglomerate back then.

And their product plan was very conservative, with the "odd numbered" chips not being great advances on the even numbered ones, e.g. the 68030, which came out the same time as the first SPARC, used the 68020 microarchitecture with a 256 byte instruction cache and used the design shrink to put the MMU on the chip (but not the FPU). The 68040, three years later, wasn't a stunning success, was it? And the 88100, which came out a year after the 68020 certainly sucked up a lot of corporate oxygen, not to mention put into question the company's commitment to the 68K family.

And it's very dangerous for a company to depend on another company for one of the most critical components of its products, isn't it?

But, yeah, the economies of scale, the much lower unit sales to spread your Non-Recurring Engineering costs over, eventually doomed them to mostly bureaucratic and installed base niches when AMD successfully innovated for a short period and then Intel got its act together due to that threat.


Yea, I know the 68040 was late. On the other hand, not having SPARC for example would have been less competition for HP etc to deal with, making this less of a problem.


Which does a good job of proving my case. SUN very much didn't want to provide "less completion for HP" (albeit I very seldom heard a good word for HP-UX).


And my point is that in the long term this was probably a bad idea.


Indeed, and it's something I've been thinking about if I were in charge of Symbolics at the beginning.

But it's also in part 20/20 hindsight, e.g.:

Lots of people refused to believe that Moore's Law would last as long as it has; the corollary that you'd get higher speeds purely from design shrinks did end about a decade ago.

I'm not sure very many people "got" the Clayton Christensen The Innovator's Dilemma disruptive innovation thesis prior to his publishing the book in 1997. He really put it all together, how companies with initially cruddy products could in due course destroy you seemingly overnight.

In this case, how a manufacturer of rather awful CPUs (the 286 in particular, but the 8086/8 was no price except in cost; caveat, Intel support to people who design in their chips was stellar back then), could start getting its act together in a big way in 1985 with the 386, then seriously crack their CISC limitations with P6 microarchitecture (Pentium Pro), etc.

And note their RISC flirtation with the i860 in 1989, their Itanium debacle, etc. etc. More than a few companies would have committed suicide before swallowing their pride and adopting their downmarket, copycat's 64 bit macroarchitecture that competed with the official 64 bit one.

And that's not even getting into all the mistakes they made with memory systems, million part recalls on the eave of OEM shipments, etc. What allowed Intel to win? Superb manufacturing, and massive Wintel sales, I think.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: