Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To present something less extreme than an 8086 running DOS and Windows 1.0, consider that in 1999, a PC with a 366 MHz processor, 64 MB of RAM, and a 6 GB hard disk was quite adequate for browsing the Web, running office applications, playing MP3s, encoding said MP3s (faster than real time), and developing software -- all at the same time if I wanted to. (This was my laptop throughout college, albeit later upgraded to 192 MB of RAM.) And yet now, those specs wouldn't even be considered adequate for a smartphone, let alone a PC. How'd we get here? Abstraction gone wild?

EDIT: Pondering this is enough to make me regret that I perpetuated the cycle by buying a newer, faster PC a year ago. It seems to me that the upgrade treadmill and the accompanying waste will only stop if we software developers deliberately develop and test our software on underpowered machines.



I've lately seen an NT 4 server that's still running on old hardware. It's amazing how responsive this system was, every click immediately showed the UI element I wanted, windows explorer takes under a second to load, etc. Around 99 was probably the golden era of PC, right before it all got messy with required anti virus systems, firewalls and later UAC tying back your hands and people wanting silly things like a desktop system with touch display. I'm only 29 and didn't quite see the beginnings in the early 80ies, but the 90ies still give lots of nostalgia.


This. It's also when windows had win32 and C or MFC and C++. No ATL, no .net, no managed extensions, no WinRT.

You could pick tech in no time at all to build something. Then devdiv went batshit and started spewing frameworks galore. Now we have runtime hell and not a soul knows where to start.

I occasionally use an NT4 VM and it makes me sad that such a clean OS has turned into a bloated pile of crap.


> To present something less extreme than an 8086 running DOS and Windows 1.0, consider that in 1999, a PC with a 366 MHz processor, 64 MB of RAM, and a 6 GB hard disk was quite adequate for browsing the Web, running office applications, playing MP3s, encoding said MP3s (faster than real time), and developing software -- all at the same time if I wanted to.

A 2007 iPhone with similar specs could do all that just fine. Probably the biggest difference is what we ask of our systems. I'm typing this into one of eight browser tabs, each running some JavaScripty thing or another. I'm also running an end-to-end test suite of the software I'm working on today, and that suite hits PostgreSQL, Mongo, Redis, and some other stuff all running on the same laptop. There are a bunch of Internet-connected apps doing their own thing in the background - polling Twitter, listening to Hipchat conversations, and downloading a playlist from my iTunes Match account. And through all this, I have an instantly-reactive, shiny desktop with drop shadows and transparency in quite a few spots.

Maybe the difference is that we expect our computers to wrap around our needs now and not vice-versa. It's been years since I quit one app so that I could run another, and I'd upgrade my laptop in a heartbeat if running that test suite made other stuff sluggish. I effectively don't have to care about my machines limitations anymore and I like it that way.


You're right that we can do more with our computers now than we did a decade ago (though I don't care for the transparency and other visual effects; I used the Windows Classic theme until MS took it away in Windows 8). But the increasing resource requirements of software don't always come with an increase in functionality, and the increased requirements sometimes force hardware upgrades even if one doesn't want more functionality. Would an original 2007 iPhone be able to run iOS 7, even if one only expects it to do the things that it could do in 2007? (Hint: Even an iPhone 4 is sluggish on iOS 7.) So maybe it's just an inaccurate perception, but the increased resource requirements of newer software make it feel like we're on an upgrade treadmill. Or as the saying went in the 90s, Andy (Grove) giveth and Bill (Gates) taketh away.


The upgrade treadmill is a good thing. I'm sitting in front of a machine with a quad-core processor and 16GB of RAM that cost way less than the 486 system I bought back when the specs you described were relevant. Is it really useful to have 15.93600 GB of free memory because everything could, if unchanged from 1999, run in 64 MB of RAM?

Of course, the big differences from 1999 is that I'm also sitting in front of 3840x1080 pixels. Something that my previous computer from 6 years ago couldn't handle adequately. I have a game open in a window that uses 23GB of harddrive space -- and it's totally worth it! I have 60 browser tabs open and any one of them could be play HD video in real time from the Internet (In 1999, HD wasn't even a real thing yet). It's never been easier to write software or create any sort of media content. It's not bloat -- we're making use of the power that we have. Anything less than 100% utilization is a waste.


Chill out dude it's called progress and it's a good thing. The web is fundamentally different now than it ever was and software is very different. If you wanna go back to 19diggedey-two theres always ancient operating systems or GNU/Linux with some oldass WM like Windowmaker or something.


I would argue that software is NOT very different. In fact, it's still exactly the same, just wrapped in new words.

The entire web is still client/server. Web apps? Client / server. Databases? Client / server. Just because more apps now insist on having a server to store data instead of making use of the microcomputer that they're running on doesn't change the fact that the last 25 years has been a reinvention of the same concepts again and again.

And even when "Rich Internet Applications" was the craze of the 90s, that was still just client/server, the concept that 70s mainframes had been coping with - they were servers and served clients.

Sure, there are loads more programming languages now instead of the choice of C and C++ (and maybe Delphi?) for Windows in the 1990s but underneath it is still doing the same. Apart from now we have intermediate virtual machines and CLRs to intercede, perhaps unnecessarily?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: