Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Windows 1.0 and the Applications of Tomorrow (2005) (charlespetzold.com)
125 points by bluedino on Aug 11, 2014 | hide | past | favorite | 64 comments


It's amazing to me that cake.c with minor modifications can still compile and run on modern Windows.

http://jcode.org/screenshots/cake.png


It's amazing in some ways, but I'm concerned it held Windows back to maintain such a high level of reverse compatibility for so long.


On the contrary, the Windows API has endured through some pretty big changes, e.g. from 16-bit to 32-bit and then 64-bit, and from a cooperative single-address-space system to preemptive multitasking with memory protection between processes. (I was surprised that each application had its own message pump even in the cooperative multitasking days.) In what ways do you think the Windows developers' hands were tied by the need for backward compatibility?

Seeing how similar the Windows 1.0 API was to the current Windows APi for desktop apps, I'm actually sad that Microsoft decided to (mostly) abandon that legacy for the new "modern" Windows apps under Windows 8 and Windows Phone 8.1. There's no reason why the classic Windows API couldn't have been adapted for these new-style immersive apps running inside sandboxed containers. But I guess the Windows API gained too much of a reputation for being old, crufty, ugly, and unapproachable for new developers; that's certainly how I saw it when I got my first copy of Visual C++ in 1998. But now I wish Microsoft hadn't responded to that perception by abandoning that API, which has now evolved for decades.


I doubt he meant that third-party developers' hands were tied. I think he means that Microsoft spent incredible resources on backwards compatibility at the expense of innovation. To take the most extreme example I can think of: going from OS 9 to OS X, Apple preserved a (comparatively) handful of APIs as-was, another pile as nearly-the-same-but-requires-modification, and regardless of which they temporarily preserved, made it very clear that you were going to completely rewrite your application if you intended to have a genuinely native OS X experience. By contrast, farking Microsoft OS/2 apps survived into Windows NT 4, and Windows 3.1 apps ran from 1993 through some forms of Windows XP without blinking. Even in the hated Windows 8, Microsoft went out of their way to make sure that they maintained very clear and well-spelled-out migration paths. It is theoretically possible (though you'd be insane) to write a WinRT app in C that mostly uses kernel32.dll, if you want to, because WinRT is unapologetically based on COM, which is in turn based on an old way that vtables happened to be laid out in VC++, which is in turn based on how C structs used to lay out function pointers. This hurts WinRT performance compared to what they could've accomplished with a Swift-like linker.

Did this insane focus on backwards compatibility hurt Microsoft? I don't think you can conclusively say. Did it hurt me as a former Windows developer? Only at best tangentially; all the backwards compatibility crap is pretty abstracted from you if you hang out in .NET land. (Except when it isn't.[1]) And at any rate, I don't think the bleed-through is noticeably worse than e.g. POSIX intrinsics showing up in OS X. But I think the argument that the engineering required to ensure Windows maintained such a strong backwards compatible stance ate out of resources that could've kept Windows more up-to-date has a point.

[1]: http://bitquabit.com/post/zombie-operating-systems-and-aspne...


Needing to maintain compatibility certainly prevented Microsoft from fixing prevalent bugs. The C runtime remained non-POSIX compliant for a very long time because applications were written to expect the incorrect behavior. Some of those bugs led to security holes that were exploited, yet couldn't be patched for fear of breaking third-party software. That's how we end up with redistributable runtimes as applications were told not to depend on the out-of-date system libraries.


And many of you are hesitant about using high-level languages like C because of the performance issues.

Well, we seem to have overcome that worry. I think most programmers of that vintage must be horrified by the amount of bloat in modern computer systems. I gave up worrying about not knowing what every file in my operating system does several years ago, but I still feel bad about throwing in the towel.


> I think most programmers of that vintage must be horrified by the amount of bloat in modern computer systems.

I'm not. I don't miss the days of manually copying data from the floppy drive buffer into the screen device, or word processors writing out bitmaps to print heads.

Some people see bloat. I see not having to care if a file-like object is coming from my hard drive, a URL retrieved via HTTP, a network filesystem, or an in-process fake file generator because they all look exactly the same to the apps I'm writing.

Abstraction is inherently expensive. If you want to support printer A and printer B, the common driver interface has to support the union of both their featuresets. A's driver will be bigger than it has to be because it has to handle stub calls for B's functions. Know what? Having lived the alternative, I have zero desire to ever go back.


To present something less extreme than an 8086 running DOS and Windows 1.0, consider that in 1999, a PC with a 366 MHz processor, 64 MB of RAM, and a 6 GB hard disk was quite adequate for browsing the Web, running office applications, playing MP3s, encoding said MP3s (faster than real time), and developing software -- all at the same time if I wanted to. (This was my laptop throughout college, albeit later upgraded to 192 MB of RAM.) And yet now, those specs wouldn't even be considered adequate for a smartphone, let alone a PC. How'd we get here? Abstraction gone wild?

EDIT: Pondering this is enough to make me regret that I perpetuated the cycle by buying a newer, faster PC a year ago. It seems to me that the upgrade treadmill and the accompanying waste will only stop if we software developers deliberately develop and test our software on underpowered machines.


I've lately seen an NT 4 server that's still running on old hardware. It's amazing how responsive this system was, every click immediately showed the UI element I wanted, windows explorer takes under a second to load, etc. Around 99 was probably the golden era of PC, right before it all got messy with required anti virus systems, firewalls and later UAC tying back your hands and people wanting silly things like a desktop system with touch display. I'm only 29 and didn't quite see the beginnings in the early 80ies, but the 90ies still give lots of nostalgia.


This. It's also when windows had win32 and C or MFC and C++. No ATL, no .net, no managed extensions, no WinRT.

You could pick tech in no time at all to build something. Then devdiv went batshit and started spewing frameworks galore. Now we have runtime hell and not a soul knows where to start.

I occasionally use an NT4 VM and it makes me sad that such a clean OS has turned into a bloated pile of crap.


> To present something less extreme than an 8086 running DOS and Windows 1.0, consider that in 1999, a PC with a 366 MHz processor, 64 MB of RAM, and a 6 GB hard disk was quite adequate for browsing the Web, running office applications, playing MP3s, encoding said MP3s (faster than real time), and developing software -- all at the same time if I wanted to.

A 2007 iPhone with similar specs could do all that just fine. Probably the biggest difference is what we ask of our systems. I'm typing this into one of eight browser tabs, each running some JavaScripty thing or another. I'm also running an end-to-end test suite of the software I'm working on today, and that suite hits PostgreSQL, Mongo, Redis, and some other stuff all running on the same laptop. There are a bunch of Internet-connected apps doing their own thing in the background - polling Twitter, listening to Hipchat conversations, and downloading a playlist from my iTunes Match account. And through all this, I have an instantly-reactive, shiny desktop with drop shadows and transparency in quite a few spots.

Maybe the difference is that we expect our computers to wrap around our needs now and not vice-versa. It's been years since I quit one app so that I could run another, and I'd upgrade my laptop in a heartbeat if running that test suite made other stuff sluggish. I effectively don't have to care about my machines limitations anymore and I like it that way.


You're right that we can do more with our computers now than we did a decade ago (though I don't care for the transparency and other visual effects; I used the Windows Classic theme until MS took it away in Windows 8). But the increasing resource requirements of software don't always come with an increase in functionality, and the increased requirements sometimes force hardware upgrades even if one doesn't want more functionality. Would an original 2007 iPhone be able to run iOS 7, even if one only expects it to do the things that it could do in 2007? (Hint: Even an iPhone 4 is sluggish on iOS 7.) So maybe it's just an inaccurate perception, but the increased resource requirements of newer software make it feel like we're on an upgrade treadmill. Or as the saying went in the 90s, Andy (Grove) giveth and Bill (Gates) taketh away.


The upgrade treadmill is a good thing. I'm sitting in front of a machine with a quad-core processor and 16GB of RAM that cost way less than the 486 system I bought back when the specs you described were relevant. Is it really useful to have 15.93600 GB of free memory because everything could, if unchanged from 1999, run in 64 MB of RAM?

Of course, the big differences from 1999 is that I'm also sitting in front of 3840x1080 pixels. Something that my previous computer from 6 years ago couldn't handle adequately. I have a game open in a window that uses 23GB of harddrive space -- and it's totally worth it! I have 60 browser tabs open and any one of them could be play HD video in real time from the Internet (In 1999, HD wasn't even a real thing yet). It's never been easier to write software or create any sort of media content. It's not bloat -- we're making use of the power that we have. Anything less than 100% utilization is a waste.


Chill out dude it's called progress and it's a good thing. The web is fundamentally different now than it ever was and software is very different. If you wanna go back to 19diggedey-two theres always ancient operating systems or GNU/Linux with some oldass WM like Windowmaker or something.


I would argue that software is NOT very different. In fact, it's still exactly the same, just wrapped in new words.

The entire web is still client/server. Web apps? Client / server. Databases? Client / server. Just because more apps now insist on having a server to store data instead of making use of the microcomputer that they're running on doesn't change the fact that the last 25 years has been a reinvention of the same concepts again and again.

And even when "Rich Internet Applications" was the craze of the 90s, that was still just client/server, the concept that 70s mainframes had been coping with - they were servers and served clients.

Sure, there are loads more programming languages now instead of the choice of C and C++ (and maybe Delphi?) for Windows in the 1990s but underneath it is still doing the same. Apart from now we have intermediate virtual machines and CLRs to intercede, perhaps unnecessarily?


How do you feel when you run "top" or its equivalent on a modern OS? I feel like the barbarians have completely taken over, and if not I certainly couldn't stop them if they wanted to.


Anxious. I'm still good at dealing with process lists and so on, so I have a knack for things like malware removal or reversing. But whereas I used to know what was going on, these days I have to rely on intuition more than I used to, and it reminds me uncomfortably of youthful ignorant blundering.

I guess it's like being someone that used fix their own care and tune the engine etc., only to have modern vehicles increasingly turn into black boxes that require more and more specialized equipment. I don't love such troubleshooting enough to want to make it my full-time occupation, so I suppose it's an inevitable one-way street.


Yeah, I think there should be a middle ground. On one hand, having more resources that enable us to worry less about lower level minutiae is good. But on the other hand when I see a resident Android device take up well over 400MB from its UI and services alone, I can't help but think there's something a little wrong here.


Your android is likely 32-bit full HD. Various bitmap caches would take good part of those 400M anyway.

Compare to my first PC which did 800x600@8bit.


>I certainly couldn't stop them if they wanted to.

May the best fork win.


There is plenty of bloat in modern systems, so we do seem to have overcome that worry in the average programmer's mindset.

There is a significant difference between the asm/C jump and the C/managed jump though- even though early C compilers had a performance penalty, the C language semantics are very amenable to optimization and the cases where you could do better in asm are pretty rare.

Your typical dynamically typed, garbage collected scripting language, on the other hand, is not very nice to optimize. It may be "good enough," but there's plenty of good reasons people still start new projects in C++, and there's a reason for the recent resurgence of more static languages like Go, Julia, Swift, Rust, etc.


When I'm doing Python the only real optimization you can get out of it is algorithmic optimization, anything else is just knowing the subtle tips and tricks of Python or if you really need it, dropping down into actual C code.


Try PyPy - python with a JIT. Any long running loop will be optimized.


Sadly PyPy isn't mature enough for my needs. I hope to switch over to it once the NumPy support is closer to 100%


I've been coding since the mid/late 80s and whilst I do sometimes look back fondly on how much we managed to wring out of the underpowered tech we had then, I don't miss the amount of time and effort it took us to achieve it.

Back in the day, I spent weeks writing a basic DOS-based windowing framework for one of my applications. Today, an equivalent (well, vastly more powerful) framework comes out of the box with 3 clicks of the "create a new app" wizard on just about every platform out there.

I also spent weeks, if not months, fine-tuning my apps so that they would run at a sensible speed on the hardware we had, would fit into the 500K or so of RAM that was available, and would pretty much never crash as issuing a patch involved me manually copying hundreds of floppy disks and posting them to clients.

Modern hardware/OSs/languages may be far less efficient in their use of CPU/memory etc, but they allow me to be far more efficient in the use of my time to deliver actual functionality to users.


> I think most programmers of that vintage must be horrified by the amount of bloat in modern computer systems.

I've been writing apps for computers since 1978, and I'm seriously horrified by bloat. It appears to be an unstoppable force, however, so these days I'm mostly interested in ways to harness bloat. I believe this is also a principle activity of my peers.


How do you harness bloat?


A question that must be answered by every single modern developer, for that is what (a majority of them) are doing ..

Visual Studio, C#, Mono .. these are all big harnesses, keeping the resources tied to the rock.


Have we really? There seems to be a lot of this very sort of push back against higher-level languages like Haskell & Scala.

I just recently debated a colleague on the merits of AltJS languages. He's of the opinion that Javascript is all we need and is happy to be a "Javascript assembly programmer" for the foreseeable (and perhaps unforeseeable) future.


Garbage collection ? pff, it's never going to work.


I am from that vintage.

Having gone through a few cycles in language changes, I look forward to the day C is drinking its retirement tea together with PL/I, and the industry finally understood the path taken by the Algol/Mesa/Cedar languages for memory safe systems programming.

It might take a few generations, but it will happen.

The same way it did to many of my fellow developers that saw C and Pascal dialects as high level bloat.


I've been a C coder for decades now. But I don't think that C is going away. C has far too much traction, and then there is: Lua. Things like Lua make it even less likely that we'll be entirely rid of C in the near future, imho ..


On UNIX systems, I agree, after all they are symbiotic.

On other ones, it will eventually change. Microsoft already killed C on their tooling[0], although it lives on as C++ subset.

Then we have things like MirageOS happening.

On any case I wasn't talking about near future. After all, there are still PL/I systems out there.

[0] Current decision, might of course change, as all decisions.


What language is Visual Studio written in?

What language is the .NET infrastructure in?

I don't think Microsoft has ever produced any self-hosting language except C/C++. Until that fact changes, C/C++ will still be around.


> What language is Visual Studio written in?

C++, C#, VB.NET and F#

> What language is the .NET infrastructure in?

C++, C#, VB.NET and F#

> I don't think Microsoft has ever produced any self-hosting language except C/C++. Until that fact changes, C/C++ will still be around.

QuickBasic, QuickPascal

You don't need C and C++ to bootstrap a compiler. Ever.

I only started to use C for real when I started to work with Xenix.

EDIT: This myth that one needs C or C++ to write compilers needs to fade away.

EDIT2: C# 6 and VB.NET 6 are self-hosting, and while .NET Native plugs into VC++ backend, they can eventually choose another backend.


Not true. The F# compiler is self-hosting as is the C# compiler. See:

https://github.com/fsharp/fsharp & https://roslyn.codeplex.com/


Is Windows going away though? Do you know anyone who likes Windows 8? What should we write for the desktop now? Which desktop is "the desktop"?


> Is Windows going away though? Do you know anyone who likes Windows 8?

Yes, I do like it. Specially given that WinRT builds on the original idea of .NET (COM based native programming), before they went MSIL. It is like Microsoft just recovered the Ext-VOS project before their Java love.

http://blogs.msdn.com/b/dsyme/archive/2012/07/05/more-c-net-...

I do consultancy for Fortune 500 companies and at least for the next decades, it will surely be around.

> What should we write for the desktop now? Which desktop is "the desktop"?

What we always did. Make use of the native languages and frameworks to provide the best user experience.


I was only asking, by the way. I wasn't trying to belittle Windows particularly, just given the general dislike of Windows 8 among those used to the old paradigm I wonder what will happen for desktop applications as the traditional windowed approach lives with the tiled full-screen application approach. Just curious really.


but seemed much saner than trying to recreate an actual vintage 1985-era PC-AT.

That was only a 20 year old computer at that point. I can play media from 50 years ago easily. I'd argue that software are more appliances (they do things) than media, and appliances from 20, 30, 40 years ago are still serviceable and working.

Yet we are amazed when a Raspberry PI can serve web pages, despite the fact that it's many factors of more powerful than many circa-1998 web servers.


But yet we have phones and computers that are arguably as (un) responsive as their counterparts in 1998. I really wish those issues would be addressed rather than making things flashier with sexy transition animations.


But they aren't. I'm pretty sure I've owned multiple machines that couldn't POST in the time it takes my current desktop to be at a usable state, and I have more on-die L2 cache than my first desktop had RAM.

The past was full of a lot more paging to disk and it was awful. I mean, sure, things haven't progressed as much as they could I suppose, but I also don't know the last time I had to wait 5-10 minutes for a computer to shut down.


My first XT clone could would POST in less than 3 seconds if I skipped the memory test (all 640K of it), and I stayed with DOS 2 much longer than everyone else because it run almost all software at the time, and properly configured (big BUFFERS in CONFIG.SYS, for example), it would boot from floppy in 3 more.

I've recently managed to get an ubuntu server to boot as quickly (with preconfigured network interfaces and a few other things), but for years I could find the magic combination that would boot a usable system that quickly.

And I haven't seen a Windows machine that boots to usable state in less than 15 seconds (SSD, 30 seconds on magnetic hard drive) in a long time. "Usable" means you double click on something and it responds, btw - not that it shows the desktop.


The past was full of a lot more paging to disk and it was awful.

Those sounds just went through my head. That noise hurts just to think about.

but I also don't know the last time I had to wait 5-10 minutes for a computer to shut down.

Windows 7 seems to take 5 minutes to shut down, even if I give it a head-start by explicitly closing all applications I can see or understand.

There are 137 "processes" and over 200 "services" running right now on the Windows 7 machine I am at. I can't watch them go away when I shut down, because if I leave the task manager running, Windows 7 complains about the Task Manager still running when I want to shut down.

There are many things that are much better today, but it seems that some things are getting worse.


In theory you could install Windows Performance Toolkit from Windows SDK and then trace shutdown with xbootmgr command, then try to analyse resulting .etl file (xbootmgr will save it to the current folder) with xperf. In practice that will probably take more time than you could save by optimizing shutdown.


That sounds painful. Ever since I went to Win 8 my shutdown has been instant. Startup is around 3 seconds.


It's not that we don't have the ability to be responsive, and somethings are. My desktop is still laggy at times, (though it's gotten a lot better when I switched to a lighter DE (for obvious reasons)) and my phone still has issues just loading text messages without me having to wait multiple seconds.

That's the kind of stuff I'm talking about. A Text Messaging app shouldn't have perceivable load time. My phone shouldn't freeze doing a transition between desktops. Computer desktops shouldn't take time to open up a menu.

We've come a long way, and have tons more raw performance, but I just feel that we squander it on things that don't add any value.


You can definitely play media from 50 years ago... but "easily" may be a reach too far. How many people have easy access to a player for eight-track tapes (circa 1964: http://en.wikipedia.org/wiki/8-track_tape), for instance? Or a way to play back old Super-8 (circa 1965: http://en.wikipedia.org/wiki/8_mm_film) home movies? These devices can be tracked down, but it's not like they're sitting at most peoples' fingertips.


8-track was a big flop outside of the US so didn't really get the ubiquity that competing formats did. To look at things another way: how many people can play HD-DVDs? It's only a few years old but suffered the same commercial failure as 8-track did in Europe. Yet most people still have easy access to a VHS players, audio cassette players and record decks despite those all being old technologies too.


If you have easy access to a flea market, you've got easy access to those.


Who's amazed at that? I imagine you'd be hard-pressed to produce someone that is impressed that "a tiny computer can run real computer programs".


>> Now, I know when some people see graphical environments they start whining and saying “But I don’t want to use a mouse. There’s no room on my desk for a mouse.”

How small were desks back then, not having room for a mouse?

>> Well, with Windows, for the most part the mouse is optional. I won’t be using a mouse at all during this demonstration.

I remember feeling 'cool' that I could use a Windows machine from only the keyboard back in the 90's.


You still can use Windows with only a keyboard. Until very recently, MS was very good about maintaining UI consistency and having plenty of redundancy - simply holding down the alt key in Win7, for example, will display the keyboard shortcuts for all of the functions in the current window by underlining the relevant letter, in the manner that used to be displayed at all times in earlier windows versions.

Having recently switched to Linux, I've been astonished that some X applications seem to require the use of a mouse, and that many graphical environments don't have default keyboard shortcuts assigned to certain tasks. At least it's much easier to create and redefine shortcuts, and you can always open a terminal and type in the full commands as needed.

This is one of the things that's very annoying about applications that hide the menubar or lack a menubar entirely. Cycling through menubar options with a keyboard is far, far more efficient than attempting to use a keyboard with many of the poorly-thought-out experimental UIs that seem to be increasingly prevalent.


That daft "show shortcut letters when I press Alt" can be turned off so that Windows ALWAYS shows those underlined letters. Thank goodness. It is unusable else.

I would agree that X / Linux applications fail in this respect.

Also, Mac OSX by default will only tab between certain UI elements, and the option to force it to tab between all UI elements cannot be reached with the keyboard only, meaning if you start up without a mouse you can't turn on the relevant option to use it with a keyboard, with a keyboard...........


Desks weren't small, computers were enormous.

http://en.m.wikipedia.org/wiki/File:Bundesarchiv_B_145_Bild-... shows a desk that is way larger than the typical desk one had at home, with its separate tray for the keyboard.

Even if you had such a desk, that tray doesn't have room for a mouse.

Edit: also, mice were terribly unresponsive. You had to move the mouse for decimeters to get from one edge of the screen to the other, even if that screen had only 640 pixels.


Bear in mind that computers were seen as glorified calculators back then rather than the business appliance of today - managers didn't have computers for the same reason they didn't have typewriters - they were machines to be operated by subordinates, preferably in the basement. In my first coputer job I was using a dumb terminal to a mainframe in another city, and it sat atop some sort of storage cabinet while I perched on a rickety stool.

Can't remember exactly what it was, but this was Ireland in mid 1980s and the hardware was already pretty creaky so some low end model from the 1970s. Print jobs were formatted by placing a loop of punched paper tape on two sprocket wheels before sending the job request to the mainframe, and offline storage was to giant vertically-mounted spinning tapes. Our department's prize possession was a 286 with a 10mb hard disk, which was what I learned DOS on. I got told off for installing Desqview without permission XD


> How small were desks back then, not having room for a mouse?

Desks weren't necessarily small, but the amount of paper, paper handling and organizing equipment on the typical office desk was unbelievable to the modern office worker.

http://www.devoncornwallexecutive.co.uk/wp-content/uploads/2...

http://www.bbc.co.uk/northyorkshire/content/images/2008/06/2...

http://www.sundaynightimprov.com/files/images/Tom2.preview.i...

Actually, if you want to see a modern version of this, the Japanese have kept the paper office alive and well.

http://www.accountant-tokyo.com/wp-content/uploads/2014/04/J...


Hmm, I wonder if that's why clean desk policies are popular (or rather, unpopular but present) in many companies.

It does look neater, and it's the only enforceable equilibrium given the variety of approaches people take to tidiness. But I'd never made the connection, that it may be a form of signalling - our company is modern and doesn't need all the gubbins to process paper.


Good point. The "paperless office" has long been a target and symbol modernity in office environments.

https://en.wikipedia.org/wiki/Paperless_office


Desks were smaller, the big desktop computer and big CRT took up more space, and one was more likely to have a bunch of paper, books, and other stuff taking up space on the desk too then than today.


I remember people at my first job freaking out on how fast I could navigate complex GUIs with the mouse.

It probably helped that I played RTS and FPS for years, lol.


Remember that a large monitor occupied 80% of your desk space back then.


Keep in mind also that today when we plan a desk space, we plan for a mouse or trackpad and a keyboard, and we don't do much else.

People then still did lots of paperwork and needed space for other things. Of course they didn't have space for a mouse yet.


I wasn't doing too much computing in 1985, but I don't think the issue was so much about small desks as it was large computers. Likewise, mice were larger then, too. The whole "smaller computers" is still a really recent thing, despite the recentness of the whole "computer" thing in general.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: