it's a skill, not a fact. skills are learned tacitly through experience; such as riding a bicycle. You have to ride to learn. The problem is that everything has changed so much.
When I started, compiling took serious time (hours sometimes). So you were much more careful about making mistakes. Compilers also had bugs, as did linkers, debuggers, you had to know how to spot these things and when to question your code and when to question your tools.
Operating system containment and protection was more an intention than a reality and it was relatively easy to lock up or crash a machine through faulty code. Workstation uptimes of weeks was seen as impressive. These days, things are so stable that "uptime" just tells me when my last power outage was.
When we released software it was on physical media, which was either mailed to people in physical boxes or running on physical machines that were shipped out. Not making mistakes was much more important in that situation since you couldn't just deploy an update without a lot of cost and ceremony.
It's all changed so fundamentally; I'd be open to having an instruction course where people have to target some vintage machine (which we'd likely have to virtualize) and have them just deal with it for 6 months. I don't know how many signups you'd get though.
What’s the point, though, in learning about a vintage machine? It’s fun for hobbyists but it’s not useful for real life. That’s the point. The industry has changed and old dinosaurs, which I consider myself a part of, have to adapt.
For example I hate dependency injection. I despise it, I think it’s stupid. But my company does this, so I do it. Many other companies are doing it. I adapt or die.
With Amazon and Google et.al cloud computing we are back to pay-per-cycle, so at least some of it makes sense, if you are for example deploying millions of CPU's.
Software development is still in its infancy. It will slow down eventually and stabilize.
The last two decades were a madness of inventions, with computing doubling almost every 3 year, new languages and paradigm invented, new tools, the internet, the web, phones, giant displays, small displays with touch. We surely won't get that much change in the next two decades.
People have been beating this drum as long as I can remember. Nobody wants to have to have those pesky engineers around to actually do the work - they want to drag and drop some components around, provide an incredibly fuzzy description of what it should do, wave their finger in the air, and voila, working software materializes.
This is one of those "Next year, in Jerusalem" ideas that is perpetually 20 years away from reality.