Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I did do mainframe programming and I do webapps and phone apps these days, so I've kept up.

It's gotta be both approaches (yours and the articles) but the real problem is the demand for programmers is so high and the barrier to entry is so low that the quality has suffered; the quality of the libraries, build systems, documentation, designs, interfaces, all of it.

You can't magically drag all modern programmers through the mud using line editors and 16 bit processors for 2 years to learn everything the painful way.

I honestly don't see a way out. We're in the eternal September of software development and it's all down hill from here unless we make some commitment to raising the barrier to entry and decreasing the incentives, making it hard again.



"the real problem is the demand for programmers is so high and the barrier to entry is so low that the quality has suffered"

Interestingly I see a whole world of difference in quality between Python and Perl libraries compared to JavaScript libraries. OK not all Python libraries are perfect and not all JS libraries are shit, but in general the backend stuff seems of a far higher quality.


There's always a "first" language - the first programming language that's learned and taught. It was qbasic in the 90s for instance, then vbasic, java, php for a bit, then ruby - it's certainly javascript now.

The first languages, during their reign as first languages are always derided. When something snatches the crown from JavaScript (it'll happen however inconceivable this is), I'm sure things will settle and it won't be so bad any more.

It's like how at the end of its life, most of the people still using AOL instant messenger were respectable computer experts. Same idea.


You can't teach new people one by one, but you can write articles and blog posts, then post them here on HN just like this guy did.


it's a skill, not a fact. skills are learned tacitly through experience; such as riding a bicycle. You have to ride to learn. The problem is that everything has changed so much.

When I started, compiling took serious time (hours sometimes). So you were much more careful about making mistakes. Compilers also had bugs, as did linkers, debuggers, you had to know how to spot these things and when to question your code and when to question your tools.

Operating system containment and protection was more an intention than a reality and it was relatively easy to lock up or crash a machine through faulty code. Workstation uptimes of weeks was seen as impressive. These days, things are so stable that "uptime" just tells me when my last power outage was.

When we released software it was on physical media, which was either mailed to people in physical boxes or running on physical machines that were shipped out. Not making mistakes was much more important in that situation since you couldn't just deploy an update without a lot of cost and ceremony.

It's all changed so fundamentally; I'd be open to having an instruction course where people have to target some vintage machine (which we'd likely have to virtualize) and have them just deal with it for 6 months. I don't know how many signups you'd get though.


What’s the point, though, in learning about a vintage machine? It’s fun for hobbyists but it’s not useful for real life. That’s the point. The industry has changed and old dinosaurs, which I consider myself a part of, have to adapt.

For example I hate dependency injection. I despise it, I think it’s stupid. But my company does this, so I do it. Many other companies are doing it. I adapt or die.


With Amazon and Google et.al cloud computing we are back to pay-per-cycle, so at least some of it makes sense, if you are for example deploying millions of CPU's.


Software development is still in its infancy. It will slow down eventually and stabilize.

The last two decades were a madness of inventions, with computing doubling almost every 3 year, new languages and paradigm invented, new tools, the internet, the web, phones, giant displays, small displays with touch. We surely won't get that much change in the next two decades.


I believe in the next 2 decades human programmers will be replaced by artificial intelligence.


People have been beating this drum as long as I can remember. Nobody wants to have to have those pesky engineers around to actually do the work - they want to drag and drop some components around, provide an incredibly fuzzy description of what it should do, wave their finger in the air, and voila, working software materializes.

This is one of those "Next year, in Jerusalem" ideas that is perpetually 20 years away from reality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: