That's definitely a very spicy take but I think people need to be more willing to engage with the idea. There is a serious problem where software engineers are building systems that, when built poorly, are causing significant harm.
If we don't engage with that problem we're going to get regulated in a way that we may not like.
edit: Look at how many comments are just attacking a straw man policy. This is why we're going to end up getting regulated in some dumb way - people can't or won't even imagine a situation better than this one.
This is one of the few opinions which are consistently rejected by the HN peanut gallery, unsurprisingly since most of us are developers. OP is brave to outright say it!
Any time you bring up the idea of accountability to “engineers” for defects in their software that cause serious, real life harm, you get a litany of excuses in response, instead of good faith debate. If an engineer signs off on a bridge and it collapses, injuring a dozen people, there will at the very least be questions asked, if not more accountability. But if a software engineer implements a shoddy system that gets hacked and results in this kind of very personal data getting exfiltrated, all you get is 1,000 versions of “it’s not the developers’ fault!” and “they were probably under an impossible deadline!” And “bad manager told them what to implement, they had no choice!” and “the market forces this, careful developers can’t compete!” and so on.
Saying there should be licensure with revocation as a consequence in extreme cases? Now that’s crazy-talk here!
> Any time you bring up the idea of accountability to “engineers” for defects in their software that cause serious, real life harm, you get a litany of excuses in response
I'm generally against it, because I don't trust anyone to regulate this in a sane manner, but I'm open to discussion.
If we're going have engineers sign off on things and be legally liable, there has to be a context that you're signing off on them to be used in as well. If a physical engineer signs off on a pedestrian bridge, and then people decide to drive semi-trucks over it, there's generally no liability for the original engineer. Same situation if an engineer signs off on a sturdy bridge, and someone makes a bunch of changes to materials without telling them.
These seem roughly analogous to someone opening up something designed for private use to the open internet, and someone making changes to a function without telling the original engineer, respectively. These have to be excluded, or nobody can ever sign off on anything.
Yep, context and environment is important. You're signing off on your product being used in a particular context and in a particular environment. The semi-truck driving over a pedestrian bridge is a great example.
Unfortunately for much of software engineering, our "environment" is the open Internet where there are largely invisible, international, adversarial attackers working 24 hours a day, seven days a week. With Internet-connected software we can't just say "Oh, this software's intended environment is a clean-room LAN with no connected devices! That's all I'm signing off." That's not reality. As for your example, companies should really, really have a hard conversation about taking a software designed for privacy use and just opening it up to the Internet without hardening it sufficiently. Accountability would help make that conversation possible.
Professional ethics generally apply to representing the interest of your clients, not their victims. Engineering ethics and professional licensure don't discourage creating new weapons systems (eg armed drones) or more unnecessary prisons. Which means that for the main problem we're facing - big tech surveillance panopticon - widely understood engineering ethics has basically nothing to say! I wish this were different, but it's not.
There's also the practical problem that this software wasn't even produced in the US. So then also ban the importation of any software that wasn't produced under a similar regime?
So no, the idea that licensing software engineers could correct the problems of our industry seems wholly untenable and ultimately from the same broken-political vein of finding some fall guy to blame rather than directly regulating the behavior of malevolent companies.
The bigger flaw in your argument is assuming that imposing strict regulations on developers in your own country would have magically prevented this software breach from engineers and a company in a foreign country.
I’m perplexed by how many people will see stories about companies having problems in foreign markets and conclude that we need to make things harder for ourselves domestically.
Programmers are more fungible than your local licensed doctor or your local licensed PE structural engineer.
> The bigger flaw in your argument is assuming that imposing strict regulations on developers in your own country would have magically prevented this software breach from engineers and a company in a foreign country.
Foreign companies care about complying with US policy when they sell to the US. It's not that complicated and I've said this repeatedly. Beyond that, one does not need an incredible imagination to think "how would these problems look domestically" so I don't know why so many people are hyperfocused on one problem existing in one country. It's not like the US doesn't see massive breaches constantly.
I have never once written bugfree code. ever. I can mitigate with a myriad of tools, but sooner or later, a bug will popup. Nobody writes bugs on purpose, so of course the idea is ludicrous.
I don't know what you're talking about wrt controlling what the bug will be but certifications already exist and they have nothing to do with bugs. Like SOC2.
What if an engineer designs a lock and someone picks it? What if an engineer designs a bridge that someone later destroys by finding a weak point and placing explosives there?
Root cause analysis with findings made public. What exactly was signed off, in what context, what was the failure and was it a design defect? If it was a design defect, what action could have prevented it? Did the engineer sign off that the action was conducted?
Was the defect known at the time of shipping? I personally don't think it's fair to hold people accountable to unknown defects, as long as corrective action happens when they become known. You could limit accountability to known defects and still find lots of serious problems. We all know that our companies ship software with known defects all the time!
I like how it works in aviation: Any time something happens, the NTSB swoops in and does a report and publishes their findings so everyone can learn from them. They list primary and contributing causes. If negligence or violations of FAR are found, then the FAA may start down the path of certificate suspension/revocation.
Software is not as serious or deadly as aviation, so you wouldn't take it that far, but I'd say it's directionally where we should be going. Baby steps...
Your opinion is admirable but you're missing something.
I, as a software engineer, sign nothing. I do not have a license, I do not have a labor union, and I do not have the "professional stature" to tell my boss to fuck off when he suggests something dangerous. The reason engineering disciplines have such benefits is because after enough cars crumbled and bridges fell the leaders in the industry conceded they needed to listen to professionals. I hear similar bad faith arguments like "you can always quit!" or "your job is to write good code!". I can't always quit because I just got done being jobless for 3 months interviewing. My job was never to write good code. It was to deliver something on budget. If it so happens I get the time to write good, safe, code it was either an accident by management or something I did on my spare time. If you think this is a "bad faith" argument for software engineers being unlicensed you have never worked in the industry outside of high speed military stuff.
I have no stake in the code I write because the code I write means nothing. I have no one to defend me and nothing to fall back on that qualifies me as an expert that can tell a VP of engineering to fuck off. Software engineering is an over-educated phrase for the new factory worker. We don't send factory workers to the brig for screwing up a widget. Neither will we send software engineers.
If you want to change that it begins at the top. You need a labor union and/or an accreditation board backed by the largest companies guiding the industry. For that, I say, good luck. There's no fundamental physics of software engineering. There's no "basic safety" in software engineering. There are N languages and N+1 ways to blow your own foot off. Standardization would not be well received because the language itself would need to be blessed. This works well for the military who likely still runs a copy of GCC from 1992. It does not work for an industry evolving by the day. What does a PE look like in software engineering? What language? What planning framework? What compiler? This isn't even beginning to talk about contending with the fact a licensing scheme would send every H1B in America home and crush companies like InfoSys overnight. There's a lot of capital in just these two places to fight the licensing battle for several generations.
Conflating engineering with software is a hazard. Software "engineers" are just laborers. Very well paid, but fundamentally no different than the guy who built your house. You may be able to argue language designers, software architects, etc would need to be "licensed" but the actual people writing the code are digital welders, house builders, and painters. These people are bonded but not licensed (usually). Companies implicitly bond their software engineers by eating the N million dollar cost of a mistake.
> I, as a software engineer, sign nothing. I do not have a license, I do not have a labor union, and I do not have the "professional stature" to tell my boss to fuck off when he suggests something dangerous.
I think that's the original-OP's call to action: Our industry should set up this licensure/accountability infrastructure. Software Engineers should sign off that their creations will not cause harm. They should have a license they can point to when the V.P. tells them to ship harmful software. They should have a strong union behind them to give teeth to their ethical position.
Software Engineering is decades old and full of very capable professionals. We should be working to set up the mechanisms of "professional stature" similar to what allows other licensed engineers to seriously push back against doing harm.
4. What does pay look like after bonding and licensing?
5. What do we do about H1Bs and foreign contractors?
6. How do you fight the, likely, trillions in capital that will be deployed from companies that contract engineers?
7. How do you fight the billions of dollars invested into agile project management?
8. What do we do with software written outside the country? Even if it's written by someone licensed in their country?
9. How long does adoption of new software take?
There have already been several failed attempts at licensing and achieving a PE status for software. It occasionally makes the news in IEEE. None of them have worked because none of them address these points. Worse, the ivory tower "we should seriously think about this" people address even fewer points than the organizations that failed. This battle has been fought and lost numerous times.
I think the pro-licensing crowd forgets that licensing implies there will be exactly one (maybe two) blessed languages. It may even mean one or two blessed editors, one blessed UML software, one blessed planning framework (waterfall), etc. The language will either be C++ or Java and that will be how we code for the duration of the licensing scheme. Don't believe it will be either C++ or Java? Well, simply ask what your government uses.
I'm not designing the whole licensure system in an HN post. If there was an actual, serious push for this, people would figure all those things out. I didn't realize there were already failed attempts--maybe the next try could learn from them. OP and I are fully aware this is a pipe dream, seeing how far along the "wild west" track we've gone.
I also don't believe licensing necessarily implies there will be one or two "blessed" languages or technologies. What might be hard is finding someone willing to sign off on 100,000 lines of unsafe C++ code vs. 5,000 lines of sandboxed Python.
I didn't intend for you to design the system but rather demonstrate every single hurdle it would have to overcome.
As for your hypothesis, I'd go the opposite. Dynamic typing would basically be untenable. I would NEVER sign off on anything hinting at dynamic code. I have had so much apparently well written, apparently well tested, dynamic Python code completely blow up in my face. That's why I suggest languages like C++ and Java. They have insane adoption levels and are where all the "probable PEs" probably work. While both are still weakly typed in a literal sense you can box the types in such a way the compiler, and therefore the engineer, can make promises. This is actually one of my main gripes of the industry in general. I am a Python developer. I predict Python will become the new Javascript. A language we should've, in hindsight, just let rot in the past. Just trying to standardized what safe concurrency and parallelism looks like in a post-licensing world gives me a headache.
It's actually a very good argument for the complete destruction of dynamic type systems in favor of a haskell-esque, rust-esque, ada-esque development methodology. I am not entire opposed to the idea though the cult of agile will beg to differ. The standard software engineers would have to rise to would necessitate a formal education and entire system we have would be upended. Everything outside of RTOS-level development is dynamic these days. It's weird knowing everything can blow up in your face and you'd never be able to predict why. Imagine a bridge builder saying we tested everything but there's still a 30% chance the bridge just implodes by itself.
I'm up for ravenscar profile Ada. I don't think 95% of the industry is on my side on that one though :).
I think you're misunderstanding how licensing works. It's very rarely the case that one technology is blessed and another is not. That may happen in cases like FIPS but for the majority of existing software regulation it's far more about threat modeling and showing that the threats are considered.
Do you think a PE building a bridge doesn't have rushed deadlines? That they don't face pressure from bosses to do it faster and cheaper? That they are not faced with the choice to design something immoral or quit? Regular engineers face all these same problems every day. Its simply a a matter of having the fortitude to stand up and call things out when they are irresponsible, and leave if they stay that way. There is no magic PE stamp that you can flash at management and suddenly they have to listen to you.
China will care a lot about regulations in the US that prevent them from selling products to us. More so when the myriad US allies enforce similar laws, which the US can very trivially influence them to do.
Anyway, my point isn't "we should regulate" or that "regulating will work". My point is that we need to start thinking much more seriously about our responsibilities because, if we don't, the government will do that thinking for us.
Importing products made from IP theft is illegal. How effective has that been?
(Before you care to answer that, know that I have a product that has been ripped off in China. My code is put onto cloned devices, and you can buy them on Amazon right now. We’ve gone from reporting, to lawyers, to working with customs police in USA, Canada, China, and Europe. We’ve even had multiple state reps, and two US Senators (for both states we produce in) involved. Nothing is effective and China really really really doesn’t care.)
Anyone that says “China will care if we just…” has little idea of this situation in my opinion.
I'm not saying that it would be the right way to go. I'm saying that other companies absolutely care about US policy.
As for IP, this has increasingly led to tensions between the countries. To say that China doesn't care is silly, China cares and they spend a lot of effort making it viable to continue their practices.
>I'm saying that other companies absolutely care about US policy.
Yea? Like Amazon and Walmart? It took a long time to get them to take our counterfeits down, and then only apply that to a vendor, so another pops up instantly.
Here I am with first hand knowledge of the situation over years of difficulty and finally defeat, but your opinion is good too.
Yes, like those companies. They care a lot about the law. That's why they spend so much time and money trying to create new laws or defend themselves against the laws that they break. It takes serious effort for them.
Your personal experience is irrelevant and is clearly biasing your opinion. It's obvious that companies care deeply about the law - even the ones who flaunt it do so only with great investment or with the belief that it will ultimately be worth it. Just because you were on the receiving end of the 'and it was worth it to them' does not change that.
I'd be supportive of licensing but it needs to go after the right things. Currently the industry is dominated by academics who took their CS & CE degrees and made them barriers to entry for jobs. Real software engineering involves very little DS&A. If we were making people demonstrate competency around OWASP on an infrastructure and dynamic application level I think licensure would go a long way.
If we don't engage with that problem we're going to get regulated in a way that we may not like.
edit: Look at how many comments are just attacking a straw man policy. This is why we're going to end up getting regulated in some dumb way - people can't or won't even imagine a situation better than this one.