I can easily relate to the frustration behind the crude methodologies that dominate software development these days, but I think the author is missing the crux of the whole issue.
Software is pure thought-stuff. Its complexity knows no bounds. The software universe already has many orders of magnitude more complexity than all physical engineering disciplines combined, and that quantity will continue to grow.
To say that we just need to buckle down, measure things, and design rigorous and optimal methodologies is at best naïve. There is no consistency or reproducibility in software development. Every project is different, the ecosystems in which development is done are constantly shifting, the dependencies are often unknown, the big picture is too complex for any one person to fully comprehend, and individual skill sets vary so much as to make any hope of "optimal" architecture all but impossible.
Contrast with bridge building. Consider the enormous amount of experience and study that has been expended tens of thousands of times to fulfill basically ONE requirement that 99% of the population can easily understand. "It needs to be able to support the weight of the traffic going over it". Think of all the complex load bearing computation, materials science, fluid dynamics, and everything else that goes into meeting this one basic requirement. Sure, nature makes this a challenging task, but at the core you have one simple requirement which you can design your whole process around. You can try different things and over time iterate to optimal processes because the requirements are essentially the same every time.
With software you have no such grounding requirement. On the plus side abstraction allows simple software to be very very simple. However for any software of significant complexity the interactions of concerns and requirements quickly balloons out of control. The only way to manage this complexity is on a case-by-case basis. There are many best practices and common techniques that can be applied to improve the quality, but they all have a cost, and they will all have varying degrees of benefit depending on the project and the team. I'd say NASA has very very good methodologies for ensuring quality, but it costs more than the vast majority of projects could even fathom paying 1% of.
Should everyone that is willing to tolerate some bugs in exchange for a 99% discount tag get out of the software business? Should all the developers just go home and leave software engineering to be practiced and defined by corporations and government programs with 9-figure budgets?
good write up. I'm not a fan of analogies between software and physical construction, mostly because they are often used by people who understand neither and consequently think all physical structures are perfect and software should be too. The author seems to have the right perspective though (or at least one I share).
I agree the analogy is not right. Firstly I disagree that the Art of Software Engineering has disappeared or gone down or lost. Yes it is continually morphing and yes it has its own problems. I have been involved in both 'construction' worlds for over 20 years and the analogy is misplaced.
'We need an AutoCAD like tool for designing software blueprints.'
No you don't. Autocad replaced manual drafting and took some of the drudgery of detailing out, but has made construction more 'inefficient'. The ease of change has actually slowed down construction. Project after project in fast growing areas such as the Middle East has been slowed down by 'design' problems. One hi-rise in Doha has actually tilted by over 300 mm and if a solution is not found it will be demolished (50 storey building). Most Construction Projects use Primavera for planning, one such Project that I am involved with has 22,000 activities. No-one is using it, due to its sheer volume and the actual plan itself is behind the reality on Site!
What Software Engineering and Construction need are better tools to manage teams (not tweet!), tools to manage complexity (not just to-do lists) and tools to manage re-usability and generation of some of the code. You can use Autocad to 'draw' out a building floor, produce a blue-print for an aircraft engine or draw out a cartoon. For me a computer language is my 'Software Engineering Tool'. The 'design methodology and other tools' reside in the computer that lives in my head!
I'm sorry, what? The guy lost me at the supposedly lost art of gathering and packaging software requirements. When was this golden age of the not almost-instantly-outdated packaged requirements? Maybe only when the success of software was judged by the speed of shipping out versions as opposed to whether users need/like/use it. Of course, if you ignore the latter, you can have perfect requirements and code alignment, but... Really?
Yes, we all know that this is the right attitude for certain domains of software development that have become disproportionately prominent as of late.
However, we also need software to guide rockets, monitor nuclear reactors and control medical devices. In such cases, the requirements and the end product have to be complete and flawless the first time around, or people die.
We have not figured out how to do software engineering yet. It's been fun pretending that we will never have to, but we do and we will.
Is that in fact how that happens though? Complete and flawless the first time around? I'm not that familiar with embedded sort of software, but my assumption was that they try to do a good job, but then use simulators to get the software ship-shape. At least that's what I would do, given my presumption of errors in any human endeavor. I'm much more in favor in building processes that accept failure gracefully, rather than presuming error-free production.
Simulations suffer from the same flaw as any kind of automated testing: you can't verify the correctness of the test, except possibly with another test, which then presents the same problem. At some point, an intuitive leap must be made to call the overall process correct.
This is true of any kind of engineering, but generally the assumptions can be reduced to obvious ones with a reasonable amount of overhead. The complexity and transience of software development makes it the exception.
I have little first hand experience building ultra-critical software, but my impression is that it takes ridiculous amounts of beurocracy to bring the failure rate down to tolerable levels, making the cost astronomical and prohibiting development on the scale of commodity software.
For the people who have downvoted my comment, it would be helpful to me to know why you did that. It wasn't an ad hominem attack, and my question wasn't all that rhetorical. I thought downvotes were only for being nasty.
Software is pure thought-stuff. Its complexity knows no bounds. The software universe already has many orders of magnitude more complexity than all physical engineering disciplines combined, and that quantity will continue to grow.
To say that we just need to buckle down, measure things, and design rigorous and optimal methodologies is at best naïve. There is no consistency or reproducibility in software development. Every project is different, the ecosystems in which development is done are constantly shifting, the dependencies are often unknown, the big picture is too complex for any one person to fully comprehend, and individual skill sets vary so much as to make any hope of "optimal" architecture all but impossible.
Contrast with bridge building. Consider the enormous amount of experience and study that has been expended tens of thousands of times to fulfill basically ONE requirement that 99% of the population can easily understand. "It needs to be able to support the weight of the traffic going over it". Think of all the complex load bearing computation, materials science, fluid dynamics, and everything else that goes into meeting this one basic requirement. Sure, nature makes this a challenging task, but at the core you have one simple requirement which you can design your whole process around. You can try different things and over time iterate to optimal processes because the requirements are essentially the same every time.
With software you have no such grounding requirement. On the plus side abstraction allows simple software to be very very simple. However for any software of significant complexity the interactions of concerns and requirements quickly balloons out of control. The only way to manage this complexity is on a case-by-case basis. There are many best practices and common techniques that can be applied to improve the quality, but they all have a cost, and they will all have varying degrees of benefit depending on the project and the team. I'd say NASA has very very good methodologies for ensuring quality, but it costs more than the vast majority of projects could even fathom paying 1% of.
Should everyone that is willing to tolerate some bugs in exchange for a 99% discount tag get out of the software business? Should all the developers just go home and leave software engineering to be practiced and defined by corporations and government programs with 9-figure budgets?