Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect it is common. And mostly inevitable. When you have 10^N engineers working on something, protecting them from each other (and themselves) grows increasingly important. At some point, it becomes your main priority, as without it you have a totally non-functioning system.

Which partly implies that the only solution is to not grow to that scale. Keep a smaller, more skilled team instead. In most cases I think that'd be better, and I think most people (and companies) would agree (if only they could hire those skilled teams reliably)...

... but sometimes throwing more people at a problem really is the best (or only) option, e.g. when you have to deal with a large volume of ridiculous external constraints. For example: my first job was at a company that aggregated loan providers, which entailed consuming hundreds of random PDFs per day and dozens of weird RPC calls. We had dozens if not hundreds of hand-crafted bits of automation to detect when things changed and handle the new format. Many APIs had hand-rolled XML parsers and producers... because the company we were calling would choke if your attributes weren't in a specific order, or if X wasn't within a specific number of bytes from the start of the request, or they returned invalid XML. It was absolutely ridiculous, but there's no way a bunch of banks were going to fix their APIs for us, so there's not much of an option but to do it by hand.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: