I'm consistently surprised at the negative comments on EDA on Hacker News because there are so many examples of major organizations successfully implementing and running EDA at scale. Here are a few examples:
My hypothesis about why this is, is that most organizations probably don't need EDA yet. They don't have that many data producers and consumers and don't have HA and other requirements that drive the need, so implementing it is overkill, and so their experiences have been bad.
The amount of horseshit that's exploded all over any simple-ass system that could fit comfortably on one real mid-range server, these days, is truly astounding. I thought I was used to churn and such in this field, but the last five or so years are really starting to strain me. The amount of junk one increasingly must know to work on these over-engineered systems—and they never have enough personnel to keep everyone from having to know & constantly work with a dozen different tools and interfaces just to get anything done, on top of what you need for the work of actually writing and working with code—is getting to be more than I can handle.
I only hope if anything good comes out of the next bubble burst it's that some of this piles-of-cash burning counterproductive insanity gets reigned in.
I think it has to do with this idea that you aren't an advanced engineer if you don't use words like 'protocol buffers' and 'event-driven architecture' and 'distributed systems engineering' in your day-to-day. talking in clear and plain terms means you don't know a lot of stuff to justify your generally very high salary. (or at least, that's the flawed mindset)
But "trying to decouple, modularize, and choose the right tools for the job" for your applications seems applicable at any scale.
EDA requires a change in testing methodology, in software design, and a bit of reading, but calling it a "google-scale thing" is pretty fallacious. The idea that a monolith is easier to maintain or that synchronous inter-component communication is easier to reason about also seems fallacious.
I'd love to see a chart of developer's perceptions of event driven microservices and their personal operational expectations.
My point was meant to be more general, in that people try to implement the "google-scale" version of any given conceptual model even when they don't need to (i.e. this is about "choosing the right tools")
For example, one can do a CQRS style database setup just fine with a single API server, a single worker process system, and a single postgresql db - and given all changes are already driven by command objects, building out other datastores later if you need to tends to work out well.
Though I would also point out that on average, communication within a process is easier to reason about than communication outside because there are fewer failure modes, and sync is easier to reason about than async because there are fewer failure modes.
Admittedly, a monolith does make overly tight coupling between components easier to not notice as you're doing it ... but then again it's depressingly easy to accidentally end up with what's essentially a distributed monolith even with a theoretically-microservice-based design (googling "distributed monolith" will provide a bunch of articles with disagreeing definitions of the term ;)
Uber:
- https://eng.uber.com/ureplicator/
- https://eng.uber.com/reliable-reprocessing/
Google:
- https://cloud.google.com/blog/products/gcp/implementing-an-e...
Twilio:
- https://signal.twilio.com/2017/sf/sessions/18530/building-ro...
Stripe:
- https://stripe.com/blog/canonical-log-lines
My hypothesis about why this is, is that most organizations probably don't need EDA yet. They don't have that many data producers and consumers and don't have HA and other requirements that drive the need, so implementing it is overkill, and so their experiences have been bad.