DAGs are fine. Their DSL is not because it's abstracting the wrong things in the wrong place. It's a global file with static definitions; why do you hardcode KubernetesOperator when maybe you don't want a KubernetesOperator in a test env? There is also no type safety between tasks/operators. And it's an extremely dependency heavy package with no client/server isolation so bundling Airflow for multiple teams is just not viable.
Why do you think you define DAGs in Python? The point is to be dynamic, exactly to do things like switching between operator types based on things like the environment. Sorry you really don't seem to know a lot about airflow for your strong opinions against it, I'm out, no offense intended at all.
I'm well aware that's "possible", but if you have to build your own abstractions and CI/CD to make it usable this way, it doesn't seem very well designed.
Note that any feedforward neural network (e.g. anything using attention) is also a DAG.
You can always encapsulate more complex logic in a single subroutine or even choose a hex or onion pattern for biz logic.
But what do you suggest besides DAGs + saga patterns that doesn't result in a ball of mud over time for distributed systems?