It's good to question what we teach. Like I never learned any methods for computing a square root, to arbitrary precision, whereas my parents did.
But autodiff doesn't replace symbolic differentiation. Consider the derivative of cuberoot(x^3). Autodiff struggles at x=0 because x^3 is flat (slope 0) and cuberoot is vertical (slope inf). And together, with autodiff, you get slope NaN.
If you wanted to try to fix that problem with autodiff, well first off it's likely not worth it because similar problems are inherent to autodiff, but you'll need an analytic understanding of what's going on.
It's true that most autodiff systems would behave this way, but a human given only the rules of calculus (the same rules as AD) would have trouble at the same point. Of course, most mathemeticians would apply additional insights while differentiating here (such as cancellation of terms or L'Hôpital's rule). But equally, you could add these things to an AD too; they are orthogonal to the basic derivative transformation, so it's really a difference of degree at best.
You will have to put a bit of work into thinking about it, if you don't already see it. At the risk of being more rude than effective, I spelled it out pretty clearly above if you understand autodiff, which requires an understanding of the chain rule, which most people first learn about in a calculus class.
If you want to claim that autodiff is a replacement for all our differentiation needs, then you should try to understand when it doesn't work.
The differentiation of a fractional power (fraction less than 1.) involves a negative fractional power. 0 raised to some negative power involves dividing by zero. This gives you some kind of NAN in, e.g., vanilla languages such as Python. The "issue with automatic differentiation" is really an issue with dividing by zero. Graph x(1./3.). It's slope is vertical at zero.
My mistake then. I did not intend to try and be so precise. I was thinking loosely. I have a very simple autodiff tool hand rolled (in Python) here to play with. I raise 0 to a negative exponent in the gradient of the cube root at 0. I do not get a NAN. Strictly speaking I get this:
"ZeroDivisionError: 0.0 cannot be raised to a negative power"
Is that satisfying? I do not claim anything more specific here. Sorry to be imprecise.
But autodiff doesn't replace symbolic differentiation. Consider the derivative of cuberoot(x^3). Autodiff struggles at x=0 because x^3 is flat (slope 0) and cuberoot is vertical (slope inf). And together, with autodiff, you get slope NaN.
If you wanted to try to fix that problem with autodiff, well first off it's likely not worth it because similar problems are inherent to autodiff, but you'll need an analytic understanding of what's going on.