If I am wrong, please correct me: but I don't believe traditional electromagnetism can explain the fuzziness (which I think is better described as noisiness) of shadows.
It depends what you mean by "traditional" and "fuzziness".
In ray optics, shadows from point sources are black and white. (If half of the sun is obscured, the surface it illuminates is half as bright, but that's pretty obvious.) That's very old school. Newton could get it right, even though he got particles and waves entirely wrong.
In wave/electromagnetic optics, shadows are shades of grey. Just like water waves can be ripples at one end of a beach, crashing surf at the other, and vary continuously in between.
In quantum optics, shadows are a superposition of different shades of grey, due to shot noise from the photons. That matters if you're doing very precise interference measurements, but not when you're taking photographs, no matter how short the exposure.
That's just diffraction. Any kind of wave spreads slightly into the shadow behind an obstacle; the waves described by Maxwell's equations are no exception.
I suspect "fuzziness" here is something specific? E.g. fuzzy-edged shadows from the sun seems pretty obvious - it's not a point light source. So this is referring to... nano-scale edge fuzziness or something? Or the shenanigans needed to do sub-wavelength features (like we do for silicon lithography)?