It seems to me that punters overly excited to declare that everything we've ever done up to this point is wrong and there's a new paradigm coming that will obsolete everything, as well as people being critical of the new thing, tend to miss a very important aspect of digital computation that will prevent analog computation from ever simply "taking over": Digital computation is essentially infinite composable.
It does not matter how many gates you throw your 1s and 0s through; they will remain ones and zeros. While floating point numerical computations carry challenges, they are at least deterministic challenges. You can build things like an SHA512 hash which can ingest gigabytes upon gigabytes, with the entire computation critically dependent at every step upon all previous computations in the process, a cascading factor of literally billions and billions, and deterministically get the exact same SHA512 hash for the exact same input every time.
This property is so reliable that we don't even think about it.
Analog computers can not do that. You could never build a hash function like that out of analog parts. You can not take the output of an analog computer and feed it back into the input of another analog computation, and then do it billions upon billions of times, and get a reliable result. Such a device would simply be a machine for generating noise.
Analog computing fits into the computing paradigm as another "expansion card". It may take isolated computations and perform them more efficiently. Perhaps even important computations. But they will always be enmeshed in some digital computer paradigm. Breathless reports about how they're "coming back" and coming soon and taking over are just nonsense. (I speak generally, this walled-off article may or may not have made such claims, I dunno.) So many things about how digital computers work that you just take for granted are simply impossible for analog computers, structurally; something as simple as taking a compressed representation of a starting state for your analog computer is something you need a digital computer for, because our best compression algorithms have the same deep data dependencies that I mentioned for the hashing case.
Useful, interesting, innovative, gonna make some people some money and create some jobs? Sure. Something we should all go gaga over? No more than a new database coming out. It's going to be a tool, not a paradigm shift.
It does not matter how many gates you throw your 1s and 0s through; they will remain ones and zeros. While floating point numerical computations carry challenges, they are at least deterministic challenges. You can build things like an SHA512 hash which can ingest gigabytes upon gigabytes, with the entire computation critically dependent at every step upon all previous computations in the process, a cascading factor of literally billions and billions, and deterministically get the exact same SHA512 hash for the exact same input every time.
This property is so reliable that we don't even think about it.
Analog computers can not do that. You could never build a hash function like that out of analog parts. You can not take the output of an analog computer and feed it back into the input of another analog computation, and then do it billions upon billions of times, and get a reliable result. Such a device would simply be a machine for generating noise.
Analog computing fits into the computing paradigm as another "expansion card". It may take isolated computations and perform them more efficiently. Perhaps even important computations. But they will always be enmeshed in some digital computer paradigm. Breathless reports about how they're "coming back" and coming soon and taking over are just nonsense. (I speak generally, this walled-off article may or may not have made such claims, I dunno.) So many things about how digital computers work that you just take for granted are simply impossible for analog computers, structurally; something as simple as taking a compressed representation of a starting state for your analog computer is something you need a digital computer for, because our best compression algorithms have the same deep data dependencies that I mentioned for the hashing case.
Useful, interesting, innovative, gonna make some people some money and create some jobs? Sure. Something we should all go gaga over? No more than a new database coming out. It's going to be a tool, not a paradigm shift.