The mark of a maturing domain is the evolution from only general tools to general + specialized. We've gone from only CPUs to CPU + GPU to specialized AI chips (Neural Engine, Tensor chips etc.) and specialized computing is a big tent which can fit many different architectures together.
Analog computing is the closest thing to bioengineering in fundamental computer science that I know of, so I am confident that it will find a niche. I remember reading about Mythic AI here on HN, who were doing some cool work with analog computing chips for ML. My hunch is that matrix multiplication is the most expensive mathematical operation we do as a society (not unit expensive, but in overall absolute cost) - and our progress in AI is directly proportional to how easy / cheap it is to run.
Maybe I'm missing something, but wouldn't any optical computer have to still funnel signal through binary logic gates at some point? In what sense is that any more analog than (digital recordings on analog) magnetic tape decoded by a modem? The ultimate computation is still 1/0
Get a ruler. Draw a line 10 cm long, AB. Now grab end B and draw another another line, BC, with a certain angle $/alpha$ wrt to the first. Now measure the distance AC.
Congratulations! You have just built an analog computer to use the Law of Cosines to solve for line segment AC. Non-dimensionalize your result and its a general LofC solver. A problem that (I suspect) would take the majority of modern day Eng undergrads a week to program without the use of the math lib[1], can be solved by any keen middle schooler.
Now build a robot that measures AC for you and you have an API for your analog computer.
Typically an analog computer is thought of as a set of opamps and diodes, whose currents and voltages solve a set of non-linear ODEs; but thats a very narrow view. An analog computer is, ultimately, any physics experiment whose model is known
Wind tunnel? Navier Stokes analog computer
Cold atoms traveling through a double slit in a magnetic field? Analog Quantum computer
RCL circuit? Analog computer solving the response of a car's suspension.
[1] code reuse and libraries are a big reason why digital computers are more popular to solve models nowadays. Cost, bandwidth, are another. Ostensibly so is reproducibility. But if CS scientists cannot get reproducible builds, what hope does a humble physicist hacking on C or Matlab have?
The issues arise when converting the analog outputs of those physical experiments back to the digital domain for processing. High speed and resolution analog to digital converters are not cheap, and often require different process node technologies for implementation. The cost savings from the elimination of computing the models digitally have to be weighed against the costs arising from the converters' introduction.
>> Now build a robot that measures AC for you and you have an API for your analog computer.
This line made me laugh out loud, but what an amazing explanation and set of examples. Thank you.
The original ECU in my 1980 Datsun ... is digital, right? I just never thought of the insane analog vacuum pot system that controls the cold start valve and the air conditioning selector and the cruise control to also be a "computer" but I guess it does compute things, in its humble way ;)
Glad to get a chuckle. I was more more proud of my reproducible builds jab.
Does a 1980's Datsun have an ECU? That's not obvious to me. Even a 6502 would be an expensive add on at the time given car margins. Maybe a one bit controller?
But there are analog solutions to all these control situations. And you don't need vacuum tubes!
Measuring and adjusting the fuel-air mix is the carburetor's job. The carb doesnt just add fuel. It adds the right amount of fuel. Its needle calibrates it for different conditions (namely pressure).
The spark advance "computer" was super cool. Originally spark advance was a lever controlled by the user. Faster rpm -> adjust for more advance. Eventually the lever was attached to a governor (a set of balls attached to a spinning linkage and a stiff spring. It measures rpm) and the spark advance disappeared under the hood.
The automatic version of your car would have had a (very complicated) hydraulic circuit in the tranny pushing pistons that measure throttle, rpm, velocity and choose the gear accordingly.
The manual version had a vast organic neural network doing the same job but using just sound pitch of the engine. The manual transmission was cheaper because it didn't include the neural network shifter except on the very highest end cars - think RRs - and then only as a service (early SAS model).
German WW2 aircraft were fuel injected so they probably used a governor attached to the throttle wire and pressure gauge to guesstimate the mass flow and therefore the fuel to inject. There's a front page HN submission - the fuel control system of those engines or the original Mercedes Gull-wing!
All of the hydro-mechanical stuff described were slowly transistorized (but kept analog) in the 70s. In the late 80s to 90s everything became digital except the transmissions that took longer.
Afaik first mass produced digital ECU was shipped by GM in 1978 (Motorola 6802 based). Ford also shipped some Toshiba ECUs in 1979. Bosh first fully digital ECU (Intel 8051 based) was https://en.wikipedia.org/wiki/Motronic in 1979 like Datsun/Nissan. Ford went full ECU in 1983 (https://en.wikipedia.org/wiki/Intel_8061)
huh. So the distinction between analog and digital has nothing to do with whether the logic itself is binary, just whether the delivery of signal to gate is absolute or ranged? Something has to gate the signal, right? I always thought "analog" referred to processes that didn't reduce things to a binary at some step along the way...(?)
As far as the circuits are concerned, there’s no such thing as digital. For human engineers, digital is a convention. Well, technically, there are numerous digital logic conventions based on different voltage standards.
For example, you might decide that 0 volts is a logical 0 and 5 volts is a logical 1. If you get everyone to agree to this convention then you can build components that talk to each other. Unfortunately, it’s very difficult (impossible) to get to exactly 0 or exactly 5 volts. So instead you decide that anything less than 2 volts is a logical 0 and anything greater than 3 volts is a logical 1. This setup makes your circuits quite robust to noise.
To further improve things, you might decide that when you want to output a logical 0 you must produce a voltage less than 1 volt and if you want to output a logical 1 you must produce a voltage above 4 volts. This convention allows your system to continually correct voltages away from the undefined region (between 2 and 3 volts). A marginal input of 3.1 volts gets interpreted as a logical 1 and then output above 4 volts. This “self-correction” is what made digital computers the revolution they are.
so ..
the only thing I could imagine making a system "analog" would be if each voltage were treated differently, rather than segregated into 0 or 1 as you just described. If a whole range of signal between 0 and 5 volts were directly output to something like a speaker system, that would be analog. I guess I'm wondering how this optical computer would get around the bottleneck of reducing everything to binary as you described with an electrical system.
Digitizing the output of an analog computation doesn’t make it a digital computation. It’s still analog. The ultimate computation is not binary 1/0, it gets converted to binary after the computation. We may be able to save time or energy by changing representation.
Imagine a NxN matrix-matrix multiply. The computation part is (naively) N^3 multiplies. The conversion back to digital is only N^2 operations, and those operations may be much simpler than digital multipliers. If there’s a way to do the N^3 multiplies as analog, then we can potentially save a lot by converting to and from binary to enable the analog phase.
Analog computing is the closest thing to bioengineering in fundamental computer science that I know of, so I am confident that it will find a niche. I remember reading about Mythic AI here on HN, who were doing some cool work with analog computing chips for ML. My hunch is that matrix multiplication is the most expensive mathematical operation we do as a society (not unit expensive, but in overall absolute cost) - and our progress in AI is directly proportional to how easy / cheap it is to run.