Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One of the things that I fully not expect to be successful is optical computing. There are just a lot of academic groups that are doing optics and they like to invent new reasons why whatever they are up to is relevant. For physics reasons the integration density of optical compute elements is abysmal and will remain so forever. Other technologies like spintronics at least have the chance to work sometime in the future. There were projects on wafer scale optical computing already in the 80-90s at MIT Lincoln labs, so this isn't exactly a new idea either. We have a new group at our institute doing "Neuromorphic Quantum Photonics", they publish in high-impact glossy journals, doesn't change that it is in my opinion mostly hype and bullshit.


> For physics reasons the integration density of optical compute elements is abysmal and will remain so forever.

Could you give some details? Claims about "forever" often don't hold up. I guess you're referring to things like component size in relation to the wavelength of light used? One could use smaller wavelengths. Integrated photonics is certainly being done and also commercially relevant (in telecommunications). What integration density would you consider not-abysmal? How much does integration density matter if you have very low loss (which means low power dissipation, a huge problem for semiconductor electronics) and can just make big chips?

There is also research arguing that optoelectronics might eventually be very useful for computing, e.g. recently [1]. (Yes, this is by researchers who need to appear relevant. However, if we dismiss their arguments based on that alone, we can abolish all research altogether.) Why do you disagree? Again, you were talking about forever.

[1]: https://www.nature.com/articles/s41467-022-29252-1


> I guess you're referring to things like component size in relation to the wavelength of light used? One could use smaller wavelengths

The same issues that affect electronic VLSI manufacturing also apply to trying to use light on-chip. The semiconductor industry had to transition to EUV (13.5nm) light to make it work. But that has huge and inefficient light sources.

Photonics makes sense if one end of your system has light on it; if you're building a LIDAR system, or data transmission over fiber, or somesuch. I have not yet seen anyone doing computation at scale in light.


Visible light and near ultra-violet light has a wavelength ~400-800 nm, current gen transistors have a pitch of ~40 nm. This gets worse because scaling is actually either quadratic (2d) or cubic (future 3d integration). So we are talking about 100x to 1000x worse spatial scaling disadvantage at the moment. The only redeeming quality of light is wavelength multiplexing, but that is only useful for a subset of applications, like optical communication and (maybe) convolutions (see below).

Moreover even in a hypothetical scenario where we somehow were able to find materials applicable to smaller wavelength, the deBroglie wavelength of an electron is ~1000x smaller, than that of a Photon at the same energy. So in terms of integration density electrons will always have a 10^6 - 10^9 (2d - 3d) theoretical advantage over photons, which means that investment in electron based computation will have a much more likely eventual payoff.

Take for example https://www.nature.com/articles/s41586-020-03070-1, they have a bunch of projections for what they hope to achieve over time. The most fantastical figure they give is 50 Peta MAC / s, but this doesn't take into account the PCM programming time.

If you take a look at the supplementary material https://static-content.springer.com/esm/art%3A10.1038%2Fs415... it becomes clear that they currently have a much lower TOPS/Watt figure than current generation ML ASIC like the TPU and this neglects all the expensive experimental optical equipment they would need to miniaturise. So even in their most favourable comparison they are 5x worse. Most of these papers unfortunately are full of hype and claims like that.


100x loss in density isn't actually necessarily a problem. Optical systems have potential for much higher clocks (since there's so much less heat), and the actual logic on a cpu is tiny. If you can get 50ghz clocks, you can lose a lot of density and still win out (I'd take 10x single core perf over 10 cores any day of the week).


Even optical computing needs lots of standard electronics for storage etc. as you point out these consume most of the area, so we are talking about something that consumes 100x more area for a function which is not the main energy expensive thing in traditional computers anyways, memory and communication make up at least 1/3 of the energy budget, which doesn't go away by making the rest optical.

In fact reprogramming the optical non-linearities typically is much slower and energy intensive than retrieving and flipping some bits. Which makes non-static non time-multiplexed computation extremely slow compared to whatever the "best" case static scenario is.


I friend I had researching in optical computing around 5 years ago always said "Nah, not even near". Not that ain't gonna happen, but only basic interfaces seemed feasible at the time. (But I'm not familiar with the field and I could be wrong or not remember exact details)


More power to them if it works, you can do cool things with it for sure. I am continuously amazed what you can do in quantum optics etc. but it isn't anywhere near to practical except for optical communication.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: