Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

100x loss in density isn't actually necessarily a problem. Optical systems have potential for much higher clocks (since there's so much less heat), and the actual logic on a cpu is tiny. If you can get 50ghz clocks, you can lose a lot of density and still win out (I'd take 10x single core perf over 10 cores any day of the week).


Even optical computing needs lots of standard electronics for storage etc. as you point out these consume most of the area, so we are talking about something that consumes 100x more area for a function which is not the main energy expensive thing in traditional computers anyways, memory and communication make up at least 1/3 of the energy budget, which doesn't go away by making the rest optical.

In fact reprogramming the optical non-linearities typically is much slower and energy intensive than retrieving and flipping some bits. Which makes non-static non time-multiplexed computation extremely slow compared to whatever the "best" case static scenario is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: