Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having a solid mental model for “how fast is fast” is, in my opinion, critical to what makes an excellent engineer: knowing when to care about performance up front.

And not even big O notation of algorithms or IO latency. But just a general feel for the performance of higher level abstractions. To look at a design that involves some input, data processing, a transfer somewhere, rendering, presentation, or whatever, and to instantly have an intuition on what parts to worry most about.



I would much prefer people know the scale of these numbers than know big O notation. The number of times I've seen big O used as a hammer because everything looks like a nail can be frustrating.

This is doubly true on embedded systems where N > 3000 tends to be the less common case. Most of the time when I do performance profiles it's a 'flat profile' because the majority of the time is lost thrashing the L1/L2 cache.


When I get onto an unfamiliar platform I do a test of sequential search vs binary search to see where the crossover point is on number of elements for small arrays. Then I know roughly when not to bother with a better algorithm.


How fast is fast can be known, but how important are tradeoffs is a better way to test an engineer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: