Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think "concurrence" here refers to serving many requests concurrently, as opposed to executing concurrently on multiple cores. But yeah, I think you're right, it's not concurrent in the strict sense.

If you wanted, you could probably run as many separate python processes as you have cores. I doubt it would help that much, since the main bottleneck in this sort of application is usually the I/O involved in handling lots of small packets.

When Facebook made memcached multi-threaded (it's also based on libevent), things improved, but if I recall correctly, not that dramatically. They're probably getting more benefit out of it now that they've hacked the kernel to improve the packet/second bottleneck issues, but for most people that's probably not worth it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: