Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree there will be some breakthrough (maybe by Nvidia or maybe someone else) that allows these models to run insanely cheap and even locally on a laptop. I could see a hardware company coming out with some sort of specialized card that is just for consumer grade inference for common queries. That way the cloud can be used for sever side inference and training.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: