Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Too small for ML.

What do you mean by this - I assume you mean too small for SoTA LLMs? There are many ML applications where 12GB is more than enough.

Even w.r.t. LLMs, not everyone requires the latest & biggest LLM models. Some "small", distilled and/or quantized LLMs are perfectly usable with <24GB



If you're aiming for usable, then sure that works. The gains in model ability from doubling size is quite noticable at that scale though.

Still...tangibly cheaper than even a 2nd hand 3090 so there is perhaps a market for it


comparing the bottom of the market cards with a card that is 4+ times the power budget and 6 times the price is um....




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: