I don’t agree with the sentiment. Sure, I don’t understand exactly how it works, and I have no way of training one from scratch on my own, but I also can’t build a web server on my own. The thing is that I already self-host a small model for my needs (Vicuna-13b) and it works just fine. Next thing I’d like to try Mixtral 8-7b which looks as capable as GPT 3.5. And that all with only one year since the field emerged. Who knows what we could build five years from now.