> OpenAI and ChatGPT have been pioneering but they're absolutely going to be commoditized.
I am not sure that it is very interesting that LLM apis are a commodity. It's not even a situation where it is _going_ to be a commodity, it already is. But so is compute and file storage, and AWS, Google and Microsoft etc have all built quite successful businesses on top of selling it at scale. I don't see why LLM api's won't be wildly profitable for the big providers for quite a long time, once the build out situation has stabilized. Especially since it is quite difficult for small companies to run their own LLMs without setting money on fire.
In any case, OpenAI is building products on top of those LLMs, and chatgpt is quite sticky because of your conversation history, etc.
> There is no way the character licensing survives an hour of contact with the public, unless it is _extremely_ restricted. I can't imagine a worse job than trying to "curate" the torrent of sewage that is going to get created. Deadpool is pretty much the only Disney-owned property this makes sense for.
And I say this as someone who _likes_ using Sora.
There is no way the character licensing survives an hour of contact with the public, unless it is _extremely_ restricted. I can't imagine a worse job than trying to "curate" the torrent of sewage that is going to get created. Deadpool is pretty much the only Disney-owned property this makes sense for.
> The author doesn't talk at all about the hardware aspect of this stuff such as the surprisingly short lifetime of the GPUs that are being rolled out at a break-neck pace.
I am not sure why that is interesting. Nobody thinks of these chips as long term assets that they are investing in. Cloud providers have always amortized their computers over ~5 years. It would be very surprising if AI companies were doing much different -- maybe even a shorter time line.
I assume they will produce their own AI once the dust settles, just like they produce their own chips now.
Apple has generally been a company that waits, gets criticized for being behind, and then produces a better version (more usable, better integrated, etc), claims it is new, and everybody buys it. Meanwhile a few people moan about how Apple wasn't actually the first to make it.
Old Apple wasn't run by ex-Microsoft and ex-consultancy MBAs... a serious cultural rot has set in and the much of the "bottom up" component powering much of the innovation is nothing but smoldering coals.
This is one of the best Rust articles I've ever read. It's obviously from experience and covers a lot of _business logic_ foot guns that Rust doesn't typically protect you against without a little bit of careful coding that allows the compiler to help you.
So many rust articles are focused on people doing dark sorcery with "unsafe", and this is just normal every day api design, which is far more practical for most people.
1) We have barely scratched the surface of what is possible to do with existing AI technology.
2) Almost all of the money we are spending on AI now is ineffectual and wasted.
---
If you go back to the late 1990s, that is the state that most companies were at with _computers_. Huge, wasteful projects that didn't improve productivity at all. It took 10 years of false starts sometimes to really get traction.
It's interesting to think Microsoft was around back then too, taking approximately 14 years to regain the loss of approximately 58% of their valuation.
I am not sure that it is very interesting that LLM apis are a commodity. It's not even a situation where it is _going_ to be a commodity, it already is. But so is compute and file storage, and AWS, Google and Microsoft etc have all built quite successful businesses on top of selling it at scale. I don't see why LLM api's won't be wildly profitable for the big providers for quite a long time, once the build out situation has stabilized. Especially since it is quite difficult for small companies to run their own LLMs without setting money on fire.
In any case, OpenAI is building products on top of those LLMs, and chatgpt is quite sticky because of your conversation history, etc.
reply