> Imagine not understanding that their main way of doing money is through their API for other companies, and not through a product. They are focused on doing something they are good: good AI models, they let other companies take the risk to build product on top of it, and reap benefits from theses products.
There is no moat in an API-gated foundation model. One LLM is as good as any other, and it'll be a race to the bottom.
The only way to mint a new FAANG is to build a platform that captivates and ensnares the populace, like iPhone or Instagram.
The value in AI will be accrued at the product layer, not the ML infra tooling, not the foundation model. The product layer.
It might be too late to do this with LLMs and voice assistants, though. OpenAI is super distracted, and there's plenty of time for Google, Meta, and Apple to come in and fill the void.
Everyone was too busy selling the creation of gods, or spreading FOMO to elevate themselves to lofty valuations. At the end of the day, business still looks the same as it always has: create value for customers, ideally in a big market where you can own a large slice. LLMs and foundation models are fungible and easy.
> LLMs and foundation models are fungible and easy.
The top couple LLMs are extraordinarily expensive - will get dramatically more expensive yet - and are one of the most challenging products that have been created in all of human history.
If what you're claiming were true, it wouldn't cost so much for Meta and OpenAI to do their models, and it wouldn't take trillion dollar corporations as sponsors to make it all work.
> One LLM is as good as any other, and it'll be a race to the bottom.
Very clearly not correct. There will be very few top tier LLMs due to the extreme cost involved and the pulling up of the ladders regarding training data. This same effect has helped shield Google and YouTube from competition.
> There is no moat in an API or foundation model.
Do you have a billion dollars? No? Then you can't keep up over the next decade. Soon that won't even get you into the party. Say hello to an exceptional moat.
> The top couple LLMs are extraordinarily expensive - will get dramatically more expensive yet - and are one of the most challenging products that have been created in all of human history.
I disagree. The more we learn about LLMs the more it appears that they're not as difficult to build as it initially seemed.
You need a lot of GPUs and electricity, so you need money, but the core ideas: dump in a ton of pre-training data, then run layers of instruction tuning on top - are straight-forward enough that there are already 4-5 organizations that are capable of training GPT-4 class LLMs - and it's still a pretty young field.
Compared to human endeavors like the Apollo project LLMs are pretty small fry.
100%. I don't think we should at all minimize the decades of research that it took to get to the current "generative AI boom", but once transformers were invented in 2018, we basically found that just throwing more and more data at the problem gave you better results.
And not to discount the other important advances like RLHF, but the reason everyone talks about the big model companies as having "no moat" is because it's not really a secret of how to recreate these models. That is basically the complete opposite of, say, other companies that really do build "the most challenging products that have been created in all of human history." E.g. nobody has really been able to recreate TSMC's success, which requires not only billions but a highly educated, trained, and specialized workforce.
Mistral AI has released their updated Mistral Large model and it gets basically the same scores on the chatbot arena leaderboard as a GPT4 version from the end of 2023.
OpenAI has to constantly keep moving and improving their models with zero forgiveness for any complacency and so far they have only achieved less a lead of less than a year versus an underfunded competitor.
Meanwhile Anthropic and Google are managing to build commercial models that are on par with GPT-4.
There is no moat in an API-gated foundation model. One LLM is as good as any other, and it'll be a race to the bottom.
The only way to mint a new FAANG is to build a platform that captivates and ensnares the populace, like iPhone or Instagram.
The value in AI will be accrued at the product layer, not the ML infra tooling, not the foundation model. The product layer.
It might be too late to do this with LLMs and voice assistants, though. OpenAI is super distracted, and there's plenty of time for Google, Meta, and Apple to come in and fill the void.
Everyone was too busy selling the creation of gods, or spreading FOMO to elevate themselves to lofty valuations. At the end of the day, business still looks the same as it always has: create value for customers, ideally in a big market where you can own a large slice. LLMs and foundation models are fungible and easy.