The conditions for AI profitability he suggests as a “new idiom” for AI have fairly exact parallels in the world of software, but where the secret sauce has expanded to include inputs and outputs, not just the software itself:
> The AI companies that make profits will be ones that either have a competitive moat not based on the capabilities of their model,
This is the “open source” case, where profitable firm provide ancillary services and support, rather than directly selling the capabilities of the software, because the software is not at all exclusive.
> OR those which don’t expose the underlying inputs and outputs of their model to customers
This is the “internal software” case, where the profitable firm does not provide access to the software at all, but uses it internally in a way which is opaque from the outside to support other products or services it provides, maintaining exclusivity because no one outside actually gets access to the software itself. (Arguably, non-source-available closed source software might fit in here, though its often possible to reverse engineer software that you have access to without the source, from the executable, so I'd argue that even without source available its more the next category.)
> OR can successfully sue any competitor that engages in shoggoth mask cloning.
This is the “proprietary software” case, where the profitable firm protects its exclusivity through legal constraints on people who have access to the software, while giving customers access to it.
Given that the big firms in AI largely either are or are closely associated with big firms in software that have successfully used all three models for different parts of their portfolios, I don’t think that them figuring out AI profitability is particularly challenged by this “new idiom”.
> The AI companies that make profits will be ones that either have a competitive moat not based on the capabilities of their model,
This is the “open source” case, where profitable firm provide ancillary services and support, rather than directly selling the capabilities of the software, because the software is not at all exclusive.
> OR those which don’t expose the underlying inputs and outputs of their model to customers
This is the “internal software” case, where the profitable firm does not provide access to the software at all, but uses it internally in a way which is opaque from the outside to support other products or services it provides, maintaining exclusivity because no one outside actually gets access to the software itself. (Arguably, non-source-available closed source software might fit in here, though its often possible to reverse engineer software that you have access to without the source, from the executable, so I'd argue that even without source available its more the next category.)
> OR can successfully sue any competitor that engages in shoggoth mask cloning.
This is the “proprietary software” case, where the profitable firm protects its exclusivity through legal constraints on people who have access to the software, while giving customers access to it.
Given that the big firms in AI largely either are or are closely associated with big firms in software that have successfully used all three models for different parts of their portfolios, I don’t think that them figuring out AI profitability is particularly challenged by this “new idiom”.