Having your own AI model is not an advantage - it’s a liability

The AI strategy mistake I see big companies make all the time

It’s been a while since I wrote an opinion piece. I still have opinions to share, so buckle up.

Disclaimer: if your company has billions of spare cash and a few hundred AI engineers available, this post is not for you.

When ChatGPT launched back in November I had a call with some investors who wanted to pick my brain on a startup.

They were considering investing, but they were worried because that the startup did not have any proprietary fancy AI tech. According to them, that was a problem because it didn’t create a solid barrier to entry.

This VC wasn’t the only one to share that worry. I heard the same take from countless people convinced that there are 2 kinds of AI companies:

  • The “ChatGPT wrappers” - companies simply building tech on top of the OpenAI API

  • The “real AI companies” - the ones who train complex AI models

The common take is that “ChatGPT wrappers = bad”, and “real AI companies = good”.

I already disagreed back in the days, and my take was pretty controversial:

  1. AI was quickly going to become “just software”

  2. Having “your own AI model” was not going to be a strategic asset in the future

  3. Startups had to build a competitive advantage with the “good old startup playbook” rather than with fancy tech: build network effects, a compelling UX, lock-in, etc.

At the time I didn’t have enough data to back this up: it was too early in the AI revolution. But I think now we have a few examples to point to that not only confirm my take but make me even more extreme in my views: I now believe that training your own AI models is a liability more often than an asset.

Let’s look at a few examples from the past 2 weeks:

The Inflection acquisition (sort of)

Inflection is an AI company that raised $1.5B to build a more “human” LLM (basically it doesn’t sound like ChatGPT). They have 2 products:

  • Inflection API: for developers and businesses that want to integrate their LLM in their product, competing with the OpenAI GPT models

  • Pi: their consumer “friendly” version of ChatGPT.

Microsoft has basically acquired them, hiring all the key people and offering their model within the Azure ecosystem. Inflection has then pivoted to being an “AI studio”: a fancy word to say “agency”. My prediction: it’ll shut down within the year (have you ever heard of an agency that raised $1.5B? Exactly).

What do we make of this story? ****Inflection raised billions to get what they needed to train a custom AI model: tons of researchers, compute, and data. And their model was indeed pretty good. Then someone with better researchers, more computing, and more data made something a bit better (GPT-4).

Strategic advantage: gone.

You either have the confidence you can compete with Google, OpenAI, Microsoft & Co, or you may as well give up. Actually, not even Microsoft thinks they can compete and integrate other models in their cloud offering.

Case 2: HeyGen (this one has a better end)

HeyGen is a famous startup building voice cloning and video cloning tools. You upload some recordings of yourself, and you have an AI version of you in minutes.

Anyway, I’d argue they’re the market leader in this and potentially they have some pretty great engineers building their tech! Based on their careers page it looks like they develop their models too (they hire researchers).

Then, OpenAI developed their “voice engine”: some new crazy AI model that can clone anyone’s voice. And it’s better than HeyGen.

What did HeyGen do? They partnered with OpenAI to integrate their tech into their existing product. “If you can’t beat them, join them”, someone said.

So what do we make of this? HeyGen will probably survive because they have built a cool product, a nice UX, a brand, and a loyal customer base. They’ve also been super smart and agile and immediately flipped to a new AI model when that was available.

Imagine if instead of focusing on UX, marketing, and all the “good old startup stuff” they went all in on R&D, just to be beaten by someone else with better researchers, bigger computers, and bigger databases.

It must feel really bad to take the models that cost you millions (or billions), right-click, and “move to bin”.

So what do we do?

In 1986 Oracle had their IPO. Their main product? The Oracle Database.

So in the 70s and 80s if your company had developed an amazing database technology you were a really cool company. Today, EVERY company has a database, there are tens of choices of databases you can use, most of them are free, and no consumer cares about it.

This is why Oracle shifted their strategy from their initial “database-only” offering to products like CRMs or ERPs which are applications built on top of that core technology. What was their only product 40 years ago, today is “just a database” powering a bunch of applications.

Today we praise OpenAI, Microsoft, and co. for having the best AI models out there. Tomorrow, they’ll be an engine behind the applications we’ll use every day. Those who build these applications will be in the Fortune 500, and no one will care about the AI models they use. It’ll be “just AI”.

So my recommendation is that you stop chasing “the next big AI”, and start focusing on the “so what?”: the use cases, the applications, and the problems worth solving with it.

Let the giants do the heavy lifting. And let them become “just AI”.

Interested in learning about AI strategy and how to deploy it in your company? AI Academy is launching the Leading Business Growth with Generative AI program, and I’ll be teaching. Reserve your spot here.