The $0 Question: Will ChatGPT’s price drop to zero?

As AI enthusiasts, I’m sure many of you, like me, are curious about the future of commercial models for AI solutions.

Given the massive compute costs, it’s only logical to wonder if a freemium is really even possible in the steady state. The cost side of the equation that we need to consider in the sustainable business model for generative AI was really reinforced by the news that saying “please” and “thank you” is costing millions. As we use AI more, and increasingly treat it like our companion, we’re bound to interact with our AI more as we do with humans - in verbose dialogue - and less as we did with the voice agents of days past - only further exacerbating the cost to deliver services.

Perfect substitutes create pricing pressure

There is no question that ChatGPT delivers value, and that value should be monetizable. But for an average person’s usage, how different is ChatGPT’s utility from, say, Gemini, or Claude, or DeepSeek? I’d argue that for a great many “general” tasks, the best-of-breed models perform largely on par with each other. And more often than not, perfect-substitute products lead to aggressive pricing strategies to acquire and retain users.

Simply put, when perfect substitutes exist, price pressure tends to follow.

History suggests price erosion

I like to look at analogues, to see if they make sense as models. So, I looked at the price of computer memory over time. Despite innovations, prices plummeted. Memory cost $3.7 quadrillion per TB in 1957, and a little over $1,000 per TB in 2023. Are we likely to see that type of trend in the AI space as we begin to see a proliferation of options?

Commercialization is not just simple subscriptions

Andrew F. Forbes’s recent article in Forbes highlighted various commercial models, including the “pro” approach. Pay-per-token models seem promising, tying pricing to value received, and potentially leading to a sustainable operating model. No doubt this approach can drive businesses to build on top of LLMs and reap the benefits of arbitrage between the value they can drive and their cost for output.

Open source could drive commercial innovation

I’ve made no secret of being fascinated by Meta’s open-source approach to LLMs - and generally opting to use Llama 4 as my go-to solution. I can fully admit my bias (my wife works for Meta), but even still, I think it becomes fascinating to consider that the steady-state of generative AI models may have to find an indirect monetization path.

What’s your take?

Share your thoughts on the future of AI commercialization. Will prices drop? Will new models emerge?

For me, the more interesting question isn’t whether prices fall - it’s where the value capture migrates when they do. Memory got cheap and the value moved up the stack: to operating systems, to applications, to cloud services, to platforms. If LLM inference follows the same arc, the next generation of winners isn’t going to be the model providers. It’ll be whoever figures out the new layer that sits on top.