Does more computing power translate into more revenue and guarantee AI dominance?It is a question that has followed OpenAI since DeepSeek triggered a market sell-off with its cost-efficient, open-weight AI models last year. Yet, the ChatGPT maker has only doubled down and in recent months, has inked AI infrastructure deals worth an eye-watering $1.4 trillion dollars.In a likely attempt to address investor concerns about revenue returns, OpenAI on Sunday, January 18, said that its computing capacity has grown 9.5 times in the last three years – from 0.2 GW in 2023 to about 1.9 GW in 2025. The growth in computing capacity has been matched by rapid revenue growth, with annualised recurring revenue growing ten-fold over the same period – from $2 billion in 2023 to more than $20 billion in 2025.
These figures were shared by Sarah Friar, OpenAI’s chief financial officer (CFO), in a blog post that sought to show a correlation between its growth in computing capacity and revenue growth. Friar also laid out OpenAI’s 2026 strategy for driving monetisation of ChatGPT as well as other AI products and services while securing more compute and establishing the technical infrastructure necessary to scale these products.
Stating that OpenAI’s focus for 2026 is practical adoption, Friar wrote, “The priority is closing the gap between what AI now makes possible and how people, companies, and countries are using it day to day. The opportunity is large and immediate, especially in health, science, and enterprise, where better intelligence translates directly into better outcomes.”
“This is never-before-seen growth at such scale. And we firmly believe that more compute in these periods would have led to faster customer adoption and monetization,” she added.
The finance chief’s remarks come amid growing scrutiny from investors and others in the industry of the massive investments being made by AI companies to build out data centres, secure energy, etc, in order to power cutting-edge AI models even as viable monetisation paths remain elusive.Story continues below this ad
Last week, OpenAI announced that it is starting to test ads within ChatGPT in the United States, marking a significant shift for one of the world’s most widely used AI products, which boasts more than 800 million weekly active users.
While ChatGPT ads are being introduced on a trial basis for now, the move highlights how advertising remains a lucrative revenue model even in the AI era. For consumers, it raises potential concerns about business incentives taking priority over user experience. However, Fidji Simo, OpenAI’s CEO of Applications, has said that the company will not optimise for time spent in ChatGPT, and continue to prioritise user trust and user experience over revenue.
‘We are in a paradigm shift’
“There is definitely a timing mismatch because I have to make decisions today about making sure we have compute, not even in ‘26 or ‘27, but ‘28, ’29, and ‘30. Because if I don’t put in orders today and don’t give the signal to create data centres, it won’t be there,” Friar said in a recent podcast appearance.
Compute has scaled at roughly 3x per year. while revenue has followed the same curve, according to OpenAI. (Image: OpenAI)
“Today, we feel absolutely constrained on compute. There are many more products that we could launch, many more models we could train, many more multi-modaility things we could explore if we had more compute today,” she said. “The signal from the whole environment is AI is real, we are in a paragidm shift, we need to invest to give people the intelligence they need […]” Friar added.Story continues below this ad
What are the major compute deals made by OpenAI?
Growth in computing capacity is measured in gigawatts which is, in turn, a measure of power. 1GW is roughly equivalent to the annual power consumption of seven to ten lakh Indian homes, depending on location, load factor, and energy source, according to data analysis website Indiagraphs.
In September 2025, Nvidia announced that it would commit $100 billion to support OpenAI as it builds and deploys at least 10 GW of the chipmaker’s systems. More recently, OpenAI signed another computing power deal worth more than $10 billion with Cerebras in order to use the chipmaker’s systems to power ChatGPT and other AI products.
OpenAI CEO Sam Altman is looking to expand the startup’s computing capacity to 250GW by 2033, as per a report by The Information.
“Securing world-class compute requires commitments made years in advance, and growth does not move in a perfectly smooth line,” Friar wrote in the blog post. “Compute is the scarcest resource in AI. Three years ago, we relied on a single compute provider. Today, we are working with providers across a diversified ecosystem. That shift gives us resilience and, critically, compute certainty. We can plan, finance, and deploy capacity with confidence in a market where access to compute defines who can scale,” she further said.Story continues below this ad
OpenAI is not alone in trying to lock down compute through multi-billion dollar deals. Earlier this month, Meta CEO Mark Zuckerberg announce that Meta was “planning to build tens of gigawatts this decade, and hundreds of gigawatts or more over time.”
