
OpenAI said enterprise use of its AI products has accelerated sharply over the past year, with ChatGPT message volume among business users increasing eightfold since November 2024, as the company faces intensifying competitive pressure from Google and other rivals following an internal “code red” warning issued by its chief executive.
Enterprise Growth Data and Competitive Position
OpenAI released the new usage figures on Monday, one week after chief executive Sam Altman circulated an internal memo warning staff about the competitive threat posed by Google. The data shows that nearly 36% of U.S. businesses are now customers of ChatGPT Enterprise, compared with 14.3% for rival firm Anthropic, according to the Ramp AI Index.
Despite that lead in enterprise adoption, OpenAI continues to generate most of its revenue from consumer subscriptions. That revenue base faces mounting pressure from Google’s Gemini models. OpenAI also competes with Anthropic, whose revenue is largely derived from business customers, as well as with open-weight model providers that are increasingly attractive to enterprise buyers.
The company has committed about $1.4 trillion to infrastructure investments over the coming years, making sustained enterprise growth central to its financial strategy.
“If you think about it from an economic growth perspective, consumers really matter,” said Ronnie Chatterji, OpenAI’s chief economist, during a briefing. “But when you look at historically transformative technologies like the steam engine, it’s when firms adopt and scale these technologies that you really see the biggest economic benefits.”
Rising API Usage and Token Consumption
OpenAI said adoption among large organisations is not only expanding but becoming more embedded in daily operations. Organisations using OpenAI’s application programming interface are now consuming 320 times more “reasoning tokens” than a year ago. The increase indicates that enterprises are running more complex workloads through OpenAI’s systems, or that they are testing the technology at a much higher intensity.
The increase in reasoning tokens is also associated with higher energy use, which could raise long-term cost concerns for corporate users. TechCrunch said it had asked OpenAI for clarification on how companies are budgeting for AI and whether the current pace of growth in usage is sustainable.
Custom GPT Deployment in Large Organisations
The company also reported a sharp rise in the use of custom GPTs, which organisations deploy to encode internal knowledge into AI assistants or automate workflows. Usage of custom GPTs increased 19-fold over the past year and now accounts for 20% of all enterprise ChatGPT messages, according to the report.
OpenAI cited digital bank BBVA as an example, stating that the company regularly runs more than 4,000 custom GPTs.
“It shows you how much people are really able to take this powerful technology and start to customize it to the things that are useful to them,” said Brad Lightcap, OpenAI’s chief operating officer.
Reported Time Savings and Workforce Impact
According to OpenAI, employees using its enterprise tools report saving between 40 and 60 minutes per day. The company did not specify whether that estimate accounts for time spent training on the systems, crafting prompts, or correcting AI-generated output.
The report also found that three-quarters of enterprise respondents said AI tools now allow them to perform tasks they previously could not, including technical work. OpenAI reported a 36% increase in coding-related messages generated by users outside of engineering, IT, and research teams.
Security, Advanced Tools, and Adoption Gaps
Lightcap acknowledged that wider use of coding tools outside traditional technical teams could introduce new security risks. He pointed to OpenAI’s recently announced agentic security researcher, Aardvark, now in private beta, as a possible mechanism for identifying bugs, vulnerabilities, and exploits.
OpenAI’s data also suggests that even its most active enterprise customers are not yet making full use of its most advanced features, including data analysis, reasoning, and search. Lightcap said deeper adoption requires a shift in how companies integrate AI into their data systems and business processes.
Adoption of these advanced features, he said, will take time as firms adjust workflows and redefine how they operate with AI in place.
Divide Between Advanced and Slower Adopters
The report identified what OpenAI described as a widening gap between “frontier” users and slower adopters. Workers in the frontier group are using more AI tools more frequently and reporting greater time savings than those in the lagging group.
“There are firms that still very much see these systems as a piece of software, something I can buy and give to my teams and that’s kind of the end of it,” Lightcap said. “And then there are companies that are really starting to embrace it, almost more like an operating system. It’s basically a re-platforming of a lot of the company’s operations.”
OpenAI executives framed the divide as an opportunity for slower adopters to increase usage and efficiency as enterprise integration deepens.
Featured image credits: Dima Solomin via Unsplash
For more stories like it, click the +Follow button at the top of this page to follow us.
