OpenAI CFO Sarah Friar earlier published a paper titled "Let business scale and intellectual value grow in tandemThe article, "(A business that scales with the value of intelligence)," unusually revealed the core secrets of this unlisted company—its computing power scale and revenue growth data. This move has been interpreted by outsiders as OpenAI deliberately reshaping the market's understanding of its positioning, attempting to prove that it is not only a software company, but also an "AI infrastructure" provider that controls key productivity.
In her article, Sarah Friar systematically explains OpenAI's expansion logic and reveals a surprising linear relationship: the investment in computing power directly translates into a surge in revenue.
Computing power increased tenfold, and revenue increased tenfold in tandem.
The most striking data in the article is OpenAI's growth curve over the past three years. Sarah Friar points out that OpenAI's available computing power more than doubled from 0.2GW in 2023 to 0.6GW in 2024, and is expected to reach approximately 1.9GW in 2025.
Meanwhile, its annualized recurring revenue (ARR) also showed remarkable synchronicity: from $60 billion in 2023, $200 billion in 2024, and soaring to over $20 billion in 2025. Sarah Friar stated that the revenue growth curve is synchronized with the expansion of computing power, which means that "demand" is not the bottleneck, but "computing power" is the key factor limiting the expansion of AI. As long as more computing power can be obtained, the speed of customer adoption and monetization will be faster.
Business Model Cycle: From ChatGPT Go to API
To dispel concerns about the "shift to commercialization," Sarah Friar emphasized that business scale must expand in tandem with intellectual value. She outlined a complete business loop:
• Free Tiers and Ads:Supported by advertising and business. This echoes OpenAI's recent launch..."ChatGPT Go" solutionBy introducing an advertising mechanism to subsidize the high inference costs, AI can be made more accessible to a wider range of price-sensitive groups.
• Subscription system:It covers solutions for consumers (Plus/Pro) and enterprise teams.
• Measurement-based APIs:By linking AI to actual workloads, it can be embedded into a company's engineering, marketing, and financial processes.
In addition, Sarah Friar also predicted that as AI enters fields such as scientific research and drug development, new economic models such as "licensing" and "outcome-based pricing" will emerge in the future.
No longer solely devoted to Microsoft? Shifting towards a multi-vendor strategy.
Another noteworthy signal is the shift in supply chain strategy. Sarah Friar revealed that three years ago, OpenAI relied on a single computing vendor (hinting at Microsoft Azure), but now it has moved towards a multi-vendor and multi-hardware strategy. She views computing power as a proactively manageable "portfolio" to improve long-term stability and cost efficiency.
Analysis of viewpoints
OpenAI chose to publish this article at the beginning of 2026 to verify the operational profitability equation of "energy = computing power = revenue".
In the past, there was some skepticism in the market regarding whether LLM (Large Language Modeling) conforms to the Scaling Laws, questioning whether simply stacking computing power could still bring about a proportional increase in intelligence. However, OpenAI's $200 billion in revenue has shown the market that, at least in terms of commercial monetization, the Scaling Laws remain valid. This also explains why tech giants have recently been snapping up nuclear power plants and building gigawatt-scale (GW) data centers, because whoever controls electricity controls the money-printing machine of the future.
Secondly, the establishment of ChatGPT Go and its advertising model marks the official entry of AI services into the era of "segmented monetization." OpenAI realized that a monthly subscription fee of $20 was insufficient to support the exponentially growing electricity bills. By acquiring massive amounts of data and users through a low-priced version supported by advertising (ChatGPT Go), and then generating high profit margins through premium subscriptions and enterprise APIs, this "free + paid" hybrid model will become the standard configuration for the AI industry.
Finally, OpenAI describes itself as an "AI infrastructure" provider, indicating that its ambitions extend beyond simply creating a chatbot. It aims to be the "utility company" of the AI era; whether you're developing new drugs, writing programs, or doing marketing, you'll all have to connect to OpenAI's pipeline and pay based on traffic (computing power).




