According to the newly released quarterly financial reports from Microsoft and Meta, AI is not only not cooling down, but is also costing more money.
In their earlier earnings reports, both Microsoft and Meta indicated that their spending on AI infrastructure has entered a new expansion cycle. Despite the already staggering investment amounts, both companies emphasized that "AI computing power is in short supply," and that costs in 2026 will reach new highs, including not only the purchase of GPUs for AI acceleration but also increased electricity costs due to rising AI usage demands.
Meta is not hesitant to spend capital, announcing a staggering $1250 billion in costs.
In the recently concluded fourth quarter, Microsoft's capital expenditures increased to $375 billion, while Meta's increased by $221.37 billion, both exceeding market expectations.
Even more astonishing is the future forecast: Meta has directly raised its full-year spending forecast for 2026 to $1250 billion, a staggering 73% increase compared to the same period last year. This money will primarily flow into the construction of data centers, servers, and network infrastructure. JPMorgan analysts point out that, driven by the accelerated deployment of basic models, AI agents, and commercial applications, existing AI computing power is simply insufficient for market demand.
If you can't buy enough, make your own: The custom chip war begins
While NVIDIA and AMD GPUs remain the mainstays of the market, Microsoft and Meta are accelerating their Custom Silicon strategies to improve performance and control exorbitant costs.
• Meta:MetaCustomized chip MTIAIt will continue to iterate, and it already supports retrieval engine inference. It is planned to expand to core ranking and recommendation training workloads in the first quarter of 2026, which is a major benefit for its chip design partner, Broadcom.
• Microsoft:Microsoft is focusing on optimizing the energy efficiency of token processing, althoughThe recently announced MAIA 200Marvell was not involved in the development of the next-generation MAIA 300, but analysts point out that Marvell will assist in its development, with mass production expected in the second half of 2026.
Supply constraints become the norm, and the investment boom will continue until 2027.
Microsoft and Meta acknowledged that demand growth is exponential, but supply chain capacity is linear, leading to "supply limits" becoming the norm. Meta revealed that its GPU cluster size has doubled to train the next-generation GEM model, but further expansion is still needed.
This means that the hardware investment boom driven by cloud service providers (CSPs) is expected to continue until 2027.



