According to Jefferies’ Christopher Wood, global head of equity strategy, the scale of spending by US hyperscalers has reached a point where it is consuming an increasingly large share of their cash flows, particularly on chips and memory. Based on the latest company guidance, capex as a percentage of operating cash flow for the four major US hyperscalers has surged from 41% in 2023 to a projected 92% in 2026.
A significant portion of this is being directed towards memory alone, which is estimated to account for about 30% of total capex, implying roughly 28% of operating cash flow being absorbed by memory investments this year, he said in his Greed and Fear report.
This rising intensity of investment brings into focus a more fundamental question: monetisation. A recent Jefferies report led by Edison Lee highlights that the challenges around AI business models remain underestimated. The increasing cost of staying competitive, driven by higher compute, memory, and power requirements, suggests that sustainable profitability for pure AI model players remains distant.
Wood aligns with this view. His base case is that AI may ultimately resemble a capital-intensive industry like airlines, rather than the high-margin, winner-takes-all dynamics seen in the internet era.
Even so, the current phase of spending shows little sign of slowing. Big Tech companies continue to push ahead with aggressive capex plans. Microsoft expects to spend $190 billion this year, including about $25 billion attributed to higher component costs. Alphabet and Meta have both raised their 2026 capex guidance to $180–190 billion and $125–145 billion, respectively, while Amazon has maintained its guidance at $200 billion.
Among these, investor concerns appear more pronounced in the case of Meta, which lacks the same direct cloud-driven benefits from AI spending as peers like Alphabet, Microsoft, and Amazon.For now, the “picks and shovels” trade remains intact, supported by continued spending and limited pushback from investors on returns.
However, early signs of strain are beginning to surface. A recent report noted that OpenAI has missed internal targets for both user growth and revenues, including a goal of reaching 1 billion weekly active users for ChatGPT by the end of last year. The company has also reportedly fallen short of multiple monthly revenue targets in 2026, while facing increased competition.
Market share trends reflect this shift. Over the past 12 months to March, Gemini’s share of web traffic in the generative AI market has risen sharply from 6% to 25.5%, while ChatGPT’s share has declined from 77.4% to 56.7%, according to SimilarWeb data.
At the same time, concerns have been raised about financing structures within the ecosystem, where partners such as Nvidia and Oracle provide funding to OpenAI, which in turn uses that capital to purchase compute from them.
Competition is also intensifying. Anthropic reported in early April that its annualised revenue run rate has exceeded $30 billion, up from around $9 billion at the end of 2025, now surpassing OpenAI’s reported run rate of over $25 billion in February.
Taken together, the picture that emerges is one of escalating investment, rising competitive pressure, and unresolved questions around returns. The spending cycle continues, but the strain it places on cash flows and the uncertainty around monetisation are becoming increasingly difficult to ignore.
(Disclaimer: Recommendations, suggestions, views and opinions given by the experts are their own. These do not represent the views of The Economic Times)
