OpenAI’s $207 Billion Lie: Why All Bets Are Off in the AI Arms Race

OpenAI’s $207 Billion Lie: Why All Bets Are Off in the AI Arms Race

HSBC’s sobering analysis reveals OpenAI faces a $207B funding gap with no path to profitability by 2030, raising existential questions about the AI business model.

by Andre Banandre

The champagne corks may have popped when ChatGPT captured 100 million users in record time, but the hangover is setting in. According to HSBC’s latest analysis, OpenAI faces a staggering reality: even if the company generates $213 billion in revenue by 2030, it would still need another $207 billion just to stay afloat.

That’s not a typo. It’s the brutal math behind AI’s most celebrated unicorn.

Sam Altman looking contemplative as OpenAI faces financial reality
Sam Altman looking contemplative as OpenAI faces financial reality

The Numbers Don’t Lie, But The Business Model Might

HSBC’s semiconductor team, led by Nicolas Cote-Colisson, painted a picture that should terrify anyone holding OpenAI stock, if you could buy it. Their updated forecasts reveal OpenAI’s cumulative free cash flow will remain negative through 2030, creating that eye-watering $207 billion funding gap.

The infrastructure costs are simply astronomical. OpenAI is aiming for 36 gigawatts of AI compute power by decade’s end, enough electricity to power a state the size of Texas. Their projected cloud and infrastructure costs between late 2025 and 2030 alone total $792 billion, with data center rentals accounting for $620 billion of that.

The Financial Times’ Alphaville blog put it bluntly: OpenAI is essentially “a money pit with a website on top.”

The “Enough” Problem

Sam Altman’s now-famous single-word answer to questions about AI’s computational hunger, “enough”, takes on new meaning in this context. Turns out “enough” might mean $1.4 trillion in compute spending over the next eight years.

Meanwhile, Microsoft’s quarterly filings reveal the scale of bleeding. In Q1 FY2026 alone, Microsoft’s net income was reduced by $3.1 billion due to losses recognized on its OpenAI investment. Given Microsoft’s 27% stake, this implies OpenAI hemorrhaged roughly $11.5 billion in a single quarter.

Let that sink in: OpenAI is burning through cash at a rate that would make even the most optimistic venture capitalist reach for the antacids.

ChatGPT growth plateau showing declining momentum
ChatGPT growth plateau showing declining momentum

The User Growth Paradox

Here’s where it gets interesting. OpenAI projects massive growth, The Information reports they expect 220 million paying ChatGPT users by 2030. Currently, they have 800 million weekly active users with 35 million paying for premium tiers.

But even if they hit these ambitious targets, the math still doesn’t work. HSBC notes that doubling the proportion of paid subscribers from 10% to 20% would only add $194 billion in revenue, mere pocket change against their projected costs.

The Competition That Can Afford the Burn

What makes OpenAI’s position particularly precarious is that its competitors don’t face the same existential math problem. Google dominates the discussion on developer forums, with many noting they’re winning “by simply having ‘f you’ money from all their other revenue streams.”

Microsoft and Amazon can absorb billions in AI losses because they’re diversified tech giants. Google has been preparing for this moment for years with DeepMind and their proprietary TPU infrastructure. As one analyst observed, “It was inevitable they would win in the long run.”

OpenAI’s predicament brings to mind the classic startup dilemma writ with trillions in zeros: what happens when your “revolutionary” technology turns out to be so capital-intensive that only the companies you’re trying to disrupt can afford to play?

The Sunk Cost Trap

At this scale of capital requirements, critics argue we’re witnessing the world’s most expensive case of sunk cost fallacy. When every Uber ride cost the company $40,000, we called it unsustainable. OpenAI’s equivalent is spending billions to maintain an “inferior product”, as some users describe GPT-5 compared to Google’s latest offerings.

The sentiment on developer forums has turned noticeably skeptical. “Hasn’t just plateaued, it’s gotten worse”, notes one power user, while others observe Google is “crushing them on every front.”

Even Altman’s much-hyped partnership with Jony Ive for “next-gen AI hardware” looks increasingly like rearranging deck chairs on the Titanic, especially when the core business model is fundamentally broken.

Can Anybody Actually Pay for This?

The central problem may be simpler than the eye-watering spreadsheets suggest: nobody can afford to pay what it actually costs to run these models.

HSBC analysts noted that raising prices isn’t viable because “Alphabet can severely undercut them.” OpenAI’s current pricing of approximately $1 per million tokens for GPT-5 means even at these rates, which many developers already consider expensive, the company still operates deep in the red.

As one developer calculated, given GPT-5 often requires multiple API calls to accomplish meaningful tasks, profitable pricing might need to approach $800 per million tokens, a price point that would instantly vaporize most of their customer base.

The Endgame Scenarios

The Financial Times reports that HSBC sees several potential paths forward:

  • Dramatic efficiency improvements in compute, unlikely given the compute arms race
  • Capturing larger shares of digital ad spending, competing with Google and Meta on their home turf
  • Additional debt financing, complicated by “sharp increases in Oracle’s credit default swaps” and market concerns about AI financing

Meanwhile, productivity gains from AI remain theoretical at best. HSBC drily notes the ironic return to economist Robert Solow’s famous quip: “You can see the computer age everywhere but in productivity statistics.”

With Federal Reserve president John Williams observing in 2017 that modern technologies like the internet have “only influenced our consumption of leisure, and hasn’t yet trickled down to offices or factories”, the fundamental question remains unanswered: where exactly is the ROI on this $207 billion gamble?

The Coming Shakeout

OpenAI’s predicament raises uncomfortable questions for the entire AI ecosystem. If the poster child of generative AI can’t find a path to profitability, what does that say about the hundreds of startups building on their API?

The consensus forming among enterprise technology leaders is that the AI bubble correction will be brutal. Companies building “AI wrappers” face existential risk when their foundation model providers are themselves financially unstable.

As one enterprise architect put it: “We’re hedging our bets across OpenAI, Anthropic, and Google because we can’t afford to have our core infrastructure disappear when the funding dries up.”

The inconvenient truth emerging from HSBC’s analysis is that building AGI might require more capital than exists in the venture ecosystem, perhaps more than exists in the entire global financial system.

The Bottom Line

OpenAI stands at a precipice that makes even the most dot-com era excess look prudent. The company needs to find $207 billion, roughly the GDP of Hungary, just to reach breakeven by 2030.

Whether this represents visionary long-term thinking or the greatest speculative bubble in technology history depends entirely on one question: can they actually deliver artificial general intelligence before the money runs out?

For now, the numbers suggest that betting on AGI arriving before bankruptcy might be the riskiest wager in tech history.

The clock is ticking, and the burn rate shows no signs of slowing.

Related Articles