OpenAI's Convergence Problem: $600 Billion in Commitments, Slowing Revenue, and a Trial That Could Stop the IPO Clock
Three things happened to OpenAI this week that individually would qualify as consequential news. Together, they form something rarer: a legible stress test of the company's entire operating thesis.
On Monday, OpenAI and Microsoft announced a sweeping revision to the partnership that has defined commercial AI since 2019, converting Microsoft's exclusive license to a non-exclusive one and clearing the way for OpenAI to sell models through AWS and Google Cloud.
On Tuesday, a federal jury trial opened in Oakland with Elon Musk seeking $134 billion in damages, the removal of Sam Altman and Greg Brockman, and a court order reverting the company to nonprofit status.
And on the same day, the Wall Street Journal reported that OpenAI missed multiple monthly revenue targets in 2026 after losing ground to Anthropic in the coding and enterprise markets, and failed to reach its internal goal of 1 billion weekly active users for ChatGPT by the end of 2025.
None of these events, read in isolation, is fatal. Together, they make it harder to dismiss the central tension in OpenAI's business. The company has committed to spending approximately $600 billion on compute infrastructure at precisely the moment its revenue growth has decelerated, and its governance structure is being litigated in open court.
The numbers don't lie
OpenAI is currently generating roughly $10 billion in annualized recurring revenue, nearly double the $5.5 billion it recorded in 2024. That trajectory sounds healthy until it is placed against the liability column.
The company holds a $300 billion five-year cloud deal with Oracle and has committed to spending approximately $600 billion on data center buildout across its major infrastructure agreements. OpenAI projects roughly $74 billion in operating losses in 2028 before an anticipated return to profitability by 2030.
CFO Sarah Friar has told other company leaders she is worried the company might not be able to pay for future data center contracts if revenue does not grow fast enough. OpenAI board directors were also scrutinizing recent data center deals and questioning Altman's efforts to secure additional computing capacity despite weakening revenue. Friar and other executives are now seeking to control costs more tightly and instill more business discipline.
In a joint statement, Altman and Friar responded to the WSJ report: "This is ridiculous. We are totally aligned on buying as much compute as we can and working hard on it together every day."
The statement doesn't address the revenue shortfall or the board scrutiny.
The shortfall has sparked internal concern about whether the company can keep pace with the massive financial commitments required to build out data centers and secure long-term computing capacity.
Distribution restructuring as a revenue signal
The Microsoft deal revision is being framed as a strategic expansion, and structurally, it is. By converting Microsoft's license from exclusive to non-exclusive and explicitly granting OpenAI the right to serve products on any cloud, the new terms retroactively validate the Amazon arrangement and eliminate the legal overhang that had threatened OpenAI's earlier $50 billion deal with AWS.
Amazon CEO Andy Jassy has confirmed that OpenAI models will be available on AWS Bedrock in the coming weeks. Google is reviewing the revised deal terms to assess which partnerships might now be possible.
Under the new terms, revenue-share payments from OpenAI to Microsoft will be subject to a total cap and will continue through 2030, independent of OpenAI's technological progress. Microsoft will continue to have a license to OpenAI IP for models and products through 2032. OpenAI products will still ship first on Azure unless Microsoft decides otherwise.
In exchange, OpenAI agreed to expand its existing cloud agreement with AWS by $100 billion over eight years and committed to making AWS the exclusive third-party distribution provider for Frontier, its new enterprise agent-building platform.
The engineering logic here is sound. Enterprise customers overwhelmingly operate in multi-cloud environments. Being locked into Azure was not just a technical constraint; it was a sales objection that OpenAI's competitors, particularly
Anthropic and Google, exploited relentlessly. But the timing of the revision is itself a data point. OpenAI's revenue chief Denise Dresser acknowledged in a memo that the Microsoft exclusivity arrangement had "limited our ability to meet enterprises where they are."
If the prior structure was a problem the company understood, the question is why the restructuring was not completed before the revenue shortfall became material enough to spook the CFO.
The competitive displacement
OpenAI fell short of several monthly sales targets in 2026 after rival Anthropic gained ground in the coding and enterprise markets. This is a specific claim with identifiable causes. Anthropic's Claude family has captured a meaningful portion of developer workflows, particularly in coding and agentic use cases.
That pressure is structural, not cyclical. The developers and engineering teams that standardize on a model API do not churn quickly, so lost enterprise share is not recovered through a product launch alone.
Altman reportedly called for a "code red" to improve ChatGPT in late 2025. OpenAI has spent 2026 championing its Codex tool and higher availability of compute as the primary drivers of corporate revenue recovery.
The Codex bet is coherent, but it's a bet made under financial constraint: the company is competing in a market where Anthropic, Google, and Amazon all have deep balance sheets and infrastructure advantages of their own.
The trial as governance risk
Musk's lawsuit in the United States District Court for the Northern District of California seeks $130 billion in damages from OpenAI, the removal of Altman and Brockman from its board, and a court order reverting the company to a nonprofit structure.
In October 2025, OpenAI completed a recapitalization that converted the for-profit subsidiary into a public benefit corporation, with the nonprofit retaining oversight and an equity stake. The restructuring cleared the way for an eventual IPO after sign-off from state regulators in California and Delaware. Musk's suit, if it succeeds, would unwind that structure and block the IPO.
The judge is expected to rule on standing before the jury delivers its advisory verdict. If Musk is found to lack standing, the case ends without reaching the merits, and OpenAI's restructuring stands. Most charitable trust scholars expect that outcome. But the trial isn't risk-free regardless of the verdict.
OpenAI executives, including Altman and Brockman, are scheduled to testify in the coming days, exposing internal deliberations about safety governance, Microsoft's role, and the timeline of the for-profit conversion. Every hour those executives spend in a witness chair is an hour they're not running the company or pitching IPO investors.
OpenAI's IPO is expected later this year, and the money it raises could help it dominate an industry in which it had an early lead. OpenAI recently closed a $122 billion funding round at a post-money valuation of $852 billion. A successful public offering at or near a $1 trillion valuation remains OpenAI's primary mechanism for funding the gap between its current revenue and its infrastructure commitments.
The trial doesn't make that offering impossible, but it adds a litigation overhang to a prospectus that already needs to explain $74 billion in projected 2028 losses to institutional investors.
What the confluence means
Each of the three developments this week is defensible on its own terms. The multi-cloud restructuring was a rational commercial decision that should have happened earlier.
The trial is legally weak by most analyses and was always going to happen at some point. The revenue shortfall is real, but occurs against a backdrop of doubled annual recurring revenue year over year.
The problem isn't any single variable. It's that all three arrive simultaneously against a capital structure that depends on investor confidence remaining stable long enough for the company to execute a public offering before its operating losses peak.
The shortfall has sparked internal concern about whether OpenAI can keep pace with its financial commitments at the same time those commitments are receiving their highest-ever level of public and legal scrutiny.
For developers and engineering teams making infrastructure bets on OpenAI's API, the relevant question isn't whether OpenAI survives. It almost certainly does, in some form.
The relevant question is what the company looks like after it clears these three simultaneous tests: whether the IPO prices at a valuation that gives it the capital runway its projections require, whether the trial resolves without structural remedies that alter the governance stack, and whether the multi-cloud expansion generates the enterprise revenue growth Friar is now treating as a precondition for the company's continued ability to fund its own compute.
All three are answerable. None are answered yet.