Unsealed court documents from the Musk vs. Altman trial provide a rare, unvarnished look at the astronomical costs of sustaining the AGI arms race.
Documents unsealed during the ongoing Musk vs. Altman trial in California have provided a rare glimpse into the brutal economics of frontier AI. OpenAI is projected to lose $14 billion in 2026, primarily driven by a 300% increase in inference and pre-training costs for the upcoming GPT-6 architecture.
The financial disclosures highlight two primary bottlenecks: the skyrocketing cost of high-bandwidth memory (HBM3E) and the sheer volume of electricity required for massive clusters. OpenAI's hardware expenditures alone are expected to exceed $8 billion next year, as the lab secures massive reservations for NVIDIA Blackwell and Vera Rubin systems.
Furthermore, the cost of "fine-tuning on human feedback" (RLHF) and securing high-quality licensing deals with publishers has ballooned into a multi-billion dollar line item. The trial documents suggest that OpenAI's revenue—while growing rapidly—cannot keep pace with the infrastructure gravity required to maintain its lead over Anthropic and Google.
The trial centers on Elon Musk's allegation that OpenAI has abandoned its original non-profit mission. However, Sam Altman's internal memos, presented as evidence, argue that the non-profit structure is simply "incapable of holding the weight" of the capital required. To attract the $100 billion+ in investment needed for the "Stargate" supercomputer project, OpenAI is aggressively pursuing a transition to a fully for-profit benefit corporation.
This $14 billion deficit raises critical questions about the long-term sustainability of current LLM scaling laws. If every marginal improvement in model reasoning requires a 10x increase in capital expenditure, the path to AGI may be accessible only to entities with the credit rating of a sovereign nation. The industry is now watching to see if OpenAI can bridge this "compute chasm" before its cash reserves—and investor patience—are exhausted.