Daily Signal: April 27, 2026

Infrastructure at its limits, AI cost structures crystallizing, and strategic partnerships among giants. All this, and more, in today's Daily Signal.

Share
Daily signal logo image: An "A" for Algorithm Times, with "Daily Signal" in smaller print below it.

Welcome to the Daily Signal, our daily sweep of the AI headlines worth reading, with context for why they matter.

Today, we're seeing infrastructure at its limits as the AI industry scrambles to secure the power, deals, and political clearances needed to keep scaling.

We’re also seeing AI’s enterprise cost structure become harder to ignore. The subsidized early-adoption era is giving way to consumption-based pricing, forcing companies to confront what AI actually costs when it moves from pilot projects into everyday operations.

Meanwhile, the OpenAI-Microsoft relationship has quietly completed its renegotiation from a strategic partnership to something closer to a vendor agreement.

OpenAI and Microsoft renegotiate the terms of their entanglement

The deal that built the current AI boom just got significantly rewritten. OpenAI and Microsoft announced a revised partnership agreement this morning that ends Microsoft's exclusive right to distribute OpenAI's models and products, opening the door for OpenAI to serve customers across any cloud provider, including Amazon, Google, and others.

Microsoft's license to OpenAI's IP continues through 2032 but is now non-exclusive. OpenAI will keep paying Microsoft a 20% revenue share through 2030, now subject to a cap. In exchange, Microsoft stops paying its revenue share to OpenAI and retains its roughly 27% equity stake.

The practical read: OpenAI needed this to honor its $50 billion Amazon deal without triggering litigation from Microsoft. Both sides are spinning it as a win. The more honest framing is that OpenAI has successfully negotiated its way out of a cage it agreed to build around itself, and Microsoft got paid to hand over the key.


China blocks Meta's $2 billion acquisition of Manus

Beijing has ordered the unwinding of Meta's planned $2 billion acquisition of Manus, the Singaporean AI automation startup with Chinese roots that Meta announced in December.

China's state planner asked both companies to withdraw the transaction after launching a January probe into export controls and technology transfer concerns.

The timing is notable: the block lands weeks before a scheduled summit between Trump and Xi. Manus had been held up as a template for how Chinese-origin startups could build globally, raising money from U.S. VCs and operating out of Singapore.

That template now looks considerably less reliable. For AI founders trying to thread the geopolitical needle between U.S. and Chinese capital markets, this is a concrete data point worth studying.


Meta is going to try to power its data centers with energy beamed from space

This one reads like science fiction, but the engineering rationale is real. Meta announced a deal with startup Overview Energy for up to 1 gigawatt of space-based solar power, in which satellites in low Earth orbit collect sunlight continuously, convert it to near-infrared light, and beam it down to existing terrestrial solar farms that can generate electricity around the clock.

No new land, no battery dependency, no grid interconnection queues. The orbital demonstration is planned for 2028, with commercial delivery targeting 2030.

Meta also separately announced a partnership with Noon Energy for 1 GW of ultra-long-duration storage. The underlying problem this addresses is real and accelerating: AI data centers are consuming energy at a pace that terrestrial grid infrastructure cannot match, and the straightforward solutions, buying more solar, adding batteries, waiting for interconnection approvals, are all too slow or too expensive at the scale these companies need.


Data centers are on track to consume 12% of U.S. electricity by 2028

The Lawrence Berkeley National Laboratory number is getting fresh attention today in light of the Meta space solar announcement: U.S. data centers consumed 4.4% of total national electricity in 2023 and are projected to reach between 6.7% and 12% by 2028, a range of 325 to 580 terawatt-hours annually.

For context, data center power demand more than doubled between 2017 and 2023, largely driven by growth in AI servers. The high end of that projection represents an energy load roughly equivalent to adding Spain's electricity consumption to the U.S. grid in three years.

The constraint isn't just supply. Grid interconnection queues in many regions now run four to seven years. That structural bottleneck is why companies like Meta are looking at space-based solutions and why nuclear, modular reactors, and private power purchase agreements are moving from curiosity to serious infrastructure planning.


Enterprise software is repricing itself around AI consumption, and the bills are coming due

Here's a telling data point reported by The Information that's circulating widely: by the end of 2025, 79 of 500 tracked software companies, including HubSpot, Adobe, and Salesforce, had adopted usage-based AI fees, more than double the number in 2024.

The shift from seat-based licensing to consumption-based billing isn't just a change in pricing model. It restructures the economics of software adoption: costs that were previously fixed and predictable are now variable and tied directly to how aggressively teams use AI features.

Uber's CTO recently said that Claude Code usage has surged so far beyond internal projections that it has blown the engineering budget. That pattern is playing out across organizations that adopted AI tools during a period of subsidized pricing and are now receiving invoices that reflect actual compute costs.

The companies that modeled this in advance are in good shape. Those that treated AI tooling as a fixed-cost line item are having a harder conversation.