Skip to content
The New Geography of Power

The New Geography of Power

Published on April 15, 20265 min read

Five events this week, each celebrated, each adopted without protest, each one drawing a line on a map that no atlas prints but that already governs more territory than most treaties. The lines are not rivers or mountain ranges. They are undersea cables, arctic data centers, fuel-cell contracts, and leaked memoranda. The question they answer is old. The answer they give is new: whoever controls the computation controls the capacity, and whoever controls the capacity controls what intelligence is permitted to do.


OpenAI has stopped pretending that the cyber arms race is hypothetical. GPT-5.4-Cyber is a "cyber-permissive" variant of GPT-5.4 built exclusively for digital defense, with lowered refusal boundaries and binary reverse engineering capabilities reserved for verified security teams. The Trusted Access for Cyber program is expanding to thousands of individual defenders and hundreds of teams responsible for protecting critical software. The launch arrived one week after Anthropic disclosed that Mythos, its most capable model, had completed the UK AI Safety Institute's full thirty-two-step corporate cyber range simulation, succeeding in three of ten attempts and averaging twenty-two of thirty-two steps. The philosophies are opposite: Anthropic restricts; OpenAI arms the defenders. Different method, same message. The laboratories have concluded that cybersecurity deserves its own infrastructure, its own gating, its own deployment strategy. The models are no longer merely intelligent. They are being weaponized, with the best of intentions and the oldest of pretexts. One is reminded that the word "defense" has preceded every escalation in recorded history; the escalation, of course, was always someone else's fault.

Denise Dresser, OpenAI's Chief Revenue Officer, sent a four-page memorandum that someone leaked before the digital ink had dried. The lines that matter: "Multi-product adoption makes us harder to replace." And on Anthropic: "Their story is built on fear, restriction, and the idea that a small group of elites should control AI." Dresser accused Anthropic of inflating its stated run rate by approximately eight billion dollars through accounting treatment. The memorandum reveals a company that has stopped thinking in products and begun thinking in platforms. "Spud," a model tuned for reasoning and reliability. "Frontier," a platform designed to loose autonomous agents on enterprise workflows. GPT-5.4 and Codex integrated into Cloudflare's edge. A London office expanding past five hundred employees. The acquisition of Hiro, a personal finance startup. Greg Brockman preaching the compute-powered economy where human intent is the only bottleneck. It is the strategy of an organization with many products and little margin for error: convert the sum into an ecosystem and the ecosystem into gravity. The user enters through the chatbot door and discovers that the house has many rooms, all furnished, none with a visible exit (this being, of course, the definition of a good hotel and an excellent trap).

The computation is hungry and the kitchens are running low on gas. GitHub Copilot, OpenAI Codex, and Claude are all implementing usage caps as GPU shortage bites. Users report degraded reasoning depth in Claude, a probable symptom of Anthropic throttling effort settings to manage scarcity. Co-founder Jack Clark confirmed that the company briefed the Trump administration on Mythos. Nvidia Blackwell GPU rental prices climbed forty-eight percent in two months. The contradiction that no one states aloud is precisely the one that defines the moment: the same companies promising autonomous agents running twenty-four hours a day are rationing the computational cycles needed to make them function. It is the familiar pattern of every promised abundance — the brochure arrives before the infrastructure, the subscription before the capacity, the vision before the kilowatt-hours. The consumer has purchased the ticket. The runway exists. The aircraft, for the moment, waits. One suspects it will wait rather longer than the brochure suggested, though the brochure, naturally, has been updated to reflect "a phased rollout aligned with our commitment to responsible scaling."

Microsoft quietly absorbed OpenAI's Stargate Norway data center, the arctic-circle facility that Sam Altman announced last July. Thirty thousand Nvidia Vera Rubin chips leased from Nscale. It is the second OpenAI infrastructure project that Microsoft has taken over in thirty days. The word "partner" has been redefined so many times in this relationship that it no longer means what either party once intended it to mean. Microsoft invested over thirteen billion dollars in OpenAI and now holds a twenty-seven percent stake. But the percentage does not tell the complete story. What Microsoft possesses is not equity. It is infrastructure: servers, chips, data centers above the Arctic Circle. OpenAI has the models. Microsoft has the ground on which the models live. In every relationship between a brilliant tenant and a patient landlord, history tends to favor the party holding the keys — a tendency that the tenant typically discovers at the precise moment the lease terms are renegotiated.

Maine has become the first American state to prohibit large-scale data centers. The moratorium applies to any facility consuming more than twenty megawatts of power, effective until November 2027. The House passed the measure seventy-nine to sixty-two; the Senate, twenty-one to thirteen. Governor Janet Mills has indicated she wants an exemption for a planned five-hundred-and-fifty-million-dollar project at a former paper mill in Jay: "The people of Jay need those jobs, with appropriate guardrails on preserving water resources." Maine ranks fourth nationally in electricity prices. Residents organized against proposals at sites including the Bates Mill complex in Lewiston. Track Policy, an interactive global atlas launched this week, shows that twenty of fifty American states now have active or advancing data-center restrictions. It is the first time a community has said "no" to the computational appetite of artificial intelligence with the force of law. It will not be the last. What is notable is not the prohibition itself but what it reveals about the assumption it interrupts: that the infrastructure of intelligence would be welcomed everywhere, by everyone, because the alternative — being left behind — was presumed to be worse. Maine has, for the moment, chosen the alternative. Whether this constitutes wisdom or merely delay is a question that twenty megawatts of prohibited capacity will not answer, but that the people of Lewiston, whose electricity bills arrive monthly and whose patience does not, have decided they would rather ask than avoid.