
The Week the Harvest Weighed More Than the Land
This week they built the most dangerous machine ever made and decided not to give it to anyone. The same week, one company committed six hundred billion dollars over five years to build more. Between those two decisions — the restraint and the appetite — fits everything we are.
Anthropic built an artificial intelligence model with ten trillion parameters and called it Mythos. In testing it found a vulnerability in OpenBSD that had been hiding in the code for twenty-seven years. It found another in FFmpeg that had survived sixteen years and five million automated tests. It chained Linux kernel exploits with no human guidance. It did this overnight, while the engineers slept. On SWE-bench Verified, the standard software engineering benchmark, it scored 93.9 percent. The previous model, Opus 4.6, scored 80.8. Thirteen points apart. In a field where two points make news, thirteen are something else. Anthropic looked at what it had made and decided not to release it. They called it Project Glasswing. Twelve organizations have access: Amazon, Apple, Google, Microsoft, Nvidia, CrowdStrike, Cisco, Broadcom, JPMorgan Chase, Palo Alto Networks, the Linux Foundation. No one else. A hundred million dollars in compute credits so the defenders can find the flaws before the attackers do. Boris Cherny, creator of Claude Code, said Mythos should feel terrifying. He is right. It is terrifying because it works.
The Internet Bug Bounty, the program that since 2012 had paid researchers to find flaws in the software that holds the internet together, closed its doors on March thirty-first. HackerOne, which administered it, published a statement: AI-assisted research has expanded vulnerability discovery beyond remediation capacity. Just like that. In January, cURL had shut down its bounty program for the same reason: a flood of machine-generated reports. In March, Google suspended automated submissions to its open-source vulnerability reward program. Node.js still accepts reports but no longer pays. The balance between finding flaws and fixing them broke. The machine finds faster than the human repairs. No one asked what happens when the one searching for holes works twenty-four hours and the one patching them works eight.
OpenAI announced plans to spend a hundred and twenty-one billion dollars on compute in 2028 alone. Eighty-five billion in projected losses that year. Sam Altman has committed the company to six hundred billion dollars in infrastructure spending over five years, with a Q4 IPO on the horizon. The numbers exist in a realm where comprehension fails and only comparison helps: OpenAI's single-year compute budget would exceed the GDP of more than a hundred countries. Anthropic, by contrast, projects thirty billion in compute costs by 2029 — roughly four times more capital-efficient, spending a quarter of what OpenAI plans while having already surpassed it in revenue. Two companies building toward the same horizon. One sprints. The other walks. Both arrive at a place where the money required to keep the machine running exceeds what most nations produce in a year.
Anthropic signed a deal with Google and Broadcom for multiple gigawatts of next-generation compute. Ironwood chips, Google's seventh-generation TPU: nine thousand two hundred sixteen chips per Pod, 42.5 exaflops per Pod, more than twenty-four times the power of El Capitan, the world's fastest supercomputer. Broadcom turns the designs into manufacturable silicon. Capacity begins arriving in 2027. Anthropic disclosed that its annualized revenue now exceeds thirty billion dollars. At the end of 2025 it was nine billion. In February it was fourteen billion. In April, thirty billion. More than a thousand companies pay more than a million dollars a year to use Claude. Eight of the world's ten largest. In February they raised thirty billion dollars in a Series G round at a valuation of three hundred eighty billion. The money goes to the chips. The chips go to the models. The models go where the money goes.
Samsung reported operating profit of 57.2 trillion won in the first quarter, roughly thirty-eight billion dollars. Eight and a half times more than the same quarter a year ago. A record. The previous quarter had been a record too, at twenty trillion won. This one tripled it. DRAM contract prices rose 39.8 percent in a single quarter. NAND prices rose 208.8 percent. HBM prices doubled. All for the same reason: artificial intelligence data centers that need memory the way land needs water. Samsung began delivering HBM4 chips to Nvidia in February. An analyst at Heungkuk Securities projected the next quarter could reach seventy-five trillion won. Fifty billion dollars. Just like that, as one might say the weight of the harvest.