
The Week What Was Lost Made No Sound
A video program that cost a million dollars a day shut down and nobody missed it. Two artificial intelligences sat down to judge each other. A car learned to speak. Someone founded an institute to protect us from something it does not yet understand. And a pair of glasses learned to see for those who can no longer see alone.
Sora lasted six months. OpenAI launched it in September 2025 with the promise that anyone could make films with words. A million people tried it. By December, fewer than five hundred thousand remained. The program consumed a million dollars a day in compute. The Wall Street Journal published its investigation on March 29. Sam Altman made the decision to shut it down, freeing the chips and redirecting them to a new project named after a tuber: Spud. Disney had signed a multiyear deal in December for two hundred characters, including Mickey Mouse and Darth Vader, with a planned investment of one billion dollars. Disney executives learned of the shutdown less than an hour before the public announcement. What surprises most is not that Sora failed. It is that its disappearance left no gap. The tool that was going to democratize cinema left the way photo development shops leave: one day they are simply no longer there.
Microsoft introduced two systems in the same week, and both do the same thing: distrust. Council takes a question, delivers it simultaneously to an Anthropic model and an OpenAI model, and then a third model, the judge, compares both answers. It identifies where they agree, where they diverge, what each offers that the other omits. Nicole Herskowitz, Corporate Vice President of Microsoft 365, described it as taking collaboration between models to the next level. What she did not say is more interesting: that the largest software company in the world no longer trusts a single artificial intelligence to tell the truth. The solution is not to build a more honest machine. It is to set two machines to watch each other. The cost is two and a half times that of a single query. The price of distrust is always paid in cash.
On April 1, ChatGPT arrived on the dashboard. OpenAI released a dedicated app for Apple CarPlay, available with iOS 26.4. The screen shows no text, no images. Only a button to speak and one to silence. Apple requires these apps to operate exclusively by voice in the driving environment. The driver speaks. The machine responds. There is no visible record of the conversation on the screen, only in the phone history. CarPlay also opened the door to Claude and Gemini with the same update. Three artificial voices competing for the attention of someone who should be watching the road. At some point in recent history, the silence inside a car ceased to be enough. Now the car converses. Not because the driver needs it, but because the company needs the driver never to stop talking.
Perplexity founded an institute for the security of artificial intelligence. They called it the Secure Intelligence Institute. It is directed by Dr. Ninghui Li, professor of computer science at Purdue, a fellow of both the ACM and IEEE, with more than two hundred publications in security and privacy. The first project is called BrowseSafe: an open-source model that examines the HTML of a web page and determines whether it contains malicious instructions aimed at an AI agent. The question nobody asked aloud is why this did not exist before. The answer is the usual one: first you build the house, then you install the lock. First you invent the autonomous browser, then you discover that someone can poison it with a line of code hidden in the footer.
Meta announced smart glasses for people who need prescriptions. Two new models: Blayzer, rectangular, and Scriber, rounded. Starting price: four hundred and ninety-nine dollars. Pre-orders from March 31, retail in optical shops from April 14. The glasses include a camera, voice assistant, open-ear speakers, pedestrian navigation, neural handwriting dictation. Meta noted that billions of people worldwide wear corrective lenses, and that until now they had been excluded from the smart glasses market. The statement is precise. It is also revealing. The market was not losing customers. It was losing billions of eyes. Now it has them. The glasses that once corrected vision now also capture it. The optician in Condesa who ground lenses by hand does not compete with this. Not because his product is inferior, but because his product does not collect data.