
The Week the Applause Was Automated
Five things happened this week. Each was celebrated. Each was adopted without protest. Each one removed a small friction that no one will miss and no one should have surrendered.
Google released a dictation application called Eloquent onto the App Store without telling anyone it had done so. No press event, no blog post, no announcement of any kind. The app appeared on Monday, was downloaded, and began working. It uses Gemma-based speech recognition models that run locally on the device; the voice never leaves the phone. It filters filler words automatically — the hesitations, the self-corrections, the small verbal debris of being human — and offers four modes of output: key points, formal, short, long. It imports vocabulary from Gmail to better recognize the proper nouns one uses most often. It is free, without subscription or usage cap. The App Store listing mentions a forthcoming Android version with system-wide keyboard integration, a floating button that can be summoned in any text field. One speaks imperfectly; the machine returns polish. The user need not know that anything was corrected. That, of course, is precisely the point.
In the first quarter of 2026, the number of new applications published to Apple's App Store rose by eighty-four percent — two hundred thirty-five thousand eight hundred new apps in three months. Between 2016 and 2024, new app submissions had declined by forty-eight percent. The reversal took a single quarter. The cause has a name that is now a dictionary entry: vibe coding, coined by OpenAI co-founder Andrej Karpathy in February 2025 and selected by Collins as word of the year. The tools that enable it — Claude Code, Codex, Replit — allow a person who has never written a line of code to publish a functional application in hours. Replit users alone have placed nearly five thousand apps in the store. Apple responded by blocking updates to Replit, Vibecode, and an app called Anything, citing Guideline 2.5.2: applications must be self-contained in their bundles. Review times leapt from forty-eight hours to more than thirty days. A new profession has emerged: vibe coding cleanup specialist, hired to repair what the machine assembled. The gate is wider than it has ever been. What comes through it is another question, and one that Apple — to its visible discomfort — would prefer not to answer.
Three in five Americans now say they have used artificial intelligence as a healthcare ally in the past three months. The figure comes from OpenAI's own survey, released alongside ChatGPT Health in January. Chengpeng Mou, OpenAI's Head of Business Finance, supplied the finer detail: six hundred thousand healthcare-related messages arrive every week from people living in what the system calls hospital deserts — communities where the nearest hospital is more than thirty minutes away by car. Seven in ten of these medical conversations happen outside standard clinical hours. A Gallup survey places American satisfaction with healthcare quality at its lowest point in twenty-four years; the public grades access at C-plus and cost at D-plus. The machine did not replace the doctor. It occupied the space the doctor left empty — the 2 a.m. question, the rural parking lot, the waiting room that does not exist. The convenience is real. The question of what was conceded in exchange for it is one that the convenience itself makes difficult to ask.
Utah approved a twelve-month pilot allowing an AI chatbot to renew psychiatric prescriptions. The company is Legion Health, based in San Francisco. The system can renew existing prescriptions for stable, low-risk patients — antidepressants, anxiolytics, the quiet medications of managed unhappiness. It cannot issue new prescriptions, adjust dosages, or touch controlled substances. If it detects suicidal ideation, mania, pregnancy, or severe side effects, it transfers the patient to a human. The approval was quiet. There was no national debate. A chatbot that renews antidepressants. The patient does not need to drive two towns over. Does not need to wait until Monday. Does not need to explain to another human being how he feels. One imagines Huxley would have appreciated the economy of it: the soma dispensed without the pharmacist, the discomfort removed before it could become awareness, the entire transaction completed in the time it takes to read a consent form that nobody reads.
A man in Greenville, South Carolina, created a singer who does not exist and placed him at number one on iTunes. His name is Dallas Ray Little. He operates under a label called Crunchy Records. The singer is called Eddie Dalton: gray-haired, soulful, bluesy, a face rendered by artificial intelligence, a voice generated by algorithms trained on patterns of human grief and longing. On March fifteenth, Little published the first single, "Another Day Old." By April fifth, Eddie Dalton occupied eleven positions on the iTunes top one hundred — positions 3, 8, 15, 22, 42, 44, 51, 58, 60, 68, and 79 — with an album at number three. All of it built on six thousand nine hundred actual sales. The YouTube comments are filled with praise: captivating, soulful, music from the heart. Many listeners do not know the singer does not exist. On TikTok, the videos carry an AI-generated content label. On YouTube, they do not. Suno, the largest AI music platform, generates seven million songs per day — the equivalent of Spotify's entire catalog every two weeks. The public did not protest. The public applauded. One suspects that if the applause itself were revealed to be synthetic, the audience would applaud that too.