DOOMSDAY BAKERY
Can the government force a private company to bake a doomsday cake?
Remember when America spent several years debating whether a man with a pastry tube could be forced to decorate a cake for a gay wedding? Masterpiece Cakeshop v. Colorado Civil Rights Commission rose all the way to the Supreme Court, which concluded that while states may protect gay customers from discrimination they must do so with proper religious neutrality. No hostile eye-rolling at the baker. No comparing Flambé to the Holocaust. Ice piping is Free Speech, bitches.
Now cut to Pete Hegseth — overcompensating homophobe… chin high, sleeves too tight, energy somewhere between a CrossFit Open House and half-time locker-room pep talk — putting the pressure on Anthropic to sell its super powerful AI for broad military use — using deadline theatrics and bullshit threats of invoking the wartime Defense Production Act. Subtlety has left the building.
Anthropic’s hesitation is adorably similar to the Cake Nazi: it doesn’t want its artistic creation used for something it doesn’t approve—in this case, mass domestic surveillance and/or autonomous kill weapons. It has the quaint view that if you’ve built something is eventually destined to outthink you, and maybe go rogue, maybe shouldn’t bolt it to a thermonuclear launch vehicle.
And now the cake metaphor returns — bench-pressing in tactical body armor and yelling about freedom.
The Colorado baker worried that selling a wedding cake would essentially brand him as endorsing a same-sex union. He didn’t want his artistry attached to something that violated his core principles. The Supreme Court concurred.
NO CAKE FOR YOU!
Anthropic now appears to be making a structurally similar argument: We built this thing. It reflects our judgment, our values, our constraints. We’d rather not see it repurposed as an existential doomsday device, thank you.
The political irony here is rich enough to require a double dose of Ozempic.
Hegseth exudes the toxic he-man aesthetic — the hyper-flexing, the push-up challenges, the aggressively hetero signaling — all wrapped in performative disdain for anything perceived as weak, coastal, or insufficiently macho. And yet here we are, watching a government that bristles at rainbow flags demanding access to an agentic model named Claude — a creation born in San Francisco — to perform things that would make even the most dystopian screenwriter blink twice.
If you apply the cake logic consistently, you get an uncomfortable symmetry: Just as the baker claimed he shouldn’t be forced to contribute his creations to a wedding he didn’t endorse, Anthropic can argue it shouldn’t be compelled to contribute its creative code to a government whose vibes it finds… potentially existential.
After all, this isn’t a Costco sheet cake with “Congrats Steve & Brad” ice piping. It’s a darkly mysterious, self-rewriting alien brain that, if mis-handled could surveil, select, and destroy… possibly everything. If a cake is protected speech, this is a primal scream.
The Pentagon insists it has no intention of surveilling Americans or removing human decisions from the kill chain… says the roided up guy who never stops talking about “warrior ethos” and “lethality.” The deeper question is this: Can the state compel a private company to hand over a continually evolving dangerous creation just because it wants it?
If icing is speech, then algorithms are essays. And if a baker could refuse a wedding cake on conscience grounds, a tech company can refuse to bake the apocalypse.
P.S. As I’ve argued in previous posts here, here, and here, there still aren’t meaningful guardrails to slow down this runaway train. We’ve outsourced AI safety to the very companies racing to win. And now the one entity supposedly tasked with protecting us — the State — wants to take control — and it’s not to install brakes.


Brilliant and as usual, you’ve given me yet another thing to keep me awake at night. Yikes!