Executive Summary
The Department of War's supply chain restriction against Anthropic for refusing mass surveillance applications reveals a fundamental power dynamic that will reshape AI company valuations. The Pentagon threatened to destroy Anthropic's business relationships with Amazon, Nvidia, Google, and Palantir unless these companies cordoned off their AI services from Pentagon work. This coercive precedent exposes a critical vulnerability: as AI becomes ubiquitous in all products by 2028, big tech companies may face binary choices between their AI providers and government contracts. The analysis suggests companies with lower government revenue exposure—particularly those focused on consumer and commercial markets—will gain structural advantages as AI regulation intensifies. Mass surveillance capabilities will cost just $300 million by 2028 (down from $30 billion today due to 10x annual cost reductions), making technical feasibility a non-constraint. The real constraint becomes political norms and corporate resistance. Companies like Google and Amazon, with significant AWS government business, face the highest regulatory capture risk. Meanwhile, pure-play AI infrastructure providers with diversified commercial revenue streams may command premium valuations as 'regulation-resistant' alternatives. The broader thesis centers on AI alignment becoming a government control mechanism rather than a safety framework, creating a new category of political risk for AI-dependent businesses.
Key Insights
what The Hosts said“while this year might cost $30 billion, next year it will cost $3 billion, the year after that $300 million, and by 2030, it will be less expensive to monitor every single nook and cranny in this country than it is to remodel the White House”
what The Hosts said“The federal government has contracts with all the other big tech companies that Anthropic relies on for chips and for funding. And it could make a soft, unspoken condition, or maybe even an explicit condition of such contracts that those companies no longer do business with Anthropic”
what The Hosts said“The underlying terms here, like catastrophic risk or threats to national security or autonomy risk, are so vague and so open to interpretation that you're just handing a fully loaded bazooka to a future power hungry leader”