In a significant legal battle, Anthropic is receiving support from an unexpected source: employees of competing AI firms. Over 30 employees from OpenAI and Google DeepMind, including Google chief scientist Jeff Dean, have filed an amicus brief arguing that the Pentagon’s decision to label Anthropic as a “supply-chain risk” could have detrimental effects on the entire American AI industry.
The brief, submitted just hours after Anthropic announced two lawsuits against the government, warns that the Pentagon’s blacklist could undermine the U.S.’s industrial and scientific competitiveness in artificial intelligence. “This effort to punish one of the leading U.S. AI companies will undoubtedly have consequences for the United States’ industrial and scientific competitiveness in the field of artificial intelligence and beyond,” the employees stated in the filing. This solidarity among rival employees underscores a larger dispute about the direction of AI governance and control.
The tensions between Anthropic and the Trump administration escalated dramatically last week after the two parties failed to agree on a revised contract detailing the deployment of Anthropic’s AI model, Claude. Anthropic sought to impose “redlines” to prevent the model’s use in domestic mass surveillance and autonomous weaponry. However, the Pentagon insisted on broader language that allowed for “all lawful use” by the military. Anthropic’s refusal to comply with these terms led to the cancellation of its government contracts and its designation as a national security risk.
Shortly after these negotiations collapsed, OpenAI secured its own deal with the Pentagon, apparently agreeing to conditions that Anthropic had previously rejected. The contrasting outcomes ignited a contentious exchange between the two companies’ CEOs. Anthropic CEO Dario Amodei criticized OpenAI’s approach as “safety theater,” and accused OpenAI CEO Sam Altman of disseminating “straight up lies.” In response, Altman suggested that it is “bad for society” when companies operate outside democratic norms, implying that Amodei’s critiques were politically motivated.
This unusual alignment among employees from rival firms is noteworthy. While signatories of the amicus brief claimed to have signed in a personal capacity, their actions follow an earlier open letter signed by nearly 900 employees at Google and OpenAI. This letter urged their leadership to reject government requests for using AI in domestic mass surveillance or in autonomous lethal targeting—issues that echo the redlines raised by Anthropic in its Pentagon negotiations.
The fallout from Anthropic’s predicament could extend beyond its immediate legal challenges, potentially igniting a broader employee revolt against corporate management in the tech sector. Historical context underscores this concern; for instance, Google faced significant employee backlash in 2018 when it considered collaborating with the Pentagon on Project Maven, which aimed to utilize AI for analyzing drone surveillance. Employee objections ultimately led Google to decline renewing its contract for analyzing aerial surveillance data, which was subsequently picked up by Amazon and Microsoft.
The ongoing conflict between Anthropic and the Pentagon highlights critical questions regarding the control and ethical use of AI technologies, as well as the relationship between the tech industry and government authorities. As the situation develops, it could have lasting implications for governance in the AI sector and raise awareness among tech workers about their rights and responsibilities in shaping the future of artificial intelligence.
See also
Anthropic Sues US Government Over Military AI Classification Amid OpenAI’s Defense Deals
China’s AI Strategy Gains Praise at Hong Kong Forum, Boosting Regional Adoption and Innovation
GSA Proposes Draft AI Contract Terms Granting Broad Usage Rights to Federal Agencies
OpenClaw Gains Traction Among Chinese Local Governments Amid Risk Warnings
Germany Eyes Anthropic as US Blacklists AI Giant Amid Digital Sovereignty Debate

















































