The race to regulate AI has sparked a federal vs state showdown

For the first time, Washington is close to a decision on how to regulate artificial intelligence. And the battle that is going on is not about the technology, but about who gets to do the regulation.
In the absence of a meaningful federal AI standard that focuses on consumer safety, states have introduced dozens of bills to protect residents from AI-related harm, including California’s AI safety law SB-53 and the Texas Responsible AI Governance Act, which prohibits intentional misuse of AI systems.
The tech giants and vibrant startups born in Silicon Valley argue that such laws create an unworkable patchwork that threatens innovation.
“It will slow us down in the race against China,” Josh Vlasto, co-founder of pro-AI PAC Leading the Future, told TechCrunch.
The industry, and some of it transplants in the White House, is pushing for a national standard or none at all. In the trenches of that all-or-nothing battle, new efforts have emerged to ban states from enacting their own AI legislation.
House lawmakers are reportedly trying to use the National Defense Authorization Act (NDAA) to block state AI laws. At the same time, a leaked draft of a White House executive order also shows strong support for preventing state efforts to regulate AI.
A sweeping privilege that would strip states of the right to regulate AI is unpopular in Congress, which voted overwhelmingly against a similar moratorium earlier this year. Lawmakers have argued that without a federal standard, blocking states will expose consumers to harm, and tech companies will be free to operate without oversight.
WAN event
San Francisco
|
October 13-15, 2026
To create that national standard, Rep. Ted Lieu (D-CA) and the bipartisan House AI Task Force are preparing a package of federal AI bills that cover a range of consumer protections, including fraud, health care, transparency, child safety, and catastrophic risk. It will likely take months, if not years, for a megabill like this to become law, underscoring why the current rush to limit state authority has become one of the most contentious battles in AI policy.
The Battle Lines: NDAA and the EO

Efforts to prevent states from regulating AI have intensified in recent weeks.
The House has considered putting language in the NDAA that would prevent states from regulating AI, Majority Leader Steve Scalise (R-LA) said. Punchbowl News. Congress was reportedly working to finalize a deal on the defense bill before Thanksgiving. Politico reported this. A source familiar with the matter told TechCrunch that negotiations have focused on narrowing the scope to potentially preserve state authority in areas such as child safety and transparency.
Meanwhile, a leaked White House EO The draft reveals the government’s own potential preemption strategy. The EO, which has reportedly been put on hold, would create an “AI Litigation Task Force” to challenge states’ AI laws in court, direct agencies to review state laws deemed “burdensome,” and push the Federal Communications Commission and Federal Trade Commission toward national standards that override state rules.
Notably, the EO would give David Sacks – Trump’s AI and Crypto Czar and co-founder of the venture capital firm Craft Ventures – co-leadership in creating a unified legal framework. This would give Sacks direct influence over AI policy, replacing the typical role of the White House Office of Science and Technology Policy, and its head Michael Kratsios.
Sacks has publicly advocated blocking state regulation and subordinating federal oversight, promoting industry self-regulation to “maximize growth.”
The patchwork argument
Sacks’ position reflects the position of much of the AI industry. Several pro-AI super PACs have emerged in recent months, pouring hundreds of millions of dollars into local and state elections to oppose candidates who support AI regulation.
Leading the Future – backed by Andreessen Horowitz, OpenAI president Greg Brockman, Perplexity and Palantir co-founder Joe Lonsdale – has raised more than $100 million. This week Leading the Future launched one $10 million campaign Pushing Congress to create a national AI policy that overrides state laws.
“If you’re trying to drive innovation in the tech sector, you can’t have a situation where all these laws keep popping up from people who don’t necessarily have the technical expertise,” Vlasto told TechCrunch.
He argued that a patchwork of state regulations will “slow us down in the race against China.”
Nathan Leamer, executive director of Build American AI, the PAC’s advocacy group, confirmed that the group supports preemption without AI-specific federal consumer protections. Leamer argued that existing laws, such as those addressing fraud or product liability, are sufficient to address the harms of AI. While state laws often try to prevent problems before they arise, Leamer favors a more reactive approach: Let companies act quickly and address problems in court later.
No preemption without representation

Alex Bores, a member of the New York Assembly running for Congress, is one of the early targets of Leading the Future. He sponsored the RAISE Act, which requires major AI labs to have safety plans in place to prevent critical harm.
“I believe in the power of AI, and that’s why it’s so important to have reasonable regulations,” Bores told TechCrunch. “Ultimately, the AI that will win in the market will be a reliable AI, and often the market underestimates investments in security or provides poor incentives in the short term.”
Bores supports a national AI policy, but argues that states can take faster action to address emerging risks.
And it is true that states are acting faster.
As of November 2025, 38 states have done so accepted This year, more than 100 AI-related laws have been passed, mainly focused on deepfakes, transparency and disclosure, and government use of AI. (A recent one study found that 69% of those laws place no requirements on AI developers at all.)
Activity in Congress provides more evidence for the slower-than-states argument. Hundreds of AI bills have been introduced, but few have been passed. Since 2015, Representative Lieu has introduced 67 bills in the House Science Committee. Only one became law.
More than 200 lawmakers signed one open letter opposes preemption in the NDAA, arguing that “states serve as laboratories of democracies” that “must retain the flexibility to meet new digital challenges as they arise.” Nearly forty attorneys general also sent an open letter oppose a state ban on AI regulation.
Cybersecurity expert Bruce Schneier and data scientist Nathan E. Sanders – authors of Rewiring Democracy: How AI will transform our politics, government and citizenship – argue that the patchwork complaint is exaggerated.
AI companies already comply with stricter EU regulations, they note, and most industries find a way to operate under different state laws. The real motive, they say, is to avoid responsibility.
What might a federal standard look like?
Lieu is drafting a megabill of more than 200 pages that he hopes to introduce in December. It covers a range of topics such as fines for fraud, deepfake protectionswhistleblower protection, computer resources for academia, and mandatory testing and disclosure for major language model companies.
The latter provision would require AI labs to test their models and publish results – something most people now do voluntarily. Lieu has not yet introduced the bill, but he said it does not direct federal agencies to directly review AI models. That’s different from a similar one account introduced by Sens. Josh Hawley (R-MS) and Richard Blumenthal (D-CN), which would have required a government-run evaluation program for advanced AI systems before they could be deployed.
Lieu acknowledged that his bill would not be as strict, but he said it had a better chance of being passed into law.
“My goal is to get something into law this term,” Lieu said, noting that House Majority Leader Scalise is openly hostile to AI regulation. “I’m not writing a bill that I would have if I were king. I’m trying to write a bill that a Republican-controlled House, a Republican-controlled Senate and a Republican-controlled White House could pass.”




