For the first time, Washington is getting closer to deciding how to regulate artificial intelligence. And the race brewing isn’t about technology, it’s about who gets to regulate.
In the absence of a meaningful federal AI standard focused on consumer safety, states have introduced dozens of bills to protect residents from AI-related harm, including California’s AI Safety Bill SB-53 and Texas’ AI Responsible Governance Act, which prohibits the intentional misuse of AI systems.
Tech giants and bustling Silicon Valley-born startups argue that such laws create an unenforceable patchwork that threatens innovation.
“It will slow us down in the race against China,” Josh Vlasto, co-founder of the pro-AI PAC Leading the Future, told TechCrunch.
The industry and lots of it transplants at the White House, is pushing for a national standard or none at all. In the trenches of this all-or-nothing battle, new efforts have emerged to bar states from enacting their own AI legislation.
House lawmakers are reportedly trying to use the National Defense Authorization Act (NDAA) to block state AI laws. At the same time, a leaked White House draft executive order also demonstrates strong support for preempting government efforts to regulate artificial intelligence.
A sweeping prompt that would strip states of their rights to regulate artificial intelligence is unpopular in Congress, which voted overwhelmingly against a similar moratorium earlier this year. Lawmakers have argued that without a federal standard, states that block consumers will leave consumers exposed to harm and tech companies free to operate without oversight.
Techcrunch event
San Francisco
|
13-15 October 2026
To create this national standard, Representative Ted Lieu (D-CA) and the bipartisan House AI Task Force are preparing a package of federal AI bills that cover a range of consumer protections, including fraud, health care, transparency, child safety and catastrophic risk. A megabill like this will likely take months, if not years, to become law, underscoring why the current rush to limit state power has become one of the most contentious battles in AI policy.
The battle lines: NDAA and EO
Efforts to prevent states from regulating artificial intelligence have increased in recent weeks.
The House has considered including language in the NDAA that would prevent states from regulating artificial intelligence, Majority Leader Steve Scalise (R-LA) said. Punchbowl News. Congress was reportedly working to finalize a deal on the defense bill before Thanksgiving, Politico reported. A source familiar with the matter told TechCrunch that the negotiations focused on narrowing the scope to preserve state authority in areas such as child safety and transparency.
Meanwhile, a leaked White House EO The draft reveals the administration’s own potential proactive management strategy. The EO, which has reportedly been put on hold, would create an “Operation AI Litigation” to challenge state AI laws in court, direct agencies to evaluate state laws deemed “burdensome,” and push the Federal Communications Commission and Federal Trade Commission toward national standards that override state rules.
Specifically, the EO would give David Sacks – Trump’s AI and Crypto Czar and co-founder of VC firm Craft Ventures – co-head of creating a single legal framework. That would give Sacks direct influence on AI policy replacing the typical role of the White House Office of Science and Technology Policy and its head, Michael Kratsios.
Sachs has publicly advocated curbing state regulation and keeping federal oversight insufficient, favoring industry self-regulation to “maximize growth.”
The patchwork argument
Sacks’ position reflects the view of much of the AI industry. Several pro-AI super PACs have emerged in recent months, pouring hundreds of millions of dollars into local and state elections to oppose candidates who support AI regulation.
Leading the Future – backed by Andreessen Horowitz, OpenAI president Greg Brockman, Perplexity and Palantir co-founder Joe Lonsdale – has raised more than $100 million. This week, Leading the Future presented one $10 million campaign pushing Congress to craft a national AI policy that overrides state laws.
“When you’re trying to drive innovation in technology, you can’t have a situation where all these laws keep coming up by people who don’t necessarily have the technical expertise,” Vlasto told TechCrunch.
He argued that a patchwork of government regulations would “slow us down in the fight against China.”
Nathan Leamer, executive director of Build American AI, the PAC’s advocacy arm, confirmed the group supports preemption without specific federal consumer protections for AI. Leamer argued that existing laws, such as those dealing with fraud or product liability, are sufficient to address the harms of artificial intelligence. Where state laws often seek to prevent problems before they arise, Leamer advocates a more reactive approach: let companies move quickly, deal with problems later in court.
No preemption without representation


Alex Bores, a New York Assemblyman running for Congress, is one of Leading the Future’s first targets. He supported the RAISE Act, which requires major AI labs to have security plans to prevent critical breaches.
“I believe in the power of artificial intelligence, and that’s why it’s so important to have sensible regulations,” Bores told TechCrunch. “Ultimately, the AI that is going to win in the market will be trusted AI, and often the market undervalues or gives poor short-term incentives to invest in security.”
Bores supports a national AI policy, but argues that states can move more quickly to address emerging risks.
And it is true that states move faster.
As of November 2025, 38 states have adoptive More than 100 AI-related laws this year, mostly targeting deepfakes, transparency and disclosure, and government use of AI. (A recent one study found that 69% of these laws impose no requirements at all on AI developers.)
Activity in Congress provides more evidence for the slower-than-states argument. Hundreds of AI bills have been introduced, but few have passed. Since 2015, Rep. Lieu has introduced 67 bills in the House Science Committee. Only one became law.
More than 200 lawmakers signed on open letter opposing preemption to the NDAA, arguing that “states function as laboratories of democracies” that must “retain the flexibility to address new digital challenges as they arise.” Almost 40 attorneys general also sent an open letter opposes a state ban on AI regulations;
Cybersecurity expert Bruce Schneier and data scientist Nathan E. Sanders – authors of Reconnecting Democracy: How artificial intelligence will transform our politics, government and citizenship – argue that the patchwork complaint is overblown.
AI companies already comply with stricter EU regulations, they note, and most industries find a way to work around various state laws. The real motive, they say, is to avoid accountability.
What might a federal standard look like?
Lieu is drafting a more than 200-page megabill that he hopes to introduce in December. It covers a range of topics such as penalties for fraud, Deepfake protectionscomplaint protection, resource calculation for academia and mandatory testing and disclosure for major language modeling companies.
This latest provision would require AI labs to test their models and publish the results – which most do voluntarily now. Lieu has yet to introduce the bill, but said it does not direct any federal agency to directly review AI models. This is different from a similar one bill introduced by Sens. Josh Hawley (R-MS) and Richard Blumenthal (D-CN), which would require a government evaluation program for advanced AI systems before they are deployed.
Lieu acknowledged that his bill would not be as strict, but said he had a better chance of making it law.
“My goal is to put something into law this term,” Lieu said, noting that House Majority Leader Scalise is openly hostile to AI regulation. “I’m not writing a bill that I would have if I were king. I’m trying to write a bill that could pass a Republican-controlled House, a Republican-controlled Senate and a Republican-controlled White House.”
