SB 53, AI’s security and transparency bill that California Governor Gavin Newsom signed the law this week proves that state regulation does not need to prevent AI progress.
So, says Adam Billen, Vice President of Public Policy in the youth defense team, codes AI in today’s episode of Equity.
“The reality is that policy -makers themselves know that we need to do something and know from working on a million other issues that there is a way to pass legislation that really protects innovation – which I care about – while ensuring that these products are safe,” he said.
At its core, the SB 53 is a first-in-national bill that requires large AI laboratories to be transparent about safety and security protocols-especially about how they prevent their models from catastrophic dangers, as used to commit cyberbully in critical infrastructure or biomedical. The law also states that companies remain on these protocols, which will be imposed by the Emergency Services Office.
“Companies are already doing the things we ask them to do in this bill,” Billen told TechCrunch. “They do security tests on their models. They release model cards. They start wiping in some areas in some companies?
Billen also noted that some AI companies have a policy of relaxing safety standards under competitive pressure. Openai, for example, has publicly stated that it can “adjust” its safety requirements if an AI Lab opponent releases a high -risk system without similar safeguards. Billen argues that policy can impose existing company security promises, preventing them from cutting the angles under competitive or economic pressure.
While public contrast to SB 53 declined compared to the predecessor of SB 1047, which Newsom vetoed last year, the rhetoric at Silicon Valley and among most AI laboratories was that almost every arrangement of AI is votive to move forward and eventually prevent the US in its rarity.
TechCrunch event
Francisco
|
27-29 October 2025
That is why companies such as Meta, VCS such as Andreessen Horowitz and powerful people such as Openai Greg Brockman president are collectively hundreds of millions of super Pacs to support Pro-AI politicians in the state elections. And that is why the same forces earlier this year pushed a AI moratorium that would have forbidden the states from regulating the AI for 10 years.
The coding of AI ran a coalition with more than 200 organizations to work to remove the proposal, but Billen says the race is not over. Senator Ted Cruz, who defended the moratorium, is attempting a new strategy to achieve the same goal of the federal preference of state laws. In September, Cruz introduced the Sandbox actwhich would allow AI companies to apply for exemptions from temporarily to bypass some federal regulations for up to 10 years. Billen also awaits an upcoming bill that creates a Federal AI standard that could be placed as a medium -ground solution, but will actually replace state laws.
He warned that the narrow meager federal legislation could “delete federalism for the most important technology of our time”.
“If you told me that the SB 53 was the bill that would replace all state accounts for everything related to AI and all potential dangers, I would tell you that it is probably not a very good idea and that this bill is designed for a particular subset of things,” Billen said.
While agreeing that the AI struggle with China is important and that politics responsible must adopt an arrangement that will support US progress, says it will kill state accounts – which focus mainly on the depth, transparency, algorithmic discrimination, security of children and security.
“Are the bills like SB 53 the thing that will prevent us from defeating China? No,” he said. “I think it’s really really mentally dishonest to say that this is the thing that will stop us in the race.”
He added: “If the thing you care about is to beat China in the AI race – and I care about it – then the things you will push are things like export controls in Congress,” Billen said. “You will make sure US companies have chips, but this is not the industry.”
Legislative proposals such as Chip security law The aim of preventing the diversion of advanced AI chips in China through export controls and surveillance devices, and the law on existing chips and science seeks to boost the production of domestic chips. However, some large technology companies, including Openai and Nvidia, expressed your reluctance or opposition to some aspects of these efforts, reporting concerns about effectivenessCompetitiveness and security vulnerabilities.
Nvidia has the reasons – it has a strong financial incentive to continue selling chips in China, which has historically represented a significant part of its worldwide revenue. Billen estimated that Openai could contain the defense of chips exports to remain in the good grace of critical suppliers such as Nvidia.
There were also inconsistent messages from Trump’s administration. Three months after expanding the prohibition of exporting the advanced AI chips to China in April 2025, the administration was reversed, allowing NVIDIA and AMD to sell some china to China in return for 15% of revenue.
“You see people on the hill moving to accounts such as the chip safety law that would put export controls in China,” Billen said. “In the meantime, it will continue to be this support of the narrative to kill state accounts that are actually light enough hard.”
Billen added that SB 53 is an example of democracy in action – industry and policies working together to reach a version of a bill that everyone can agree on. It is “very ugly and dirty”, but “this process of democracy and federalism is the whole foundation of our country and our economic system and I hope we will continue to do so successfully.”
“I think SB 53 is one of the best points of proof that can still work,” he said.
This article was first published on 1 October.
