As California Governor Gavin Newsom weighs signing or vetoing the fiercely contested AI safety bill SB 1047, SAG-AFTRA and two women’s groups are pushing him to approve it — adding even more voices to an already frenzied debate. The performers union, the National Organization for Women (NOW), and Fund Her have each sent letters to Newsom, all of which have been obtained by The Verge and are being published here for the first time.

The letters from SAG-AFTRA, NOW, and Fund Her highlight concerns about AI’s potential to cause catastrophic harm if the technology is left unregulated. SAG-AFTRA outlines SB 1047’s mandate for developers to test for and safeguard against AI-enabled disasters, like cyberattacks on critical infrastructure or bioweapon development. NOW and Fund Her cite grave warnings from people at the forefront of AI and discuss the technology’s potentially disproportionate impacts on vulnerable groups. 

SAG-AFTRA posted a call for support yesterday on X from its 160,000 members, which include stars like Scarlett Johansson and Tom Hanks. NOW, the largest feminist organization in the US with around 500,000 members, said it was motivated by expert claims “about how dangerous this incredible technology can be if it is not developed and deployed responsibly.” Fund Her, a PAC that has helped elect 12 progressive women to prominent positions in California in 2022, writes of the “race to develop the first independent thinking AI,” at which point “it will be too late to impose safety guardrails.”

SAG-AFTRA and NOW represent the latest power players to weigh in on the California bill, which has become the object of exceptional national interest and scrambled conventional partisan boundaries. 

SB 1047, authored by state Senator Scott Wiener, would be the most significant AI safety law in the US. It establishes civil liability for developers of next-generation AI models like ChatGPT if they cause disasters without implementing appropriate safeguards. The bill also includes whistleblower protections for employees of AI companies, garnering support from OpenAI whistleblowers Daniel Kokotajlo and William Saunders. 

“The AI safety standards set by California will change the world”

NOW writes in its letter that “the AI safety standards set by California will change the world,” a view echoed by bill cosponsor Dan Hendrycks, director of the Center for AI Safety. Hendrycks tells The Verge that SB 1047 could be Newsom’s “Pat Brown moment,” referring to California’s then-governor signing a groundbreaking auto tailpipe emissions law in 1966. He quotes what’s since become known as the California Effect: “where California leads on important regulation, the rest of the country follows.”

Having passed both houses of the state legislature with strong majorities in late August, the bill now awaits Governor Newsom’s decision, due by September 30th. The governor’s office said it doesn’t “typically comment on pending legislation. This measure will be evaluated on its merits.”

This comment notwithstanding, the fate of SB 1047 may come down to a political calculation — a reality each side appears to recognize as they marshal support in the bill’s final hours. 

The odd political coalitions that have emerged in the fight over SB 1047 augur a topsy-turvy future for AI policy battles. Billionaire Elon Musk aligns with social justice groups and labor unions in supporting the bill, while former House Speaker Nancy Pelosi, progressive House Congressman Ro Khanna, Trump-supporting venture capitalist Marc Andreessen, and AI “godmother” Fei-Fei Li are all opposed.

AI is the rare issue that hasn’t yet sorted into clear partisan camps. As the technology grows in importance, the debate over how to govern it is likely to grow in intensity and may continue to scramble the usual allegiances.

These recent letters join support for the bill from organizations like the nearly 2-million-strong SEIU and the Latino Community Foundation. 

SAG-AFTRA has been home to some of the most organic anti-AI sentiment. Many screen actors see generative AI as an existential threat to their livelihoods. The use of the technology was a major sticking point in the 2023 actors strike, which resulted in a requirement that studios get informed consent from performers before creating digital replicas of them (actors must also be compensated for their use). 

“SAG-AFTRA knows all too well the potential dangers that AI poses”

The union’s letter writes that “SAG-AFTRA knows all too well the potential dangers that AI poses,” citing problems experienced by its members in the form of nonconsensual deepfake pornography and theft of performers’ likenesses. It concludes that “policymakers have a responsibility to step in and protect our members and the public. SB 1047 is a measured first step to get us there.”

In a phone interview, organization president Christian Nunes said NOW got involved because the group is worried about how unregulated AI can affect women. She and NOW have previously supported efforts to ban nonconsensual deepfakes. 

In the NOW letter, Nunes writes that the dangers warned of by AI experts “would disproportionately fall on vulnerable groups, including women.” She highlights Newsom’s “courageous support for us in the face of intense lobbying pressure” on reproductive rights, equal pay, and paid family leave, and that this support “is one of the reasons why women have voted for [him] time and time again.” 

While SB 1047 isn’t explicitly designed to address these groups’ more central concerns, the organizations seem to see strategic value in joining the coalition behind it. Nunes told The Verge she views the bill as part of a broader project to hold Big Tech accountable.

This support for SB 1047 complements other pending AI legislation that more directly addresses these groups’ specific issues. For instance, the federal NO FAKES Act aims to combat deepfakes, while another AI bill on Newsom’s desk, endorsed by SAG-AFTRA, would regulate the use of digital replicas. By backing SB 1047 alongside these more targeted initiatives, these organizations appear to be taking a comprehensive approach to AI governance.

The NOW and Fund Her letters both draw parallels between unregulated AI and the history of social media. Fund Her founder and president Valerie McGinty writes to The Verge, “We have seen the incredible harm social media has imposed on our children and how difficult it is to reverse it. We won’t be stuck playing catch up again if Governor Newsom signs SB 1047 into law.”

It’s unclear if the letters will be enough for the bill to overcome the powerful forces arrayed against it. While Wiener and other advocates describe the regulation as “light-touch” and “common sense,” the industry is, by and large, freaking out. 

The US currently relies almost entirely on self-regulation and nonbinding voluntary commitments to govern AI, and the industry would like to keep it that way. As the first US AI safety regulation with teeth, SB 1047 would set a powerful precedent, which is a likely motivation behind both these letters and the vigorous industry opposition. 

Google, Meta, and OpenAI took the unusual step of writing their own letters opposing the bill. Resistance from AI investors has been even stiffer, with the prestigious startup incubator Y Combinator (YC) and the venture fund Andreessen Horowitz (a16z) leading a full-court press to kill SB 1047. These and other prominent opponents warn that the bill could prompt an exodus from California, cede the US lead in AI to China, and devastate the open source community. 

Naturally, supporters dispute each of these arguments. In a July letter addressing YC and a16z’s claims about the bill, Wiener points out that SB 1047 would apply to any covered AI company doing business in California, the world’s AI hub and fifth-largest economy. Dario Amodei, CEO of leading AI company and eventual de facto SB 1047 supporter Anthropic, called the threat to leave “just theater” (it has nonetheless also been invoked by OpenAI, Meta, and Google). 

Nancy Pelosi called the bill “well-intentioned but ill informed”

In her statement opposing the bill, Pelosi called it “well-intentioned but ill informed.” In a phone interview, Wiener said, “I have enormous respect for the Speaker Emerita. She is the GOAT,” but went on to call Pelosi’s statement “unfortunate” and noted that “some of the top machine learning pioneers on the planet support the bill,” citing endorsements from deep learning “godfathers” Geoffrey Hinton and Yoshua Bengio. Wiener also highlights a supportive open letter published Monday from over 100 employees and alumni of the leading AI companies.

For evaluating SB 1047 on its merits, the most convincing letter might be one published by Anthropic, which broke from its peers to write that the revised legislation’s “benefits likely outweigh its costs.” This letter followed a round of amendments made directly in response to the company’s prior complaints. Anthropic’s Claude family of chatbots leads the world on some metrics, and the company will likely be one of the handful of AI developers directly covered by the law in the near future. 

With key congressional leaders promising to obstruct substantive federal AI regulations and opposing SB 1047, California may go it alone, as it already has on net neutrality and data privacy. As NOW’s Nunes writes, the “AI safety standards set by California will change the world,” giving Governor Newsom a chance to make history and model “balanced AI leadership.”

Fund Her’s McGinty summed up the supporters’ stance in an email to The Verge: “We should listen to these experts more interested in our wellbeing than the Big Tech executives skimping on AI safety.” 

As the September 30th deadline approaches, all eyes are on Governor Newsom to see how he’ll shape the future of AI governance in California and beyond. “My experience with Gavin Newsom is — agree or disagree — he makes thoughtful decisions based on what he thinks is best for the state,” says Wiener. “I’ve always appreciated that about him.”

Correction: The article initially cited deep learning “godfather” Yann LeCun as a supporter of SB 1047. LeCun is opposed to the bill. We regret the error.

Share.
Exit mobile version