[Thread] California State Senator Scott Wiener defends his AI safety bill, SB 1047, after criticism that it will “crush OpenAI's competitors” and open-source AI
Senator Scott Wiener / @scott_wiener :
@scott_wiener Senator Scott Wiener
Related Coverage
- SB-1047 will stifle open-source AI and decrease safety Answer.AI
- The AI safety fog of war Politico
- Why OpenAI Should Become Open-Source HackerNoon
Discussion
-
@psychosort
Brian Chau
on x
The California senate bill to crush OpenAI's competitors is fast tracked for a vote. This is the most brazen attempt to hurt startups and open source yet. 🧵 [image]
-
@bgurley
Bill Gurley
on x
What's really happening in the background around AI regulation. [image]
-
@quintinpope5
Quintin Pope
on x
@Scott_Wiener Please correct me if I'm wrong, but SB 1047 seems to open multiple straightforward paths for de facto banning any open model that improves on the current state of the art. E.g., - The 2023 FBI Internet Crime Report indicates cybercriminals caused ~$12.5 billion in t…
-
@deanwball
Dean Woodley Ball
on x
Senator Wiener's response on #SB1047 fails to address the myriad ways in which this bill outlaws open-sourcing models close to today's frontier (to say nothing of the frontier in the future). (1/2)
-
@typewriters
Lauren Wagner
on x
We at @contextfund have been following the evolution of SB 1047 for months. Our working group of engineers and policy analysts from the open source community has feedback: 1/While the proposal has good intent, it tries to solve a complex research problem with the legal...
-
@jeremyphoward
Jeremy Howard
on x
By imposing restrictions on open-source AI, SB1047 hurts AI safety, reducing: - Collaboration, which allows a wider range of experts to identify and address potential safety concerns - Resilience; concentrating control creates single points of failure & increases systemic risk
-
@angrynoah
@angrynoah
on x
SB 1047 is so funny it won't accomplish anything valuable but it will create a massive incentive for lots of companies to leave California I endorse this for the comedy value alone
-
@youraimarketer
@youraimarketer
on x
Just read @jeremyphoward's personal response to bill SB-1047. I'm sharing some insights because the aftermath of this bill will affect other regions and countries, too. “It could reduce AI safety, through reducing transparency, collaboration, diversity, and resilience.” He's [ima…
-
@jeremyphoward
Jeremy Howard
on x
SB-1047 creates significant barriers to entry for small businesses and startups looking to innovate in AI. Compliance costs & legal risks could deter entrepreneurs. This would stifle innovation and concentrate power within established corporations
-
@krishnanrohit
Rohit
on x
CA SB 1047 is part of a list of regulations including the NRC or NEPA or CEQA, all of which were made with great intentions but have been since gotten perverted. It makes sense to not make broad brush regulations that will kill yet another incredible technology in its crib.
-
@psychosort
Brian Chau
on x
SB 1047, a bill which would ban open source AI, is being fast tracked through the California Senate. We're now expanding our campaign to save open source.🧵 https://twitter.com/...
-
@chrislengerich
@chrislengerich
on x
Open letter to @Scott_Wiener re: SB-1047. A Safe Harbor for Independent AI Evaluation? Hi Scott, Just a personal thought from the investing perspective, 1047 seems likely to be about 1 week - 6 months away from an SBF 2.0-style scandal. The bill sponsors likely aren't being
-
@sambreed
@sambreed
on x
the great thing about AI is that you can quickly summarize shit like the text of this horrendous bill and author a reply to your representative. Here's the text I sent: I am writing to express my opposition to SB 1047, the Safe and Secure Innovation for Frontier Artificial...
-
@opensauceai
Ben Brooks
on x
Lots of anxiety about SB 1047 in CA, and a lot of swirl. The broad contours will be familiar to folks following the US Executive Order and EU AI Act. There's a line in the sand, and if you cross the line, you have to do and disclose a bunch of things before training and releasing…
-
@perrymetzger
Perry E. Metzger
on x
SB 1047 is an extreme measure that will destroy the AI industry in California and de facto ban open source AI. It was written by the Effective Altruism cult, introduced at the behest of the Effective Altruism cult, and everything Brian wrote about the content is correct.
-
@kavinstewart
Kavin Stewart
on x
The biggest issue with SB 1047 is 22602(n) It defines hazardous capability as making it significantly easier to do things like cause >$500M in damage. But any high leverage tech (like search engines) seems like it would meet this standard. Instead, why not redefine this to mean..…
-
@rao_hacker_one
Arun Rao
on x
Really terrible SB 1047 by @Scott_Wiener would shut down AI development in CA and lead to heavy flight of companies and engineers. He and his staff need to talk to more companies to understand the issues - disappointing to see a bill like this one. Analysis:...
-
@mahoneymatic
Mahoney
on x
I won't get out of my lane and speak to the specifics of this, but on the whole it should be the #1 concern right now to protect the ability to maintain parity with AI platforms using open tools. https://www.answer.ai/...
-
@nathanleamerdc
Nathan Leamer
on x
Today California State Senator Scott Weiner tried to defend his ill conceived proposal, SB 1047 which would undermine development of AI in California. He claims you can DM him to continue the dialogue, except his DMs are closed. Why lie about this? [image]
-
@cfgeek
Charles Foster
on x
SB 1047 includes a handful of concessions to open-weight advocates. Unfortunately they're quite weaksauce in light of the rest of the proposed legislation. [image]
-
@thezvi
Zvi Mowshowitz
on x
On review: It seems like most of the actual problems with SB 1047 as written are in the definition of a derivative model. In particular, the fact that the model remains derivative under unlimited additional training. This is a mistake. We can and should fix it.
-
@perrymetzger
Perry E. Metzger
on x
The claim here is that lots of people support SB 1047, the extremist anti-AI bill. I have never even heard of “Lovable Labs.” Where are the actual AI companies in all of this? Where was the testimony from the Open Source Initiative about the effect on open source, from the EFF...
-
@psychosort
Brian Chau
on x
My response to Senator Wiener's QT: I stand by the claims that this is a clear attempt to ban open source and that criminal perjury charges are possible https://twitter.com/...
-
@psychosort
Brian Chau
on x
Their definition of derivative model includes unlimited additional training, as long as it is not “independent”, and everything up to combination with other software. [image]
-
@psychosort
Brian Chau
on x
The bill would make it a felony to make a paperwork mistake for this agency, opening the door to selective weaponization and harassment. [image]
-
@perrymetzger
Perry E. Metzger
on x
I think a lot of people at AI companies don't get how bad California's proposed law (SB 1047) is. If your company wants to be able to do any AI work of note, you need to oppose it; if it passes, the industry is going to be shut down in the state.
-
@adamthierer
Adam Thierer
on x
California's SB-1047 makes hypothetical worst-case thinking the basis of #AI regulation. Only the largest current tech companies would likely be able to handle the bill's compliance costs and liability threats. Smaller developers (especially #opensource) would be decimated by... …
-
@psychosort
Brian Chau
on x
They literally specified that they want to regulate models capable of competing with OpenAI.
-
@sftombu
Tom Burns
on x
CA SB-1047 will lead to state regulation of most software. The threshold for regulation sounds high now, but given the rate of progress, it will apply to most software in the near future. Most software will use SOTA AI, so most software will face regulation and litigation. [video…
-
@danhendrycks
Dan Hendrycks
on x
Hinton and Bengio on SB 1047 and a summary of the bill. Hinton: “SB 1047 takes a very sensible approach... I am still passionate about the potential for AI to save lives through improvements in science and medicine, but it's critical that we have legislation with real teeth to...…
-
@kevinbankston
Kevin Bankston
on x
A problem with this Cali AI bill I haven't seen mentioned: like the AI EO/Commerce KYC proposed regs, the KYC provisions of SB 1047 appear to violate the federal Stored Communications Act. Gov't needs subpoena or court order for such info from a remote computing provider.
-
@jeremyphoward
Jeremy Howard
on x
There's a new bill, SB-1047 “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act”. I think it could do a great deal of harm to startups, American innovation, open source, and safety. So I've written a response to the authors: 🧵 https://www.answer.ai/...
-
@argleave
Adam Gleave
on x
I support SB 1047: the regulation asks billion-$ tech companies to take reasonable precautions when training models with the greatest capability for misuse, poses few to no costs on other developers, and supports academic & open-source research through compute funding.
-
@adamthierer
Adam Thierer
on x
California is considering a new #AI bill (SB 1047) that is “one of the most far-reaching and potentially destructive technology measures being considered today,” as I argue in this new @RSI analysis. https://www.rstreet.org/...
-
@jonaskonas
Jon Askonas
on x
We are already seeing an explosion of AI regulation that is designed to ban open source while claiming to be neutral. SB 1047 designates a “hazardous capability” to include what a third party can show with infinite fine-tuning and re-training. Meanwhile, closed models get points.…
-
@simonesyed
@simonesyed
on x
Fuck the EAs and the fast tracking of SB 1047 This irrevocably demolish the burgeoning AI ecosystem in CA, but also generally VC funding, automation and industry, desirability to live here Rationalist ideology has destroyed society. Now these fucks want to destroy the future
-
@anjneymidha
Anjney Midha
on x
I finally read the full SB 1047 draft bill. Holy shit. This is serious Big Tech lobbying. They should just rename it “How To Kill AI Startups and Open Source” To make AI safe, you have to regulate its misuses. Not restrict fundamental research and open scientific progress.
-
@psychosort
Brian Chau
on x
They are trying to sneak in a new AI regulatory agency, like the TSA or Nuclear Regulatory Commission. The primary function will be to harm small players in the AI industry. Or as @adamthierer puts it, NEPA for AI [image]