Richard Branson, Steve Bannon, Susan Rice, and others are among the broad coalition of leaders signing the Future of Life Institute's Pro-Human AI Declaration
An eclectic group including Steve Bannon, Glenn Beck, Richard Branson, Ralph Nader, Susan Rice and Nobel Prize-winning economist Daron Acemoglu signed the letter.
Sources: OpenAI is developing AI tools to generate music from text and audio prompts, including capabilities such as adding guitar accompaniment to vocal tracks
OpenAI recently made a splash with artificial intelligence that generates short videos from text prompts—say, a scene from …
Sources: OpenAI is developing AI tools to generate music from text and audio prompts, including capabilities such as adding guitar accompaniment to vocal tracks
OpenAI recently made a splash with artificial intelligence that generates short videos from text prompts—say, a scene from …
Prince Harry and 800+ public figures sign a Future of Life Institute statement urging a ban on AI superintelligence development until it can be deployed safely
The call, signed by Nobel laureates, ex-military leaders and public figures worldwide, seeks a ban on research that could create machines smarter than people.
Prince Harry and 800+ public figures sign a Future of Life Institute statement urging a ban on AI superintelligence development until it can be deployed safely
The call, signed by Nobel laureates, ex-military leaders and public figures worldwide, seeks a ban on research that could create machines smarter than people.
Michel Devoret, a Google Quantum AI chief scientist, John Martinis, who left Google in 2020, and John Clarke win the Nobel in Physics for quantum computing work
all 3 @UofCalifornia professors. Home to groundbreaking physicists, including 2 immigrants leading the world in innovation and possibility, California is proud to dream big and del...
Ahead of the Paris AI summit, Macron says companies plan to invest €109B in AI projects in France in the coming years and it is France's equivalent to Stargate
French president speaks ahead of AI summit in Paris involving executives such as OpenAI's Sam Altman
The Royal Swedish Academy of Sciences awards the Nobel Prize in Physics to John Hopfield and Geoffrey Hinton for “foundational discoveries” in machine learning
had not seen them in any of the sweepstakes ahead of the announcement. But if everything is either physics or stamp collecting we just promoted machine learning to “physics”. (Love...
The Nobel in Chemistry goes to David Baker “for computational protein design” and DeepMind's Demis Hassabis and John Jumper “for protein structure prediction”
- Demis Hassabis, John Jumper share half the $1.1 million award — Remainder goes to David Baker for building new proteins
The Royal Swedish Academy of Sciences awards the Nobel Prize in Physics to John Hopfield and Geoffrey Hinton for “foundational discoveries” in machine learning
- John J. Hopfield, Geoffrey E. Hinton worked on neural networks — Their work laid foundations for machine learning and AI
[Thread] Superalignment team co-lead explains why he has left, says OpenAI's safety culture and processes took a backseat to shiny products over the past years
Yesterday was my last day as head of alignment, superalignment lead, and executive @OpenAI.
Over 700 people, including AI experts and executives, sign an open letter calling for more regulation of deepfakes, such as by criminalizing deepfake child porn
especially as we head into a major election. ‘AI godfather’, others urge more deepfake regulation in open letter https://www.reuters.com/... Ari H. Mendelson / @kingmakerseries : R...
EU lawmakers are on a tricky knife edge hammering out the AI Act's final shape, as France and Germany push for a regulatory carve-out for foundation models
Studies show that LK-99 is not a superconductor and that impurities, notably copper sulfide, were responsible for the material's superconducting-like behaviors
how science sleuths solved the mystery Replications pieced together the puzzle of why the material displayed superconducting-like behaviours. https://www.nature.com/... @originalsp...
OpenAI and DeepMind executives, Geoffrey Hinton, and 350+ others sign a statement saying “mitigating the risk of extinction from AI should be a global priority”
and says computer scientists need ethics training Brian Fung / CNN : AI industry and researchers sign statement warning of ‘extinction’ risk Alka Jain / Livemint : Industry leaders...
OpenAI and DeepMind execs, Geoffrey Hinton, and 350+ others release a statement saying “mitigating the risk of extinction from AI should be a global priority”
Leaders from OpenAI, Google Deepmind, Anthropic and other A.I. labs warn that future systems could be as deadly as pandemics and nuclear weapons.
OpenAI debuts GPT-4, claiming the model “surpasses ChatGPT in its advanced reasoning capabilities”, available in ChatGPT Plus and as an API that has a waitlist
Following the research path from GPT, GPT-2, and GPT-3, our deep learning approach leverages more data and more computation …