On the News
Open Letter
Statement on AI Risk
Mitigating the risk of EXTINCTION FROM AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.
At release time (On May 30, 2023), hundreds of artificial intelligence experts and other notable figures signed the statement.
The signatories included over 100 professors including the two most-cited computer scientists and Turing laureates Geoffrey Hinton and Yoshua Bengio, as well as the scientific and executive leaders of several major AI companies, and experts in pandemics, climate, nuclear disarmament, philosophy, social sciences, and other fields.
Media coverage has emphasized the signatures from several tech leaders, including Sam Altman, Dario Amodei, Ilya Sutskever, Bill Gates as well as more than 600 top executives and thousands of computer scientists and engineers.
Signatories
Read Other Important
Open Letters Here
Artists Call for Gov. Newsom to Sign SB 1047
More than 125 Household names from Hollywood, actors, directors, producers, music artists and entertainment industry leaders have added their names to a letter urging Gov. Gavin Newsom to sign a bill that would require developers of advanced artificial intelligence models to have safety measures in place to prevent catastrophes.
Pause Giant AI Experiments: An Open Letter
We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4
Autonomous Weapons Open Letter: AI & Robotics Researchers
Autonomous weapons select and engage targets without human intervention. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
Interviews and Talks
Industry Leaders and Notable Public Figures
Explainers
Learn about the issue by some of the best explainers out there