Nous appelons tous les laboratoires à stopper immédiatement le développement des IA plus puissantes que GPT-4 pour une durée d'au moins 6 mois.| Future of Life Institute
This joint open letter by Encode Justice and the Future of Life Institute calls for the implementation of three concrete US policies in order to address current and future harms of AI.| Open Letters Archive - Future of Life Institute
The abhorrent Ukraine war has the potential to escalate into an all-out NATO-Russia nuclear conflict that would be the greatest catastrophe in human history. More must be done to prevent such escalation.| Open Letters Archive - Future of Life Institute
Given our commitment to do no harm, the global health community has a long history of successful advocacy against inhumane weapons, and the World and American Medical Associations have called for bans on nuclear, chemical and biological weapons. Now, recent advances in artificial intelligence have brought us to the brink of a new arms race in lethal autonomous weapons.| Open Letters Archive - Future of Life Institute
The following statement was read on the floor of the United Nations during the August, 2018 CCW meeting, in which […]| Open Letters Archive - Future of Life Institute
Nuclear arms are the only weapons of mass destruction not yet prohibited by an international convention, even though they are the most destructive and indiscriminate weapons ever created. We scientists bear a special responsibility for nuclear weapons, since it was scientists who invented them and discovered that their effects are even more horrific than first thought.| Open Letters Archive - Future of Life Institute
Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI. In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine.| Open Letters Archive - Future of Life Institute
As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel […]| Open Letters Archive - Future of Life Institute
| Open Letters Archive - Future of Life Institute
Click here to view the Autonomous Weapons Open Letter for AI & Robotics Researchers.| Open Letters Archive - Future of Life Institute
Autonomous weapons select and engage targets without human intervention. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.| Open Letters Archive - Future of Life Institute
Click here to view the Research Priorities for Robust and Beneficial AI Open Letter.| Open Letters Archive - Future of Life Institute
An open letter by a team of economists about AI’s future impact on the economy. It includes specific policy suggestions […]| Open Letters Archive - Future of Life Institute
There is now a broad consensus that AI research is progressing steadily, and that its impact on society is likely to increase. The potential benefits are huge, since everything that civilization has to offer is a product of human intelligence. Because of the great potential of AI, it is important to research how to reap its benefits while avoiding potential pitfalls.| Open Letters Archive - Future of Life Institute
Inspired by our Puerto Rico AI conference and open letter, a team of economists and business leaders have now launched […]| Future of Life Institute
The emergence of artificial intelligence (AI) promises dramatic changes in our economic and social structures as well as everyday life […]| Future of Life Institute
The Elders, Future of Life Institute and a diverse range of co-signatories call on decision-makers to urgently address the ongoing impact and escalating risks of the climate crisis, pandemics, nuclear weapons, and ungoverned AI.| Future of Life Institute
At a 2017 FLI conference, AI scientists and researchers developed the highly influential Asilomar AI governance principles. Add your signature.| Future of Life Institute
We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.| Future of Life Institute