If Anyone Builds It, Everyone Dies
Autor: | Soares, Nate Yudkowsky, Eliezer |
---|---|
ISBN: | 9781847928924 |
Sprache: | Englisch |
Produktart: | Gebunden |
Verlag: | Vintage Books UK |
Veröffentlicht: | 02.10.2025 |
Untertitel: | The Threat to Humanity of Superintelligent AI |
Schlagworte: | 21st Century 21st century, c 2000 to c 2100 Artificial Intelligence COMPUTERS / Artificial Intelligence / General Impact of science & technology on society Impact of science and technology on society SOCIAL SCIENCE / Technology Studies TECHNOLOGY & ENGINEERING / Social Aspects Technology: general issues |
Eliezer Yudkowsky is the co-founder of the Machine Intelligence Research Institute (MIRI), and the founder of the field of AI alignment research. He is one of the most influential thinkers and writers on the topic of AI risk, and his TIME magazine op-ed of 2023 is largely responsible for sparking the current concern and discussion around the potential for human extinction.Nate Soares is the president of MIRI and one of its seniormost researchers. He has been working in the field of AI alignment for over a decade, after previous experience at Microsoft and Google.