If Anyone Builds It, Everyone Dies

The Threat to Humanity of Superintelligent AI

About Eliezer Yudkowsky

Eliezer Yudkowsky is the co-founder of the Machine Intelligence Research Institute (MIRI), and the founder of the field of AI alignment research. He is one of the most influential thinkers and writers on the topic of AI risk, and his TIME magazine op-ed of 2023 is largely responsible for sparking the current concern and discussion around the potential for human extinction.
Details
  • Imprint: Bodley Head
  • ISBN: 9781847928924
  • Length: 304 pages
  • Price: £22.00
All editions