Earth with ominous red glow

Praise for If Anyone Builds It, Everyone Dies

"The most important book of the decade."
— Max Tegmark, Professor of Physics, MIT
"The current level of policy discourse over artificial general intelligence (AGI) is dangerously low. If AGI leads to human annihilation — and the authors make a compelling case it almost certainly will — then the imagined benefits of building AGI mean nothing. The incentives for major companies to keep pushing ahead to build superintelligence will win the day unless governments around the world recognize the risks and start to take collective and effective action."
— Jon Wolfsthal, former Special Assistant to the President for National Security Affairs; former Senior Director, White House, National Security Council
"A clearly written and compelling account of the existential risks that highly advanced AI could pose to humanity. Recommended."
— Ben Bernanke, Nobel laureate and former chairman of the Federal Reserve
"This book offers brilliant insights into history's most consequential standoff between technological utopia and dystopia, and shows how we can and should prevent superhuman AI from killing us all. Yudkowsky and Soares's memorable storytelling about past disaster precedents (e.g., the inventor of two environmental nightmares: tetra-ethyl-lead gasoline and Freon) highlights why top thinkers so often don't see the catastrophes they create."
— George Church, Founding Core Faculty, Synthetic Biology, Wyss Institute at Harvard University
"A sober but highly readable book on the very real risks of AI. Both skeptics and believers need to understand the authors' arguments, and work to ensure that our AI future is more beneficial than harmful."
— Bruce Schneier, leading computer security expert and Lecturer, Harvard Kennedy School
"Everyone should read this book."
— Daniel Kokotajlo, OpenAI whistleblower and lead author, AI 2027
"The authors raise an incredibly serious issue that merits — really demands — our attention. You don't have to agree with the prediction or prescriptions in this book, nor do you have to be tech or AI savvy, to find it fascinating, accessible, and thought-provoking."
— Suzanne Spaulding, former Under Secretary for the Department of Homeland Security
"If Anyone Builds It, Everyone Dies isn't just a wake-up call; it's a fire alarm ringing with clarity and urgency. Yudkowsky and Soares pull no punches: unchecked superhuman AI poses an existential threat. It's a sobering reminder that humanity's future depends on what we do right now."
— Mark Ruffalo, actor
"If Anyone Builds It, Everyone Dies may prove to be the most important book of our time. Yudkowsky and Soares believe we are nowhere near ready to make the transition to superintelligence safely, leaving us on the fast track to extinction. Through the use of parables and crystal-clear explainers, they convey their reasoning, in an urgent plea for us to save ourselves while we still can."
— Tim Urban, writer, Wait But Why
"Humans are lucky to have Nate Soares and Eliezer Yudkowsky because they can actually write. As in, you will feel actual emotions when you read this book. We are currently living in the last period of history where we are the dominant species. We have a brief window of time to make decisions about our future in light of this fact. Sometimes I get distracted and forget about this reality, until I bump into the work of these folks and am re-reminded that I am being a fool to dedicate my life to anything besides this question."
— Grimes, artist and musician
"This is our warning. Read today. Circulate tomorrow. Demand the guardrails. I'll keep betting on humanity, but first we must wake up."
— R.P. Eddy, former Director, White House, National Security Council
"This is the best no-nonsense, simple explanation of the AI risk problem I've ever read."
— Yishan Wong, former CEO of Reddit