Earth with ominous red glow

All Praise for If Anyone Builds It, Everyone Dies

Science and Academia

"The most important book of the decade."
— Max Tegmark, Professor of Physics, MIT
"A clearly written and compelling account of the existential risks that highly advanced AI could pose to humanity. Recommended."
— Ben Bernanke, Nobel laureate and former chairman of the Federal Reserve
"Essential reading for policymakers, journalists, researchers, and the general public. A masterfully written and groundbreaking text, If Anyone Builds It, Everyone Dies provides an important starting point for discussing AI at all levels."
— Bart Selman, Professor of Computer Science, Cornell University
"I'd like everyone on earth who cares about the future to read this book, debate its ideas, and have its thesis in mind when they're discussing AI."
— Scott Aaronson, Schlumberger Centennial Chair of Computer Science, UT Austin
"This book offers brilliant insights into history's most consequential standoff between technological utopia and dystopia, and shows how we can and should prevent superhuman AI from killing us all. Yudkowsky and Soares's memorable storytelling about past disaster precedents (e.g., the inventor of two environmental nightmares: tetra-ethyl-lead gasoline and Freon) highlights why top thinkers so often don't see the catastrophes they create."
— George Church, Founding Core Faculty, Synthetic Biology, Wyss Institute at Harvard University
"Everyone should read this book."
— Daniel Kokotajlo, OpenAI whistleblower and lead author, AI 2027
"A shocking book that captures the insanity and hubris of efforts to create thinking machines that could kill us all. But it's not over yet. As the authors insist: 'where there's life, there's hope.'"
— Dorothy Sue Cobble, Distinguished Professor Emerita, Labor Studies, Rutgers University
"Claims about the risks of AI are often dismissed as advertising, intended to sell more gadgets. It would be comforting if that were true, but this book disproves it. Yudkowsky and Soares are not from the AI industry, and have been writing about these risks since before it existed in its present form. Read their disturbing book and tell us what they get wrong."
— Huw Price, Bertrand Russell Professor Emeritus of Philosophy, Trinity College, Cambridge

National Security

"The authors raise an incredibly serious issue that merits — really demands — our attention. You don't have to agree with the prediction or prescriptions in this book, nor do you have to be tech or AI savvy, to find it fascinating, accessible, and thought-provoking."
— Suzanne Spaulding, former Under Secretary for the Department of Homeland Security
"The current level of policy discourse over artificial general intelligence (AGI) is dangerously low. If AGI leads to human annihilation — and the authors make a compelling case it almost certainly will — then the imagined benefits of building AGI mean nothing. The incentives for major companies to keep pushing ahead to build superintelligence will win the day unless governments around the world recognize the risks and start to take collective and effective action."
— Jon Wolfsthal, former Special Assistant to the President for National Security Affairs; former Senior Director, White House, National Security Council
"This is our warning. Read today. Circulate tomorrow. Demand the guardrails. I'll keep betting on humanity, but first we must wake up."
— R.P. Eddy, former Director, White House, National Security Council
"While I'm skeptical that the current trajectory of AI development will lead to human extinction, I acknowledge that this view may reflect a failure of imagination on my part. Regardless of where one stands on Yudkowsky and Soares's central argument, this book makes a valuable contribution by reminding us, once again, that every transformative technology in history has carried both promise and peril. Given AI's exponential pace of change, there's no better time to take prudent steps to guard against worst-case outcomes. The authors offer important proposals for global guardrails and risk mitigation that deserve serious consideration."
— Lieutenant General John N.T. "Jack" Shanahan (USAF, Ret.), Inaugural Director, Department of Defense Joint AI Center
"An Artificial Intelligence war cannot be won and must never be fought."
— Brooke Taylor, CEO, Defending Our Country, LLC

Other Endorsements

"The most important book I've read for years: I want to bring it to every political and corporate leader in the world and stand over them until they've read it. Yudkowsky and Soares, who have studied AI and its possible trajectories for decades, sound a loud trumpet call to humanity to awaken us as we sleepwalk into disaster. Their brilliant gift for analogy, metaphor and parable clarifies for the general reader the tangled complexities of AI engineering, cognition and neuroscience better than any book on the subject I've ever read, and I've waded through scores of them. We really must rub our eyes and wake the **** up!"
— Stephen Fry, actor and writer
"A sober but highly readable book on the very real risks of AI. Both skeptics and believers need to understand the authors' arguments, and work to ensure that our AI future is more beneficial than harmful."
— Bruce Schneier, leading computer security expert and Lecturer, Harvard Kennedy School
"If Anyone Builds It, Everyone Dies may prove to be the most important book of our time. Yudkowsky and Soares believe we are nowhere near ready to make the transition to superintelligence safely, leaving us on the fast track to extinction. Through the use of parables and crystal-clear explainers, they convey their reasoning, in an urgent plea for us to save ourselves while we still can."
— Tim Urban, writer, Wait But Why
"This is the best no-nonsense, simple explanation of the AI risk problem I've ever read."
— Yishan Wong, former CEO of Reddit
"Soares and Yudkowsky lay out, in plain and easy to follow terms, why the current path we are on to build ever-more-powerful AIs is extremely dangerous."
— Emmett Shear, former interim CEO of OpenAI
"Humans are lucky to have Nate Soares and Eliezer Yudkowsky because they can actually write. As in, you will feel actual emotions when you read this book. We are currently living in the last period of history where we are the dominant species. We have a brief window of time to make decisions about our future in light of this fact. Sometimes I get distracted and forget about this reality, until I bump into the work of these folks and am re-reminded that I am being a fool to dedicate my life to anything besides this question."
— Grimes, artist and musician
"AGI could be here in as short as a few years. This is one of the few books that takes the implications seriously and explains what could be in store without mincing words."
— Scott Alexander, writer, Astral Codex Ten
"A.I. is coming, whether we want it or not. It's too late to stop it, but not too late to keep this handy survival guide close and start demanding real guardrails before the edges start to fray."
— Patton Oswalt, actor
"If Anyone Builds It, Everyone Dies isn't just a wake-up call; it's a fire alarm ringing with clarity and urgency. Yudkowsky and Soares pull no punches: unchecked superhuman AI poses an existential threat. It's a sobering reminder that humanity's future depends on what we do right now."
— Mark Ruffalo, actor