Facebook Twitter Instagram
    Trending
    • From Wolves to Care Bears: Insights from the Caloundra Futures Thinking and Transformational Strategy Masterclass
    • JFS | Podcast
    • A Rocket to the Future – Futures Triangle for Children
    • Editors’ Introduction to Hesitant Feminist’s Guide to the Future Symposium
    • Rebellious girls needed – the urgency to imagine more feminist futures
    • Feminist International Relations: a knowledge-based proposition
    • Mother, motherhood, mothering: A conversation on feminist futures across generations, cultures, and life experiences
    • Quantum Feminist Futures: Introducing the applied fusion of two theories
    Journal of Futures Studies
    • Who we are
      • Editorial Board
      • Editors
      • Core Team
      • Digital Editing Team
      • Consulting Editors
      • Indexing, Rank and Impact Factor
      • Statement of Open Access
    • Articles and Essays
      • In Press
      • 2025
        • Vol. 29 No. 3 March 2025
      • 2024
        • Vol. 29 No. 2 December 2024
        • Vol. 29 No. 1 September 2024
        • Vol. 28 No. 4 June 2024
        • Vol. 28 No. 3 March 2024
      • 2023
        • Vol. 28 No. 2 December 2023
        • Vol. 28 No. 1 September 2023
        • Vol. 27 No. 4 June 2023
        • Vol. 27 No. 3 March 2023
      • 2022
        • Vol. 27 No. 2 December 2022
        • Vol. 27 No.1 September 2022
        • Vol.26 No.4 June 2022
        • Vol.26 No.3 March 2022
      • 2021
        • Vol.26 No.2 December 2021
        • Vol.26 No.1 September 2021
        • Vol.25 No.4 June 2021
        • Vol.25 No.3 March 2021
      • 2020
        • Vol.25 No.2 December 2020
        • Vol.25 No.1 September 2020
        • Vol.24 No.4 June 2020
        • Vol.24 No.3 March 2020
      • 2019
        • Vol.24 No.2 December 2019
        • Vol.24 No.1 September 2019
        • Vol.23 No.4 June 2019
        • Vol.23 No.3 March 2019
      • 2018
        • Vol.23 No.2 Dec. 2018
        • Vol.23 No.1 Sept. 2018
        • Vol.22 No.4 June 2018
        • Vol.22 No.3 March 2018
      • 2017
        • Vol.22 No.2 December 2017
        • Vol.22 No.1 September 2017
        • Vol.21 No.4 June 2017
        • Vol.21 No.3 Mar 2017
      • 2016
        • Vol.21 No.2 Dec 2016
        • Vol.21 No.1 Sep 2016
        • Vol.20 No.4 June.2016
        • Vol.20 No.3 March.2016
      • 2015
        • Vol.20 No.2 Dec.2015
        • Vol.20 No.1 Sept.2015
        • Vol.19 No.4 June.2015
        • Vol.19 No.3 Mar.2015
      • 2014
        • Vol. 19 No. 2 Dec. 2014
        • Vol. 19 No. 1 Sept. 2014
        • Vol. 18 No. 4 Jun. 2014
        • Vol. 18 No. 3 Mar. 2014
      • 2013
        • Vol. 18 No. 2 Dec. 2013
        • Vol. 18 No. 1 Sept. 2013
        • Vol. 17 No. 4 Jun. 2013
        • Vol. 17 No. 3 Mar. 2013
      • 2012
        • Vol. 17 No. 2 Dec. 2012
        • Vol. 17 No. 1 Sept. 2012
        • Vol. 16 No. 4 Jun. 2012
        • Vol. 16 No. 3 Mar. 2012
      • 2011
        • Vol. 16 No. 2 Dec. 2011
        • Vol. 16 No. 1 Sept. 2011
        • Vol. 15 No. 4 Jun. 2011
        • Vol. 15 No. 3 Mar. 2011
      • 2010
        • Vol. 15 No. 2 Dec. 2010
        • Vol. 15 No. 1 Sept. 2010
        • Vol. 14 No. 4 Jun. 2010
        • Vol. 14 No. 3 Mar. 2010
      • 2009
        • Vol. 14 No. 2 Nov. 2009
        • Vol. 14 No. 1 Aug. 2009
        • Vol. 13 No. 4 May. 2009
        • Vol. 13 No. 3 Feb. 2009
      • 2008
        • Vol. 13 No. 2 Nov. 2008
        • Vol. 13 No. 1 Aug. 2008
        • Vol. 12 No. 4 May. 2008
        • Vol. 12 No. 3 Feb. 2008
      • 2007
        • Vol. 12 No. 2 Nov. 2007
        • Vol. 12 No. 1 Aug. 2007
        • Vol. 11 No. 4 May. 2007
        • Vol. 11 No. 3 Feb. 2007
      • 2006
        • Vol. 11 No. 2 Nov. 2006
        • Vol. 11 No. 1 Aug. 2006
        • Vol. 10 No. 4 May. 2006
        • Vol. 10 No. 3 Feb. 2006
      • 2005
        • Vol. 10 No. 2 Nov. 2005
        • Vol. 10 No. 1 Aug. 2005
        • Vol. 9 No. 4 May. 2005
        • Vol. 9 No. 3 Feb. 2005
      • 2004
        • Vol. 9 No. 2 Nov. 2004
        • Vol. 9 No. 1 Aug. 2004
        • Vol. 8 No. 4 May. 2004
        • Vol. 8 No. 3 Feb. 2004
      • 2003
        • Vol. 8 No. 2 Nov. 2003
        • Vol. 8 No. 1 Aug. 2003
        • Vol. 7 No. 4 May. 2003
        • Vol. 7 No. 3 Feb. 2003
      • 2002
        • Vol. 7 No.2 Dec. 2002
        • Vol. 7 No.1 Aug. 2002
        • Vol. 6 No.4 May. 2002
        • Vol. 6 No.3 Feb. 2002
      • 2001
        • Vol.6 No.2 Nov. 2001
        • Vol.6 No.1 Aug. 2001
        • Vol.5 No.4 May. 2001
        • Vol.5 No.3 Feb. 2001
      • 2000
        • Vol. 5 No. 2 Nov. 2000
        • Vol. 5 No. 1 Aug. 2000
        • Vol. 4 No. 2 May. 2000
      • 1999
        • Vol. 4 No. 1 Nov. 1999
        • Vol. 3 No. 2 May
      • 1998
        • Vol. 3 No. 1 November 1998
        • Vol. 2 No. 2 May. 1998
      • 1997
        • Vol. 2 No. 1 November 1997
        • Vol. 1 No. 2 May. 1997
      • 1996
        • Vol. 1 No. 1 November 1996
    • Information
      • Submission Guidelines
      • Publication Process
      • Duties of Authors
      • Submit a Work
      • JFS Premium Service
      • Electronic Newsletter
      • Contact us
    • Topics
    • Authors
    • Perspectives
      • About Perspectives
      • Podcast
      • Multi-lingual
      • Exhibits
        • When is Wakanda
      • Special Issues and Symposia
        • The Hesitant Feminist’s Guide to the Future: A Symposium
        • The Internet, Epistemological Crisis And The Realities Of The Future
        • Gaming the Futures Symposium 2016
        • Virtual Symposium on Reimagining Politics After the Election of Trump
    • JFS Community of Practice
      • About Us
      • Teaching Resources
        • High School
          • Futures Studies for High School in Taiwan
        • University
          • Adults
    Journal of Futures Studies
    Home»Articles and Essays»2024»Vol. 29 No. 2 December 2024»The Precipice: Existential Risk and the Future of Humanity (London: Bloomsbury, 2020), by Toby Ord

    The Precipice: Existential Risk and the Future of Humanity (London: Bloomsbury, 2020), by Toby Ord

    Review

    The Precipice: Existential Risk and the Future of Humanity (London: Bloomsbury, 2020), by Toby Ord

    B.V.E. Hyde

    Department of Philosophy, Durham University

    Abstract

    This is a review of The Precipice (London, Bloomsbury, 2020) by Toby Ord.

    Keywords

    Longtermism, Effective Altruism, Existential Risk, Extinction, Artificial Intelligence

    Man’s complacent assumption of the future is too confident. We think, because things have been easy for mankind as a whole for a generation or so, we are going on to perfect comfort and security in the future. We think that we shall always go to work at ten and leave off at four, and have dinner at seven for ever and ever. But these four suggestions, out of a host of others, must surely do a little against this complacency. Even now, for all we can tell, the coming terror may be crouching for its spring and the fall of humanity be at hand. In the case of every other predominant animal the world has ever seen, I repeat, the hour of its complete ascendency has been the eve of its entire overthrow.

    — H. G. Wells, “The Extinction of Man”, 1894

     

    Humanity is now on the precipice of extinction. According to Toby Ord, senior research fellow at the University of Oxford’s Future of Humanity Institute, there is a 1 in 6 chance that civilization will come to an end in the next century. Throughout the twentieth century, that chance was 1 in 100.

    The chance of ‘existential catastrophe’ in which intelligent life is completely annihilated is called ‘existential risk’ and, as mankind advances technologically, it has been increasing. We are now at a uniquely dangerous period in history characterized by unprecedented destructive capability with neither an understanding of this nor the global unity to do anything about it. Beginning with the development of the first atomic bomb in 1945, Ord calls this period the ‘Precipice’, but it will not last more than a few centuries: accordingly, we will either develop the necessary policy to reduce existential risk or humanity will end before we do.

    What will cause the extinction of mankind? An asteroid, like the one that caused the mass extinction of all non-avian dinosaurs 66 million years ago? A supervolcanic eruption? Unlikely: all in all, natural risks together amount to only about a 1 in 10,000 chance of existential catastrophe per century in Ord’s estimation. The existential risk associated with nuclear war, however, is 1 in 1,000, ten times higher than all natural risks put together. The risk of extinction by climate change is also 1 in 1,000. Much worse, though, is the threat posed by engineered pandemics, which have a 1 in 30 chance of ending the world and, the most dangerous of all, the existential risk of artificial intelligence unaligned with human values is estimated as 1 in 10 by Ord – a figure doubtless influenced by Nick Bostrom, who also judged artificial intelligence a serious threat to human existence in Superintelligence: Paths, Dangers, Strategies (Oxford: Oxford University Press, 2014). Bostrom’s book, along with others like Our Final Invention (New York: Thomas Dunne, 2013) and Human Compatible (New York: Viking Press, 2019), brought concerns about existential risk from artificial general intelligence to public attention, and it is now commonplace to see public figures like Elon Musk and Bill Gates express concern about it. Although some have been skeptical of the risk it poses – like Michio Kaku in Physics of the Future (New York: Doubleday, 2011) – most recent books have not been, and several hypothetical takeover scenarios have been mapped, such as in Life 3.0 (New York: Vintage Books, 2017) by Max Tegmark. The threat of artificial intelligence is more real than ever today, but it is perhaps no longer the biggest threat to human survival. Ord’s figures are somewhat outdated already because, with the Russia-Ukraine War which has transpired after Ord wrote his book, the chance of nuclear warfare certainly ought to be estimated much higher. In 2023, the Doomsday Clock is set at ninety seconds to midnight, the closest to global catastrophe it has ever been, with the reasons cited being the war between Russian and Ukraine and the threat of nuclear warfare. Either way, what Ord’s estimates mean is that the 1 in 6 chance that the world will end this next century is, pretty much entirely, manmade.

    The bright side is that this means we can do something about it. In an ideal world, humanity needs to come together as a coherent agent to take responsibility for its future and make some strategic choices with the longterm future in mind. Less than 0.001% of gross world product is spent on targeted existential risk reduction interventions. For example, the Biological Weapons Convention, the global body founded to reduce risk of accidental or deliberate viral releases which, recall, have a 1 in 30 chance of extinguishing humanity, has an annual budget ($1.4 million) smaller than that of the average MacDonald’s. Motivation to fund existential risk mitigation is limited not only by ignorance of the dangers at hand, but also by insufficient global coordination.

    Ord does not suggest that the survival of humanity is only a global policy issue. We can all play a role in safeguarding humanity’s future, he thinks. Two of the most important ways individuals can change the world are through their careers and charitable donations. 80,000 Hours, a non-profit organization part of the Centre for Effective Altruism at the University of Oxford, conducts research on which careers have the largest positive social impact and provides career advice based on that research. Giving What We Can, another effective altruism organization based at the University of Oxford, set up by Ord and MacAskill in 2009, is a collective of individuals committed to donating a minimum of 10% of their income to the most effective charities, including those with a longtermist agenda. Ord also encourages public discourse about humanity’s longterm future which, he would be right to think, is essential to an international, intergovernmental, unified response to existential risks.

    One may express indifference or lack of concern towards the potential extinction of humanity. This perspective may arise particularly among individuals of advanced age or those who, for various reasons, believe that an existential catastrophe would transpire posthumously. Consequently, such individuals might call into question the relevance of the longterm future. This is where ‘longtermism’ comes in as an ethical position. The term was coined by Toby Ord and William MacAskill and refers to their view that positively influencing the longterm future is a key moral priority of our time. It was first popularized by Ord with The Precipice (London: Bloomsbury, 2020), but What We Owe the Future (London: Oneworld, 2022) by MacAskill has in the end been more influential. According to MacAskill, “distance in time is like distance in space”. Your moral circle is big enough to donate to charities helping people across the world, so why not to people in the future? What makes MacAskill and other effective altruists care about future people so much is that there are so many of them. You might disagree with them here, but for effective altruists, numbers count. That is because effective altruism is a utilitarian movement, so what they are looking to do is the greatest good for the greatest number, and they are completely indiscriminate in this. Because the future is so large, and therefore so populous, “the early extinction of the human race would be a truly enormous tragedy”, says MacAskill. This is also Ord’s view: he thinks that existential catastrophe would betray the efforts of our ancestors, bring great harm upon those in whose lifetimes the end of the world comes about, and destroy the possibility of a vast future filled with human flourishing. “Longtermism”, he says, “is animated by a moral re-orientation toward the vast future that existential risks threaten to foreclose”.

    The Precipice is yet another addition to the rapidly growing effective altruist – and, by extension, longtermist – movement. Whether or not you agree with his estimates, or what he proposes we do about them, it is difficult to take an ambivalent approach towards the existential risks Ord outlines in the book. If he has succeeded in one thing, it is drawing the attention of the human race to the risks it faces – risks that it has created and, crucially, can mitigate and eradicate.

    References

    Barrat, James. 2013. Our Final Invention. New York: Thomas Dunne.

    Bostrom, Nick. 2014. Superintelligence. Oxford: Oxford University Press.

    Kaku, Michio. 2011. Physics of the Future. New York: Doubleday.

    MacAskill, William. 2022. What We Owe the Future. London: Oneworld.

    Ord, Toby. 2020. The Precipice. London: Bloomsbury.

    Russell, Stuart. 2019. Human Compatible. New York: Viking Press.

    Tegmark, Max. 2017. Life 3.0. New York: Vintage Books.

    Top Posts & Pages
    • Towards an Explicit Research Methodology: Adapting Research Onion Model for Futures Studies
    • Homepage
    • Regenerative Futures: Eight Principles for Thinking and Practice
    • Jose Rizal: Precursor of Futures Thinking in the Philippines
    • The Tale of Three Futures: Conquest, Reverence or Reconciliation?
    • Decolonial Feminism as a Future Direction for Liberatory Feminist Futures
    • Applying Feminist-Informed Foresight to Feminist Foreign Policy: A Reflection on Potentials and Challenges
    • Articles by Topic
    • The Case for Treating Reframing and Imagination as Powerful Life Skills
    • The Global Bystander Effect: Moral Responsibility in Our Age of Ecological Crisis
    In-Press

    Drama to Dharma and the Holographic Buddha: Futures Thinking in Thailand

    May 4, 2025

    Article Ivana Milojević1, Sohail Inayatullah2, Ora-orn Poocharoen3, Nok Boonmavichit4* 1Senior Lecturer in Futures, Edinburgh Futures…

    Codes of Tomorrow: Genomic Sequencing Futures in Mexico of 2035

    May 4, 2025

    The Tale of Three Futures: Conquest, Reverence or Reconciliation?

    May 4, 2025

    Extreme Heat Governance Futures for Sydney – What Now, and What If?

    April 21, 2025

    Mama Coca Chronicles: Navigating Ancestral Heritage and Future Narratives

    April 21, 2025

    Parliaments and Foresight: Scanning and Reflections on Parliamentary Futures Work

    March 16, 2025

    Automating Liminality in Foresight Practice

    January 28, 2025

    Dis/abling Futures: What Ableism Stops Us Noticing

    January 28, 2025

    Beyond the Gaia-Borg Dichotomy: Imagining a Second Chance

    January 28, 2025

    Book Review: “The End of the Cow and Other Emerging Issues”

    January 28, 2025

    The Journal of Futures Studies,

    Graduate Institute of Futures Studies

    Tamkang University

    Taipei, Taiwan 251

    Tel: 886 2-2621-5656 ext. 3001

    Fax: 886 2-2629-6440

    ISSN 1027-6084

    Tamkang University
    Graduate Institute of Futures Studies
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.