WILLIAM MACASKILL: On “longtermism” and moral responsibility


Manage episode 338050136 series 2893163
By Sarah Wilson. Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio is streamed directly from their servers. Hit the Subscribe button to track updates in Player FM, or paste the feed URL into other podcast apps.

Our existential risk – the probability that we could wipe ourselves out due to AI, bio-engineering, nuclear war, climate change, etc. in the next 100 years – currently sits at 1 in 6. Let that sink in! Would you get on a plane if there was a 17% chance it would crash? Would you do everything you could to prevent a calamity if you were presented with those odds?

My chat today covers a wild idea that could – and should - better our chances of existing as a species…and lead to a human flourishing I struggle to even imagine. Long Termism argues that prioritisng the long term future of humanity has exponential ethical and existential boons. Flipside, if we don’t choose the long termist route, the repercussions are, well, devastating.

Will MacAskill is one of the world’s leading moral philosophers and I travel to Oxford UK, where he runs the Global Centre of Effective Altruism, the Global Priorities Institute and the Forethought Foundation, to talk through these massive moral issues. Will also explains that right now is the most important time in humanity’s history. Our generation singularly has the power and responsibility to determine two diametrically different paths for humanity. This excites me; I hope it does you, too.

Learn more about Will MacAskill’s work

Purchase his new book What We Owe the Future: A million year view

If you need to know a bit more about me… head to my "about" page.

Subscribe to my Substack newsletter for more such conversations.

Get your copy of my book, This One Wild and Precious Life

Let’s connect on Instagram! It’s where I interact the most.

Hosted on Acast. See acast.com/privacy for more information.

83 episodes