Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
Technology
History
About Us
Contact Us
Copyright
© 2024 PodJoint
Podjoint Logo
US
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/fd/08/50/fd0850a4-be9c-1051-aff5-49e6258174f6/mza_1994138172040981453.jpeg/600x600bb.jpg
Foresight Institute Radio
Foresight Institute
194 episodes
1 month ago

Foresight Institute Radio features the most cutting-edge talks and seminars from our workshops—fresh insights on advanced AI, nanotech, longevity biotech, and beyond.


See the slides and demos on YouTube, and follow @ForesightInst on X for real-time updates. For polished, in-studio interviews, check out our sister feed: The Existential Hope Podcast


Foresight Institute is an independent nonprofit devoted to steering emerging technologies toward beneficial futures.


Hosted on Acast. See acast.com/privacy for more information.

Show more...
Technology
Society & Culture,
Science
RSS
All content for Foresight Institute Radio is the property of Foresight Institute and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.

Foresight Institute Radio features the most cutting-edge talks and seminars from our workshops—fresh insights on advanced AI, nanotech, longevity biotech, and beyond.


See the slides and demos on YouTube, and follow @ForesightInst on X for real-time updates. For polished, in-studio interviews, check out our sister feed: The Existential Hope Podcast


Foresight Institute is an independent nonprofit devoted to steering emerging technologies toward beneficial futures.


Hosted on Acast. See acast.com/privacy for more information.

Show more...
Technology
Society & Culture,
Science
https://assets.pippa.io/shows/6527e4f1d40c9700125f42de/1748961959259-4d0f0a86-9284-4933-8108-ef3b377a95f4.jpeg
Eliezer Yudkowsky vs Mark Miller | ASI Risks: Similar premises, opposite conclusions
Foresight Institute Radio
4 hours 12 minutes 32 seconds
1 month ago
Eliezer Yudkowsky vs Mark Miller | ASI Risks: Similar premises, opposite conclusions

What are the best strategies for addressing extreme risks from artificial superintelligence? In this 4-hour conversation, decision theorist Eliezer Yudkowsky and computer scientist Mark Miller discuss their cruxes for disagreement.


They examine the future of AI, existential risk, and whether alignment is even possible. Topics include AI risk scenarios, coalition dynamics, secure systems like seL4, hardware exploits like Rowhammer, molecular engineering with AlphaFold, and historical analogies like nuclear arms control. They explore superintelligence governance, multipolar vs singleton futures, and the philosophical challenges of trust, verification, and control in a post-AGI world.


Moderated by Christine Peterson, the discussion seeks the least risky strategy for reaching a preferred state amid superintelligent AI risks. Yudkowsky warns of catastrophic outcomes if AGI is not controlled, while Miller advocates decentralizing power and preserving human institutions as AI evolves.


The conversation spans AI collaboration, secure operating frameworks, cryptographic separation, and lessons from nuclear non-proliferation. Despite their differences, both aim for a future where AI benefits humanity without posing existential threats.


Hosted on Acast. See acast.com/privacy for more information.

Foresight Institute Radio

Foresight Institute Radio features the most cutting-edge talks and seminars from our workshops—fresh insights on advanced AI, nanotech, longevity biotech, and beyond.


See the slides and demos on YouTube, and follow @ForesightInst on X for real-time updates. For polished, in-studio interviews, check out our sister feed: The Existential Hope Podcast


Foresight Institute is an independent nonprofit devoted to steering emerging technologies toward beneficial futures.


Hosted on Acast. See acast.com/privacy for more information.