Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
News
Sports
TV & Film
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/17/bd/b5/17bdb5a1-b0d0-1b26-9cfa-2a8cae8c8842/mza_17542379518000041042.jpg/600x600bb.jpg
Consistently Candid
Sarah Hastings-Woodhouse
19 episodes
2 months ago
In this episode, I chatted with Frances Lorenz, events associate at the Centre for Effective Altruism. We covered our respective paths into AI safety, the emotional impact of learning about x-risk, what it's like to be female in a male-dominated community and more! Follow Frances on Twitter Subscribe to her Substack Apply for EAG London!
Show more...
Technology
Society & Culture,
Philosophy
RSS
All content for Consistently Candid is the property of Sarah Hastings-Woodhouse and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
In this episode, I chatted with Frances Lorenz, events associate at the Centre for Effective Altruism. We covered our respective paths into AI safety, the emotional impact of learning about x-risk, what it's like to be female in a male-dominated community and more! Follow Frances on Twitter Subscribe to her Substack Apply for EAG London!
Show more...
Technology
Society & Culture,
Philosophy
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/17/bd/b5/17bdb5a1-b0d0-1b26-9cfa-2a8cae8c8842/mza_17542379518000041042.jpg/600x600bb.jpg
#18 Nathan Labenz on reinforcement learning, reasoning models, emergent misalignment & more
Consistently Candid
1 hour 46 minutes
8 months ago
#18 Nathan Labenz on reinforcement learning, reasoning models, emergent misalignment & more
A lot has happened in AI since the last time I spoke to Nathan Labenz of The Cognitive Revolution, so I invited him back on for a whistlestop tour of the most important developments we've seen over the last year! We covered reasoning models, DeepSeek, the many spooky alignment failures we've observed in the last few months & much more! Follow Nathan on Twitter Listen to The Cognitive Revolution My Twitter & Substack
Consistently Candid
In this episode, I chatted with Frances Lorenz, events associate at the Centre for Effective Altruism. We covered our respective paths into AI safety, the emotional impact of learning about x-risk, what it's like to be female in a male-dominated community and more! Follow Frances on Twitter Subscribe to her Substack Apply for EAG London!