Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Music
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/9e/45/69/9e45696e-2e90-4522-79b5-8975c57ac63b/mza_17743532268733695654.jpeg/600x600bb.jpg
"Upstream" with Erik Torenberg
Erik Torenberg
144 episodes
5 months ago
You’ll hear consequential ideas here first, and in the mainstream months later. Upstream is a curated nexus feed from the Turpentine podcast network, bringing you expert-level conversations hosted by some of the most compelling thinkers in the world including Noah Smith, Samo Burja, Byrne Hobart, Erik Torenberg, and Nathan Labenz. Guests include Marc Andreessen, Balaji Srinivasan, Dario Amodei, Brian Armstrong, David Sacks, Sam Harris, Katherine Boyle, Curtis Yarvin, and many more unmissable conversations.
Show more...
Technology
News,
Tech News
RSS
All content for "Upstream" with Erik Torenberg is the property of Erik Torenberg and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
You’ll hear consequential ideas here first, and in the mainstream months later. Upstream is a curated nexus feed from the Turpentine podcast network, bringing you expert-level conversations hosted by some of the most compelling thinkers in the world including Noah Smith, Samo Burja, Byrne Hobart, Erik Torenberg, and Nathan Labenz. Guests include Marc Andreessen, Balaji Srinivasan, Dario Amodei, Brian Armstrong, David Sacks, Sam Harris, Katherine Boyle, Curtis Yarvin, and many more unmissable conversations.
Show more...
Technology
News,
Tech News
https://image.simplecastcdn.com/images/ec6382/ec638266-7b9d-436e-b6c3-d127d725c930/218c5c2a-8dc3-4e92-a89c-f2c1c80e5c0c/3000x3000/34073854e84d7bfbab4bc9a9733377f8.png?aid=rss_feed
E137: AI Safety vs Speed: Helen Toner Discusses OpenAI Board Experience, Regulatory Approaches, and Military AI [The Cognitive Revolution]
"Upstream" with Erik Torenberg
1 hour 20 minutes 1 second
6 months ago
E137: AI Safety vs Speed: Helen Toner Discusses OpenAI Board Experience, Regulatory Approaches, and Military AI [The Cognitive Revolution]
This week on Upstream, we’re releasing an episode of The Cognitive Revolution. Nathan Labenz interviews Helen Toner, director at CSET, about her experiences with OpenAI, the concept of adaptation buffers for AI integration, and AI's role in military decision-making. They discuss the implications of AI development, the need for regulatory policies, and the geopolitical dynamics involving AI competition with China. — 📰 Be notified early when Turpentine's drops new publication: https://www.turpentine.co/exclusiveaccess    — RECOMMENDED PODCASTS:  🎙️ ‪The Cognitive Revolution The Cognitive Revolution is a podcast about AI where hosts Nathan Labenz and Erik Torenberg interview the builders on the edge of AI and explore the dramatic shift it will unlock over the next decades.  Spotify: https://open.spotify.com/show/6yHyok3M3BjqzR0VB5MSyk?si=7357ec31ac424043&nd=1&dlsi=060a53f1d7be47ad   Apple: https://podcasts.apple.com/us/podcast/the-cognitive-revolution-ai-builders-researchers-and/id1669813431  — SPONSORS: ☁️ Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds and offers one consistent price. Oracle is offering to cut your cloud bill in half. See if your company qualifies at https://oracle.com/turpentine 🕵️‍♂️ Take your personal data back with Incogni! Use code UPSTREAM at the link below and get 60% off an annual plan: https://incogni.com/upstream 💥 Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/  and mention “Turpentine” to skip the waitlist. — LINKS: Helen Toner's appearance on the TED AI show: https://www.ted.com/talks/the_ted_ai_show_what_really_went_down_at_openai_and_the_future_of_regulation_w_helen_toner  Helen Toner's substack : https://helentoner.substack.com/  Additional recommended reads: https://helentoner.substack.com/p/nonproliferation-is-the-wrong-approach  https://cset.georgetown.edu/publication/ai-for-military-decision-making/ https://www.lawfaremedia.org/article/ai-regulation-s-champions-can-seize-common-ground-or-be-swept-aside  https://www.economist.com/by-invitation/2024/05/26/ai-firms-mustnt-govern-themselves-say-ex-members-of-openais-board — X / TWITTER: @hlntnr @labenz @eriktorenberg @turpentinemedia — HIGHLIGHTS FROM THE EPISODE: Helen Toner joined OpenAI's board in 2021, bringing AI policy expertise when AGI discussions were still uncommon. She confirms that rumors about QStar contributing to the board's decision to fire Sam Altman were completely false. Helen observes contradictions at OpenAI: safety-focused research papers alongside aggressive policy positions. For AI whistleblowers, she recommends clear disclosure standards rather than vague reporting guidelines. Helen introduced the concept of "adaptation buffers," noting that while frontier AI development gets more expensive, capabilities become cheaper to replicate once achieved. Rather than focusing on non-proliferation, Helen advocates using adaptation time to build societal resilience (like improving outbreak detection). She favors conditional slowdowns (based on risk mitigation) rather than arbitrary pauses or compute limits. For military AI applications, Helen's research identifies three key considerations: scope (how tightly bound the system is), data quality, and human-machine interaction design. Helen expresses skepticism about "AI war simulations," arguing military contexts have too many unknowns to be modeled like games. She suggests the shift in AI CEOs' rhetoric about China competition is "the path of least resistance" to argue against regulation. Helen acknowledges the difficulty of reaching stable international equilibrium around AI development with too many unknowns about what superintelligence would mean for political systems.
"Upstream" with Erik Torenberg
You’ll hear consequential ideas here first, and in the mainstream months later. Upstream is a curated nexus feed from the Turpentine podcast network, bringing you expert-level conversations hosted by some of the most compelling thinkers in the world including Noah Smith, Samo Burja, Byrne Hobart, Erik Torenberg, and Nathan Labenz. Guests include Marc Andreessen, Balaji Srinivasan, Dario Amodei, Brian Armstrong, David Sacks, Sam Harris, Katherine Boyle, Curtis Yarvin, and many more unmissable conversations.