All content for The Deeper Thinking Podcast is the property of The Deeper Thinking Podcast and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The Deeper Thinking Podcast
https://thedeeperthinkingpodcast.podbean.com/
Who is Leading ? Who is Learning ? : AI at Work - The Deeper Thinking Podcast
The Deeper Thinking Podcast
22 minutes 45 seconds
2 months ago
Who is Leading ? Who is Learning ? : AI at Work - The Deeper Thinking Podcast
Who is Leading, Who is Learning?: AI at Work
A new report from MIT has sent shockwaves through the enterprise AI world. According to the State of AI in Business 2025 study, 95% of generative AI pilots deliver zero return on investment.
#ArtificialIntelligence #MultimodalAI #ExplainableAI #PhilosophyOfTechnology #DigitalEthics #NarrativeStructures
What if the real question of AI was not how powerful it becomes, but what kind of story it tells? This episode frames artificial intelligence as a narrative force—less a technological object and more a co-author of contemporary meaning. From the growing unease around generative AI to the quiet revolutions in healthcare and governance, we explore how intelligence is escaping the lab and inhabiting our daily institutions, expectations, and moral architectures.
We move through philosophical tensions: the trade-off between efficiency and autonomy, the ethical opacity of explainable AI, and the metaphysics of machines that now see, speak, and learn. Drawing on thinkers like Gilbert Simondon, Hannah Arendt, and Bruno Latour, the episode unpacks the architecture of AI not as a technical challenge, but as a civic, cultural, and ontological one.
The aim is not to simplify the story of AI—but to listen more carefully to it. What are its rhythms, its blind spots, its unspoken philosophies? And how might we design with care rather than control?
Reflections
AI is not just a tool—it is a theory of how cognition ought to behave.
Efficiency is not a neutral value; it reshapes institutions and identities.
Machines that perceive change the ethical demand we place on design.
The opacity of AI is not just technical—it is philosophical.
Smaller models challenge our assumptions about scale and significance.
To understand AI is to understand what it means to delegate judgment.
Governance without interpretability is not governance—it is abdication.
Multimodal AI simulates perception, but what does it mean to simulate care?
The future of intelligence is less about code and more about character.
Why Listen?
Understand the philosophical tensions behind AI development and deployment
Explore how narrative, care, and institutional design shape AI's societal role
Engage with the ethical implications of autonomous systems and machine ethics
Reconsider AI as an unfolding civic actor rather than a technical artifact
Listen On:
YouTube
Spotify
Apple Podcasts
Support This Work
If this episode deepened your perspective, you can support the project here: Buy Me a Coffee
Further Reading
Gilbert Simondon, On the Mode of Existence of Technical Objects
Bruno Latour, We Have Never Been Modern
Hannah Arendt, The Human Condition
Explainable Artificial Intelligence,
The future will not be decided by machines alone. It will be shaped by the structures we choose to trust—and the rhythms we choose to listen for.
#TheDeeperThinkingPodcast #ArtificialIntelligence #EthicsOfTechnology #PhilosophyOfAI #DigitalHumanism #NarrativeAI #InstitutionalDesign #CivicArchitecture #Simondon #Latour #Arendt #FutureOfWork #TechEthics #AIInSociety #Explainability #Governance
The Deeper Thinking Podcast
The Deeper Thinking Podcast
https://thedeeperthinkingpodcast.podbean.com/