Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Fiction
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/ba/93/99/ba9399aa-d4ae-498c-0f50-99c7a342f87e/mza_14246718780721623205.jpg/600x600bb.jpg
ADAPT Radio
The ADAPT Centre
80 episodes
3 days ago
When your AI agent books a rental car, it needs your driver's license, credit card, calendar access and permission to message your contacts—creating what Meredith Whittaker calls "fundamental backdoor" threatening apps like Signal.  At ADAPT ADVANCE 2025, Signal Foundation President and AI Now Institute co-founder Meredith Whittaker joined Dr Abeba Birhane for a fireside chat dissecting why "bigger is better" serves hyperscaler monopolies not evidence.  How AI companions weaponise 1970s Eliza manipulation psychology on minors, why "open source AI" became marketing arbitrage exploiting software community goodwill, and what sovereign AI actually requires beyond anxiety signifiers—including democratic governance, trusted local data, and answers to "who owns deployment infrastructure?" 
 THINGS WE SPOKE ABOUT * “Bigger is better" AI myth protects hyperscaler monopolies, not users * Agentic AI demands sweeping permissions creating existential privacy backdoor threats * AI companions weaponize known psychological manipulation tactics against vulnerable minors * "Open source AI" exploits software community goodwill without delivering benefits * Sovereign AI requires democratic governance beyond geopolitical anxiety signaling today GUEST DETAILS Meredith Whittaker is President of the Signal Foundation and co-founder of the AI Now Institute—one of the most trusted voices in AI ethics, transparency and accountability. Her decade of work has profoundly shaped ethical AI frameworks, bringing impact from academia to industry. At Google, Meredith was core organizer for the 2018 Google Walkouts where over 20,000 employees protested military AI use (Project Maven), surveillance, and sexual misconduct—forcing Google to discontinue their military contract and oust implicated VPs. As AI Now Institute co-founder, her research cuts through AI hype, grounding discussions on what truly matters: power concentration, labour exploitation in AI pipelines, and protecting fundamental rights including privacy and rule of law. Her work exposes corporate capture, debunks "bigger is better" myths, reveals sustainability costs, and provides foundational open source research. Meredith has provided congressional testimony to US Congress and leads Signal—one of the most trusted privacy-friendly messaging apps. Her background building large-scale network measurement systems at Google gives her unique expertise in data quality, evaluation criteria manipulation, and how benchmark gaming serves hyperscaler interests over real-world effectiveness. Dr Abeba Birhane is founder and director of the AI Accountability Lab at Trinity College Dublin. Her groundbreaking research examines AI datasets, uncovering how larger datasets contain higher hateful content and pornography—debunking "bigger dissipates problems" assumptions. Her work on benchmarks and measurement demonstrates that purpose-built smaller models often outperform larger models in real-world contexts with appropriate contextual data. Connect with the guests: * Signal Foundation: signal.org * AI Now Institute: ainowinstitute.org * AI Accountability Lab: Contact through ADAPT Centre * Follow their research and writing on AI accountability MORE INFORMATION You can learn more about the Sea-Scan project and other cutting-edge research at Trinity College Dublin's ADAPT Centre here: www.adaptcentre.ie/ Adapt Radio is produced by DustPod.io for the ADAPT Centre For more information about ADAPT's groundbreaking AI and data analytics research visit www.adaptcentre.ie/ KEYWORDS #TrustedAI #AIaccountability #AIprivacy #AIgovernance #MeredithWhittaker
Show more...
Technology
RSS
All content for ADAPT Radio is the property of The ADAPT Centre and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
When your AI agent books a rental car, it needs your driver's license, credit card, calendar access and permission to message your contacts—creating what Meredith Whittaker calls "fundamental backdoor" threatening apps like Signal.  At ADAPT ADVANCE 2025, Signal Foundation President and AI Now Institute co-founder Meredith Whittaker joined Dr Abeba Birhane for a fireside chat dissecting why "bigger is better" serves hyperscaler monopolies not evidence.  How AI companions weaponise 1970s Eliza manipulation psychology on minors, why "open source AI" became marketing arbitrage exploiting software community goodwill, and what sovereign AI actually requires beyond anxiety signifiers—including democratic governance, trusted local data, and answers to "who owns deployment infrastructure?" 
 THINGS WE SPOKE ABOUT * “Bigger is better" AI myth protects hyperscaler monopolies, not users * Agentic AI demands sweeping permissions creating existential privacy backdoor threats * AI companions weaponize known psychological manipulation tactics against vulnerable minors * "Open source AI" exploits software community goodwill without delivering benefits * Sovereign AI requires democratic governance beyond geopolitical anxiety signaling today GUEST DETAILS Meredith Whittaker is President of the Signal Foundation and co-founder of the AI Now Institute—one of the most trusted voices in AI ethics, transparency and accountability. Her decade of work has profoundly shaped ethical AI frameworks, bringing impact from academia to industry. At Google, Meredith was core organizer for the 2018 Google Walkouts where over 20,000 employees protested military AI use (Project Maven), surveillance, and sexual misconduct—forcing Google to discontinue their military contract and oust implicated VPs. As AI Now Institute co-founder, her research cuts through AI hype, grounding discussions on what truly matters: power concentration, labour exploitation in AI pipelines, and protecting fundamental rights including privacy and rule of law. Her work exposes corporate capture, debunks "bigger is better" myths, reveals sustainability costs, and provides foundational open source research. Meredith has provided congressional testimony to US Congress and leads Signal—one of the most trusted privacy-friendly messaging apps. Her background building large-scale network measurement systems at Google gives her unique expertise in data quality, evaluation criteria manipulation, and how benchmark gaming serves hyperscaler interests over real-world effectiveness. Dr Abeba Birhane is founder and director of the AI Accountability Lab at Trinity College Dublin. Her groundbreaking research examines AI datasets, uncovering how larger datasets contain higher hateful content and pornography—debunking "bigger dissipates problems" assumptions. Her work on benchmarks and measurement demonstrates that purpose-built smaller models often outperform larger models in real-world contexts with appropriate contextual data. Connect with the guests: * Signal Foundation: signal.org * AI Now Institute: ainowinstitute.org * AI Accountability Lab: Contact through ADAPT Centre * Follow their research and writing on AI accountability MORE INFORMATION You can learn more about the Sea-Scan project and other cutting-edge research at Trinity College Dublin's ADAPT Centre here: www.adaptcentre.ie/ Adapt Radio is produced by DustPod.io for the ADAPT Centre For more information about ADAPT's groundbreaking AI and data analytics research visit www.adaptcentre.ie/ KEYWORDS #TrustedAI #AIaccountability #AIprivacy #AIgovernance #MeredithWhittaker
Show more...
Technology
https://i1.sndcdn.com/artworks-Pay5ZfKVHJrbSMkW-jAGZcQ-t3000x3000.png
Vacant No More: AI and the Future of Irish Buildings
ADAPT Radio
36 minutes 15 seconds
2 months ago
Vacant No More: AI and the Future of Irish Buildings
Ireland is facing a housing crisis, and yet countless buildings sit empty and unused. In this episode of Adapt Radio, Dr. Clare O’Connell speaks with Dr. Philip Crowe and Milo Dennehy from University College Dublin about a bold, AI-driven project to map and tackle building vacancy across the country. Discover why data on empty buildings is so patchy, how new tech could unlock hidden opportunities for homes and communities, and why solving this puzzle means more than just better spreadsheets. From town centre revitalisation to the surprising power of open data, this conversation explores how AI might help Ireland turn vacancy into vibrant possibility. THINGS WE SPOKE ABOUT ● Why Ireland’s empty buildings are so hard to track ● How AI and data science are changing the vacancy game ● The challenges, and surprises, of sharing property data ● What Ireland can learn from France, Philadelphia, and beyond ● Transforming vacant spaces into vibrant communities GUEST DETAILS Dr Philip Crowe is UCD Assistant Professor for Climate Responsive Design at the School of Architecture, Planning and Environmental Policy (APEP) and the School of Civil Engineering. He is the Director of Research at UCD APEP, co-director of the UCD Centre for Irish Towns, and Programme Director of the MSc in Architecture, Urbanism and Climate Action. Philip teaches in areas relating to carbon management, sustainability, urban resilience and urban ethics. As a researcher, he is working on a range of EU and nationally funded projects relating to town revitalisation, vacancy and adaptive reuse, compact urban growth, and citizen participation in processes of change. Philip’s background is in architecture, and he was previously Director of Sustainable Design at M.CO (Dublin) from 2003-2012. Milo Dennehy is a Research Assistant in the School of Computer Science. Milo is a current student in the BSc City Planning and Environmental Policy programme, set to graduate in 2025. Before joining Building Stories, Milo was a research assistant in the School of Architecture, Planning and Environmental Policy at UCD. His research interests are in building geospatial tools that leverage machine learning and remote sensing techniques to enhance capacity and reduce the barriers to access to geospatial technology for local actors. MORE INFORMATION You can learn more about your AI Literacy in the Classroom here: https://ai-literacy-in-the-classroom.adaptcentre.ie/ Adapt Radio is produced by DustPod.io for the Adapt Centre For more information about ADAPT visit www.adaptcentre.ie/ QUOTES We don’t actually know what we’re dealing with—how do you manage your built environment without reliable, dynamic data? – Dr Philip Crowe It shocked me how much decision-making still relies on throwing darts at the map and manual guesswork – Milo Dennehy If you aggregate all these data sets together, you’ve got something incredibly powerful to solve the housing crisis – Dr Philip Crowe AI lets us see what’s missing in the data—and predict what’s possible for Ireland’s empty buildings. – Milo Dennehy The real challenge isn’t technical—it’s about building trust and a culture of sharing data for the public good. – Dr Philip Crowe KEYWORDS #AI #VacantBuildings #IrelandHousing #DataScience #UrbanRevitalisation #SmartCities #OpenData #HousingCrisis #PropertyData #CommunityImpact #TownCentres #DataGovernance #SustainableCities #Innovation #TechForGood
ADAPT Radio
When your AI agent books a rental car, it needs your driver's license, credit card, calendar access and permission to message your contacts—creating what Meredith Whittaker calls "fundamental backdoor" threatening apps like Signal.  At ADAPT ADVANCE 2025, Signal Foundation President and AI Now Institute co-founder Meredith Whittaker joined Dr Abeba Birhane for a fireside chat dissecting why "bigger is better" serves hyperscaler monopolies not evidence.  How AI companions weaponise 1970s Eliza manipulation psychology on minors, why "open source AI" became marketing arbitrage exploiting software community goodwill, and what sovereign AI actually requires beyond anxiety signifiers—including democratic governance, trusted local data, and answers to "who owns deployment infrastructure?" 
 THINGS WE SPOKE ABOUT * “Bigger is better" AI myth protects hyperscaler monopolies, not users * Agentic AI demands sweeping permissions creating existential privacy backdoor threats * AI companions weaponize known psychological manipulation tactics against vulnerable minors * "Open source AI" exploits software community goodwill without delivering benefits * Sovereign AI requires democratic governance beyond geopolitical anxiety signaling today GUEST DETAILS Meredith Whittaker is President of the Signal Foundation and co-founder of the AI Now Institute—one of the most trusted voices in AI ethics, transparency and accountability. Her decade of work has profoundly shaped ethical AI frameworks, bringing impact from academia to industry. At Google, Meredith was core organizer for the 2018 Google Walkouts where over 20,000 employees protested military AI use (Project Maven), surveillance, and sexual misconduct—forcing Google to discontinue their military contract and oust implicated VPs. As AI Now Institute co-founder, her research cuts through AI hype, grounding discussions on what truly matters: power concentration, labour exploitation in AI pipelines, and protecting fundamental rights including privacy and rule of law. Her work exposes corporate capture, debunks "bigger is better" myths, reveals sustainability costs, and provides foundational open source research. Meredith has provided congressional testimony to US Congress and leads Signal—one of the most trusted privacy-friendly messaging apps. Her background building large-scale network measurement systems at Google gives her unique expertise in data quality, evaluation criteria manipulation, and how benchmark gaming serves hyperscaler interests over real-world effectiveness. Dr Abeba Birhane is founder and director of the AI Accountability Lab at Trinity College Dublin. Her groundbreaking research examines AI datasets, uncovering how larger datasets contain higher hateful content and pornography—debunking "bigger dissipates problems" assumptions. Her work on benchmarks and measurement demonstrates that purpose-built smaller models often outperform larger models in real-world contexts with appropriate contextual data. Connect with the guests: * Signal Foundation: signal.org * AI Now Institute: ainowinstitute.org * AI Accountability Lab: Contact through ADAPT Centre * Follow their research and writing on AI accountability MORE INFORMATION You can learn more about the Sea-Scan project and other cutting-edge research at Trinity College Dublin's ADAPT Centre here: www.adaptcentre.ie/ Adapt Radio is produced by DustPod.io for the ADAPT Centre For more information about ADAPT's groundbreaking AI and data analytics research visit www.adaptcentre.ie/ KEYWORDS #TrustedAI #AIaccountability #AIprivacy #AIgovernance #MeredithWhittaker