Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
News
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts125/v4/7a/2f/f6/7a2ff6ab-72b4-98e4-b57d-c7816e622d5f/mza_14942688181217468594.png/600x600bb.jpg
Plutopia News Network
Plutopia News Network
274 episodes
6 days ago
The Plutopia News Network provides conversation and commentary on news, current events, culture, politics, and weird anomalies. We're all about humans being human!
Show more...
Society & Culture
News
RSS
All content for Plutopia News Network is the property of Plutopia News Network and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The Plutopia News Network provides conversation and commentary on news, current events, culture, politics, and weird anomalies. We're all about humans being human!
Show more...
Society & Culture
News
https://is1-ssl.mzstatic.com/image/thumb/Podcasts125/v4/7a/2f/f6/7a2ff6ab-72b4-98e4-b57d-c7816e622d5f/mza_14942688181217468594.png/600x600bb.jpg
Sophie Nightingale: Our Minds on Digital Technology
Plutopia News Network
1 hour 2 minutes 7 seconds
6 days ago
Sophie Nightingale: Our Minds on Digital Technology
The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.



One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can't possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn't otherwise. There's quite a lot of evidence showing that's especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let's say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There's a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

Plutopia News Network
The Plutopia News Network provides conversation and commentary on news, current events, culture, politics, and weird anomalies. We're all about humans being human!