Home
Categories
EXPLORE
True Crime
Comedy
Business
Society & Culture
Sports
Health & Fitness
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
Loading...
0:00 / 0:00
Podjoint Logo
US
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts125/v4/5a/a6/a3/5aa6a3a6-8af0-f2b2-9630-3b1b23a50ce0/mza_18269432242724279199.png/600x600bb.jpg
CARE Failing Forward
Emily Janoch
100 episodes
1 month ago
CARE staff and other guests around the world talk about experiences we learn from failure, ways to create safe space to talk about failure, and how we use that to get better at our work.
Show more...
Non-Profit
Business,
Society & Culture
RSS
All content for CARE Failing Forward is the property of Emily Janoch and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
CARE staff and other guests around the world talk about experiences we learn from failure, ways to create safe space to talk about failure, and how we use that to get better at our work.
Show more...
Non-Profit
Business,
Society & Culture
https://pbcdn1.podbean.com/imglogo/ep-logo/pbblog3370675/RS104388_Peru-2023-7536.jpg
We Built a Women-Centered GPT. It Flopped – and Taught Us Everything
CARE Failing Forward
26 minutes 11 seconds
1 month ago
We Built a Women-Centered GPT. It Flopped – and Taught Us Everything
What happens when you try to build an AI tool that works for women entrepreneurs – and it totally flops? In this episode of Failing Forward, CARE’s Koheun Lee and Sarah Hewitt share the story of their ambitious attempt to create a women-centered GPT trained on real-world data from women entrepreneurs. Spoiler: it didn’t go as planned. But the failure revealed a lot.  In this episode, Koheun and Sarah discuss:  Why CARE built a custom GPT to fight bias, and what went wrong How even well-trained AI tools can reinforce stereotypes and exclude women What we learned about prompt design, user behavior, and the limits of scrappy innovation Why most users still defaulted to mainstream AI tools Actionable tips for using AI more intentionally, and with less bias What this "failure" taught us about building better tools and better teams Tune in for a candid conversation about tech, bias, and what it really means to learn in public.  To learn more and join the conversation, visit the Women’s Entrepreneurship LinkedIn Community of Practice.
CARE Failing Forward
CARE staff and other guests around the world talk about experiences we learn from failure, ways to create safe space to talk about failure, and how we use that to get better at our work.