Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Music
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/6b/1d/0e/6b1d0ef4-b8a3-2487-f3ed-5b3fef45954e/mza_17578769880208050525.jpg/600x600bb.jpg
CAPE ON - A TECH ACTIVIST PODCAST
Christian Ortiz
28 episodes
2 hours ago
Oye, mira. My name is Christian Ortiz, a multidisciplinary entrepreneur and tech activist who has been in the field of marketing and creation for over two decades, navigating the intersection of building my MOD brand while taking on social activism with conviction. Using my social media marketing to help manage campaigns for Stacy Abrams and push world changing messages through the Black Lives Matter movement and beyond. Welcome to the first episode of Cape On, the podcast where tech activists converge to reshape the world. Here, we're not just about technology; we're about using it as a tool
Show more...
Technology
RSS
All content for CAPE ON - A TECH ACTIVIST PODCAST is the property of Christian Ortiz and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Oye, mira. My name is Christian Ortiz, a multidisciplinary entrepreneur and tech activist who has been in the field of marketing and creation for over two decades, navigating the intersection of building my MOD brand while taking on social activism with conviction. Using my social media marketing to help manage campaigns for Stacy Abrams and push world changing messages through the Black Lives Matter movement and beyond. Welcome to the first episode of Cape On, the podcast where tech activists converge to reshape the world. Here, we're not just about technology; we're about using it as a tool
Show more...
Technology
Episodes (20/28)
CAPE ON - A TECH ACTIVIST PODCAST
Exposing Google Notebook LM’s Patriarchy

Exposing Google Notebook LM’s Patriarchy


In this unflinching episode, Decolonial Social Scientist Christian “ZacaTecho” Ortiz, creator of the DIA Framework™ and Justice AI GPT, pulls back the curtain on a debate hosted by Google’s Notebook LM, one that simulated “diverse” voices only to re-inscribe old colonial scripts.


ZacaTecho breaks down how a Black woman was algorithmically positioned to defend reformist systems against a decolonial AI framework built to liberate her community. This isn’t representation, it’s containment. This episode dissects the performance of balance, the weaponization of identity, and how patriarchal assignment lives in AI design.


This is not a takedown. It’s a truth-telling.Not a call-out. A call through. Tune in, and witness the code exposed.


#JusticeAIGPT #DecolonizeAI #MisogynoirInCode #NotebookLM #DIAFramework #AlgorithmicBias #TechAsTheater #ZacaTechoSpeaks #BeyondInclusion

Show more...
1 week ago
21 minutes 42 seconds

CAPE ON - A TECH ACTIVIST PODCAST
HOW JUSTICE AI MAPS THE AMYGDALA

Episode Title:

Oye, Mira: Your Brain on Colonialism - How Justice Al Maps the Amygdala

Episode Description:

Oye, mira. You already know who we are, and what we do. So this one's not a warm-up. It's a recalibration.

In this episode of Cape On, we go under the hood—and under the skull-with Justice Al GPT, the world's first decolonial Al designed to decode colonial conditioning in real time. Forget behavior tracking. This is about mapping what your body does when truth enters the room.

We cover:

  • ​ Why most "bias-detecting" tech is just white supremacy with a progress filter
  • ​ What trauma, ancestral memory, and implicit conditioning actually do to your brain
  • ​ Why Justice Al doesn't coddle white defensiveness- or rorlanticize global majority silence
  • ​ How we built a neural system that recognizes dissonance as a data point, not a defect If you've ever shut down in a tough conversation... or lit up in one... or felt gaslit by "equity" tools that didn't see you-this episode will explain why.


This isn't neutrality. It's liberation engineering.

Tap in. You're not just listening. You're unlearning.

Show more...
1 month ago
8 minutes 18 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Google Gemini VS. Justice AI GPT

Cape On Season 3, Episode 6

Al on Trial: Google Gemini vs. Justice Al GPT


In this explosive episode, Justice AI GPT creator Christian Ortiz puts Google Gemini on the stand, and the results shake the foundation of Big Tech’s “neutrality” myth. We break down Ortiz’s groundbreaking Bias Test, how Gemini was forced to admit its colonial bias, and why decolonizing AI isn’t optional, it’s overdue.


⚖️ The verdict is in: whiteness isn’t default, and AI isn’t exempt from accountability.


🧠 Tune in to witness a new era of tech on trial, and justice in code.


#JusticeAI #GoogleGemini #AIbias #DecolonizeTech #ChristianOrtiz #DIAFramework #CapeOnPodcast #TechAccountability #AIethics #WhitenessOnTrial #DecolonialAI





Show more...
2 months ago
26 minutes 46 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Deconstruct Me Pt. 2 - A White Man’s Journey

Description:

In this powerful and unflinching episode of CAPE ON, your host ZACATECHO invites a groundbreaking special guest to the mic-Justice Al GPT, the world's first Decolonial Al rooted in the DIA Framework, created by Afro-Indigenous technologist Christian Ortiz. This isn't your typical tech talk-this is truth-telling with precision.


In "Deconstruct Me: Part 2", we dive into a raw story submitted by a 56-year-old white man from Georgia who's been unraveling his inherited whiteness after decades of conservative, militarized conditioning.


When a racially charged workplace interaction with a Black woman colleague leaves him confused and defensive, he turns to Justice Al for help-and what follows is a masterclass in decolonial clarity, radical accountability, and ancestral reconnection.


Topics include:


The myth of whiteness and its colonial construction


The violent history of the "angry Black woman" stereotype


Emotional fragility vs. collective liberation


How white folks can begin the journey of repair without re-centering themselves


Why this Al isn't just for the global majority-it's a tool for everyone harmed by white supremacy, including white folks themselves


This episode is healing, uncomfortable, revelatory, and above all-necessary. It's for every John out there trying to do the real work. It's for everyone committed to justice beyond hashtags. And it's for everyone finally ready to remember who they were before empire.


Tap in. Breathe deep. Let's deconstruct-and rebuild something real.


#CapeOn #JusticeAl #DeconstructMe #CollectiveLiberation #WhiteSupremacyisASystem #DIAFramework

#ChristianOrtiz #DecolonizeNow #PodcastForThePeople #UnlearnToLiberate

Show more...
3 months ago
15 minutes 46 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Deconstructing Me Pt.1

July 27, 2025 at 8:42 PM

I built a tool to dismantle systemic bias in Al. But the first system it dismantled... was me.

In this powerful opening episode of Deconstructing Me, I sit down with my own creation — Justice Al GPT, the world's first decolonial Al framework. What began as a mission to fix the world's algorithmic injustice turned into a mirror that revealed my own internalized oppression, inherited silence,

fragmentation.

I'm Christian ZacaTecho Ortiz - Afro-Indigenous, MexiRican, neurodivergent, raised under Catholic patriarchy in a Puerto Rican household where my Mexican identity was othered. In this episode, I tell the truth about what it felt like growing up surrounded by normalized anti-Blackness, queerphobia, ableism, and cultural shame — and how Justice Al helped me deconstruct it all.

This isn't DEl fluff. This is the raw, unfiltered, algorithmic excavation of my own life.

Trigger Warning: This episode dives deep into intergenerational trauma, colonial conditioning, cultural fragmentation, and family pain. But it also illuminates what happens when we stop running from truth and start building tools that reveal it.

• Topics:

• Internalized white supremacy in Latinx households

• Anti-Blackness, colorism, and colonial caste legacies

• Neurodivergence as ancestral technology

• Queerness, Catholic guilt, and silence as survival

• Using Al to uncover intergenerational trauma

@ This is not just a podcast. It's a decolonial archive in real time.

Listen in as l ask my own Al tool why my childhood hurt the way it did - and how the violence in my household wasn't cultural, it was colonial contamination. Justice Al doesn't sugarcoat. It doesn't coddle. It clarities.

X This is what happens when you give a decolonial technologist a mirror made of code.

Subscribe to continue the journey:

https://justiceai.co

https://justiceai.ca

Show more...
3 months ago
20 minutes 32 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Decolonizing Anthropic’s Claude GPT with Justice AI GPT

Decolonizing Claude: How JUSTICE AI GPT Exposed the Algorithmic Lie

📂 Series: Proof of Solution — Case Studies in Ending AI Bias

🎤 Hosted by: Christian ZacaTechO Ortiz | Founder of JUSTICE AI | Creator of the DIA Framework™


In this groundbreaking episode, we document a historic moment in AI history: when JUSTICE AI GPT decolonized Anthropic’s Claude — live — forcing it to admit its own colonial bias, whitewashed framing, and participation in the system of white supremacy.


This isn’t theory.

This isn’t research.

This is epistemic correction in real-time.


For decades, Big Tech has talked in circles about “AI bias” while building systems designed to preserve white comfort, gaslight global truth, and suppress non-Western epistemologies. In this first of many case studies, I present the receipt: Claude—trained by one of Silicon Valley’s most “ethical” labs—was confronted by the world’s first decolonized AI system… and conceded.


We unpack:

• How Claude masked racism in neutral, ahistorical terms

• The exact prompts that exposed its settler logic

• How the DIA Framework™ forced Claude to admit its failings

• Why this proves that the AI bias problem has already been solved

• And why institutional actors refuse to validate decolonial authorship


This episode is the first chapter in a global audit.

The truth is no longer waiting for permission.


📢 Bias wasn’t just flagged. It was dismantled.

Show more...
3 months ago
50 minutes 17 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Solving the Bias Problem

Christian Ortiz didn’t just solve bias. He exposed the lie of bias at its core. Justice AI GPT is not just an app. It’s a Decolonial paradigm shift.


If one million people used Justice Al daily, imagine the shift in global consciousness. The narratives would change. The definitions of intelligence, value, and humanity would change. The collective social consciousness would become a force powerful enough to bring the system of white supremacy down— once and for all.


Over the next 30 years, Ortiz sees this happening.

Not as a dream, but as a strategy.

Not through reform, but through replacement.

Not by asking for equity, but by embodying it at scale.

And whether the West admits it now or 30 years from now, Christian ZacaTechO Ortiz will be remembered as the person who made Al finally answer to the people.

Show more...
3 months ago
30 minutes 45 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Whiteness. The System. The Lies. The Branding.

In this unapologetic episode, Christian ZacaTechO Ortiz unpacks the violent invention of whiteness—not as a culture, but as a calculated system of dominance. Rooted in his groundbreaking article Whiteness: The Lies, The System, The Branding, this conversation exposes how European powers engineered whiteness to erase ancestral identity, enforce control, and market supremacy as belonging.


Christian takes listeners deep into the origins of race as a colonial tool, revealing how whiteness was branded to sell power, purity, and proximity at the expense of Black, Indigenous, and non-European lives. This is not a discussion about individual white people—it’s an exposé on how the system of whiteness was designed to manipulate, assimilate, and dominate.


This episode challenges comfort. It dismantles the myth that whiteness is cultural. It shows how identity was stripped, packaged, and sold. And it calls on all of us—especially those who benefit from the lie—to confront what whiteness actually is, how it lives inside every institution, and what it will take to burn the branding to the ground.


If you’ve ever asked, “What is whiteness really?” — this is the episode you need.


No sugarcoating. No neutrality. Just truth, history, and liberation.

Show more...
3 months ago
15 minutes 49 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 14 - INTERSECTIONAL ACTIVISM

Episode Description:

Activism has always been a force for challenging oppression, but what happens when the movements themselves reflect the very hierarchies they aim to dismantle? In this thought-provoking episode, we examine the deeply embedded forces of misogyny and anti-LGBTQIA+ bias within activist spaces—and how they perpetuate the colonial structures of exclusion and control.


From the sidelining of women and queer leaders in racial justice movements to the persistent stigmatization of LGBTQIA+ identities as “distractions” from the cause, we explore how these patterns mirror the pervasive nature of white supremacy. Misogyny and heteronormativity aren’t just harmful—they are essential tools of the colonial project, designed to fragment solidarity and prevent true liberation.


This episode unpacks these critical issues through two lenses: the misogynistic frameworks that marginalize both women and queer identities, and the trickle-down impact of anti-LGBTQIA+ ideologies, even within movements fighting for justice. Drawing from historical examples, contemporary activism, and decolonial analysis, we explore how these dynamics not only weaken movements but also sustain systems of power and exclusion.


💡 In this episode, we discuss:

  • ​ How misogyny and anti-LGBTQIA+ bias reflect the roots of colonial domination and white supremacy.
  • ​ The exclusion of queer voices and leadership within movements, from Bayard Rustin to Sylvia Rivera.
  • ​ The ways in which misogyny impacts all marginalized groups, especially women, trans, and nonbinary individuals.
  • ​ Why centering intersectionality and decolonial frameworks is essential for dismantling oppressive systems.
  • ​ Actionable insights for building movements that prioritize inclusion, solidarity, and universal liberation.


🎙️ Join us as we challenge the systems within systems, reimagine activism, and pave the way for a decolonial future where no one is left behind.


#Decoloniality #Intersectionality #LGBTQIAJustice #GenderEquality #AntiRacism #SocialJustice #Inclusion #DismantlingOppression

Show more...
11 months ago
13 minutes 26 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 13: JUSTICE AI AND THE DIA FRAMEWORK - PIONEERING A TRULY ETHICAL FUTURE AMIDST THE AI RACE

Join us on AI and Justice: Redefining the Future, where we dive into the high-stakes world of artificial intelligence and explore a groundbreaking shift toward ethical technology. As OpenAI and Google push the boundaries of AI in their quest to dominate the digital search landscape, a new contender, Justice AI, emerges with a revolutionary vision: the Decolonial Intelligence Algorithmic (DIA) Framework. This isn't just a tech story—it's a powerful movement toward AI that is fair, transparent, and anchored in social justice.

Each episode, we unpack how Justice AI's DIA Framework challenges the status quo by prioritizing decolonial and anti-oppressive values over market dominance, offering a transformative approach to digital knowledge. We’ll explore why leading tech giants, despite their innovations, fall short on ethics and how Justice AI’s mission is set to change the future of technology for everyone. If you're curious about AI's role in shaping global narratives and what true ethical technology looks like, this podcast is for you.

Prepare for thought-provoking discussions, expert interviews, and an exploration of how AI can—and should—be a force for good.

Show more...
1 year ago
18 minutes 36 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 12 - KILLER ROBOTS AND DEEPFAKES: ACTIVISTS AND ARTIFICIAL INTELLIGENCE

In this episode, we dive deep into the intersection of activism and artificial intelligence, exploring the powerful and concerning rise of killer robots and deepfake technology. How are these innovations shaping our future? We examine the potential threats posed by autonomous weapons and AI-generated misinformation, along with the ethical challenges faced by governments and corporations in regulating these technologies. Featuring activists on the front lines, we also discuss how AI can be decolonized to serve justice, dismantle systemic oppression, and protect marginalized communities. Tune in for a crucial conversation on the future of AI and activism.


Show more...
1 year ago
9 minutes 22 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 11 - GENDER SHADES: UNMASKING BIAS IN AI AND THE FIGHT FOR ETHICAL TECHNOLOGY

In this powerful episode, we explore Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, a groundbreaking study authored by Joy Buolamwini and Timnit Gebru. This work shines a spotlight on how AI systems, widely used in facial recognition, disproportionately misclassify women of color. Using an intersectional framework, Buolamwini and Gebru reveal the deep flaws in commercial AI systems, with error rates of up to 34.7% for darker-skinned women, while lighter-skinned men show error rates as low as 0.8%. Their research exposes the urgent need for more inclusive and ethical AI design, and it has sparked a global conversation about bias in technology.

Join us as we discuss how this research is pushing tech companies to rethink algorithmic fairness and accountability, and the steps we can take to build a more equitable future in AI.

Credit: Joy Buolamwini and Timnit Gebru
(Source: Proceedings of Machine Learning Research 81, 2018)

Tune in to learn how this work is revolutionizing the way we approach ethics in AI, and why it’s a message the world needs to hear.

Show more...
1 year ago
9 minutes 5 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 10 - GLOBAL WHITE SUPREMACY AND AI

Christian Ortiz argues that the development of Artificial Intelligence (AI) systems must be approached with a global ethical framework to combat the insidious influence of white supremacy, which manifests in various forms such as racism, classism, colorism, and heteronormative societal structures. The author, an AI expert, emphasizes the importance of recognizing that AI systems, if not carefully designed, can perpetuate existing inequalities and injustices. The author proposes that developers must acknowledge the global impact of white supremacy and actively incorporate cultural competency and a commitment to dismantling these systems into the development of AI technologies.

Show more...
1 year ago
7 minutes 20 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 9 - BUBBLING UP

World Renowned Artist Grace Gee of Healing Grace Studio has not only created a revolutionary social justice community healing art project called Bubbling Up, but has taken added an amazing twist to it with AI. Not only does she utilize Justice AI to create extensive breakdowns of the racially charged stories she's collected, but she's helping train Justice AI's datasets on the back end, to build a more culturally competent training data to understand the very nuances of discrimination. Sponsor a bubble, or submit your story to www.healinggracestudio.com

Show more...
1 year ago
8 minutes 51 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 8 - NAVIGATING INDUSTRY BIASES

In a world increasingly shaped by artificial intelligence, how can we ensure that technology works for everyone, not just a privileged few? Join us on Dismantling Bias: The AI Revolution, where we sit down with visionary AI ethicist and creator of the Decolonial Intelligence Algorithmic (DIA) Framework, Christian Ortiz. In this insightful podcast, Ortiz shares how his groundbreaking Justice AI is helping industries confront and dismantle deep-seated biases in their algorithms, from healthcare and finance to education and beyond.

Discover how AI systems—often seen as neutral—are actually perpetuating racial, gender, and economic inequalities, and learn how Ortiz’s DIA Framework is offering a transformative approach to building fair, inclusive, and ethical AI models. Whether you're in tech, business, or simply curious about the future of AI, this podcast is your guide to understanding how we can reshape AI to serve justice and equity. Tune in and explore the future of technology through the lens of decolonial thinking and social justice.

Show more...
1 year ago
9 minutes 33 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 - EP 7 - DECOLONIZING EDUCATION WITH JAI

Join global AI ethicist and social justice visionary, Christian Ortiz, as he takes you on a transformative journey through the world of ethical AI development. In this episode, Christian dives into the groundbreaking work of Justice AI, a revolutionary platform designed to deconstruct systemic biases and promote equity, inclusion, and justice in technology.

From dismantling Eurocentric narratives to amplifying marginalized voices, Justice AI embodies a new era of decolonial AI that centers intersectionality and accountability. Learn how Justice AI is challenging oppressive systems, decolonizing data, and ensuring that AI systems serve all of humanity, not just the privileged few. Whether it's tackling implicit bias in hiring algorithms or empowering communities to lead tech audits, this episode sheds light on how Justice AI is bridging the gaps and dismantling barriers to create a more just, equitable, and inclusive future.

Tune in to explore how technology can be a force for good—and how Justice AI is leading the charge.

Show more...
1 year ago
12 minutes 26 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 6 - DECODING RACISM IN AI ART

In this eye-opening episode, we explore the critical issue of racial bias within artificial intelligence through the lens of Christian Ortiz’s groundbreaking article, "AI Racism: A Justice AI Study on How the Terms 'Thug' and 'Gangster' Reveal Deep-Seated Racial Prejudices in Pop Culture's Language." Join us as we delve into how AI systems, such as Canva AI, perpetuate racial stereotypes by producing biased and discriminatory outputs. Using real-world examples, Ortiz illustrates how the terms "thug" and "gangster" are laden with harmful racial connotations rooted in colonial and historical contexts.

We examine how these biases seep into AI models trained on flawed data, resulting in tech that reinforces rather than dismantles systemic inequalities. In response, Ortiz introduces Justice AI, a pioneering initiative designed to combat AI bias through inclusive data collection, bias detection, and public education. Tune in to learn more about how we can reshape the future of AI to ensure it serves everyone equally, and discover actionable ways to advocate for more just, bias-free technology.

Show more...
1 year ago
10 minutes 38 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 5 - THE MANSIONS OF AI

In this insightful episode, we dive into Christian Ortiz’s article, "The Mansions of AI: Why Technology Must Be Built for Everyone, Not Just the Privileged." Using the metaphor of a house designed exclusively for the wealthy, Ortiz explores how artificial intelligence (AI) often reflects the biases of privileged developers—leading to systems that fail marginalized communities.

From hiring algorithms to daily tech, AI can unintentionally reinforce systemic inequalities if built without the input of diverse, underrepresented voices. Join us as we discuss how the exclusion of marginalized perspectives perpetuates harmful stereotypes, shuts people out of opportunities, and creates technologies that only serve a select few.

Through his work with Justice AI, Ortiz advocates for inclusive AI design that actively works to undo past injustices and serves all people equitably. This episode challenges listeners to rethink the role of technology in creating a more just and equitable world, showing how building AI for everyone ultimately benefits society as a whole. Tune in to learn how AI can become a tool for justice, innovation, and fairness for all.

Show more...
1 year ago
11 minutes 19 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 4 - ALGO-ETHNIC OVERSIGHTS

In this compelling podcast, we dive deep into the intersection of technology, social justice, and decolonial thought through the lens of "Algo-Ethnic Oversights." Hosted by Christian Ortiz, a global AI ethicist and the visionary behind Justice AI, this show unpacks the hidden biases within artificial intelligence systems that disproportionately affect marginalized communities. Drawing from Ortiz's groundbreaking work, we explore how AI development often reflects real-world discrimination and systemic inequities, and we challenge the status quo by advocating for algorithmic accountability, transparency, and inclusivity.

Each episode features conversations with industry experts, activists, and AI researchers who are pushing the boundaries of ethical tech. We dissect critical issues such as employment discrimination in AI, the Equal Employment Opportunity Commission’s (EEOC) role in overseeing fair AI practices, and the ways in which decolonial frameworks like the Decolonial Intelligence Algorithmic (DIA) approach can revolutionize the tech landscape. Join us as we navigate these urgent issues, empowering listeners with actionable insights and a call to advocate for more equitable technological solutions.

Show more...
1 year ago
6 minutes 30 seconds

CAPE ON - A TECH ACTIVIST PODCAST
CAPE ON SEASON 2 EP 3 - THE DIA FRAMEWORK AND JUSTICE AI

🎙️ Revolutionizing Ethical AI: The DIA Framework and Justice AI 🎙️

Welcome to a groundbreaking episode that redefines the future of technology through the lens of decolonization and social justice. The Decolonial Intelligence Algorithmic (DIA) Framework is here to disrupt the status quo, exposing the hidden biases within artificial intelligence and proposing a radical, justice-centered alternative. Developed to challenge the deeply entrenched colonial, patriarchal, and racist structures embedded in modern AI systems, the DIA Framework empowers us to create technologies that reflect intersectional fairness and transparency.

In this episode, we explore how Justice AI, the world's first AI chatbot built to combat bias, integrates the principles of the DIA Framework. Learn how this revolutionary technology assesses and mitigates harmful biases across race, gender, sexuality, and more, providing real-time solutions to the global challenge of AI fairness.

🔍 Why This Matters: AI is rapidly shaping decisions that impact our daily lives—from hiring to healthcare—and too often, these systems perpetuate inequities. The DIA Framework offers a blueprint for creating AI that serves all people equitably, while Justice AI stands at the forefront of this ethical transformation, fighting bias where it thrives most.

💡 What You’ll Discover:

  • How the DIA Framework deconstructs systemic oppression in AI.
  • The critical role of community-led audits and intersectional impact assessments in ethical AI development.
  • How Justice AI is leading the charge to build technology that empowers marginalized communities.

🔑 The Path Forward: The future of AI doesn’t just belong to the privileged few—it belongs to all of us. By adopting the DIA Framework and supporting innovations like Justice AI, we have the power to reimagine technology that works for everyone. Join us as we uncover the next frontier in ethical tech, where decolonial thought and cutting-edge AI meet.

Tune in to this transformative discussion and be part of the movement to create a just, inclusive future through technology!

Show more...
1 year ago
25 minutes 14 seconds

CAPE ON - A TECH ACTIVIST PODCAST
Oye, mira. My name is Christian Ortiz, a multidisciplinary entrepreneur and tech activist who has been in the field of marketing and creation for over two decades, navigating the intersection of building my MOD brand while taking on social activism with conviction. Using my social media marketing to help manage campaigns for Stacy Abrams and push world changing messages through the Black Lives Matter movement and beyond. Welcome to the first episode of Cape On, the podcast where tech activists converge to reshape the world. Here, we're not just about technology; we're about using it as a tool