
Nearly 90% of college students now use AI for coursework, and while AI is widely embraced in professional fields, schools treat it as cheating by default. This disconnect became clear when Columbia student Roy Lee was suspended for using ChatGPT, then raised $5.3 million for his AI-assisted coding startup. Could we say that the real issue is not AI use itself, but rather how we integrate these tools into education? Our host, Carter Considine, breaks it down in this episode of Ethical Bytes.
When students rely on AI without engagement, critical thinking suffers. There have been countless accounts by teachers of students submitting AI-written essays that they clearly never even read through.
It’s telling that a 2025 Microsoft study found that overconfident AI users blindly accept results, while those confident in their own knowledge critically evaluate AI responses. The question now is how teachers can mold students into the latter.
Early school bans on ChatGPT failed as students used personal devices. Meanwhile, innovative educators discovered success by having students critique AI drafts, refine prompts iteratively, and engage in Socratic dialogue with AI systems. These approaches treat AI as a thinking partner, not a replacement.
The private K-12 program Alpha School demonstrates AI's potential: students spend two hours daily with AI tutors, then apply learning through projects and collaboration. Results show top 2% national performance with 2.4x typical academic growth.
With all this in mind, perhaps the solution isn't banning AI but redesigning assignments to reward reasoning over mere information retrieval. When students evaluate, question, and refine AI outputs, they develop stronger critical thinking skills. The goal could be to teach students to interrogate AI, not blindly obey it.
This can prepare them for a future where these tools are ubiquitous in professional environments–a future in which they control the tools rather than are controlled by them.
Key Topics:
More info, transcripts, and references can be found at ethical.fm