← All work

Educational AI Chatbot

An AI chatbot designed as a deliberately flawed co-learner for children with special educational needs, built as a central case study in PhD research.

Educational AI chatbot interface showing themed character conversations designed for children with special educational needs
Educational AI chatbot interface showing themed character conversations designed for children with special educational needs
Problem

AI tutors that give perfect answers don't encourage critical thinking. Could a chatbot that sometimes gets things wrong be a better learning tool?

Results
  • Working chatbot with calibrated imperfection for educational contexts
  • Deployed at Bridewell School for 11-13 year olds with SEN
  • Central case study in PhD research on AI-assisted development

Three months into my doctoral programme, I found myself in Google AI Studio describing features for an educational chatbot. It was for a CDT industry challenge, a collaboration between our research centre and Bridewell School, working with children aged 11-13 who have special educational needs.

The Design

The core idea was positioning AI as a deliberately flawed co-learner rather than an authoritative teacher. Students wouldn’t be told answers. They’d interact with themed AI characters that sometimes get things wrong, creating opportunities for the students to correct them. It’s a pedagogical approach that puts the child in a position of expertise.

Over a few intensive sessions, I watched a functional system take shape: authentication, themed AI characters, text-to-speech, even experimental voice interaction. I had written almost none of the code myself.

Where It Got Complicated

This wasn’t how I’d imagined starting a PhD. Features that would have taken me weeks appeared in hours. Technical barriers I’d assumed were permanent dissolved into dialogue. But the chatbot also raised questions I couldn’t ignore.

Could I actually deploy this for real children in real classrooms? There were security considerations, content safety questions, accessibility requirements, and infrastructure decisions that no amount of chatting with an AI had addressed. Data retention policies for minors. Interaction logging and safeguarding. The distance between “working demo” and “something you’d responsibly put in front of vulnerable children” turned out to be enormous.

What It Taught Me

The educational chatbot became one of the defining examples in my PhD research. It crystallises the compression gap perfectly: AI tools compress the journey from nothing to working prototype dramatically, but the final distance to something you could responsibly deploy still demands knowledge that no amount of prompting can replace. The journey to a convincing demo is fast. The journey to something you’d actually trust with someone else’s children is not. I explored the chatbot project and its implications in What Happens When You Let AI Write All Your Code.