The human brain is the most complex structure in the known universe. With roughly 86 billion neurons, each forming thousands of connections with others, nature has produced a network of hundreds of trillions of pathways. Evolution built something extraordinary. It just took five hundred million years.
Artificial intelligence took considerably less time. The transformer architecture that underpins every modern AI system was published in 2017. ChatGPT launched in late 2022. In under a decade, we now have tools that can write, analyse, tutor, generate code and hold sophisticated conversations across virtually every subject a student might study. That speed should give us pause. Not because AI is dangerous, but because its arrival has outpaced our ability to think clearly about what it is actually doing to the young people using it.
Barbara Oakley, one of the world’s foremost researchers on learning, describes a familiar scene: a student studying with past papers and a mark scheme, reading an answer, nodding and turning the page. She calls this the illusion of competence. The student is recognising, not learning. Those two things feel identical from the inside, but they produce very different results when the examination paper arrives and the mark scheme does not. AI has taken this problem and amplified it.
A study cited in the OECD Digital Education Outlook 2026 found that students using an AI tutoring tool improved their performance by up to 48 per cent. When the AI was removed, performance dropped to 17 per cent below where they had started, worse than before they began. Research from MIT reinforces this. Students who used ChatGPT to write essays showed the lowest levels of brain engagement of any group studied. They remembered less of what they had written. Over time, the habit of reaching for the tool grew stronger as their independent thinking grew weaker. It is the GPS effect: rely on it completely, and your ability to navigate without it erodes.
The danger is that AI produces excellent work without requiring any thinking at all. A polished, well-argued essay tells you nothing about what happened in the mind of the person who submitted it. Yet the only reliable test, asking a student to explain their own work, remains largely absent from formal assessment.
That said, the picture is not uniformly bleak. When AI use is structured and guided, thinking returns. Students who create first, then use AI to seek feedback, think more carefully, revise more substantively and remember more of what they have written.
Across drop-down days and CCAs, students at Epsom College will take part in a carousel of AI-focused sessions, where they choose an area that genuinely interests them. Options range from Knowledge Theory using AI and Data Science with machine learning, to vibe coding to build websites and games, storytelling with AI-generated images, and many others. In every session, the emphasis is the same: think before prompting, question the output and remain the author of your own ideas.
The goal is not to distrust these tools. It is to develop the judgement to use them well. A student who learns to work alongside AI with critical awareness will be far better equipped than one who simply learns to prompt it. That distinction will shape not just how they perform, but how they think.
Dr Terence McAdams
Chief Education Officer