A picture of Tom Barton-Wilkes smiling at the camera

Tom Barton-Wilkes — Teacher & Builder of ChalkboardAI

Building ChalkboardAI

I’m a teacher who hit a hard wall with AI

I’m Tom Barton‑Wilkes, a UK primary school teacher with over ten years in the classroom. I love teaching — but like most teachers, I’ve spent too many evenings and weekends on admin that adds little value for pupils.

When large language models appeared, I did what many teachers did: I experimented. Drafting resources. Simplifying texts. Planning interventions. It was immediately clear that AI could save time.

It was also immediately clear why schools couldn’t use it.

The real problem isn’t AI — it’s data

Most AI tools require teachers to upload raw pupil data. In schools, that’s a non‑starter. It’s not just risky — it’s indefensible.

I ran straight into that wall myself. The moment pupil data leaves the teacher’s device, schools shut the door. Heads carry the risk. DPOs say no. Teachers are left choosing between unsafe shortcuts or not using AI at all.

That tension isn’t hypothetical. It’s happening in schools every day.

ChalkboardAI started as a constraint, not an idea

So I built ChalkboardAI around the one constraint that couldn’t be ignored:

Since then, I’ve been backfilling my coding knowledge so I can take full ownership of the tools I build.

Identifiable pupil data must never leave the teacher’s device.

ChalkboardAI anonymises data locally, sends only the minimum pseudonymised information required to the model, and re‑identifies results on the teacher’s machine. The AI never sees real names. Nothing is stored. Nothing is reused.

This wasn’t a branding choice. It was the only way the tools could exist.

When I first shared an early prototype with a trust‑level Data Protection Officer, the response wasn’t excitement — it was relief. For the first time, there was a way to use AI in schools that could actually be defended.

What I’m building now

ChalkboardAI is infrastructure for safe AI use in schools.

The tools here are not experiments or demos. They’re real tools, built to handle real constraints: timetables, staffing limits, SEND requirements, and regulatory scrutiny.

I write all the code myself and have been deliberately deepening my technical foundations so I can own the system end‑to‑end.

Where this is going

AI use in schools is inevitable. Unregulated AI use is not sustainable.

ChalkboardAI exists to resolve that tension — not by asking schools to take more risk, but by removing it.

If you’re exploring how AI can work properly in education — as a teacher, school leader, trust, or partner — I’d be glad to talk.

← Back to home