Inspection Readiness

What ISI Will Ask About AI — And What They Want to See

ISI inspectors are not AI experts. But they are experts at asking questions schools can't answer. Here's what to expect.

ISI inspectors are not AI experts. But they are experts at asking questions schools can’t answer.

AI is appearing in inspection conversations — not as a separate focus, but woven into existing frameworks. Safeguarding. Staff training. Curriculum. Governance.

Here’s what to expect and how to prepare.

Where AI questions appear

Safeguarding. “How do you ensure AI tools don’t create safeguarding risks?” They want to know you’ve thought about data protection, student privacy, and inappropriate content generation.

Staff training. “How are staff trained to use AI safely?” They want evidence — not just that training happened, but that it was effective and is kept current.

Teaching and learning. “What’s your position on AI in the classroom?” They want a coherent policy that staff understand and follow.

Governance. “How do governors have oversight of AI use?” They want to see that the board is informed and has approved your approach.

What good answers look like

Inspectors don’t expect perfection. They expect a defensible position with evidence. That means:

  • A written policy that’s been approved by governors
  • Training records showing who’s completed AI CPD
  • Clear guidance staff can articulate when asked
  • Evidence of regular review and updates

If you can produce these quickly when asked, you’re in good shape. If you have to scramble, that tells them something too.

The questions behind the questions

When inspectors ask about AI, they’re really asking:

  • Does this school think ahead, or react to crises?
  • Is governance functioning properly?
  • Do staff understand the school’s policies?
  • Is student welfare genuinely prioritised?

Your AI approach is a window into how you manage emerging challenges generally.

When to worry

If an inspector asks about AI and you hear yourself saying:

  • “We’re still developing our approach…”
  • “Staff use their professional judgment…”
  • “I’d need to check with IT…”

…you have a gap that needs closing before inspection, not during it.

The Pedagogue Standard gives you inspection-ready AI documentation: policy, training evidence, and compliance reports exportable in one click. See how it works.

Avatar photo

Angus Griffin

Angus Griffin is CEO and co-founder of Pedagogue. A seasoned AI commercialisation specialist, he spent a decade closing the technology-implementation gap at THG and Pattern, partnering with Pfizer, GSK, P&G, British Council and Mondelez. Angus champions ROI-driven AI solutions that deliver measurable productivity transformation for schools.

Ready to make your school AI-ready?

Join 1,200+ schools building confidence in AI.