Shadow AI Is Already in Your School. Here’s What to Do About It.
Your staff are using AI tools you haven't approved. This isn't a prediction. It's already happening.
Your staff are using AI tools you haven’t approved. This isn’t a prediction. It’s already happening.
ChatGPT for lesson planning. Free transcription services for meeting notes. AI writing assistants for reports. Browser extensions that summarise documents.
None of these are inherently bad. But when staff use unapproved tools, you have a problem you can’t see — and can’t manage.
The risk isn’t AI. It’s data.
Every time a teacher pastes student information into an AI tool, that data leaves your control. Names. Performance data. SEN information. Safeguarding notes.
Most free AI tools have terms of service that allow them to use input data for training. That means student information could end up in a model that millions of people query.
This isn’t theoretical. It’s a GDPR issue waiting to happen.
Why staff don’t tell you
Teachers aren’t hiding their AI use out of malice. They’re doing it because:
- They don’t know it’s a problem
- They’re under time pressure and AI helps
- They assume someone would have told them if it wasn’t allowed
- They don’t want to be the one asking “is this OK?”
The silence isn’t resistance. It’s the absence of clear guidance.
How to get ahead of it
You can’t stop shadow AI by banning it. That just pushes it underground. Instead:
- Acknowledge it exists. Start a conversation. “We know many of you are using AI tools. That’s fine — we want to help you use them safely.”
- Set clear boundaries. Which tools are approved? What data can never be uploaded? What’s the process for requesting a new tool?
- Train on the “why.” Most teachers will comply if they understand the risk. A 10-minute explanation of data protection is more effective than a 10-page policy.
- Create a safe path. If staff have to jump through hoops to use AI legitimately, they’ll keep using the shadow tools. Make compliance easier than non-compliance.
The opportunity
Schools that address shadow AI now will build a culture where staff feel confident using AI productively — without creating risk.
Schools that ignore it will face an incident eventually. And by then, the question will be: “Why didn’t we know this was happening?”
You don’t need to have all the answers today. But you do need to start the conversation.
The Pedagogue Standard includes practical training on which tools are safe, what data never to upload, and how to use AI productively without creating risk. See how it works.