AI Tools

How to Write an AI Policy for Your School That Teachers Will Actually Follow

Most school AI policies gather dust because they were written by committee and read by nobody. Here's a practical framework for writing one that teachers actually use — plus the tools that make it easier.

TL;DR

Most school AI policies fail because they're written to protect institutions rather than support teachers. A practical AI policy should be under five pages, name specific approved tools, and build in termly reviews. Free tools like the K12 AI Policy Generator can produce a compliant draft in minutes. The guiding principle: outsource the procedural clarity so teachers can focus on pedagogical decisions. With 60% of teachers now using AI and the AFT investing $23 million in AI training for 400,000 teachers, schools without a usable policy are already behind.

Key Takeaways

  • Most school AI policies fail because they're too long, too vague, or written for compliance rather than classroom use
  • Start with three questions: what are teachers already using, what are you protecting, and who is the audience
  • Write separate, short guidance for each group (teachers, students, governors) — two pages maximum per audience
  • The K12 AI Policy Generator walks leaders through a 13-step framework to produce a compliant draft in minutes
  • SLT AI offers 190+ tools for school leaders drawing on 120+ current DfE and Ofsted documents
  • Good policies name specific approved tools, include honest caveats about AI limitations, and build in termly review dates
  • The AFT announced a $23 million partnership with Anthropic, Microsoft, and OpenAI to train 400,000 teachers on AI

How to Write an AI Policy for Your School That Teachers Will Actually Follow

By Dan Fitzpatrick — Forbes contributor, three-time bestselling author, founder of The AI Educator. Published 10 April 2026.

Here's the problem with most school AI policies: they're written to protect the institution, not to help the teacher. The result? A 40-page document that sits in a shared drive, unread, whilst teachers quietly figure out AI on their own. I've seen this pattern in dozens of schools across the UK, US, and internationally. The policy exists. Nobody follows it. And leadership wonders why things feel chaotic.

The American Federation of Teachers just announced a $23 million partnership with Anthropic, Microsoft, and OpenAI to train 400,000 teachers on AI. That's a clear signal — AI in classrooms isn't a question of if. The question is whether your school has a framework teachers can actually use when they start experimenting. If not, now is the time.

How This Guide Was Built

I've advised schools, MATs, and government bodies across three continents on AI adoption. This isn't theory. Every recommendation here comes from working directly with senior leadership teams navigating real implementation challenges. I don't take payment from any tool listed on the AI Educator Tools directory, and I only recommend what I've seen work in practice.

My guiding principle: outsource the doing, not the thinking. Your AI policy should do the same — handle the procedural stuff clearly so teachers can focus their energy on the pedagogical decisions that actually matter.

Start With Three Questions, Not a Template

Before you write a single word, answer these honestly. First: what are teachers already using? If you don't know, you don't have a policy problem — you have a visibility problem. Survey your staff. You'll be surprised.

Second: what are you actually trying to protect? Student data, assessment integrity, and professional reputation are the big three. Name them explicitly. Vague references to "responsible use" help nobody.

Third: who needs to follow this? A policy that tries to cover teachers, students, support staff, and governors in one document will satisfy none of them. Write separate, short guidance for each group. Two pages maximum per audience.

The Tools That Make This Easier

You don't need to start from scratch. The K12 AI Policy Generator walks school leaders through a 13-step framework, asking targeted questions about your context and generating a compliant draft in minutes. It's free, GDPR compliant, and considerably better than copying another school's policy and changing the name at the top.

For senior leadership teams drowning in paperwork, SLT AI offers 190+ purpose-built tools covering everything from self-evaluation to governance documentation. Over 13,000 school leaders use it, and every output is tailored to your school's specific context — not generic templates. It draws on 120+ current DfE statutory guidance documents and Ofsted frameworks, which means your policy references will actually be up to date.

Once your policy is in place, tools like SchoolAI let teachers implement it in practice — building monitored AI tutoring spaces where student interactions are visible in real time. That's what good policy looks like: clear boundaries that enable confident use, not a ban dressed up as guidance.

What Good AI Policies Have in Common

The schools getting this right share three traits. They keep the document short — under five pages for the teacher-facing version. They name specific approved tools rather than making vague statements about "AI platforms." And they build in a review date, usually termly, because the technology moves faster than annual policy cycles can handle.

They also include honest caveats. AI works brilliantly for drafting lesson resources, generating differentiated materials, and handling administrative tasks. It's less reliable for nuanced student feedback or anything requiring deep contextual understanding. Say that clearly. Teachers respect honesty more than corporate optimism.

The Bottom Line

Your school needs an AI policy. But more importantly, it needs one that's short enough to read, practical enough to follow, and honest enough to trust. The tools exist to help you build one in an afternoon, not a term. Browse the full directory to find what fits your context, and stop letting perfect be the enemy of functional.

Your teachers are already using AI. Give them a framework worthy of their professionalism.

Dan Fitzpatrick is the founder of The AI Educator, a Forbes contributor, and the author of three bestselling books on AI in education. He advises schools, MATs, and government bodies across three continents on responsible AI adoption. Last updated: 10 April 2026.

Frequently Asked Questions

Does my school need an AI policy in 2026?

Yes. With 60% of teachers now using AI and adoption nearly doubling year on year, schools without a clear AI policy risk inconsistent use, data protection issues, and staff uncertainty. A practical policy provides the framework for confident, responsible AI adoption.

How long should a school AI policy be?

The teacher-facing version should be under five pages. Schools getting this right write separate, short guidance for each audience — teachers, students, support staff, and governors — with a maximum of two pages per group.

What should a school AI policy include?

A good AI policy names specific approved tools, defines what data can and cannot be entered into AI systems, sets expectations for human review of AI outputs, includes honest caveats about AI limitations, and builds in a termly review date.

Are there free tools to help write a school AI policy?

Yes. The K12 AI Policy Generator on AI Educator Tools walks school leaders through a 13-step framework and generates a compliant draft in minutes. It is free and GDPR compliant.

How often should a school AI policy be updated?

At minimum termly, not annually. AI technology and the tools available to educators change faster than traditional policy review cycles. Building in regular review dates ensures your policy stays current and credible.

What is the biggest mistake schools make with AI policies?

Writing a policy designed to protect the institution rather than support the teacher. Policies that are too long, too vague, or focused solely on prohibition end up ignored. The most effective policies are short, practical, and enable confident classroom use.

How is the AFT training teachers on AI?

The American Federation of Teachers announced a $23 million, five-year partnership with Anthropic, Microsoft, and OpenAI to train 400,000 teachers on meaningful AI use through the National Academy for AI Instruction.

Looking for AI tools built for educators? Discover 50+ curated tools at aieducator.tools — the trusted directory built by educators, for educators.

Explore AI Tools for Educators
D
Dan Fitzpatrick

Delivered training to 150K+ educators | Founder of The AI Educator and AI Educator Tools | Forbes Contributor | International Keynote Speaker | 4 x #1 Bestselling Author