itchyrobot blog feature image

Three Quarters of Teachers Are Using AI. Has Your School Got a Policy?

A new survey from the National Education Union paints a clear picture of where schools are in 2026: staff are adopting AI tools quickly, but the governance frameworks to manage that adoption are lagging far behind.

The NEU’s State of Education: AI report, published in April 2026 and based on responses from more than 9,400 teachers in English state schools, found that 76% of teachers now use AI tools in their day-to-day work, up from 53% the previous year. The fastest-growing uses are resource creation, lesson planning, and administrative tasks. Yet 49% of schools still have no AI policy at all, and 66% have no specific guidance for student use.

Why this matters to primary schools

The gap between adoption and governance is sharper in primary schools than you might expect. Primary teachers are adopting AI at similar rates to their secondary colleagues, but the survey suggests they are less likely to have any formal guidance on how to use it safely and consistently across the school.

There is an additional wrinkle specific to primary. While 66% of secondary teachers report that pupils’ critical thinking has declined as a result of AI use, only 28% of primary teachers share that concern. This could reflect a genuine difference in how primary-age pupils interact with AI tools, or it could simply reflect less visibility of the problem at that stage of schooling. Either way, it is not a reason to delay developing clear guidance.

The practical exposure schools face without a clear policy is significant. GDPR obligations, safeguarding considerations, and questions about academic integrity do not disappear because a school has not yet written its guidance. They simply become harder to manage when something goes wrong.

“Pupils’ critical thinking and problem-solving skills are being eroded by AI use”

NEU survey respondents, State of Education: AI (April 2026) – State of education: AI | National Education Union

What this could mean in practice

For most primary schools, the immediate task is not to build a comprehensive AI strategy. It is simpler than that: decide what your staff should and should not be doing with AI tools right now, write it down, and share it. A two-page acceptable use document is better than nothing, and it gives you a baseline to build from as the landscape develops.

The longer-term question is whether AI policy should sit separately or form part of your broader data and computing governance. Schools that already have a clear data protection framework will find it easier to extend that into AI guidance. Schools that do not have a clean data house will need to address that first, or risk writing a policy that cannot be enforced.

The government’s AI tutor proposal attracted significant scepticism in the survey, with only 14% of teachers in agreement. For primary leaders, this is worth watching rather than acting on immediately. The programme is not yet in place, and the evidence base for AI-powered tutoring at primary level remains thin. What matters now is what is already happening in your school, not what might arrive from Whitehall in twelve months.

What we’re thinking about at iTCHYROBOT

At iTCHYROBOT, we see the policy gap directly when we work with schools on their digital infrastructure. The schools in the strongest position are not necessarily those using the most AI tools, but those that have made deliberate choices about what they use and why. That clarity feeds through into everything from how they communicate with parents about technology to how they configure permissions in the systems we help them build. Schools without that clarity tend to find themselves making reactive decisions rather than considered ones.

The question worth sitting with is this: if a member of your staff used an AI tool this week in a way you would not have approved, would you know about it? If the answer is no, that is your starting point.