
The DfE is building AI tutors for classrooms. Has your school got a policy yet?
The Department for Education has opened bidding for its AI Tutoring Pioneer Group, inviting up to eight EdTech and AI companies to design and test classroom-ready tutoring tools. Each selected organisation receives £300,000 to co-design systems with teachers, starting this summer. A national rollout is planned from 2027, with the stated aim of reaching 450,000 disadvantaged pupils. The announcement focuses on Year 9 and Year 10. The policy implications, however, reach much further.
The programme requires all tools to meet DfE’s Generative AI Product Safety Standards, a framework currently being developed by DSIT’s Incubator for AI alongside hundreds of teachers. This is the government’s first formal move to define what responsible AI deployment in English classrooms actually looks like. The standards it sets will shape procurement, inspection expectations, and school policy across all phases of education, including primary.
Why this matters to primary schools
The most striking context for this announcement sits not in the government’s own press release but in survey data published separately by the National Education Union. Their 2026 State of Education report, drawn from responses from more than 9,000 teachers in English state schools, found that 49 per cent of schools have no AI policy at all. A further 66 per cent have no specific policy covering how pupils use AI. That is a substantial governance gap, and it applies as much to primary schools as to secondary.
Primary school leaders might reasonably feel that AI tutoring tools aimed at Year 9 and 10 are a secondary concern. The risk in that thinking is treating the policy question as someone else’s responsibility. The DfE is now actively setting the terms for what safe AI in classrooms means. That framework will inform what Ofsted expects, how procurement guidance is written, and eventually what parents ask about. Arriving at those conversations without a considered position is not neutral. It is a gap.
The NEU data also shows that 76 per cent of teachers now use AI tools in their day-to-day work, up from 53 per cent the previous year. Much of this is resource creation and lesson planning. School leaders may not have visibility over which tools are being used or how. If staff are already using AI and the school has no policy covering it, that is a governance question worth addressing now rather than later.
49 per cent of schools have no AI policy at all, for staff or pupils.
National Education Union, State of Education: AI, 2026
What this could mean in practice
The immediate practical question for primary headteachers and school business managers is not which AI tools to adopt. It is whether the school has any agreed position on AI at all. That is a more achievable starting point than it might appear. A simple staff protocol covering which AI tools are permissible, how outputs should be verified, and what pupils are told about AI in their learning environment addresses most of the ground that currently matters.
The Pioneer Programme is designed around teacher supervision throughout. That framing is worth noting: the government is not building tools to replace teacher judgement but to extend it. For primary schools thinking about where AI fits in their existing systems, the question is not whether AI replaces anything. It is whether it is being used consistently, visibly, and with appropriate oversight.
From 2027, secondary pupils will start encountering formally approved AI tutoring tools as part of their normal school experience. Their younger siblings in primary will hear about it at home. Getting ahead of those conversations and having a straightforward, honest account of the school’s approach to AI is worth doing before the question arrives urgently rather than on your own terms.
What we’re thinking about at itchyrobot
Much of the work we do with primary schools already sits in adjacent territory. When we connect a school’s Wonde data feed to its website, or build a Claude-based workflow for routine admin tasks, we are making decisions about where AI touches school data and how that use is supervised. Those decisions need a policy framework, and most schools we work with do not yet have one in place.
The DfE’s safety standards, when published, will be the natural starting point for any school reviewing its AI approach. Our view is that waiting for them before thinking about policy is probably too long. The core questions a sensible school AI policy needs to answer are not complicated: what tools are in use, by whom, for what purpose, and who is accountable. That is a one-page document most headteachers could draft this term. If yours does not exist yet, that is the most useful thing to produce before September.
Source: https://www.gov.uk/government/news/edtech-and-ai-companies-invited-to-help-build-safe-ai-tutoring-tools-for-disadvantaged-pupils


