Skip to the main content.

8 min read

The Unfiltered Impact of AI & KCSIE Compliance on Schools in 2025

The Unfiltered Impact of AI & KCSIE Compliance on Schools in 2025
The Unfiltered Impact of AI & KCSIE Compliance on Schools in 2025
16:06

In 2025, schools in England face a convergence of new safeguarding requirements and data protection laws as artificial intelligence (AI) becomes increasingly embedded in education. The latest update to Keeping Children Safe in Education (KCSIE 2025) explicitly addresses generative AI, signaling that AI tools and content now fall under the umbrella of online safety in schools. At the same time, the UK’s new Data (Use and Access) Act 2025 strengthens obligations around children’s data and online services

For school leaders, safeguarding leads, IT directors, and data protection officers, the message is clear: we must update our people, policies, and processes to keep pupils safe and comply with the law in an AI-powered world. This blog post unpacks the implications of these developments and explains how 9ine’s comprehensive platform can help schools meet their statutory and legal obligations.

KCSIE 2025 – New AI Safeguarding Expectations for Schools

KCSIE 2025 (statutory guidance for schools in England) has, for the first time, made explicit reference to AI in the context of child safeguarding. In particular, paragraph 143 now directs schools to the Department for Education’s “Generative AI: Product Safety Expectations” – guidance which explains how filtering and monitoring requirements apply to the use of generative AI in education. In effect, KCSIE is telling schools to understand and implement the DfE’s expected safeguards for AI tools, ensuring that any AI-based educational technology (EdTech) used is appropriately filtered, monitored, and integrated into your school’s broader safeguarding systems.

This inclusion is significant. It means that AI-generated content and tools are firmly within scope of online safety duties. Governing bodies and proprietors are expected to review the DfE’s AI product safety guidelines, discuss with IT staff how to apply them, and make sure that insights from AI tools (e.g. flags from AI content filters) are fed into the school’s safeguarding processes for early intervention. Schools should perform risk assessments to determine what filtering and monitoring of AI is necessary (as part of their Prevent duty and existing filtering standards). In practice, this might involve ensuring that any AI chatbots or generative tools pupils use have content moderation enabled, that logs of AI interactions are reviewed for any signs of risk, and that staff remain “the human in the loop” to supervise AI use. 

The challenge? Many EdTech products with AI capabilities currently do not meet all the DfE’s Product Safety expectations. There is also no easy way for a school to centrally consolidate all the filtering/monitoring information from multiple AI tools and platforms. However, schools cannot ignore these obligations or ban AI altogether; the benefits of AI (from personalised learning to administrative efficiency) are real, but they must be balanced with robust safeguards. KCSIE stops short of making schools solely responsible for fixing any gaps in vendors’ AI safety features, since other DfE and ICO guidance already requires schools to assess AI risks proactively. In other words, KCSIE assumes schools have begun evaluating the risks of AI in their EdTech and taken steps accordingly. The onus is now on each school to integrate AI risk management into its existing safeguarding framework, bridging any gaps between what AI tools provide and what pupils need for protection. 

Digital literacy and AI awareness are also emerging as key themes. KCSIE 2025 notes that children should be taught how to keep themselves and others safe, including online in the context of AI. Before schools can effectively teach pupils about safe and ethical AI use, they must first train their staff. Teachers and support staff need to understand both the general principles of using AI safely and the specific AI tools being adopted at their school. This includes awareness of risks like AI-generated misinformation, biases, or even AI-facilitated grooming attempts, so that educators can confidently guide pupils and remain vigilant. In short, meeting KCSIE’s expectations means building an informed human safeguard around the technology – your staff are that critical human layer.

The Data (Use and Access) Act 2025 – Prioritising Children’s Data and Online Safety

Hand-in-hand with KCSIE’s AI guidance, the Data (Use and Access) Act 2025 (DUAA) updates UK data protection law (UK GDPR and DPA 2018) with a strong emphasis on protecting children online. For any online service likely to be accessed by children (which includes most educational apps and platforms), the law now requires providers to account for children’s needs and vulnerabilities when handling personal data. This means EdTech vendors, and by extension, the schools using such services, must ensure that their data collection and usage practices consider the best interests of children at different ages. For example, providers should think about how to protect and support child users, recognise that children have less understanding of data risks, and avoid one-size-fits-all approaches in favor of age-appropriate experiences.

The UK’s Age Appropriate Design Code (AADC, also known as the Children’s Code) is explicitly linked to the DUAA through an amendment to Article 25 of the UK GDPR. This amendment requires that “the controller must take into account the children’s higher protection matters” when providing information society services (such as EdTech). These higher protection matters are defined as “how children can best be protected and supported when using the services”.

The ICO, the UK GDPR regulator, has developed a statutory code, the Age Appropriate Design Code, on the basis that children require enhanced protection. Consequently, and this is how the connection broadly works (noting that I am not a lawyer, so some technicalities may be missed): because Article 25 of the GDPR has been updated to reflect the need for higher protection for children, and the regulator has already issued the AADC, the ICO can use the AADC as the benchmark for determining what constitutes “higher protection matters”. As a result, non-compliance with the AADC could amount to an infringement or breach of the GDPR.

The DUAA also introduces or reinforces specific measures around AI and automated decisions. Notably, it requires having a “meaningful human involvement” in important decisions made by AI. Any significant decision about an individual that is made solely by algorithms (for example, an AI system automatically grading exams or flagging at-risk pupils) now comes with stricter rules. Such automated decisions cannot use sensitive personal data (like health, ethnicity, or religion) unless one of a few narrow conditions is met. Even when automated decisions are allowed, schools and EdTech providers must implement safeguards if those decisions have a legal or similarly significant effect on a person’s life. These safeguards include informing individuals that AI was used to make a decision about them, allowing them to challenge or express concerns about that decision, providing an option to have a human review the outcome, and ensuring there’s a way to appeal or contest the decision. In practice, a school deploying AI (say, an AI that analyzes pupil performance to recommend interventions) should be prepared to override or adjust any AI-driven decision if a pupil, parent, or staff member raises an issue – the final say cannot rest with the algorithm alone. 

For schools, a crucial implication of the DUAA is the need to have staff empowered as the “humans in the loop.” It’s not enough to have technical safeguards; your teachers, administrators, and safeguarding personnel need the knowledge and confidence to interpret and, if necessary, intervene in AI-driven processes. This means investing in training around AI, data privacy, and cybersecurity so that staff can effectively oversee automated systems and respond to any challenges or objections. The DUAA explicitly ties into this by highlighting the importance of human involvement, a reminder that compliance is not just a technical checkbox, but also an educational effort for those operating the systems.

Alongside new responsibilities, the DUAA does offer some relief in administrative burden for schools. For example, it clarifies that when responding to Subject Access Requests (SARs) – where individuals request all data held on them, schools are only expected to conduct “reasonable and proportionate” searches. If information is stored in inaccessible archives or would require disproportionate effort to retrieve, schools may be justified in narrowing or even refusing the request, which is a practical recognition of the resource constraints many schools face. If you’re dealing with a Subject Access Request (SAR) and want to reduce the time and effort usually required, our consultancy team can help.

The DUAA also expands scenarios in which schools can rely on legitimate interests as the legal basis for processing personal data without consent. It specifically names safeguarding children as a legitimate interest that does not require the usual balancing test between the school’s aims and an individual’s privacy risk. In other words, if you need to process pupil data to protect a child’s welfare or prevent harm, the law explicitly acknowledges that interest – giving schools more confidence to act swiftly in protection of pupils’ safety. (Other helpful examples include allowing data sharing with law enforcement or using data in emergencies without unnecessary delay.) 

Bottom line: Between KCSIE 2025 and the DUAA (updating the GDPR), schools are expected to raise their game on both safeguarding and data privacy in the context of AI. You must ensure the EdTech tools in use not only deliver educational value but also have necessary protections for children’s data and wellbeing. This means vetting vendors for compliance (do they adhere to the Age Appropriate Design Code and the DfE’s AI safety standards?), updating policies (e.g. incorporating AI usage into your safeguarding and acceptable use policies), and most importantly, equipping your staff and pupils with the understanding to navigate AI safely. It’s a lot to manage, especially when you likely use dozens of apps and platforms across teaching, learning, and administration. This is where the right technology platform can make a critical difference. Put simply, your school needs the 9ine platform to navigate today’s complex world of compliance. Built with schools, for schools, read on, then book a demo or trial!

How 9ine Helps Meet KCSIE and DUAA Obligations

Given the breadth of new requirements, schools may wonder how to practically track compliance and implement best practices for online safety, AI oversight, and data protection. A comprehensive solution is needed, one that brings together vendor due diligence, staff training, and clear guidance on using technology in the classroom. 9ine’s platform is designed to do exactly this, helping schools respond with confidence to KCSIE 2025, the Data (Use and Access) Act, the DfE AI Guidance, the DfE Generative AI: Product Safety Expectations and everything else in the GDPR. Key components include:

  • Vendor Management: A centralised system to assess and monitor the compliance of all your EdTech vendors. 9ine’s Vendor Management module provides a traffic-light dashboard for each app or platform, showing independent assessments of its safeguarding, data privacy, AI, and cybersecurity risks. It flags which tools align with the DfE’s filtering and monitoring standards and which may have gaps. This intelligence saves schools hundreds of hours of manual review and helps ensure you’re only using EdTech that meets required standards or that you have mitigations in place. If inspectors (e.g. Ofsted or the ISI) ask how your school manages AI risks, you can literally show them the Vendor Management dashboard, demonstrating a rigorous, ongoing evaluation of each tool’s compliance. In short, Vendor Management lets you easily identify risks and take action – whether that means engaging a vendor for improvements, configuring the tool safely, or ultimately choosing an alternative.
  • Application Library: Even when using approved, vetted apps, teachers need guidance on how to use them safely and effectively. 9ine’s Application Library is a central, searchable library of all EdTech in use at your school. For each application, staff can find “how to” guides and best practices for classroom integration, alongside crucial information on safeguarding settings, age-appropriate usage, data privacy, AI features, and any rules for pupil use. This means a teacher preparing to use a new AI-powered tool can quickly check the Library to understand, for example, if the tool is age-restricted, whether it allows pupil chat with an AI, what privacy considerations to note, and how to enable its safety features. By providing this one-stop knowledge base, Application Library empowers educators to be the ‘human firewall’ protecting pupils – they have the awareness to keep children safe online while leveraging innovative apps. It also helps school leaders standardize and control what apps are used and how: if a tool has known risks, guidance in the Library ensures everyone knows how to mitigate them. (As a bonus, schools often discover through the Library where there’s duplication of tools or unused apps, helping to eliminate wasteful subscriptions and streamline procurement.)
  • Academy LMS – AI Pathway: Compliance is not just about technology and policies, but also about people. The Academy LMS is 9ine’s online learning platform offering courses and certification pathways for school staff in areas like AI, data protection, and cybersecurity. The AI Pathway includes over 20 courses on AI in Education, ranging from introductory concepts to advanced modules on AI ethics and safeguarding. There are tailored courses for different roles – from teachers and support staff to leaders like Designated Safeguarding Leads (DSLs) or Data Protection Officers. By enrolling your team, you ensure that all staff develop digital literacy and AI awareness, fulfilling the spirit of KCSIE’s guidance that staff should be trained to keep children safe online. Teachers learn how to use AI tools responsibly in teaching, how to spot AI-related risks (like deepfake content or AI-facilitated plagiarism), and how to uphold data privacy in day-to-day tasks. Specialist staff gain deeper expertise to oversee AI and data protection strategy. Ultimately, Academy LMS helps your school maintain the “meaningful human involvement” envisioned by the DUAA – because your people will have the knowledge to confidently interpret, intervene, and guide where AI is in play.

These three components work in harmony as a unified platform. Together, they cover the critical aspects of governance, operational guidance, and capacity-building that schools need to navigate AI and data protection requirements. As noted in a recent independent review, schools using 9ine’s Vendor Management, Application Library, and Academy LMS are benefitting from an ecosystem built to fill exactly these emerging gaps – from understanding the AI functions embedded in EdTech, to ensuring vendors comply with laws, to delivering AI literacy training for staff.

 

Book a meeting to learn more

Outlook: AI in Safeguarding – What to Expect in KCSIE 2025

Outlook: AI in Safeguarding – What to Expect in KCSIE 2025

KCSIE is due to be published soon and, according to sources, is expected to undergo a significant upgrade, potentially even a complete rewrite. In...

Read More
KCSIE 2025: What We Got Right (and What We Didn’t) About AI and Safeguarding

KCSIE 2025: What We Got Right (and What We Didn’t) About AI and Safeguarding

When we published our forecast on how KCSIE 2025 might address Artificial Intelligence, we speculated that the Department for Education was poised to...

Read More
The UK Data (Use and Access Act): What does it mean for Schools and EdTech Vendors?

The UK Data (Use and Access Act): What does it mean for Schools and EdTech Vendors?

June 2025 saw a change to the data protection landscape in the UK, with the Data (Use and Access) Bill becoming law, to update the UK GDPR and Data...

Read More