AI in Education: The Space Race for AI Literacy in Schools
Being referred to as the ‘modern space race’, the US and China have been competing for the top spot when it comes to being the global leader on...
10 min read
9ine
:
May 15, 2025 11:36:46 AM
As EdTech Vendors create new AI products and introduce AI features to existing ones, schools need to review the compliance of the tools that they are using to avoid risks and issues. The summer is the perfect time for an EdTech Audit to go into the new academic year with a safe, secure and compliant EdTech ecosystem, and this article takes you through what you need to do to conduct one.
Education technology vendors are in the midst of a rapid AI integration trend, in addition to launching entirely new AI products, many big EdTech players are adding generative AI features to existing platforms already popular in schools. For example, Google’s Workspace for Education now offers AI-powered writing assistance (via its Gemini model) in familiar apps like Docs and Slides, and Khan Academy has introduced an AI tutor called Khanmigo for teachers and students. These enhancements promise to help educators with tasks like lesson planning, grading, and student support.
But, with this unprecedented speed of AI tool development in education, schools need to adapt continuously. Education experts have noted that while these developments are exciting, it makes the need for teacher training on AI even more urgent. In other words, when a LMS or classroom application that a school has used for years suddenly adds an AI chatbot or automation feature, educators need to quickly understand how it works and how to use it appropriately. This new reality is pushing schools to stay informed and agile in how they manage their EdTech tools and their relationships with EdTech vendors.
Alongside their benefits, AI-driven features introduce fresh privacy, safeguarding, and cybersecurity risks that schools must consider. Many of the risks are extensions of familiar EdTech issues, but AI operates at a larger scale and speed, creating new risks as well as exacerbating existing ones. A recent case in Los Angeles saw an AI-powered education app abruptly shut down, leaving parents worried about what happened to the student data it had collected and with questions for the school about how it had vetted the vendor, and what safeguards it had put in place. This scenario demonstrated how quickly trust can erode if it is unclear whether a school has assessed and managed the risks associated with using a third party EdTech Vendor’s AI platform.
Safeguarding and Child Protection is another concern. AI assistants might inadvertently produce inappropriate content, or fail to filter harmful material, exposing students to potential harm. There have also been instances of students misusing generative AI provided by the school to create deepfakes and abusive content, underscoring the need for monitoring and education on ethical use.
Cybersecurity risks are also heightened, as every new AI integration can expand a school’s digital attack surface. An unvetted AI plugin or an outdated version of an app with AI could introduce vulnerabilities. Moreover, if a school isn’t fully aware of which apps are handling student data (and how), it’s difficult to ensure compliance with regulations and security standards. Unmonitored or unapproved tools, especially those no longer in use, can become security vulnerabilities or lead to compliance breaches. In short, when your classroom software evolves, your risk assessments and safeguards must evolve with it.
Given the rapid changes, it’s clear that schools need to review their EdTech portfolios on a regular basis. In fact, many experts recommend treating this as an annual (or even semi-annual) exercise and the summer break is an ideal time. During summer, IT administrators and digital learning leads can take stock of the past year’s changes without the daily pressures of classes. Tech giants effectively hand educators “major homework for the summer” to adjust lesson plans and methods for AI features which arrive in their classrooms at the start of every new academic year. Dozens of new or updated EdTech products may need evaluation before the new school year and a routine summer review ensures that no major update or new tool catches the school off-guard in September.
For school leaders and IT administrators looking to conduct an EdTech portfolio audit, here are some practical steps and considerations:
By following these steps, schools can complete an effective audit of their EdTech ecosystem to ensure that it is both effective and safe.
To simplify the ongoing management of multiple vendors and tools 9ine’s Application Library is a solution that enables all staff to access a central searchable library of all approved EdTech in the school. The library provides a dashboard for both administrators and educators to stay informed about each tool’s features, risks, and usage guidelines and contains all information staff need to know about the AI in use (if there is), privacy risks, safeguarding risks and cyber risks. With easy to add ‘How to’ and ‘Help’ guides, Application Library becomes a single, central digital resource which teachers can use to self-educate on using the EdTech tools they want to use effectively and safely. By proactively providing the right resources and warnings, the application library guides staff in adhering to school policies on data security and student safety while using EdTech.
It also helps them get the most instructional value out of the technology as Application Library can also be used to collect direct feedback from educators and make it visible to others. For example, teachers can leave reviews or ratings on the apps they use, describing how the tool impacted their classroom and any challenges they encountered. This built-in feedback loop means that before another teacher tries a new app, they can learn from colleagues’ experiences. In essence, the platform becomes a knowledge-sharing hub, capturing collective wisdom about what works best for teaching and learning.
Another benefit of Application Library is the ability to identify duplicative or underutilised tools at a glance. When all apps are listed in one place with information on their purpose and usage, it’s easier for administrators to spot overlaps. For example, if two different departments requested similar quiz applications, the duplication will be visible and the school can take action. By auditing EdTech usage through the library, schools have eliminated costly overlaps and ensured each tool is actually needed. This streamlining not only saves money but also reduces the burden on IT teams and data protection officers who must support and secure these tools. Application Library also includes a workflow for the request of new EdTech for staff to follow, powerful search filters (so teachers can find tools by subject, grade level, or specific needs), and integration with privacy and compliance workflows. All of this turns what could be an overwhelming sprawl of apps into an organised ecosystem.
Once you have identified the EdTech that your school is using and that you are looking to assess for compliance, schools are likely to have a long list, which is going to take a lot of time and expertise to work through. Vendor Management removes the pain, and time, from evaluating and vetting third party vendor contracts, privacy notices, information security policies and other compliance documents. It also provides a thorough, ‘traffic light’ based approach to clearly inform you of vendor privacy, cyber, AI, and safeguarding risks. Vendor Management supports you to demonstrate to parents, staff and regulators how you effectively evaluate and manage technology you choose to deploy.
Even with the best tools and policies in place, people remain at the heart of safe and effective EdTech use. It’s essential to invest in ongoing training and professional development so that teachers, IT staff, and administrators stay up-to-date on AI in education. Many schools have been slow to prepare teachers for the new AI-driven learning environment leading to knowledge gaps which can create inconsistent or risky practices. For example, a well-meaning teacher might start using a new AI feature without realising they’re violating privacy rules, simply because no one informed them.
To help to improve AI Literacy amongst your staff 9ine Academy LMS is an on-demand training and certification platform which enables schools to enrol individual staff members or entire groups in comprehensive training courses, modules, and assessments, featuring in-built quizzes for knowledge checks. Our AI Pathway is your school's learning partner for AI ethics and governance. With over 20 differentiated course levels you can enrol all staff in an Introductory course to AI, then for those staff with a greater responsibility, enrol them in Intermediate and Advanced courses. There’s also specialist courses for AI in Safeguarding, Child Protection and Technology. Schools can also subscribe to learning pathways in Privacy, Cyber, Tech Operations and Risk Management. Alternatively, schools can purchase courses on a per person and a per course basis. We are currently offering free trials for up to three members of a school’s leadership team, so contact us if you would like to take advantage of this, or have any questions on Academy LMS.
For improving AI Literacy in both staff and students, we have also created Turing Trials. This is our free to download card game, designed to help schools explore AI in education in an engaging, interactive way. It supports discussions on AI’s impact in education, including on data privacy and child safeguarding. It is perfect for school administrators, educators, and technical staff and can be used in workshops to increase AI literacy across the whole school. It enables colleagues to collaborate, navigate real-life AI scenarios and make strategic decisions which affect students, staff and school operations. Turing Trials is not just great for staff at schools though, it can be used with students to have conversations with them about the opportunities and risks of AI, exploring what their thoughts and feelings are about how it is, and should be, used in the school. You can download it for free here and look out for the scenario where the Vendor Management Process wasn’t followed!
AI is here to stay in the world of education technology, and its influence on the classroom will only grow. For school leaders, digital learning leads, and IT administrators, the challenge is to embrace innovation without compromising student safety, privacy, or educational quality. This requires a proactive and structured approach. By regularly reviewing the school’s EdTech portfolio and staying alert to vendor changes, schools can catch issues early, before a minor app update turns into a major headache. By involving educators in the process (through feedback and training), schools ensure that technology adoption remains grounded in classroom realities and instructional needs. Also, by leveraging tools like an Application Library along with solid vendor management practices, schools gain visibility and control over their burgeoning array of AI-augmented apps. For school leaders and administrators, now is the time to put these plans into action. Plan an EdTech audit before each new school year, establish clear channels for communication with vendors and your own teachers, and make use of platforms that simplify oversight.
9ine equips schools to stay safe, secure and compliant. We give schools access to all the expertise they need to meet their technology, cyber, data privacy, governance, risk & compliance needs - in one simple to use platform. For additional information, please visit www.9ine.com or follow us on LinkedIn @9ine.
Being referred to as the ‘modern space race’, the US and China have been competing for the top spot when it comes to being the global leader on...
When I recently became a parent, I did what many new parents do—I bought a baby monitor to watch over my child as they slept. What surprised me...
The EU AI Act’s first provisions came into effect this month, including the requirement for schools to ensure the appropriate level of AI literacy...