Skip to the main content.

4 min read

10 Things Heads Need to Know About AI

10 Things Heads Need to Know About AI
10 Things Heads Need to Know About AI
7:51

Key Takeaways from Steve Bambury’s (Head of Digital Learning and Innovation at JESS Dubai) presentation at the 2025 AI & Cyber Security Forum, Dubai

This article explores the top 10 critical lessons for schools, sharing tangible advice for school leaders.

1. Governance Requires Sceptics, Not Just Evangelists

An effective AI strategy cannot be a solo mission. Establishing an "AI Steering Committee" to oversee the AI approach ensures alignment and input from across numerous departments. Crucially, this group should include not only change evangelists but also people who might be more sceptical of AI or resistant to change, to ensure the project considers all perspectives, as this is more likely to lead to more effective enterprise integration.

 

2. Policy Must Be Agile and Accessible

Carving AI policies in stone is not recommended because the technology, laws, and curriculum are evolving too rapidly, and agility needs to be factored in. Furthermore, while comprehensive policies still serve a purpose, it’s important to ensure that students and busy teachers receive a distilled version so they’re aware of the core policies and can comply. Not everyone will sit down to read thousands of words, so having long-form policies distilled into accessible formats for broader audiences ensures everyone is aligned.

 

3. Less is More: Beware the "Bubble"

One of the most striking statistics from the presentation was the procurement funnel. Last year Bambury vetted 237 different AI applications. From those 237, four proceeded to pilot, and only two remain in use.

There’s an "AI bubble," with many startups flooding the market and many are unlikely to exist in a few years. Schools should stick to major players like Microsoft and Google, as they offer enterprise data protections, ensuring school data isn't used to train public AI models.

 

4. Data is the New Oil, and AI is the Combustion Engine

AI magnifies data protection concerns, making it dangerously easy to fall foul of regulations.

A hack at a school in Scotland was a case study, whereby 3.3 million personal data files were exposed on the dark web. This could result in identity theft risks that students face when schools fail to secure data.

 

5. The Rise of "Double Bubble" Cyber Attacks

Schools are now prime targets for cybercriminals. The Home Office statistics show educational institutions are 43% more likely to be attacked than other sectors. And there’s a "double bubble" threat: hackers not only hold the school to ransom but also threaten to release sensitive student data to parents, forcing families to pressure the school into paying.

 

6. Child Protection: The Tip of the Spear

While headlines focus on tragic stories of chatbots encouraging self-harm, AI risks are far broader, ranging from deepfake pornography to "membership inference", where AI deduces private information about a child based on their prompts.

Canva is another recent case study, where the tool that’s generally trusted by schools generated inappropriate content during a test by the Dubai think tank. Although Canva fixed the exploit immediately, the message is clear: Even the tools that we trust... You've got to be wary of it from a risk perspective".

 

7. Integrity is an Ethics Issue, Not a Tech Issue

Addressing fears of cheating, Bambury used the "pencil analogy": if a student stabs someone with a pencil, we don't ban pencils; we address the behavior. He argued that academic integrity is an ethics issue, not a technology problem.

He also cautioned against relying on detection tools like Turnitin. In his own experiments, AI "humanisers" easily defeated detection software, while innocent students, particularly those with English as a Second Language (ESL), were at high risk of being falsely accused because their writing style lacks the "idioms and nuance" that detectors look for.

 

8. Tailor the Curriculum

JESS has rolled out a bespoke AI literacy curriculum from Year 4 to Year 13, covering three strands: Understanding AI, Ethical Use, and Controlling AI Systems. Bambury emphasised that this cannot just be a "computing" subject; it must be woven into pastoral care and other subjects to ensure students understand the real-world implications.

 

9. Preparing for a Disrupted Workforce

The job market is already shifting, with industries like law replacing entry-level roles with AI automation. AI will likely continue to make a massive change to the workforce. Schools must therefore change their pathways to ensure students have the skills required for a world where career entry points no longer start with administrative type roles.

 

10. You Can’t Please Everyone

Survey data showed his staff were split exactly 50/50 on whether AI should be used for report writing. He concluded with a quote from Jurassic Park's Ian Malcolm (Jeff Goldblum): "We're so preoccupied right now... with the fact that we can do all this stuff, I think we need to throw the brakes on sometimes and think about whether we should be doing it".

 

Solving the Vetting Nightmare: How 9ine Can Help

How do you vet hundreds of EdTech apps to find the safe few?

Most school leaders do not have the time or specialised knowledge to manually audit hundreds of privacy policies, test for "enterprise data protection," and monitor for new exploits like the examples given above.

This is where 9ine’s Vendor Management and Application Platform becomes critical.

Streamlining the review process from "237 to 4" is possible with 9ine’s Vendor Management module that automates this due diligence. It provides schools with a Vendor Library of over 250 pre-assessed EdTech vendors, meaning the heavy lifting of checking for GDPR compliance, data transfers, and security measures is already done for you 30.

9ine’s platform allows schools to conduct rigorous Processing Operations Assessments (POA) to see exactly where personal data is being transferred and what security measures the vendor has in place. This helps schools avoid the "bubble" companies that Bambury warned against.

9ine’s Application Platform supports this governance by acting as a central repository for approved tools. It allows teachers to see clearly which apps are safe, which have AI features, and what the specific safeguarding risks are, effectively digitising the "Golden Rules" so they are accessible to all staff.

Schools are 43% more likely to face cyber attacks than other sectors. By using 9ine to vet vendors and manage applications, school leaders can ensure they aren't leaving their "digital doors" open to the double-extortion attacks that threaten both school operations and student safety.


To learn more about how to secure your school's AI and data landscape, visit 9ine's Vendor Management page. 

The Dark Side of Innovation: Uncovering the Hidden AI Risks in Schools

The Dark Side of Innovation: Uncovering the Hidden AI Risks in Schools

Closing the session at the St. Regis Downtown, Mark Orchison, CEO of 9ine, took the stage to address the room of school leaders. Following...

Read More
UAE Federal Decree-Law No. (26) of 2025 on Child Digital Safety: what it means for schools (and what to do now)

UAE Federal Decree-Law No. (26) of 2025 on Child Digital Safety: what it means for schools (and what to do now)

The UAE has now put a clear “child digital safety” governance framework into law, and it’s one that will materially affect how schools select,...

Read More