9ine Insights | Latest news from 9ine

Beyond the Hype: How UAE School Leaders are Navigating the AI Wave

Written by 9ine | Feb 27, 2026 6:00:00 AM

Insights from the 2025 AI & Cyber Security Forum, Dubai

As Artificial Intelligence (AI) reshapes the educational landscape, heads of schools are facing a dual reality: the immense potential for operational efficiency and the rapidly evolving threats to safeguarding and integrity.

During the Heads’ Panel Discussion, four prominent UAE school leaders shared their candid experiences from the front lines. The panel featured Mark Leppard MBE (Head Teacher, BSAK Abu Dhabi), Tracy Moxley (Principal, iCademy), James McDonald (Principal, Wesgreen International School), and Jim Stearns (Deputy Director of Schools, Victoria International School).

Here are the key takeaways from their discussion on how schools are balancing innovation with safety, and how leaders can manage the risks involved.

1. AI in the Engine Room: The Operational "Win"

While much of the global conversation focuses on students using ChatGPT, the panel revealed that the immediate "win" for schools lies in administration and reducing teacher workload.

The panel shared operational use cases for AI, with specific references to creating engaging resources, refining communications, and analysing data. It was felt that AI brings significant opportunities to support efficiency and greater productivity across all aspects of school life.

 

2. The Integrity Challenge: Moving from Banning to "Guardrails"

A major concern for all panelists was academic honesty. Schools are moving from banning AI to managing it with strict "guardrails". Schools can use specific tools to identify students who have plagiarised using AI. However, the focus should be on rehabilitation rather than just punishment. Students found plagiarising must complete a course to understand the infraction, and their parents are also introduced to the material. While students know how to use AI, they often lack the skills to use it effectively to support learning. Focus on ethical use cases of AI, and move students away from simple generation toward using prompts for expanding learning and understanding of topics.

 

3. The Dark Side: Deepfakes and Safeguarding

Perhaps the most sobering part of the discussion revolved around the emergence of "deepfakes" as a serious safeguarding issue. The panel shared examples where some students have created inappropriate fake videos involving staff which necessitated a serious disciplinary response. Others shared examples where students use AI to create content that leads to bullying and harassment of other students. These incidents highlight a new frontier in cyberbullying and reputation management. There’s also a risk that students will see it and think the content is real. One thing that’s clear; these tools can cause harm to students and staff, so they should be managed carefully.

 

4. The Governance Gap: Vetting the Flood of Tools

Despite the excitement around AI, the panel repeatedly returned to the theme of risk and governance. While some schools are open to using AI, the necessary "guardrails" are not yet fully written down by many. A core challenge repeated among schools is the resource limitations to vet hundreds of new AI tools that arrive on the market and are requested by teachers. How can schools foster innovation, keep pace with technology trends, yet maintain compliance?

 

Solving the Vetting Crisis: The Role of Vendor Management

As discussed by the panel, schools are under increasing pressure to ensure that the EdTech and AI tools they adopt are safe, compliant, and ethically sound. The sheer volume of applications makes manual vetting nearly impossible for busy school leaders.

This is where 9ine’s Vendor Management module becomes an essential ally for school leaders.

Relevant to the "guardrails" and the "cyber security checks", discussed in the presentation 9ine’s platform provides a streamlined solution to the complex problem of EdTech governance.

How 9ine Solves the Problems Raised by the Panel:

  • Quick and Comprehensive Due Diligence: Instead of manually reviewing privacy policies and terms of service from every EdTech provider from scratch, 9ine’s platform allows schools to easily vet vendors with comprehensive due diligence on privacy, AI, cybersecurity, and safeguarding. This directly addresses the "risk analysis" needs discussed.
  • The Vendor Library: Schools gain access to a library of over 250 pre-assessed EdTech vendors. This means you can instantly see if a popular AI tool meets the necessary legal and quality management standards without starting from scratch.
  • Compliance with UAE Law: The platform explicitly captures information on where personal data is transferred and the security measures of the vendor. This is crucial for complying with local regulations like the PDPL and ADEK digital strategy rules.
  • Demonstrating Accountability: 9ine’s tool allows leaders to demonstrate accountability for digital compliance and child protection, ensuring that teachers are only using tools that are proven to be safe.

Conclusion

Ignoring AI is not an option; leaders must educate themselves or risk falling behind in technological innovation. By combining forward-thinking curriculum changes with robust governance tools like 9ine’s Vendor Management, schools can embrace the efficiency of AI while protecting their community from its risks.


For more information on how to streamline your EdTech vetting process and secure your school’s digital environment, visit 9ine's Vendor Management page.