Skip to the main content.

2 min read

The Dark Side of Innovation: Uncovering the Hidden AI Risks in Schools

The Dark Side of Innovation: Uncovering the Hidden AI Risks in Schools
The Dark Side of Innovation: Uncovering the Hidden AI Risks in Schools
4:32

Closing the session at the St. Regis Downtown, Mark Orchison, CEO of 9ine, took the stage to address the room of school leaders. Following discussions on operational efficiency and curriculum changes, the conversation shifted to focus on the emerging AI-related threats that 9ine is currently tracking globally.

Moving beyond the hype of chatbots, this presentation exposed the specific legal and safeguarding dangers lurking in school digital ecosystems. Here are the critical takeaways from the session.

 

1. The New Face of Extortion: AI and Student Images

Perhaps the most alarming trend highlighted is a sophisticated new form of cyber extortion targeting schools. Attackers are systematically downloading images of children from school websites and social media channels.

Using AI and reverse image lookups, perpetrators identify the children and link them to their parents. They then generate "deepfake" abuse images or videos of the students. The attackers demand a ransom from the school to stop the publication of these materials on the dark web. If the school does not engage, the attackers escalate by sending the horrific material directly to the parents and the children themselves.

This is not just a criminal issue; it is a compliance issue. In the UAE, schools that publish unrestricted images of children on social media are potentially failing to comply with the Personal Data Protection Law (PDPL). By publishing these images, the school accepts the risk on behalf of the child. It's the school’s responsibility to understand what those risks are to the rights of freedom for the children.

 

2. The "Trojan Horse" of EdTech Terms & Conditions

Building on the earlier discussion about vetting apps, data was shared from 9ine’s recent assessment of 50 leading AI tools.

A common "legal sleight of hand" used by vendors was revealed. Schools often sign contracts believing the vendor acts as a "processor" (simply handling data on the school's behalf). However, when a student or teacher logs in, they are often presented with a separate set of Terms and Conditions. By clicking "agree," the user unknowingly grants the vendor "controller" rights, allowing them to use that personal data to train their AI models.

This means that while a school leader thinks their data is ring-fenced, the vendor may actually be harvesting it to build their product—a massive privacy violation.

 

3. The 11% Danger: Stranger Danger in EdTech

Of the 27 AI products 9ine fully assessed, a startling 11% of these tools facilitate unmonitored contact between users.

These are not just students talking to students; these platforms often allow adults to engage with children. Rushing to adopt the latest "shiny" AI tool comes with a warning, as schools can inadvertently open digital doors allowing strangers to contact their students.

 

4. Bias and the "Wild West"

The current sentiment is that the AI landscape is the "Wild West", and many of today's vendors will not exist in 24 months. Furthermore, 9ine’s assessments found that many models are "inherently biased" and engineered with "persuasive design" to foster dependency, ensuring students stay hooked on the platform rather than engaging in deep learning.

 

Moving Forward: Gamifying Governance with "Turing Trials"

To help leadership teams navigate these complex scenarios without being overwhelmed, the "Turing Trials", a card game developed by 9ine, was shared, encouraging schools to download and play among their teams to raise awareness about the considerations for using AI.

Designed for professional development, the game presents school leaders with hypothetical scenarios based on real-world risks, like AI bias or information breaches. It forces leadership teams to role-play their response, testing their resilience and decision-making in a safe environment before a real crisis hits.

 

Conclusion

Innovation cannot come at the cost of safety. Whether it is restricting image sharing to comply with the PDPL or rigorously vetting vendor contracts to prevent data harvesting, schools must take a "security-first" approach to AI.

How 9ine Can Help

As discussed in the presentation, manual vetting is no longer sufficient. 9ine’s Vendor Management and Privacy modules allow schools to:

  • Identify Risks: See which vendors are acting as "controllers" and harvesting data.
  • Manage Compliance: ensuring alignment with UAE’s PDPL regarding student data and images.
  • Train Staff: Using tools like the Turing Trials and the 9ine Academy to build cyber-resilience.

For more on securing your school, visit 9ine's website. 

AI in Education: 9ine presents ‘Turing Trials Walk-throughs!’

AI in Education: 9ine presents ‘Turing Trials Walk-throughs!’

Introducing ‘Turing Trials Walk-throughs’, our weekly guide between now and the end of 2025, which takes a look at each of the Scenarios in Turing...

Read More
9ine presents ‘Turing Trials’

9ine presents ‘Turing Trials’

Looking for a fun, free and engaging way to have discussions about the opportunities and risks of AI in education? Well look no further, as 9ine are...

Read More