Skip to the main content.

6 min read

‘Dad, how did they know more about me than I know about myself?’ Predicting a Child’s Future Before It Happens - AI, Data Aggregators, Schools and EdTech

‘Dad, how did they know more about me than I know about myself?’ Predicting a Child’s Future Before It Happens - AI, Data Aggregators, Schools and EdTech
‘Dad, how did they know more about me than I know about myself?’ Predicting a Child’s Future Before It Happens - AI, Data Aggregators, Schools and EdTech
11:38

When I recently became a parent, I did what many new parents do—I bought a baby monitor to watch over my child as they slept. What surprised me wasn’t the convenience, but the extent of data these seemingly simple devices collected. From audio and video feeds to sophisticated biometric monitoring, the intrusion into our home life was unsettling. Even more concerning was realising how this early-life data could potentially follow my child through their schooling and beyond.

The Hidden World of Baby Tech and Data Collection

Today's baby monitors aren’t just audio or visual tools. Many use AI to analyse breathing patterns, sleep cycles, and crying. Products such as Nanit Pro or Owlet Dream Duo (whose Privacy Policy re-directed back to its product page when clicked on at the time of writing, making the data which it collects even less transparent) continuously stream and store sensitive personal and health data—often to cloud servers managed by third-party companies. These advanced features, though reassuring for parents, mean that a digital profile of a child can begin practically from birth.

From Nursery to Classroom: The Digital Trail Expands

This digital trail, now commonplace in the modern world, doesn’t stop in infancy. As children enter education, the volume and type of data collected about them expands dramatically. EdTech solutions track attendance, learning progress, behaviour patterns, and even emotions and attention via AI-powered surveillance systems. Companies, such as Illuminate Education, whose tagline is ‘addressing the whole child’ have already suffered major data breaches exposing hundreds of thousands of students’ records (including children’s names, birthdays and special education and free-lunch statuses) illustrating the risk that accompanies widespread digital data collection in schools.

Predictive Analytics: Joining the Dots

The convergence of early childhood data with educational records and other personal data that schools collect enables sophisticated predictive analytics. AI algorithms can now predict educational outcomes, potential behavioural issues, and even future health or employment prospects based on patterns established in infancy and refined throughout childhood. Even seemingly anonymised data sets can be linked together by advanced AI, raising significant privacy concerns and ethical dilemmas.

Researchers have shown that supposedly anonymised data, like date of birth, postcode, and gender, can often uniquely identify individuals. This means that aggregated data—initially collected for harmless reasons—can become highly personal once linked with school records, medical histories, or behavioural profiles.

GDPR, CCPA, and the Global Privacy Challenge

Regulations like Europe’s GDPR and California’s CCPA attempt to protect children’s privacy rights. GDPR, in particular, demands explicit parental consent for data collection on minors in relation to these types of product and restricts profiling or automated decision-making affecting children. The CCPA similarly places controls on selling minors' data, although it provides fewer overall protections compared to GDPR.

These regulations are vital, but face challenges. Enforcement is difficult, especially when data crosses borders, is aggregated, or when parents inadvertently consent to complex privacy terms without understanding the implications. I am a parent who is interested in, and heavily informed about this space and these issues, and it’s still difficult for me to understand all of the data being collected and processed about my child (especially when legally-required Privacy Notices are not accessible), making it even more difficult for those who are not. This is why it is important for schools to educate the whole school community on the risks and issues of AI, privacy and data capture, so that parents and guardians can make informed decisions about their child’s digital footprint, ones that they would be confident in explaining their rationale for when the child becomes an adult and has to reflect on the impact that this has (and will have) on them.

Real-world Consequences of Data Breaches

I’ve mentioned the Illuminate Education breach, but this was not in isolation and several other notable breaches highlight the real-world risks to schools and students:

  • Edmodo (2017): Where 77 million student, teacher and parent accounts were breached and leaked by being published to a popular hacking forum and made freely available. Edmodo also went on to receive an FTC order in 2023 for collecting personal data from children without obtaining their parent’s consent and using that data for advertising
  • CloudPets (2017): Where over two million children's voice messages leaked due to unsecured servers with similar vulnerabilities to the Cayla doll, an internet-connected toy that was found to be easily breached and could even be hacked on to spy on its owners 

These incidents highlight how easily data intended to support children can instead expose them to both immediate and lifelong privacy risks.

How Digital Footprints Become Marketing Tools

Early-life data isn’t just stored—it’s actively traded. Companies like Life360, marketed as a family safety app, have sold precise location data of families and children to data brokers, turning intimate family movements into commercial products. Smart toys, such as connected teddy bears (like CloudPets), record interactions that may be anonymously aggregated but still enable advertisers to build detailed behavioural profiles targeting households with young children.

This data trail continues in schools, where EdTech platforms capture detailed academic, behavioural, and demographic information. Such profiles can influence advertising and content targeting, shaping children's interests and behaviours without their knowledge or explicit consent. Between companies like Edmodo and Life360 selling this data to the highest bidder and cyberattackers hacking schools and EdTech products to expose this for free, this is a wealth of information which is available about your child to the highest bidder or worse. 

As previously mentioned, data aggregators today can already identify individuals within a household by their IP addresses and the apps they use. As babies become toddlers and subsequently start school, technology exists to aggregate datasets from parental technology use and combine them with data gathered through educational technologies. This linkage creates an enriched profile of the household, encompassing parents, children, and anyone else residing at the same address.

AI and Behavioural Influence: Predicting and Shaping Futures

Advanced predictive analytics also enable organisations to infer future preferences or behaviours based on early collected data, analytics which are only becoming stronger all the time. Algorithms can forecast everything from academic potential to consumer interests, effectively scripting a child's experiences and opportunities. This capability poses serious ethical questions: should an AI-driven assessment influence educational pathways or life opportunities based on data collected during infancy or primary school (even if it can)? Should schools use EdTech products that can mine aggregated data that is freely available to better increase their enrolment pipeline, and knowingly gain deep knowledge on the prospective child, and family, from intelligence (useful, but perhaps invasive technology)?

Unless a school places particular emphasis on privacy or is driven by compliance obligations, the depth and quality of data available for AI analysis expands significantly when children start school and their families adopt prescribed technologies to integrate with the school community. In my experience, very few schools recognise or fully comprehend this issue, and even fewer proactively consider their responsibilities regarding the ethical use of personal data.

Protecting Autonomy, Preventing Manipulation

The risk of manipulation through targeted advertising and content recommendations based on predictive analytics is substantial. Such practices can limit children's autonomy, subtly guiding their decisions and interests. Additionally, predictive profiling risks entrenching biases, reinforcing stereotypes, and unfairly limiting opportunities for those identified as "high-risk" by opaque algorithms.

Whilst sometimes the personal data that is acquired through cyber attacks and other means of appropriating personal data may seem minimal, the intelligence of AI can aggregate this to create vast profiles on individuals. For example, in the Ashley Maddison, Dior and Marks & Spencer cyberattacks (and many more) the personal data accessed in isolation was limited, however the aggregate data on shopping habits, purchases, lifestyle choices, and the inferences that could be made from it meant that the potential impact was significant. These attacks will not stop, are likely to increase in frequency When EdTech is concerned, are likely to result in the publication of sensitive information about children that will forever be baked into AI models, searchable and used to influence their journey in life. If you think about current potential fraud that could affect you, most of it is one dimensional. It’s a phishing email. It’s a hoax SMS, or a familiar phone number representing your bank seeking to influence you into losing money. Where we are going with technology creates a complete new universe of potential ways to influence through the aggregation of current disparate data sets - some legal, some in a grey area and much on the dark web; captured and used illegally. 

The Role of Schools and Educational Leaders

Schools stand at the intersection of technological innovation and ethical responsibility. In developing software and services aimed at making schools safer, more secure, and compliant, I've always sought to simplify these processes as much as possible. Now, as a parent myself, my motivation has deepened—I have a personal, vested interest in supporting schools by providing stronger protections and practical tools to limit potential harm arising from emerging technologies such as AI. However, regardless of my motivations, schools themselves must undertake much of the heavy lifting. 9ine has developed powerful tools to support this effort, and our future engineering pipeline is focused on making this work easier, faster, and more cost-effective. Nevertheless, school leaders must take the initial steps and lay the groundwork.

A Future Built on Privacy, Not Predictions

The ability to predict a child's future through data and AI is both remarkable and troubling. It’s our collective responsibility as educators, parents, and policymakers to ensure technology enhances education without compromising privacy. By implementing robust privacy frameworks and fostering a culture of digital awareness from infancy, we can protect children’s rights and autonomy, ensuring that their futures remain theirs to shape—not predetermined by data collected in their earliest days.

International Privacy Day 2025: Five Myths about Privacy and Data Protection

International Privacy Day 2025: Five Myths about Privacy and Data Protection

Happy International Privacy Day! A day which is celebrated globally by raising awareness and promoting best practice for privacy and data protection....

Read More
AI in Education: AI Literacy. What do schools need to know?

AI in Education: AI Literacy. What do schools need to know?

The EU AI Act’s first provisions came into effect this month, including the requirement for schools to ensure the appropriate level of AI literacy...

Read More
Managing AI Integrated EdTech: Best Practices for Schools

Managing AI Integrated EdTech: Best Practices for Schools

As EdTech Vendors create new AI products and introduce AI features to existing ones, schools need to review the compliance of the tools that they are...

Read More