Children’s Safety in Tech Part 1: Into the Metaverse

Late last year we saw Facebook change its name to Meta, along with the announcement of the metaverse. The metaverse is a virtual-reality space in which users can interact with a computer-generated environment and other users. Users may choose any setting, time, or even another world for their communication with each other. In this virtual world, individuals are able to create their own avatar, a digital representation of themselves which is recognisable by others. Users will be able to create their own comfortable virtual spaces for alone time, and who knows how this will develop in the future. Everything we do could one day be within the metaverse, escaping a world full of issues and human error, into a world that is seemingly flawless. However, with rapidly advancing technology, how can schools keep children and their data safe?

 

We have already seen the workplace be introduced to early signs of the metaverse, with some offices using a virtual meeting room where they can use an avatar and interact with their colleagues from all over the world using a VR headset. This leads one to think of the next steps in this evolution of technology. There is a chance that one day, learning will exist in an augmented reality that uses AI and algorithmic recommendations. 


Kwang Hyung Lee from the Korea Advanced Institute of Science and Technology (KAIST) shared his thoughts on the possible transfer of education from the physical world, to the digital. Within his article, Lee delves into the idea that students need “the full support of communities and equal access to opportunities. Technological breakthroughs must be used to benefit everyone.”  In saying this, we need to think about how the metaverse will be able to benefit the learning of students. More immersive learning through 3D visuals, interactive diagrams and games based learning could enhance the learning experience of students. Meta even released an example of how students could study space in an interactive way. However, when thinking about those who will eventually be using the metaverse in whatever way they choose, benefit is rightfully achieved when privacy, security, and safeguarding are entangled in the fabric of its architecture.  

maxresdefault

©Meta via YouTube

If everything we do could inevitably move into the metaverse, we are looking to a world where work, social life, and education all happens in one shared space. Now this isn’t too dissimilar to the fact that these spaces have been created online due to distance learning. However, it’s much easier to control the expedited digital footprints of students in this environment opposed to in a digital universe where their behaviours could possibly be tracked more than students or their parents would like, whether in the classroom or in social situations. 

 

With the level of technology we have today, the risks that are likely to be associated with the metaverse lie in the realm of artificial intelligence, algorithmic recommendations and behaviour monitoring. The issues that often arise with this kind of behaviour tracking software is the unpredictability of its recommendations. Currently algorithms on social media investigate the types of topics that an individual is interested in and presents them with the same kind of content. However, within the metaverse, users will be more immersed in the content than ever before. This means that if an algorithm were to bring an individual content on self harm, radical movements, gambling, or drug abuse within the metaverse, it would likely be more intrusive for the user. China has recently released 30 articles surrounding the restrictions of algorithmic recommendations which emphasises the caution we should take when it comes to AI and algorithmic recommendations, you can learn more in our article on the matter


In the second part of this blog series, we delve into the current use of technology and edtech services within schools, and how you can protect the rights and freedoms of the children whose data you share with external vendors. We discuss the procedures in which you can follow in order to mitigate the risk of data exploitation through sharing the sensitive data of children with third parties. The process of a Child Rights Impact Assessment (CRIA) can help you to ensure that the third party vendors that are used within your school adequately protect the rights and freedoms of your students. Considering the expedited adoption of edtech services over the course of the pandemic, Children’s Safety in a Technological World Part 2: CRIAs & edtech Services takes you through the benefits of performing a CRIA when debating whether or not to transfer student data outside of your school.


9ine recognise the challenges of current and future technology, creating software and services to support schools to make the right decisions. Get in touch to learn more.

Book a Consultation

Let’s Stay in Touch

Subscribe to our newsletter to receive product announcements & other updates.

footer-illustration