The Metaverse Has a Sexual Assault Problem

The Metaverse Has a Sexual Assault Problem


Raenelle Manning is an IPilogue Writer and 2L JD Candidate at Osgoode Hall Law School.


Meta platforms (“Meta”), formally known as Facebook, is the leading developer of the Metaverse. The Metaverse is described as a 3D version of the internet where users can participate in a variety of activities, some of which include attending concerts, playing games, buying digital clothing, and working. People can enter the Metaverse through virtual reality (VR) headsets, augmented reality glasses, smartphone apps, and other devices. While inside, users are represented by a digital avatar. Through the avatar’s point of view, one experiences the avatar’s virtual reality. Essentially, the Metaverse is a virtual simulation of the physical world.  However, a recent report from SomeofUs, a non-profit advocacy organization and online community that campaigns to hold corporations accountable, suggests that the darkest aspects of the physical world have manifested in the Metaverse.

Sexual Assault in the Metaverse

A  SomeofUs researcher wanting to study users’ behaviour on Meta’s social networking platform, Horizon Worlds, reported that her avatar was sexually assaulted only an hour after she entered the virtual space. Her avatar was led into a private party room where a male user sexually assaulted her while also making lewd comments. Meanwhile, other users outside the room watched and passed around what appeared to be a digital vodka bottle. The researcher expressed that although she was not physically harmed, the experience left her feeling disoriented and confused.

This is not the first time that VR users have reported abuse in the Metaverse. In December 2021, a woman published an article on Medium describing her nightmare of an experience in the Metaverse. She explained that she had been harassed and sexually assaulted by a group of male users only one minute after entering Horizon Worlds. She also admitted to feeling helpless and being unable to access the safety features during the encounter.

What Makes the Experience Feel So Real?

The Metaverse is designed to give users an immersive experience; they are meant to psychologically feel like they are in the environment. This is achieved through immersive VR features like acoustic input for the ears, haptic simulation for touch and high-resolution imagery. For example,  if you have either played or seen videos of people playing VR zombie apocalypse games, the experience feels eerily realistic; you feel like you are actually the target of a zombie attack. The VR experience can make it difficult for the mind and body to differentiate between the physical and virtual world. This lack of discernment between reality and VR demonstrates the severity of this situation, and how traumatic it must feel to experience sexual violence in the Metaverse. Although the user’s body is not being physically violated, the experience may render significant psychological impacts. 

What is Meta Doing About This?

The Metaverse has default safety features like “Personal Boundaries”, which prevent users from coming within a four feet distance from each other. They recommend not turning this feature off when interacting with strangers. Another safety feature called, “Safe Zones”, allows users to immediately transport to an isolated area. These features still seem to put the onus on the users to protect themselves against cyber assault. Nick Clegg, Meta’s president of Global Affairs, stated in a recently published blog post: “ In the physical world, as well as the internet, people shout and swear and do all kinds of unpleasant things that aren’t prohibited by law, and they harass and attack people in ways that are. The metaverse will be no different. People who want to misuse technologies will always find ways to do it.” While this statement may be true, the current minimally-moderated nature of the Metaverse will inevitably allow abusive and harmful behaviour to thrive.

Conclusion

There are calls for increased user regulation in the Metaverse. However, the prevailing response is that moderating user’s behaviour will be practically impossible because these incidents occur in real time and are thus difficult to track. We have seen first-hand, on social media networks like Twitter and Facebook, that as online communities expand, it becomes difficult to monitor harmful behaviour and content. The immersive nature of virtual reality arguably warrants serious consideration about what more can be done to protect users from virtual violence. The Metaverse is still in early stages of development and it is anticipated to significantly transform the future of human interaction. Meta and other companies involved in the Metaverse’s development should work to minimize the potential harms associated with their products to ensure users’ safety, as a company would in our non-virtual lives.