The cheerful language with which tech companies describe their platforms is often in stark contrast to the dark possibilities lurking within them. Meta, for example, describes the metaverse as “the next evolution in social connection and the successor to the mobile internet”, a place where “virtual reality lets you explore new worlds and shared experiences”. But for a young girl in the UK recently, that “shared experience” was an alleged gang rape.
British police are investigating the sexual assault of the girl, identified only as being under the age of 16, in what is said to be the first investigation of its kind in the UK. The girl was reportedly wearing a virtual reality headset and playing an immersive game in the metaverse when her avatar was attacked by several others.
Was this really rape? some have asked. The comments on an Instagram post for a story about the case in the New York Post were characteristically skeptical: “Couldn’t she have just turned it off?” “Can we focus on real-life crime please?” “I was killed in [the war video game Call of Duty],” one person said sarcastically: “Been waiting for my killer to be brought to justice.”
The difference, of course, is that while Call of Duty players can expect to be virtually killed sometimes as part of the game, the girl had no reason to expect that she might be raped. It isn’t yet known what game she was playing when the alleged assault occurred, but obviously there isn’t an online game where the goal for adult players is to rape children. The fact that they are able to in the metaverse is the issue at the heart of this case, which has attracted international attention.
The question of whether virtual rape is “really rape” goes back to at least 1993, when the Village Voice published an article by Julian Dibbell about “a rape in cyberspace”. Dibbell’s piece reported on how the people behind avatars that were sexually assaulted in a virtual community felt emotions similar to those of victims of physical rape.
As did the girl whose avatar was attacked in the metaverse, according to a senior police officer familiar with the case; he told the Daily Mail: “There is an emotional and psychological impact on the victim that is longer-term than any physical injuries.” In addition, the immersive quality of the metaverse experience makes it all the more difficult for a child, especially, to distinguish between what’s real and what is make-believe.
So while it is necessary for the police to investigate this case – with the courts to decide on the appropriate punishment for the alleged offenders – it is equally important for companies behind games in the metaverse to be held accountable for bad actors.
Their track record in regular online social media is questionable, however. For instance, Meta has a notoriously bad record when it comes to protecting children and teenagers. In 2021, the whistleblower Frances Haugen revealed that Facebook’s own internal research showed how using Instagram (which the company owns) adversely affects teen girls’ confidence and body image. In October of last year, a bipartisan coalition of 33 attorneys general filed a lawsuit against Meta in California, alleging that Facebook and Instagram are responsible for a “national youth mental health crisis”.
If gone unchecked, sex crimes in the developing world of the metaverse, against both children and adults, will become more common. A police investigator told the Daily Mail that the metaverse is already “rife” with sexual offenses. The Meta game Horizon Worlds has reportedly been the site of several sexual assaults. In 2022, the psychotherapist Nina Jane Patel, who does research on the metaverse, wrote of the “surreal nightmare” of being gang-raped in Horizon Venues (now Horizon Worlds). “Unlike in the physical world, there’s a lack of clear and enforceable rules in the metaverse,” said Patel.
A spokesman for Meta has said that users in the metaverse have “an automatic protection called personal boundary, which keeps people you don’t know a few feet away from you”. But apparently this feature isn’t doing enough to protect users from harm. This recent rape of a girl in the metaverse will be an important test for the UK’s new Online Safety Bill, a year-old set of laws to protect children and adults online. Some experts have expressed concerns that the bill doesn’t go far enough, focusing more on the content users publish rather than their actions.
The next generation of kids will spend an estimated 10 years in virtual reality over the course of their lifetimes – close to three hours a day – new research suggests. It may be that lawmakers need to add further protections to keep them safe. In the meantime, Meta could surprise everyone by stepping up and making the metaverse a place that lives up to its upbeat marketing.
Nancy Jo Sales is the author, most recently, of Nothing Personal: My Secret Life in the Dating App Inferno