The horror film M3gan – and its namesake demon doll – has charmed critics and packed movie theaters, earning $30.4mon its opening weekend. The film’s trailer, which went viral last fall for a scene which shows the robot pausing, mid-kill, to try out a TikTok dance, proved that M3gan is a villain made for memes. But it’s not all bloodthirsty fun: the movie raises questions about parenting and digitized playtime.
A quick summary, with light spoilers: M3gan is a robot doll who can do just about anything (walk, talk, twerk, murder). She’s created by Gemma (Allison Williams), a work-obsessed roboticist who suddenly has to care for her orphaned niece Cady. At first, M3gan is a hit: Gemma’s bosses at a Hasbro-esque toy company tout her as the Tesla of dolls. She and Cady become inseparable. Then M3gan gets too smart.
Evil toys have earned their spot in the canon of horror, from The Twilight Zone’s Talking Tina baby doll to Chucky, the action figure inhabited by the soul of a serial killer. As robot ethicists continue to debate how people should interact with impending AI companions (Should we have sex with them? Should we let them raise our kids?), M3gan feels especially timely.
But what do robot ethicists make of M3gan? Katie Darling, a leading expert in tech ethics and a research scientist at MIT Media Lab, says the world is far away from a real-life version of the doll.
“I don’t think we’re going to have something that’s on that level of sophisticated AI in the next decade or two,” she said. “People have completely skewed expectations of what robotics can do at this point in time, thanks to movies like this.”
Darling does, however, believe that people should start interrogating the way robot toys will be marketed and sold. “I’m not concerned about what I saw in the trailer happening in real life – the AI becoming too intelligent and not listening to commands,” Darling said. “I am concerned about whether AI should be used to replace human ability in relationships, and the answer is no.”
Machines don’t think or act like people do, Darling said. Sure, certain kinds of robot nannies do exist, such as the iPall, a 3ft tall companion that can sing, dance, answer questions, and, according to its makers, keep children occupied for “a couple of hours” while their parents are away. (Unlike M3gan, it can’t push schoolyard bullies in front of moving cars.)
But caretaking doesn’t just mean supervision: there’s an emotional aspect to raising children Darling says only humans are capable of. “Robots can be used as a supplement, like we would use a pet – not to directly replace a relationship that’s human,” Darling said.
Ronny Bogani, an artificial intelligence ethicist and attorney for children’s rights, believes that robot caretakers could “completely change the family dynamic”. For example: what if a child asks to go to the store at night, a parent says no, and then a robot nanny shoots back evidence about the nice weather and the lack of crime outside? “If a robot gives empirical evidence that shows that the parent’s rules are wrong, how long does that have to happen to an adult before they’re tired of being embarrassed by a toaster?” Bogani asks.
Darling also worries about how businesses could exploit a child’s attachment to their robot friends. “If a child has a relationship with this type of doll, corporate capitalism could hold that relationship hostage,” she said. “They could say, ‘Now we’re doing a software upgrade to the robot that costs $10,000,’ and it lulls access if you don’t pay that subscription. There are so many ways companies could manipulate people with robots.”
Bogani added that robots could police children through their adolescence. It will be a lot harder for a teenager to have a rebellious phase if a walking, talking device surveils their every move. “Breaking laws and civil disobedience is part of growing up,” he said. “At what point is a robot nanny required to report on a child?”
Though M3gan’s signature look (blonde curls, khaki dress, plaid scarf) will undoubtedly become a Halloween staple, many ethicists do not believe robots should look like people. “It’s one of my pet peeves that we’re trying to make them look human,” Darling said. “There are so many other ways to create a robotic design that’s compelling, but not setting up false expectations of a robot behaving like a human.”
Roboticists can draw on animators’ tricks to infuse human emotion in anthropomorphic figures. “I hate that we just default to human form for marketing purposes,” Darling said. And why do robot nannies have to traffic in maternal tropes about female caretakers? “Customers might want a Mary Poppins robot to take care of their kids, but it doesn’t even work because a robot won’t behave like Mary Poppins,” she added.
Bogani agrees that robots should not have humanoid features or names. And he thinks that in the future, if they’re used for childcare or the government, there should be a standard color to indicate a separate class of robot equipped with greater data protection. “I don’t know why a robot needs to have a head,” he said. “Look at humans: we’re a piss-poor design of a product. Why are we copying it with robots?”
It’s easy to understand why the M3gan film-makers copied the look of a precocious young girl for their robot: it’s endlessly creepy. But while the AI morality tale might freak audiences out today, there will certainly be a market for (hopefully law-abiding) robot helpers.
“It takes a village to raise a child, and we’re entering a time of a digital village,” Bogani said. “This technology is phenomenal and it’s so incredible, but without the right protections it could be dangerous, too.”