youhe horror film M3gan, and its eponymous demon doll, has charmed critics and packed theaters, earning $30.4 million in its opening weekend. The trailer for the film, which went viral last fall with a scene showing the robot pausing, mid-kill, to try out a TikTok dance, proved that M3gan is a villain made for memes. But it’s not all bloodthirsty fun: The film raises questions about parenting and digitized playtime.

A quick recap, with light spoilers: M3gan is a robot doll that can do almost anything (walk, talk, twerk, assassinate). She was created by Gemma (Allison Williams), a work-obsessed roboticist who suddenly has to care for her orphaned niece Cady. At first, M3gan is a hit: Gemma’s bosses at a Hasbro-style toy company tout her as the Tesla of dolls. She and Cady become inseparable. So M3gan gets too smart.

Evil toys have earned their place in the horror canon, from the talking Tina doll from The Twilight Zone to Chucky, the action figure inhabited by the soul of a serial killer. As robot ethicists continue to debate how people should interact with imminent AI companions (Should we have sex with them? Should we let them raise our children?), M3gan feels especially timely.

But what do ethicists think of M3gan’s robots? Katie Darling, a leading tech ethicist and research scientist at the MIT Media Lab, says the world is a long way from a real version of the doll.

“I don’t think we’re going to have something that’s at that level of sophisticated AI in the next decade or two,” he said. “People have completely skewed expectations of what robotics can do right now, thanks to movies like this.”

Darling, however, believes that people should start questioning the way robotic toys will be marketed and sold. “I’m not worried that what I saw in the trailer is happening in real life: the AI ​​gets too smart and doesn’t listen to commands,” Darling said. “I am worried about whether AI should be used to replace human capacity in relationships, and the answer is no.

Machines don’t think or act like people do, Darling said. Sure, there are certain kinds of robot babysitters, like the iPall, a 3-foot-tall companion that can sing, dance, answer questions and, according to its creators, keep kids busy for “a couple of hours” while their parents are away. outside. . (Unlike M3gan, he can’t push schoolyard bullies in front of moving cars.)

But nurturing doesn’t just mean supervision: There’s an emotional aspect to raising children, Darling says, that only humans are capable of. “Robots can be used as a supplement, like we would use a pet or a companion, not directly replace a relationship that is human,” Darling said.

Ronny Bogani, an artificial intelligence ethicist and children’s rights advocate, believes that robotic caregivers could “completely change family dynamics.” For example: what if a child asks to go to the store at night, a parent says no, and then a robot babysitter responds with evidence about the good weather and the lack of crime outside? “If a robot provides empirical evidence that parental rules are wrong, how long does that have to happen to an adult before she gets tired of being embarrassed by a toaster?” Bogani asks.

Darling also worries about how companies could exploit a child’s attachment to his robot friends. “If a child has a relationship with these types of dolls, corporate capitalism could take that relationship hostage,” she said. “They could say, ‘Now we’re doing a software upgrade for the robot that costs $10,000,’ and if you don’t pay that subscription, access is cut off. There are so many ways companies could manipulate people with robots.”

Dancers dressed as the robot perform on the black carpet during the M3gan premiere in Los Angeles.
Dancers dressed as the robot perform on the black carpet during the M3gan premiere in Los Angeles. Photo: Jim Ruymen/UPI/REX/Shutterstock

Bogani added that the robots could keep an eye on children during their teenage years. It will be much more difficult for a teenager to have a rebellious phase if a walking and talking device watches her every move. “Breaking laws and civil disobedience is part of growing up,” she said. “At what point is a robot babysitter required to report on a child?”

Although M3gan’s signature look—blonde curls, khaki dress, plaid scarf—will no doubt become a Halloween staple, many ethicists don’t believe robots should look like people. “It’s one of my favorite things that we’re trying to make them look human,” Darling said. “There are many other ways to create a robotic design that is convincing, but doesn’t create false expectations that a robot will behave like a human.”

Robotics specialists can resort to animators’ tricks to infuse human emotion into anthropomorphic figures. “I hate that we just default to human form for marketing purposes,” said Darling. And why do robot nannies have to traffic in motherly tropes about female caretakers? “Customers may want a Mary Poppins robot to look after their children, but it doesn’t even work because a robot won’t behave like Mary Poppins,” she added.

Bogani agrees that robots should not have humanoid features or names. And he thinks that in the future, whether they are used for childcare or government, there should be a standard color to indicate a separate class of robot equipped with greater data protection. “I don’t know why a robot needs to have a head,” he said. “Look at humans: we are a poor design of a product. Why are we copying it with robots?

It’s easy to understand why the filmmakers at M3gan copied the look of a precocious child for their robot: it’s endlessly creepy. But while the tale of AI morality might scare the public today, there will certainly be a market for (hopefully law-abiding) robot helpers.

“It takes a village to raise a child, and we are entering the age of a digital village,” Bogani said. “This technology is phenomenal and incredible, but without the proper protections it could also be dangerous.”