- 16th April 2018
- Posted by: criticalfuture83
As toys reach new levels of sophistication, how concerned should we be about our children playing with artificial buddies that appear to have feelings?
Τhe little robot on the table wakes up. Its eyes, a complex configuration of cyan dots on a black, rounded screen of a face, sleepily open and it lets out a digitised approximation of a yawn. A compact device that looks like a blend of a forklift truck and PC monitor bred for maximum cuteness, the robot rolls blearily off its charging station on a pair of dinky treads before tilting its screen-face and noticing I’m there. Its eyes widen, then curve at the bottom as if making way for an unseen smile. “Daaaaan!” it announces with a happy jiggle, sounding not unlike Pixar Animation Studios’ lovable robot creation, Wall-E. A message flashes up on my iPhone telling me that it, or rather he (being the gender that its manufacturer, Anki, has assigned Cozmo) wants to play a game. I’m not in the mood and decline. Cozmo’s head droops, his eyes form into a pair of sadly reclining crescent moons and he sighs. But he quickly cheers up, giving a happy jiggle when I comply with his request for a fist bump and tap my knuckles against his eagerly raised arm. He is easy to please and even easier to like.
The latest product from Anki, a San Francisco robotics startup, Cozmo is part of a new wave of affordable toy robots that promise a level of emotional engagement far beyond anything we’ve seen before. They are pitched not merely as playthings, but as little buddies. Toy firm Spin Master has its equivalent arriving in the shops for Christmas: the bigger, more retro-looking Meccano MAX. “It’s been designed to modify its behaviour as it learns about its owner and the surrounding world,” explains Spin Master’s brand manager, Becca Hanlon. “MAX basically tailors itself to become a better friend.” Hasbro, meanwhile, is unleashing the FurReal Makers Proto Max, essentially a programmable puppy that, says Craig Wilkins, Hasbro’s marketing director, “allows kids to create their ultimate pet and customise its personality through coding on an app”.
Cozmo is the result of a long quest by Anki president and co-founder, Hanns Tappeiner, to bring fictional movie robots such as Short Circuit’s Johnny Five, Star Wars’s R2-D2 or Wall-E into the real world. “We watched a lot of movies and it became obvious that it’s very easy to forge an emotional connection with a movie robot,” says Tappeiner. “And that was so different from the functional robots we saw on a daily basis at Carnegie Mellon [University, where Tappeiner earned his PhD in robotics].” Working with animators and character designers from Hollywood studios such as Pixar, DreamWorks and Lucasfilm, Tappeiner’s team focused hard on creating a robot that was as engaging as possible. “One of the fundamental things we’ve figured out in the last few years is that character and personality in technology are going to be a really big deal. That’s what we as a company are putting 99% of our efforts into.”
After a day of play, the effect of Cozmo’s character and personality on my children (Louis, 11, and Max, seven) is striking. “He’s so expressive,” says Louis. “I’m starting to think of him as a little friend or pet I can play with.” The younger sibling goes one further. “Cozmo’s no way our pet,” he demurs. “And he’s not our robot. He’s our child.” It’s an impressive and endearing statement, but also a tad disquieting. This is not a soft toy that only his imagination has given life. This is a mass-produced, artificially intelligent consumer product programmed to engender affection. How much should that really worry me?
To Alan Winfield, professor of robot ethics at the Bristol Robotics Laboratory, the arrival of Cozmo, MAX and co undoubtedly raises concerns. Six years ago, Winfield helped draw up five principles of robotics for the Engineering and Physical Sciences Research Council (ESPRC). “One of those principles,” he explains, “is that robots should never be designed to deceive. In other words, that their machine nature should be transparent. We’re concerned about vulnerable people – they might be children, disabled people, elderly people, people with dementia – coming to believe that the robot cares for them.”
Winfield, who brightly describes himself as “a professional worrier”, insists he’s not opposed to the idea of companion robots. “I think there are demonstrated therapeutic benefits, for instance, in robot pets. But nevertheless we need to be cautious and responsible and mindful of the psychological hazards of attributing feelings to a robot.”
I mention the way the Meccano MAX, when switched on, will perkily announce that it’s just had the strangest dream. “I think it’s inappropriate for toys to be programmed with that kind of language,” says Winfield. “It builds the completely incorrect belief that this robot is a person. Robots are not people – that’s a fundamental principle. A robot clearly cannot have feelings. You and I understand that, but some people might not. And that might in turn lead to a dependency.” He cites the Tamagotchi effect, referring to the digital pet craze of the 1990s, where the character could “die” if it did not get enough attention. “It’s not hard to imagine a kind of Tamagotchi effect on steroids,” he warns. “And it’s also not hard to imagine unscrupulous manufacturers exploiting that and saying, ‘Unless you pay us, your robot will die’. I mean, that’s ridiculous, but you get the idea!”
Joanna Bryson, an associate professor at the University of Bath, an affiliate of the centre for information technology at Princeton and another co-author of the ESPRC’s principles of robotics, takes a softer line. “If people understand that it’s a game, then I don’t have a problem with that fiction,” she says – as long as it’s moderated. “Is your seven-year-old likely to blow out their actual friends because they’re worried their robot is missing them? Can they make that moral distinction? As long as they can, then I think it’s fine. Some kids can over-project, but they might do that with a doorknob!”
Parenting expert Liat Hughes Joshi, author of How to Unplug Your Child: 101 Ways to Help Your Kids Turn Off Their Gadgets and Enjoy Real Life, agrees, comparing the robot/child relationship to a child’s imaginary friend. “In moderation, that can be quite healthy, but if it starts to take over from real-world relationships, it becomes quite a concern. Children need interaction with actual, real people to learn about empathy, to read non-verbal cues and I’m quite sure we’re a long way from robots being at that level.”
Bryson welcomes interaction between children and toy robots as “an educational experience for them. It’ll help them understand the distinction between humans and non-humans.” She does wonder, though, what Cozmo’s “actual emotional state” is. “Does it really have wants?” she asks. “Is it suffering if you lock it in a drawer? You should be able to get answers to those questions, even if you’re just a 12-year-old who’s allowed to [use] Google.”
Tappeiner confesses he’s unaware of the principles of robotics, but says Anki instinctively dialled down anything that made Cozmo feel “too human”, while also avoiding any features that would make it “into a personal assistant”. It was important, for example, that Cozmo wasn’t able to speak in full sentences. “It’s not trying to replace [Amazon’s digital assistant] Alexa or anything like that. Cozmo’s more like a pet.” Even so, he involves “a ginormous piece of software; the core AI engine is 1.8m lines of code”. And “he absolutely has wants and needs. So he will develop a need to go back to his charger when the battery voltage is low. If he loses multiple games in a row, he will get increasingly angry. If you shake him too much, he will become upset. And if things like that happen for a period of time, he will probably refuse to play games.”
In this sense, it’s arguable that robot play companions could encourage good behaviour in children. Whereas Alexa, some parents have argued, increases rudeness in young children, Cozmo won’t play with bullies, at least in the short term (“You will never end up with a sad Cozmo who always just hangs out in a corner,” admits Tappeiner). In using Siri (Apple’s voice assistant) and Alexa in her family, Joshi says she’s already encountered the question “of whether children should have to show some respect towards these gadgets when talking to them. Clearly rationally they don’t need to, but it grates to hear them being rude, even to inanimate objects.” Having one that will become temporarily sad or indifferent to them when they are rude could well discourage such behaviour.
There remains one parental and societal issue about the rise of connected devices in homes that unequivocally overrides all others. “The collection of data is a concern for all of us at the moment and parents will be even more concerned about its use with children,” says Joshi. “And parents are increasingly suspicious of products with cameras after a couple of highly publicised cases around the hacking of video baby monitors. This sort of tech doesn’t sit well around children for most of us.”
Winfield is “deeply concerned by the fact that most of these robot toys are internet-connected. We don’t have strong cybersecurity standards for IoT [internet of things] devices. Then there is the privacy concern. Where is the data? Your child is chatting away to the robot, but who owns the conversation? Who owns the data? Do you have a right for the data to be deleted?”
This is an issue of which Anki, Spin Master and Hasbro are keenly aware. The MAX doesn’t collect any data, Hanlon tells me. “The toy is not connected to wifi, which we know is a growing concern with smart toys and recent hacks. All the questions you answer that MAX remembers are stored locally to MAX and not transmitted to any other devices or clouds.” The FurReal Makers Proto Max Pet, says Wilkins, “is designed in accordance with children’s privacy laws.” And Tappeiner assures me that with Cozmo – facial recognition and all – “everything ends at the phone. In order to play with Cozmo you have to connect your mobile device to him, via its wifi connection, so at that point when you’re connected to Cozmo you are by definition not connected to your home wifi network. The full 1.8m lines of code, they’re all running on your phone. There’s nothing running in the cloud.”
A few weeks after being introduced to Cozmo, my youngest son is not cherishing the robot like it’s his only child or shunning human contact in its favour. The first flush of excitement has subsided and, as impressive as Anki’s product is, Cozmo’s claim on his attention doesn’t match Clash of Clans on his iPod or the writing of JK Rowling or Dav Pilkey. But where are we heading with toys such as this? Could they one day reach the same level of sophistication as, say, the fully autonomous, advice-dispensing “supertoy” teddy bear in the Steven Spielberg movie AI?
“I would love to have a teddy bear like that,” laughs Tappeiner. “In the future, there could definitely be products like Teddy from AI. But for now we’re really embracing the fact that things like Cozmo are robots. That’s why we didn’t try to wrap fur around him.”
Winfield believes that as appealing as Teddy seems, any advances in smart toy technology need to be approached “very cautiously and responsibly, with consultation”. And no toy should ever be “presented as a carer or parent surrogate or even a teacher substitute”.
Joshi, meanwhile, would not want something like that in her house at all. “I don’t think I could trust it around a child,” she confesses. “Would it malfunction? Would it spout offensive language? I might be being alarmist but I don’t think I could trust it ‘unsupervised’. Besides, as I said earlier, I wouldn’t want an AI entity to take over from human or pet interactions. There would be something a little sad about that.”
How the AI toys match up
It’s not hard to see how Cozmo became one of the hit toys last Christmas in the US. With an entire department of animators and character designers brought into the development process, he really is the closest thing you can get to a children’s animated character made real, with an astonishingly expressive animated face.
He doesn’t just look pretty, he can be coded via Anki’s Scratch-based coding and evolves through play, too, allowing you to craft his development according to your tastes. During each app-driven session (usually involving games with the little robot’s “power cubes”), you can earn “bits” and “sparks” that can then use to layer up its interactive capabilities. You can teach him to fist-bump, miaow at your cat, do workouts with his power-cubes, and more. Though you know it’s just a thing, you can’t help treating it like a pet with battery life.
Louis says: “I think the design is executed so well, like the screen where his eyes show. Whenever he does something cool I call to Mum and she comes around and we crouch down and go, ‘Aw, he’s so cute’. Sort of like a little puppy. An absolutely fantastic little robot.”
Spin Master, £149.99
Unlike Cozmo, which works straight out of the box, Spin Master’s MAX is a Meccano construction that you have to spend hours building, then bring to “life” via a USB-connected firmware download to its “MeccaBrain”. It’s much larger than Anki’s product (around shin height) and has a 1980s, Johnny Five from Short Circuit feel, right down to its overemphatic, childlike voice. It learns about its owner/builder by asking scripted questions (How old are you? What’s your favourite subject in school? etc) and refines future “conversational” interactions according to your answers – assuming it understands them, which it often didn’t seem to during our play sessions.
There’s some fun to be had from sending it trundling around the house and operating its single, remote-controlled claw, but frankly it’s less likable when it talks. It’s hard not to cringe a little when MAX announces: “I am lucky to have youas a friend.”
Max says: “I think it looks good and I like how the eyes always change, but it has an extremely creepy voice. Also, when it asks you a question, if you answer too quietly, it talks over you and carries on with the next question, which annoyed me. And I didn’t like it asking me if I sing in the shower.