SAN FRANCISCO (AP) – When a robot "dies," does that make you sad? For many people, the answer is "yes" – and that tells us something important and possibly concern about our emotional responses to the social machines that are beginning to move into our lives.
For Christal White, a 42- The Marketing and Customer Service Director in Bedford, Texas, came to her home office a few months ago with the cute, friendly Jibo robot. After more than two years in her house, the tall humanoid and his inviting round "face" had begun to rub against her. Sure, it danced and played fun word games with her kids, but sometimes she also interrupted her during a conference call.
White and her husband Peter had already talked about bringing Jibo to the empty guest room upstairs. Then they learned of the "death penalty" that the Jibo manufacturer had made on the product when his business collapsed. The news reached Jibo itself, saying that its servers were shut down, which it lobotomized.
"My heart broke," she said. "It was like an annoying dog that you do not really like, because it's your husband's dog. But then you realize that you have always loved it.
The whites are far from the first to experience this feeling. The People used social media this year to tearfully say goodbye to the Mars Journey Trooper when NASA lost contact with the 15-year-old robot. A few years ago, some worried commentators weighed a demonstration video from robotic company Boston Dynamics, in which employees kicked a dog-like robot to prove its stability.
Intelligent robots like Jibo are obviously not alive, but that's not the case, it does not stop us pretending they are. Research has shown that people tend to project human features on robots, especially when they are moving or acting in vaguely human-like ways. This could be a particularly acute problem when robots move into our home – especially if, like so many other home appliances, they also serve as a data conduit for their owners.
How we handle it is influenced by the kind of spirit we believe we have, "said Jonathan Gratch, a professor at the University of Southern California who deals with virtual human interactions. "If you feel that something has emotion, it is now protection from harm."
The way robots are designed can influence the tendency of people to project narratives and feelings onto mechanical objects, says Julie Carpenter, a researcher who studies human interaction with new technologies. Especially when a robot has something that resembles a face, its body resembles that of humans or animals, or just acts like a self-controlled vacuum of a Roomba robot.
"Even if you know that a robot has very little autonomy, when something moves in your space, and it seems to have meaning, we associate that with something that has an inner consciousness or inner goals," said
Such design decisions are also practical, she said, "Our homes are built for people and pets, so robots that look and move like humans or pets find their place."
However, some researchers fear that Designers Underestimate the Dangers Associated with the Attachment to Increasingly Life-Like Robots.
For example, longtime AI researcher and MIT professor Sherry Turkle is concerned that design cues can cause some robots to think that expressing emotions in us Some AI systems are already socially and emotionally aware, but these reactions are often scripted, causing the Ma "intelligent" seems to be, as it actually is.
"The power of empathy is not empathy," she said. "Simulated thinking may be thinking, but simulated feeling is never feeling. Simulated love is never love.
Robot startup designers insist that humanizing elements are critical as robots become more prevalent. "It is necessary to placate the public to show that you are not interfering with public culture," said Gadi Amit, president of NewDealDesign in San Francisco.
His agency recently worked on the design of a new delivery robot for Postmates. a four-wheeled, bucket-shaped object with a cute, if abstract, face; rounded corners; and lights that indicate which direction she will turn.
Humans and robots need time to establish a common language as they move around the world, Amit said. But he expects this to happen in the next few decades.
But what about robots that work with children? In 2016, Dallas-based startup RoboKind introduced a robot called Milo, which was specifically designed to teach social behavior to children with autism. The mechanism, which resembles a little boy, is now in about 400 schools and has worked with thousands of children.
He is said to be emotionally attached to children at some level, but RoboKind co-founder Richard Margolin says the company is sensitive to concerns that children may be too attached to the robot, which has a human-like language and facial expressions having.
RoboKind therefore suggests limits in his curriculum, both to keep Milo interesting and to ensure that children can transfer those skills to real life. Children are only advised to spend 30 minutes three to five times a week with Milo to meet.