It’s not surprising Japanese computer scientists have taken a different approach with AI research than their Western counterparts. In particular, they have focused on the emotional or character components of robots, rather than strictly questions of efficiency and intelligence. And recently, these efforts has resulted in the creation of something called an “emotion engine” for AI, the result of a collaboration between Honda and Softbank (Japan’s mobile network provider)
While Japanese robots have lagged behind in some aspects, in artificial emotions they are now showing signs of outstanding the field. The reason may be simply asking seemingly nonsensical questions, like “if a robot could feel, what emotion would it be feeling?” Though some might laugh, on an everyday basis humans are notorious for imputing emotions and motives on things which emotions do not really exist. Some neuroscientists have speculated this is due to an overactive social brain. For many millennia, human survival depended upon reading other people’s emotions. This led to the development of a hypersensitive emotion detection system, prone to seeing emotion and motive where none exists.
The success or failure of commercial AI might well depend on how robots and autonomous vehicles respond to people’s emotions. The emotion engine Honda is developing will serve that precise purpose. Using sensors and cameras, the AI will gauge the user’s emotional state and respond with an emotional judgement of its own. Not surprisingly, the technology will first see use in a self-driving concept car Honda is developing called the NeuV. However, there is little reason to believe such a system would not find broader application in robots, like Pepper, that are specifically designed for human-robot interaction.
“To control, in effect, is to be controlled: by driving the car properly, I enable it to play a safe and useful role in life,” Masahiro Mori, the leading Japanese roboticist, said. “But by controlling me, the automobile enables me to be a reliable and effective driver. The same relationship links human beings with all machines. They don’t do what you want them to do unless you do what they force you to do.”
By spending excess time operating machines that lack emotions, the cognitive components that govern our own emotions could begin to atrophy. Granting emotions to machines, therefore, may ultimately — and counter-intuitively — be more about preserving our own emotions rather than passing them onto robots.