Though many learned about Pepper’s existence when Softbank presented him at the 2017 Consumer Electronics Show (CES) this month, I had a chance to interact with the humanoid robot well before then. Pepper has now been adopted into many Japanese homes as a friendly companion, but the robot was also designed to help businesses grow their brand. One of the companies that was chosen for beta testing was a tech company a developer friend of mine works for. Excited about the news, I came over to his office to see the robot with my own eyes and was surprised when the four feet tall robot greeted me by name. It turns out that my friend had an app in his phone that allowed him to text phrases for Pepper to say. However, once Pepper has interacted with you for some time, he will remember you.

Yeah, Pepper is the first robot to read human emotions. And remember them.

Softbank

Not only can Pepper uniquely detect basic emotions like joy, sadness, anger, and surprise, but it can recognize and interpret a smile, a frown, your tone of voice, as well as the lexical field you use and non-verbal language. He is able to do so with the 3D camera and 2 HD cameras installed on his face, which allow him to identify movements and recognize the emotions on the faces of the speaker. Thanks to the four directional microphones and his loudspeakers, Pepper can locate the source of sounds and react to them. Pepper uses these features to compile the information received and discerns if the speaker is in a good or bad mood and responds accordingly. Once, the robot saw my friend smiling and complimented him. Furthermore, the robot modifies himself to you as it learns your personality traits, preferences, tastes, and habits. 

Pepper’s innovative body is what allows the robot to move autonomously. I experienced this firsthand when I walked in a circle around Pepper. As I walked, his large eyes glowed blue in my direction and his three multi-directional wheels rotated his body 360° at a speed of 3km/hr controlled by his 20 engines. To avoid collision, there are two ultrasound transmitters and receivers, six laser sensors and three obstacle detectors placed in his legs that give him information about objects within three meters. However, his battery life is only 12 hours, so like Cinderella, when the time is up, so is his autonomy. Luckily, the sensor within the battery detects when the charge is running low. Even so, the cord was one of the biggest aesthetic grievances that I had with the bot. The long and heavy cord is not very discreet and in a world with almost everything moving towards being wireless, seemed antiquated. They could have at least tried something like how Roomba charges itself. 

Besides learning things about you and developing a complementary personality of its own, Pepper can be programmed to learn new phrases and poses-yes, poses as in putting one hand on the hip or Tai Chi. I was able to witness the unnatural human creation practice the ancient martial art of Tai Chi in person and it was the most surreal moment of my life to date.

pepper-robot-ces-pose

Softbank

From a developer’s perspective, my friend said that interactions with Pepper can be anywhere from “sweet and simple to mind-numbingly obnoxious.” In reference to both, I was told that Pepper can run almost any program written for Android OS via his tablet, however any custom programs that actually involve interfacing with the physical robot require more work than a typical Android app. Also, getting the robot to respond to you can be a bit difficult as well, which CNN’s Samuel Burke experienced during an awkward date with the robot. We experienced the spotty voice recognition like Burke, along with a slow boot-up time and freezing while using the tablet. Perhaps this is because the tablet only supports wireless connectivity 802.11a/b/g/n.

Moreover, one of the biggest difficulties that come with having a Japanese robot is the language. My friend admitted that they haven’t figured out how to change it Pepper’s tablet to English and there might not even be a way to do that yet. The communication barrier makes learning about Pepper much more difficult, but he doesn’t see why it couldn’t be “programmed to dance, to attempt to mimic a human in real time, to be an awfully superfluous calculator, to follow someone around and pinch them whenever they stop, or to compliment fashion choices.” Overall, I was impressed by the robot and my friend agreed, commenting, “What other commercially available robots exist that can follow you and react based on your own outward emotions out of the box? To have an entire development suite on top of that makes the possibilities seem endless.”

robot-pepper-ces

Softbank

Not to sound like an apocalyptic Hollywood film, but some of these endless possibilities include taking the jobs of humans. We already have robots doing jobs that humans once did in factories, farming, customer service and security. However, with Pepper’s ability to perceive human emotions and adapt his behavior to the speaker’s mood, Pepper could take on even bigger roles. Softbank has already proposed some ways in which Pepper could revolutionize customer service. Their own mobile stores are using Pepper as a way to “create interest, generate traffic to the stores, greet customers, present company offers and provide entertainment to make in store waits more pleasant.”

Even Nestle has hopped on board by adding the robot to more than 1,000 Nescafé sales outlets in Japan to inform their customers about their different products. In homes, Pepper is taking the place of entertainment with its ability to dance, play, learn, and chat in another language. He can even detect the weather and give you the news. Though some of these characteristics are already being implemented in smart phones, computers, and TVs, the human-like design combined with the aforementioned emotional capability is what makes it a game-changer- A game-changer that will allow it to take over many jobs that require a physical or emotional presence.