[HEART#1] Social Robotics Today and Tomorrow: An Overview

On 12th October 2020 SBRE hosted the first webinar of the HEART series, Social Robotics Today and Tomorrow: an Overview. In the webinar, 3 international experts of social robotics unraveled the why, what and how of the present and the future of social robots, and discussed the many opportunities and challenges offered by the multidisciplinarity of this domain.
Our speakers:
Our first speaker, Ioana Ocnarescu, is in charge of Strate Research, the research and innovation department of Strate School of Design.
As a teacher-researcher, Ioana promotes research among design students, teaches courses on experience design and robotics. She co-directs several PhD projects and she is responsible for "Robotics by Design Lab - AI, Robots & Humans - ecologies of living together".
"I arrived at Strate 7 years ago and my first project was actually a project with SoftBank robotics [...] and I started to understand what is this field of social Robotics".
She explained what designers could bring to this field, which is an effective methodology to develop new concepts: "We, designers, know how to conduct field studies, how to observe, how to talk to people, and then we iterate through prototyping concepts, exploring and testing them in real life."We also hosted Chloé Clavel, a Professor in Affective Computing at Telecom-Paris. Her research is developed around audio and text mining applications and deals with interactions between humans and virtual agents. Chloé recently published a book called "Opinion analysis during interaction" [1].
“The idea of the book is to give an overview of natural language processing methods in a human-agent interaction and in particular natural language processing methods that are dedicated to the analysis of social emotional behaviors - e.g. behaviors corresponding to moods, sentiments, opinions, attitudes, emotions or social states - which are very important in a human-agent and human-robot interaction".
"The robot must understand the user’s social emotional behavior to detect for example when the user expresses frustration or boredom during the interaction and be able to react to it in a fluid and natural manner to ensure the user enjoys the experience”.The third speaker was Victor Paléologue, an R&D Software Engineer at SBRE who recently graduated with a PhD at Sorbonne University, demonstrating that users at home can teach new behaviors to robots using the spoken language. Victor is one of the rare lucky ones who had an opportunity to live with a robot as part of SBRE’s Pepper@Home program, internal program for developers to collect feedback and ideas for apps and new features:
“At the beginning you are really amazed by having a sort of living creature at home that can talk to you, that can answer questions and wake you up in the morning. But actually, you really quickly figure it all out and realize that the robot doesn’t get it actually, the interaction is not as deep as it should be [...] That was the motivation of my PhD, to teach the new behaviors to the robots so the robots can evolve with time.”
The present and the future of social robotics
The webinar started with a discussion about what makes a robot social. The speakers agreed it is related to the social skills of the robot, for example the ability to analyze the user’s social emotional behavior and give good answers accordingly. Victor also mentioned social robots may be needed when the human is in the loop and then the interaction becomes social, like explaining something to a human and making sure the message is understood. We, humans, are complex social beings and a simple task such as giving an instruction requires more than just the speech. It also demands understanding social cues, such as boredom, and responding to it by trying to catch the user's attention.
From another perspective, Ioana said that a social robot is what makes a person social. In that logic, “a robot is a way for us to open to the world” or a mediator [2]. She also stated that the definition and the use of social robots could be different depending on the context and the environment. For example, in a robot-child interaction in a rehabilitation process the social skills of the robot could be considered as the key element making a contribution, while solely the presence of a robot as a mediator between two humans (in the context of telepresence for instance) already makes it social [3].
But what makes a robot truly socially intelligent? Victor believes the key is in the flexibility of a system; the capability of continuous learning and adaptation [4].
In terms of machine learning, Chloé explains the challenge relies in trying to define and develop methods that allow us to take into account asymmetrical and sequential interactions [1].
Other elements were brought to the table by Ioana, such as the design, the functionality, the overall experience and the ultimate need the robot is answering to. However, from her point of view, what remains essential is that the social robot is conceived for a certain ecosystem: “when we design robots for people in a hospital, we should design robots for people taking care of the people”. The main idea here is to know the users and the environment and use this information in the design process [5].
The challenge also remains on how to evaluate that a robotic system is actually social. Being a pluridisciplinary field, social robotics has different elements that are used to evaluate the robotic systems. One can for example evaluate the success of an interaction and how much it is related to the social component of the robot; some standard questionnaires [6] [7] are used to measure for example engagement, social presence, usability or acceptability. It could also be useful to measure if the detection system of the robot used to analyse the user’s social emotional behavior is accurate. However, in Chloe’s opinion, what is often missing in these metrics is taking into account the context of the human-robot interaction, the experiments are often run in the lab or limited settings. Additionally, Ioana also agrees the answer could be in “into the wild” experimentation [8].
Research & Industry
“Design is bigger than us”, Ioana started. She clarified that the designers are efficient in terms of creating scenarios and initial prototypes but to create an optimal final product they need to work closely with researchers in AI, sociology, psychology and other relevant disciplines [5]. Following this idea, Ioana and her team started a multidisciplinary laboratory “Robotics by Design Lab”, created with 5 industrial and 3 academic partners as well as a design consultancy and PhD students with different backgrounds.
Chloé pointed out another benefit of industry–university collaboration: data sharing. She also built upon a previous point regarding the importance of performance evaluation in a real-life setting and said that the industrials can help in designing appropriate evaluation metrics according to the application, the type of the robot and the context.
Victor agreed with her and added that the researchers should always try to apply their research in real use cases, gather data and make studies “in the wild”.
Affective computing
After giving a brief introduction of what affective computing is [9] and the initial “big 6 emotions” [10], Chloé explained how the research domain has evolved [11]. Initially the field was studied using only acted data, while now there is more and more emotion-related data collected “in the wild”. In real-life use cases we are often not facing the big 6 emotions, for example when interacting with a robot at a train station people will never show fear. Therefore, now we should, and with real-life data we can, target in our research a broader set of emotions, even mixed emotions.
Personalized robots
Victor believes in the importance of personalization (his personal favorite is a butler robot :)) and it is a driving force of his research. One approach that he proposes is using spoken language, explaining to the robot what we expect from him.
Ioana is more sceptical. In this novel field of research, she believes that we need to be especially careful when working on the personalization of a new robot: how can, and can we at all, ask the right questions from the beginning to be certain we are answering the real need?
The webinar finished with the questions from the audience. You can find the speakers’ answers to these questions here.
You can also rewatch the webinar on our YouTube channel! Subscribe to it and enjoy more robotics-related videos.
References
[1] Clavel, Chloe. Opinion Analysis in Interactions: From Data Mining to Human-Agent Interaction. John Wiley & Sons, 2019.
[2] Feil-Seifer, David, and Maja J. Mataric. "Defining socially assistive robotics." 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005.. IEEE, 2005.
[3] Forlizzi, Jodi, Carl DiSalvo, and Francine Gemperle. "Assistive robotics and an ecology of elders living independently in their homes." Human–Computer Interaction 19.1-2 (2004): 25-59.
[4] Paléologue, Victor. Teaching Robots Behaviors Using Spoken Language in Rich and Open Scenarios. Diss. Sorbonne Université, 2019.
[5] Ocnarescu, Ioana, and Isabelle Cossin. "Rethinking the Why of Socially Assistive Robotics Through Design." International Conference on Social Robotics. Springer, Cham, 2017.
[6] Heerink, Marcel, et al. "Assessing acceptance of assistive social agent technology by older adults: the almere model." International journal of social robotics 2.4 (2010): 361-375.
[7] Nomura, Tatsuya, Takayuki Kanda, and Tomohiro Suzuki. "Experimental investigation into influence of negative attitudes toward robots on human–robot interaction." Ai & Society 20.2 (2006): 138-150.
[8] Salter, Tamie, Iain Werry, and Francois Michaud. "Going into the wild in child–robot interaction studies: issues in social robotic development." Intelligent Service Robotics 1.2 (2008): 93-108.
[9] Picard, Rosalind W. Affective computing. MIT press, 2000.
[10] Lewis, Michael, Jeannette M. Haviland-Jones, and Lisa Feldman Barrett, eds. Handbook of emotions. Guilford Press, 2010.
[11] Barrett, Lisa Feldman. How emotions are made: The secret life of the brain. Houghton Mifflin Harcourt, 2017.