In the future, robots may serve in a variety of support roles, such as home assistance, office support, nursing, childcare, education, and elder care. When we reach that point, people may share their personal lives with robots, which, in turn, may create long-term personal relationships in the mind of humans. Home robots, for example, could help humans with house chores; they could entertain them, teach them new skills, or encourage them to exercise. Robots may assist people with hobbies, such as carpentry or jewelry making, or help children with their homework and music lessons. In any of these roles, robots may be required to monitor the humans they interact with, and engage in supportive interactions.
For example, a robot serving in a care facility might provide support by listening to the experiences and memories of elderly people. The way a robot responds to the human’s communication in such scenarios may have a profound effect on various personal and relationship outcomes, including the human’s perception of the robot, the human’s sense of support and security, the human’s willingness to continue to interact with the robot, and the human’s overall well-being.
We know from Social Psychology research that perceiving another person as responsive to one’s needs is inherent to the formation of emotional bonds. The partner’s perceived responsiveness, meaning their support and validation of one’s own emotional needs, benefits personal and relationship well-being because it signifies the belief that the other person can be counted on to reliably support us.1 Unfortunately, the social skills displayed by many caregiving robots are not sufficiently effective in evoking the appropriate sense of responsiveness that is characteristic of human disclosure and well-being.2 In a newly published research study,3 we explored whether implementing responsiveness cues in a robot would be compelling enough for these keys to human bonding to be also evident when interacting with an inanimate object. Specifically, we examined whether humans would be receptive to responsive support from a robot, using the robot as a safe haven in times of need and as a base for becoming more confident in a subsequent stressful interaction.
In two studies, participants told a personal event to a small desktop robot. For half the participants, the robot responded responsively, using gestures (maintaining a forward focus towards the participants, gently swaying back and forth to display animacy, and nodding affirmatively in response to human speech) and on-screen text using positively responsive speech acts (e.g., “You must have gone through a very difficult time”). The other half were interacting with an unresponsive robot, who looked “alive and listening” but did not respond with body language, and used very generic text to acknowledge that it was listening (“Please go on to the next part of your story”).
We found that people who interacted with a responsive robot (a) felt more positive about the robot; (b) had more desire to use the robot as a companion in stressful situations (e.g., visiting the dentist); and (c) their body language exhibited more approach behaviors towards the robot (e.g., leaning, smiling, and eye contact). We see this as signaling warmth and interest in close contact. Moreover, when participants had to undergo a stress-generating task (in our case: introducing oneself to potential romantic partners) after interacting with the robot, the participants who interacted with the responsive robot had improved self-perception. That is, a responsive robot made participants feel better about themselves, resulting in an enhanced perception of their value as potential mates in a subsequent self-presentation task.
Hence, people seem to find the “responsive” robot compelling and respond to it in ways in which they typically respond to social partners, for example, seeking the robot’s psychological proximity through their body language. In addition, people can leverage responsive social interactions with a robot to become more confident and appealing romantic partners. Overall, our study indicates that a responsive robot could be reassuring and compelling enough to build a sense of security that then leads to better functioning under threatening circumstances.
1Reis, H. T., & Clark, M. S. (2013). Responsiveness. In J. A. Simpson & L. Campbell (Eds.), The Oxford handbook of close relationships (pp. 400-423). New York: Oxford University Press.
2Torta, E., Oberzaucher, J., Werner, F., Cuijpers, R. H., & Juola, J. F. (2012). Attitudes towards socially assistive robots in intelligent homes: Results from laboratory studies and field trials. Journal of Human-Robot Interaction, 1, 76-99.
3Birnbaum, G. E., Mizrahi, M., Hoffman, G., Reis, H. T., Finkel, E. J., & Sass, O. (2016). What robots can teach us about intimacy: The reassuring effects of robot responsiveness to human disclosure. Computers in Human Behavior, 63, 416-423.
Prof. Gurit Birnbaum works at the Baruch Ivcher School of Psychology, the Interdisciplinary Center (IDC) Herzliya (Israel). Her research focuses on the underlying functions of sexual fantasies and on the convoluted role played by sexuality in the broader context of close relationship.