In this study, adult participants from around the world observed and evaluated scenes of social interactions between two humans or a human and a robot, described as peer familiar relationships. In different experimental conditions, one of the characters expressed emotional vulnerability by saying, "I am sad," and the other performed a comforting gesture by touching their arm. In response to this gesture, the character who was touched reciprocated the touch. Participants were asked to assess how trustworthy the character providing or receiving comfort was in terms of ability, benevolence, and integrity. Additionally, observers rated the realism of the interaction, the appropriateness and pleasantness of the touch, the valence and arousal attributed to characters', distinguishing between phases where touch was initiated and reciprocated. We support the idea that trust is an interpersonal bond amplified by our paradigm which uses a social touch exchange to emphasise the interactive nature of the scene. We show that trust promotes positive appraisal of social touch, with the experimental manipulations of the social scenario, mediated by observers’ propensity to trust, resulting in differences in perceived trustworthiness, perceptions of the interaction and associated affective states. In addition, we shed light on the limitations of applying these concepts to companion robots. Nevertheless, propensity to trust is a subjective and potentially plastic trait that can be leveraged to facilitate acceptance of technologies through positive experience. Showing that if we trust, then social touch will be perceived as more appropriate and pleasant, we take a complementary perspective to previous studies that have investigated the reverse relationship (if we touch, we trust).
First and foremost, we did not find differences in the perceived trustworthiness of individuals based on their role in the interaction. People perceive as equally trustworthy someone who comforts another in a moment of vulnerability and someone who expresses their own vulnerability. This finding significantly expands our understanding of trust, which has been mainly conceived as a one-way perception and behaviour from the trustor to the trustee10,17. Trust is rather an interpersonal, interactive mechanism built upon the willingness to share one's vulnerabilities with the other. Observing how these mechanisms operate in human-robot interactions allows us to understand whether they are more or less specific to human interactions, or rather fundamental principles that can be leveraged to build trust in technologies designed for social presence. We found that the robot is overall perceived as less trustworthy than human interaction partners, especially when it expresses vulnerability. This suggests that people may not desire to interact with a social robot that, like a human being, can express vulnerability and receive comfort. Symmetry in human relationships among peers is fundamental for various social processes, including perspective taking and empathy. Instead, we should perhaps consider robots as partners of asymmetric, more unidirectional relationships, where they need to possess specific social skills to provide emotional support to humans. This clearly imposes limitations on the social relationship with a robot and raises important questions about the foundations of human-robot trust, the design and implementation of companion robots. According to previous literature, robots are perceived as less reliable if designed in a more anthropomorphic way52. Anthropomorphism of a robot has been found to be implicitly associated with lower agency and capacity to sense and feel compared to humans53, potentially because of the mismatch between affordances (what I expect the robot to do given its appearance and features) and actual performance. Indeed, our results indicate that, when comforting one another, humans are perceived as trustworthy especially for their abilities to provide support and assistance to another person, whereas robots are perceived as less skilled for social exchanges. Some promising alternatives for robots that can receive touch and comfort are pet robots, which can be used in healthcare to promote patients’ well-being32,33,54. Notably though, perceived trustworthiness of an agent in a specific situation is influenced by observers’ dispositional attitudes, such as their general propensity to trust. Our data suggest that there are two somewhat distinct systems for trusting other people or technology, which specifically come into play in these two different types of (social) interactions.
Secondly, we see that our manipulations of the social scenario, mediated by propensity to trust, results in differences in how the interaction was perceived, with trust promoting positive appraisal of social touch. In human-to-human scenarios, propensity to trust others is positively associated with perceived character’s trustworthiness, interaction realism, touch appropriateness and pleasantness. In human-robot scenarios, ratings of realism, appropriateness and pleasantness are lower. This is especially evident when the robot assumes the vulnerable role. Nevertheless, individual propensity to trust technology reduces the gap between humans and robots. These insights offer a new perspective in the study of the link between touch and trust, where researchers have primarily investigated the role of social touch in promoting and facilitating interpersonal trust, whether mediated or not by technology (see Valori et al., in press, for a systematic review). Here we look at the other side of this presumably two-way interaction. We propose that trust is a prerequisite for positively perceiving tactile social interactions and that there are two somewhat distinct systems for trusting other people or technology, which specifically influence these two different types of (social) interactions. Additionally, propensity to trust is a subjective and plastic trait with the potential to influence acceptance of technologies through positive experience. It can be hypothesised that with the advancement and widespread use of technology in everyday life, people's overall trust in technologies is likely also to increase. If trust is moderated by familiarity with specific tools55,56, we may have to wait for companion robots to appear more regularly in our daily contexts to understand whether future humans will be more inclined to trust and interact with them in affective ways. Studies on the development of trust in children show that familiarity is particularly important in novice learners, and that with increasing social experience, discrimination, e.g., of more or less trustworthy informants, is refined to be increasingly driven by the other’s competence, also when it is a robot57. Therefore, trust towards others and robots is plastic and understanding individual differences can aid in personalising robot touch behaviours to optimise interactions.
Lastly, we investigated which affective states are associated with the different social scenarios, particularly in terms of valence and arousal, which are key dimensions for understanding social touch58,59. In our paradigm, social touch is used to amplify the interactive nature of a peer-to-peer comforting exchange. We see that reciprocity of touch influences the affective experience, alleviating feelings of sadness (as shown by less negative valence and reduced arousal). Observers with higher propensity to trust others also attributed less arousal to the characters in the human-to-human scenarios. The power of reciprocal touch and trust is lessened in human-robot interactions, where we see less arousal, especially when the robot assumes the vulnerable role. Previous research found that interpersonal touch is more arousing than object-based touch, suggesting that human-to-human touch is experienced as more intense59, and the robot in our study may have been perceived as an object more than a social partner. Such human-robot interaction is therefore perceived as less realistic, appropriate, pleasant, and less emotionally meaningful. We also found that observers with higher aversion towards social touch perceived the scenarios as overall less arousing. If higher touch aversion is associated with higher vigilance to observed social touch (as suggested by the neural responses found by60), we could expect the opposite relation between touch aversion and arousal. On the other hand, it is possible that less touchy-feely people are simply less activated by scenarios of vicarious touch, without necessarily showing discomfort or hyper-vigilance. Indeed, valence does not appear to be influenced by individuals’ touch aversion in our data.
It is worth mentioning that this study has some limitations, which open the doors to future research. We focused on the perception of observed social tactile interactions between two humans or a human and a robot. To safeguard the simplicity of experimental design and statistical models, we did not include a control condition in which the interaction did not involve touch. Moreover, we used static pictures instead of animations to avoid confounding aspects such as touch velocity. Comforting touch has well-known optimal velocity ranges in human-to-human interactions5. Robots can also be programmed to execute movements with spatio-temporal patterns designed to represent different emotions (e.g., in61). However, the movements of real robots are still far from the smoothness of human ones, and creating animations in which a robot moves to touch as a human would move can lead us into an uncanny valley. To deepen the role of social touch in human-robot interactions, future studies might not only compare touch and no-touch conditions, but also explore different types of touch. Different combinations of physical parameters of touch, such as velocity, intensity, duration, and contact areas result in different gestures (e.g., stroking, holding, shaking, tapping, squeezing) that convey different emotional meanings, from sadness, to joy, gratitude, and love62. It is crucial to disentangle the importance of the robot being able to understand and communicate through touch. To become a socially intelligent partner a robot must be able to capture and classify human touch and respond to this in an appropriate manner, interpreting not only tactile features but also contextual factors63. At the same time, the robot could also be able to touch the human in an affective way, and produce tactile gestures that the human can understand64.
The present study is based on an observational task in which participants are exposed to images of social interactions that include touch. Although the participants play the role of simple observers of scenes taking place between two characters, literature suggests that the mere observation of others' touch leads to pleasantness ratings65 and brain activity similar to those associated with a first-person experience of touch (e.g., as 66 found with monkeys). Therefore, the participants' evaluations of the proposed stimuli can be interpreted as an indicator of how they would perceive the social situation themselves. Nonetheless, future studies would need to conduct lab-based experiments whereby participants interact with robots. This possibility is challenged by the limited skills and capacity for actual interactivity that robots have at the present time especially with regards to exchanges involving social touch32,63. In terms of the possibilities this set-up would open up, among the most fascinating is surely the integration of neural, physiological, and kinematic measurements to characterise human cognition, perception, and action during social interactions with robots.
Although there has been significant progress in creating more advanced and socially adept robots in recent years, there are concerns that the field is entering a winter phase of disillusionment67. Researchers are putting a lot of resources into enhancing the naturalness and authenticity of robot behaviours (e.g., designing robots to display emotions and responses that are as realistic as possible), with the idea that this will foster more genuine and meaningful interactions with humans. For instance, robots are being programmed to recognize touch gestures68 and to perform touches with optimal sensorimotor features to be perceived as pleasant and non-intrusive50,69. However, touch is a communicative signal that takes on various nuances, uses, and interpretations depending on the context and the person giving or receiving it70. Our society has yet to establish new social norms for digital social touch, through a dialogue between what is technologically feasible and what is truly desired by and beneficial for the human in the loop71,72. It is crucial that we understand under which conditions and in what contexts human-robot interactions can benefit from social touch. To address this, it is essential to clearly define the neurocognitive processes that underpin human-robot interactions, employing neuroscience and psychophysiology techniques to uncover the genuine capabilities and limits of social robots73.
In conclusion, perceiving other individuals as trustworthy is crucial in affective exchanges that involve social touch, where barriers between the self and the other are reduced, we share vulnerabilities, offer closeness and comfort. Here we provide evidence that trust is an interpersonal, interactive tango rather than the one-way mechanism from trustor to trustee that has been studied in previous literature. We also show that trust promotes positive appraisal of social touch, offering a complementary perspective to studies that have shown the reverse effect of touch as a trust booster. Looking into the future, we see our lives increasingly intertwined with those of technologies such as robots, which are not only tools but also partners in social exchanges. Yet, we still do not know what social norms apply to these new interactions. The present findings show potential limits to the social power of trust and touch in human-robot interactions, suggesting, however, that leveraging individuals' positive attitudes and trust towards technology can reduce the distance between humans and robots. This will help to shed light on crucial challenges in robot design that we humans could potentially perceive as partners to trust and touch.