Modeling the Dynamics of Nonverbal Behavior on Interpersonal Trust for Human-Robot Interactions

We describe research towards creating a computational model for recognizing interpersonal trust in social interactions. We found that four negative gestural cues-leaning-backward, face-touching, hand-touching, and crossing-arms-are together predictive of lower levels of trust. Three positive gestura...

Full description

Bibliographic Details
Main Authors: Lee, Jin Joo (Contributor), Knox, Brad (Contributor), Breazeal, Cynthia Lynn (Contributor)
Other Authors: Massachusetts Institute of Technology. Media Laboratory (Contributor), Program in Media Arts and Sciences (Massachusetts Institute of Technology) (Contributor)
Format: Article
Language:English
Published: Association for the Advancement of Artificial Intelligence, 2014-12-18T17:35:33Z.
Subjects:
Online Access:Get fulltext
LEADER 02201 am a22002293u 4500
001 92378
042 |a dc 
100 1 0 |a Lee, Jin Joo  |e author 
100 1 0 |a Massachusetts Institute of Technology. Media Laboratory  |e contributor 
100 1 0 |a Program in Media Arts and Sciences   |q  (Massachusetts Institute of Technology)   |e contributor 
100 1 0 |a Lee, Jin Joo  |e contributor 
100 1 0 |a Knox, Brad  |e contributor 
100 1 0 |a Breazeal, Cynthia Lynn  |e contributor 
700 1 0 |a Knox, Brad  |e author 
700 1 0 |a Breazeal, Cynthia Lynn  |e author 
245 0 0 |a Modeling the Dynamics of Nonverbal Behavior on Interpersonal Trust for Human-Robot Interactions 
260 |b Association for the Advancement of Artificial Intelligence,   |c 2014-12-18T17:35:33Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/92378 
520 |a We describe research towards creating a computational model for recognizing interpersonal trust in social interactions. We found that four negative gestural cues-leaning-backward, face-touching, hand-touching, and crossing-arms-are together predictive of lower levels of trust. Three positive gestural cues-leaning-forward, having arms-in-lap, and open-arms-are predictive of higher levels of trust. We train a probabilistic graphical model using natural social interaction data, a "Trust Hidden Markov Model" that incorporates the occurrence of these seven important gestures throughout the social interaction. This Trust HMM predicts with 69.44% accuracy whether an individual is willing to behave cooperatively or uncooperatively with their novel partner; in comparison, a gesture-ignorant model achieves 63.89% accuracy. We attempt to automate this recognition process by detecting those trust-related behaviors through 3D motion capture technology and gesture recognition algorithms. We aim to eventually create a hierarchical system-with low-level gesture recognition for high-level trust recognition-that is capable of predicting whether an individual finds another to be a trustworthy or untrustworthy partner through their nonverbal expressions. 
546 |a en_US 
655 7 |a Article 
773 |t Proceedings of the 2013 AAAI Spring Symposium Series