Automatic recognition of touch gestures in the corpus of social touch
Tóm tắt
For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).
Từ khóa
Tài liệu tham khảo
Altun K, MacLean KE (2015) Recognizing affect in human touch of a robot. Pattern Recognit Lett 66:31–40
Bailenson JN, Yee N, Brave S, Merget D, Koslow D (2007) Virtual interpersonal touch: expressing and recognizing emotions through haptic devices. Human Comput Interact 22(3):325–353
Balli Altuglu T, Altun K (2015) Recognizing touch gestures for social human-robot interaction. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), (Seattle, WA), pp 407–413
Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. Trans Intell Syst Technol 2:27:1–27:27
Chang J, MacLean K, Yohanan S (2010) Gesture recognition in the haptic creature. In: Proceedings of the International Conference EuroHaptics, Amsterdam, The Netherlands, pp 385–391
Cooney MD, Becker-Asano C, Kanda T, Alissandrakis A, Ishiguro H (2010) Full-body gesture recognition using inertial sensors for playful interaction with small humanoid robot. In: Proceedings of International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, pp 2276–2282
Cooney MD, Nishio S, Ishiguro H (2012) Recognizing affection for a touch-based interaction with a humanoid robot. In: Proceedings of the International Conference on Intelligent Robots and Systems (IROS), Vilamoura, Portugal, pp 1420–1427
Dahiya RS, Metta G, Valle M, Sandini G (2010) Tactile sensing—from humans to humanoids. Trans Robot 26(1):1–20
Flagg A, MacLean KE (2013) Affective touch gesture recognition for a furry zoomorphic machine. In: Proceedings of the International Conference on Tangible, Embedded and Embodied Interaction (TEI), Barcelona, Spain, pp 25–32
Flagg A, Tam D, MacLean KE, Flagg R (2012) Conductive fur sensing for a gesture-aware furry robot. In: Proceedings of the Haptics Symposium (HAPTICS), Vancouver, Canada, pp 99–104
Gallace A, Spence C (2010) The science of interpersonal touch: an overview. Neurosci Biobehav Rev 34(2):246–259
Gaus YFA, Olugbade T, Jan A, Qin R, Liu J, Zhang F et al (2015) Social touch gesture recognition using random forest and boosting on distinct feature sets. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Seattle, WA, pp 399–406
Haans A, IJsselsteijn W (2006) Mediated social touch: a review of current research and future directions. Virtual Real 9(2–3):149–159
Hertenstein MJ, Verkamp JM, Kerestes AM, Holmes RM (2006) The communicative functions of touch in humans, nonhuman primates, and rats: a review and synthesis of the empirical research. Genet Soc Gen Psychol Monogr 132(1):5–94
Hertenstein MJ, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9(4):566–573
Heslin R, Nguyen TD, Nguyen ML (1983) Meaning of touch: the case of touch from a stranger or same sex person. Nonverbal Behav 7(3):147–157
Hsu CW, Chang CC, Lin CJ, et al (2003) A practical guide to support vector classification. http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf. Accessed 17 May 2016
Hughes D, Farrow N, Profita H, Correll N (2015) Detecting and identifying tactile gestures using deep autoencoders, geometric moments and gesture level features. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Seattle, WA, pp 415–422
Jeong S, Logan D, Goodwin M, Graca S, O’Connell B, Goodenough H, et al (2015) A social robot to mitigate stress, anxiety, and pain in hospital pediatric care. In: Proceedings of the International Conference on Human–Robot Interaction (HRI) Extended Abstracts, Portland, OR, pp 103–104
Ji Z, Amirabdollahian F, Polani D, Dautenhahn K (2011) Histogram based classification of tactile patterns on periodically distributed skin sensors for a humanoid robot. In: Proceedings of the International Symposium on Robot and Human Interactive Communication (ROMAN), Atlanta, GA, pp 433–440
Jones SE, Yarbrough AE (1985) A naturalistic study of the meanings of touch. Commun Monogr 52(1):19–56
Jung MM (2014) Towards social touch intelligence: developing a robust system for automatic touch recognition. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Istanbul, Turkey, pp 344–348
Jung MM, Poppe R, Poel M, Heylen DKJ (2014) Touching the void—introducing CoST: corpus of social touch. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Istanbul, Turkey, pp 120–127
Jung MM, Cang XL, Poel M, MacLean KE (2015) Touch challenge ’15: recognizing social touch gestures. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Seattle, WA, pp 387–390
Jung MM, Poel M, Poppe R, Heylen DKJ (2016) Corpus of social touch (CoST). University of Twente, Enschede. doi:10.4121/uuid:5ef62345-3b3e-479c-8e1d-c922748c9b29
Kim YM, Koo SY, Lim JG, Kwon DS (2010) A robust online touch pattern recognition for dynamic human–robot interaction. Trans Consum Electron 56(3):1979–1987
Knight H, Toscano R, Stiehl WD, Chang A, Wang Y, Breazeal C (2009) Real-time social touch gesture recognition for sensate robots. In: Proceedings of the International Conference on Intelligent Robots and Systems (IROS), St. Louis, MO, pp 3715–3720
Kotranza A, Lok B, Pugh CM, Lind DS (2009) Virtual humans that touch back: enhancing nonverbal communication with virtual humans through bidirectional touch. In: Proceedings of the Virtual Reality Conference, Lafayette, LA, pp 175–178
Nakajima K, Itoh Y, Hayashi Y, Ikeda K, Fujita K, Onoye T (2013) Emoballoon: A balloon-shaped interface recognizing social touch interactions. In: Proceedings of Advances in Computer Entertainment (ACE), Boekelo, The Netherlands, pp 182–197
Naya F, Yamato J, Shinozawa K (1999) Recognizing human touching behaviors using a haptic interface for a pet-robot. In: Proceedings of the International Conference on Systems, Man, and Cybernetics (SMC), Tokyo, Japan, vol 2, pp 1030–1034
Silvera-Tawil D, Rye D, Velonaki M (2011) Touch modality interpretation for an eit-based sensitive skin. In: Proceedings of the International Conference on Robotics and Automation (ICRA), Shanghai, China, pp 3770–3776
Silvera-Tawil D, Rye D, Velonaki M (2012) Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin. Robot Res 31(13):1627–1641
Silvera-Tawil D, Rye D, Velonaki M (2014) Interpretation of social touch on an artificial arm covered with an EIT-based sensitive skin. Int J Soc Robot 6(4):489–505
Silvera-Tawil D, Rye D, Velonaki M (2015) Artificial skin and tactile sensing for socially interactive robots: a review. Robot Auton Syst 63:230–243
Stiehl WD, Lieberman J, Breazeal C, Basel L, Lalla L, Wolf M (2005) Design of a therapeutic robotic companion for relational, affective touch. In: Proceedings of International Workshop on Robot and Human Interactive Communication (ROMAN), Nashville, TN, pp 408–415
Ta VC, Johal W, Portaz M, Castelli E, Vaufreydaz D (2015) The Grenoble system for the social touch challenge at ICMI 2015. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), Seattle, WA, pp 391–398
Wada K, Shibata T (2007) Living with seal robots—its sociopsychological and physiological influences on the elderly at a care house. Trans Robot 23(5):972–980
van Wingerden S, Uebbing TJ, Jung MM, Poel M (2014) A neural network based approach to social touch classification. In: Proceedings of Workshop on Emotion Representation and Modelling in Human–Computer-Interaction-Systems (ERM4HCI), Istanbul, Turkey, pp 7–12
Yohanan S, MacLean KE (2012) The role of affective touch in human–robot interaction: human intent and expectations in touching the haptic creature. Soc Robot 4(2):163–180