Nội dung được dịch bởi AI, chỉ mang tính chất tham khảo
Tạo Điều Kiện Giao Tiếp Giữa Trẻ Em và Robot Bằng Cách Trang Bị Robot Khả Năng Hiểu Được Sự Tham Gia Của Trẻ Em: Trường Hợp Robot Mio Amico
Tóm tắt
Robot xã hội (SRs) ngày càng trở thành một phần quan trọng của xã hội hiện đại, nhờ vào việc sử dụng phổ biến của chúng trong nhiều lĩnh vực ứng dụng bao gồm giáo dục, giao tiếp, trợ giúp và giải trí. Thách thức chính trong tương tác giữa người và robot là đạt được giao tiếp tương tự như người với nhau và có tính cảm xúc giữa hai bên. Nghiên cứu này nhằm trang bị cho các robot xã hội khả năng đánh giá trạng thái cảm xúc của người đối thoại, thông qua việc phân tích các tín hiệu tâm lý - sinh lý của họ. Phương pháp nghiên cứu tập trung vào việc đánh giá từ xa hoạt động thần kinh thực vật ngoại biên của đối tượng thông qua hình ảnh hồng ngoại nhiệt. Cách tiếp cận này đã được phát triển và thử nghiệm cho một trường hợp sử dụng đặc biệt khó khăn: sự tương tác giữa trẻ em và robot giáo dục thương mại, Mio Amico Robot, được sản xuất bởi LiscianiGiochi©. Trạng thái cảm xúc được phân loại từ phân tích tín hiệu nhiệt đã được so sánh với trạng thái cảm xúc được nhận diện bởi hệ thống mã hóa hành động khuôn mặt. Cách tiếp cận được đề xuất là đáng tin cậy và chính xác, đồng thời thúc đẩy một tương tác cá nhân hóa và cải tiến giữa trẻ em với các robot xã hội.
Từ khóa
#Robot xã hội #tương tác người-robot #trạng thái cảm xúc #phân tích tín hiệu tâm lý #robot giáo dục.Tài liệu tham khảo
Broekens J, Heerink M, Rosendal H (2009) Assistive social robots in elderly care: a review. Gerontechnology 8(2):94–103
Mordoch E, Osterreicher A, Guse L, Roger K, Thompson G (2013) Use of social commitment robots in the care of elderly people with dementia: a literature review. Maturitas 74(1):14–20
Toh LPE, Causo A, Tzuo PW, Chen IM, Yeo SH (2016) A review on the use of robots in education and young children. J Educ Technol Soc 19(2):148–163
Boucenna S, Narzisi A, Tilmont E, Muratori F, Pioggia G, Cohen D, Chetouani M (2014) Interactive technologies for autistic children: a review. Cognit Comput 6(4):722–740
Mubin O, Stevens CJ, Shahid S, Al Mahmud A, Dong JJ (2013) A review of the applicability of robots in education. J Technol Educ Learn 1(209–0015):13
Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Looije R (2013) Multimodal child-robot interaction: building social bonds. J Hum–Robot Interact 1(2):33–53
Breazeal C, Takanishi A, Kobayashi T (2008) Social robots that interact with people. In: Siciliano B, Khatib O (eds) Springer handbook of robotics. Springer, Berlin
Lathan C, Brisben A, Safos C (2005) CosmoBot levels the playing field for disabled children. Interactions 12(2):14–16
Dautenhahn K, Nehaniv CL, Walters ML, Robins B, Kose-Bagci H, Mirza NA, Blow M (2009) KASPAR–a minimally expressive humanoid robot for human–robot interaction research. Appl Bion Biomech 6(3–4):369–397
Malik NA, Hanapiah FA, Rahman RAA, Yussof H (2016) Emergence of socially assistive robotics in rehabilitation for children with cerebral palsy: a review. Int J Adv Rob Syst 13(3):135
Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F (2018) Social robots for education: a review. Sci Robot 3(21):eaat5954
Merla A (2014) Thermal expression of intersubjectivity offers new possibilities to human–machine and technologically mediated interactions. Front Psychol 5:802
Leyzberg D, Avrunin E, Liu J, Scassellati B (2011, March) Robots that express emotion elicit better human teaching. In: Proceedings of the 6th international conference on human–robot interaction. ACM, pp 347–354
Leyzberg D, Spaulding S, Scassellati B (2014, March) Personalizing robot tutors to individuals’ learning differences. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction. ACM, pp 423–430
Cosentino S, Randria EI, Lin JY, Pellegrini T, Sessa S, Takanishi A (2018, October) Group emotion recognition strategies for entertainment robots. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 813–818
Liu Z, Wu M, Cao W, Chen L, Xu J, Zhang R, Mao J (2017) A facial expression emotion recognition based human–robot` interaction system. IEEE/CAA J Autom Sin 4(4):668–676
Foster ME, Gaschler A, Giuliani M (2017) Automatically classifying user engagement for dynamic multi-party human–robot interaction. Int J Soc Robot 9(5):659–674
Menne IM, Schnellbacher C, Schwab F (2016, November) Facing emotional reactions towards a robot—an experimental study using FACS. In: International conference on social robotics. Springer, Cham, pp 372–381
Tracy JL, Robins RW, Schriber RA (2009) Development of a FACS-verified set of basic and self-conscious emotion expressions. Emotion 9(4):554
Kaiser S, Wehrle T (1992) Automated coding of facial behavior in human-computer interactions with FACS. J Nonverbal Behav 16(2):67–84
Kędzierski J, Muszyński R, Zoll C, Oleksy A, Frontkiewicz M (2013) EMYS—emotive head of a social robot. Int J Soc Robot 5(2):237–249
Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155
May AD, Lotfi A, Langensiepen C, Lee K, Acampora G (2017) Human emotional understanding for empathetic companion robots. In: Advances in computational intelligence systems. Springer, Cham, pp 277–285
Cardone D, Pinti P, Merla A (2015) Thermal infrared imaging-based computational psychophysiology for psychometrics. Computat Math Methods Med 2015:984353
Ioannou S, Ebisch S, Aureli T, Bafunno D, Ioannides HA, Cardone D, Merla A (2013) The autonomic signature of guilt in children: a thermal infrared imaging study. PLoS ONE 8(11):e79440
Cardone D, Merla A (2017) New frontiers for applications of thermal infrared imaging devices: computational psychopshysiology in the neurosciences. Sensors 17(5):1042
Ebisch SJ, Aureli T, Bafunno D, Cardone D, Romani GL, Merla A (2012) Mother and child in synchrony: thermal facial imprints of autonomic contagion. Biol Psychol 89(1):123–129
Gordan R, Gwathmey JK, Xie LH (2015) Autonomic and endocrine control of cardiovascular function. World J Cardiol 7(4):204
Shastri D, Merla A, Tsiamyrtzis P, Pavlidis I (2009) Imaging facial signs of neurophysiological responses. IEEE Trans Biomed Eng 56(2):477–484
Engert V, Merla A, Grant JA, Cardone D, Tusche A, Singer T (2014) Exploring the use of thermal infrared imaging in human stress research. PLoS ONE 9(3):e90782
Ziegler MG (2012) Psychological stress and the autonomic nervous system. In: Primer on the autonomic nervous system, 3rd edn, pp 291–293
Aureli T, Grazia A, Cardone D, Merla A (2015) Behavioral and facial thermal variations in 3-to 4-month-old infants during the Still-Face Paradigm. Front Psychol 6:1586
Mazzone A, Camodeca M, Cardone D, Merla A (2017) Bullying perpetration and victimization in early adolescence: physiological response to social exclusion. Int J Dev Sci 11(3–4):121–130
Nicolini Y, Manini B, De Stefani E, Coudé G, Cardone D, Barbot A, Bianchi B (2019) Autonomic responses to emotional stimuli in children affected by facial palsy: the case of Moebius syndrome. Neural Plast 2019:7253768
Scassellati B, Brawer J, Tsui K, Nasihati Gilani S, Malzkuhn M, Manini B, Traum D (2018, April) Teaching language to deaf infants with a robot and a virtual human. In: Proceedings of the 2018 CHI conference on human factors in computing systems. ACM, p 553
Nasihati Gilani, S., Traum, D., Merla, A., Hee, E., Walker, Z., Manini, B., … & Petitto, L. A. (2018, October). Multimodal Dialogue Management for Multiparty Interaction with Infants. In Proceedings of the 2018 on International Conference on Multimodal Interaction (pp. 5-13). ACM
Buddharaju P, Dowdall J, Tsiamyrtzis P, Shastri D, Pavlidis I, Frank MG (2005, June) Automatic thermal monitoring system (ATHEMOS) for deception detection. In: 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), vol 2. IEEE, pp 1179-vol
Dowdall J, Pavlidis IT, Tsiamyrtzis P (2007) Coalitional tracking. Comput Vis Image Underst 106(2–3):205–219
Merla A, Cardone D, Di Carlo L, Di Donato L, Ragnoni A, Visconti A (2011) Noninvasive system for monitoring driver’s physical state. In: Proceedings of the 11th AITA advanced infrared technology and applications
Zaman B, Shrimpton-Smith T (2006, October) The FaceReader: measuring instant fun of use. In: Proceedings of the 4th nordic conference on human–computer interaction: changing roles. ACM, pp 457–460
Posner J, Russell JA, Peterson BS (2005) The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(3):715–734
Dalal N, Triggs B (2005, June) Histograms of oriented gradients for human detection. In: International conference on computer vision & pattern recognition (CVPR’05), vol. 1. IEEE Computer Society, pp 886–893
Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1867–1874
Chemical Rubber Company (1920) Handbook of chemistry and physics. Chemical Rubber Publishing Company
Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library. O’Reilly Media Inc, Newton
Zhang Z, Lyons M, Schuster M, Akamatsu S (1998, April). Comparison between geometry-based and gabor-wavelets-based facial expression recognition using multi-layer perceptron. In: Proceedings third IEEE international conference on automatic face and gesture recognition. IEEE, pp 454–459
Basheer IA, Hajmeer M (2000) Artificial neural networks: fundamentals, computing, design, and application. J Microbiol Methods 43(1):3–31
Kim T, Adali T (2002) Fully complex multi-layer perceptron network for nonlinear signal processing. J VLSI signal Process Syst Signal Image Video Technol 32(1–2):29–43
Hall JK (2019) The contributions of conversation analysis and interactional linguistics to a usage-based understanding of language: expanding the transdisciplinary framework. Mod Lang J 103:80–94
Petitjean C, González-Martínez E (2015) Laughing and smiling to manage trouble in French-language classroom interaction. Classroom Discourse 6(2):89–106
Pavlidis I, Tsiamyrtzis P, Shastri D, Wesley A, Zhou Y, Lindner P, Bass B (2012) Fast by nature-how stress patterns define human experience and performance in dexterous tasks. Sci Rep 2:305
Kosonogov V, De Zorzi L, Honoré J, Martínez-Velázquez ES, Nandrino JL, Martinez-Selva JM, Sequeira H (2017) Facial thermal variations: a new marker of emotional arousal. PLoS ONE 12(9):e0183592
Salazar-López E, Domínguez E, Ramos VJ, De la Fuente J, Meins A, Iborra O, Gómez-Milán E (2015) The mental and subjective skin: emotion, empathy, feelings and thermography. Conscious Cogn 34:149–162
Wang S, Shen P, Liu Z (2012, October) Facial expression recognition from infrared thermal images using temperature difference by voting. In: 2012 IEEE 2nd international conference on cloud computing and intelligence systems, vol 1. IEEE, pp 94–98