Eye tracking in Child Computer Interaction: Challenges and opportunities
Tài liệu tham khảo
Abe, K., Hamada, Y., Nagai, T., Shiomi, M., & Omori, T. (2017). Estimation of child personality for child-robot interaction. In 2017 26th IEEE international symposium on robot and human interactive communication.
Al-Wabil, A., Al-Husian, L., Al-Murshad, R., & Al-Nafjan, A. (2010). Applying the retrospective think-aloud protocol in usability evaluations with children: Seeing through children’s eyes. In 2010 international conference on user science and engineering.
Al-Wabil, A., Alabdulqader, E., Al-Abdulkarim, L., & Al-Twairesh, N. (2010). Measuring the user experience of digital books with children: An eyetracking study of interaction with digital libraries. In 2010 international conference for internet technology and secured transactions.
Al-Zeer, S., Al-Ghanim, A., & Al-Wakeel, L. (2014). Visual attention in interaction with Arabic augmentative and alternative communication apps. In 2014 3rd international conference on user science and engineering.
Almourad, 2019, Comparing the behaviour of human face capturing attention of autistic & normal developing children using eye tracking data analysis approach, 221
Almourad, 2020, Visual attention toward human face recognizing for autism spectrum disorder and normal developing children: An eye tracking study, 99
Axelsson, M., Racca, M., Weir, D., & Kyrki, V. (2019). A participatory design process of a robotic tutor of assistive sign language for children with autism. In 2019 28th IEEE international conference on robot and human interactive communication.
Bekele, 2016, Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with autism spectrum disorders (ASD)
Belén, J. G., Barzallo, P., & Alvarado-Cando, O. (2018). A software based on eye gazed to evaluate vowels in children with cerebral palsy in inclusive education. In 2018 IEEE ANDESCON.
Birkett, 2011, How revealing are eye-movements for understanding web engagement in young children, 2251
Book, G., Stevens, M., Pearlson, G., & Kiehl, K. (2008). Fusion of fMRI and the pupil response during an auditory oddball task. In Conference of the cognitive neuroscience society.
Boyraz, P., Yiğit, C. B., & Biçer, H. O. (2013). UMAY1: A modular humanoid platform for education and rehabilitation of children with autism spectrum disorders. In 2013 9th Asian control conference.
Breen, 2014, An evaluation of eye tracking technology in the assessment of 12 lead electrocardiography interpretation, Journal of Electrocardiology, 47, 922, 10.1016/j.jelectrocard.2014.08.008
Brooks, R. (2002). Humanoid robot models of child development. In Proceedings 2nd International conference on development and learning.
Caridakis, G., Asteriadis, S., Karpouzis, K., & Kollias, S. (2011). Detecting human behavior emotional cues in natural interaction. In 2011 17th international conference on digital signal processing.
Chen, 2014, Towards improving social communication skills with multimodal sensory information, IEEE Transactions on Industrial Informatics, 10, 323, 10.1109/TII.2013.2271914
Chen, 2015, Eye-hand coordination strategies during active video game playing: An eye-tracking study, Computers in Human Behavior, 51, 8, 10.1016/j.chb.2015.04.045
Cholewa, 2018, Precise eye-tracking technology in medical communicator prototype, Procedia Computer Science, 138, 264, 10.1016/j.procs.2018.10.038
Chukoskie, L., Soomro, A., Townsend, J., & Westerfield, M. (2013). ‘Looking’ better: Designing an at-home gaze training system for children with ASD. In 2013 6th international IEEE/EMBS conference on neural engineering.
Clemotte, 2014
Colombo, L., Landoni, M., & Rubegni, E. (2014). Design guidelines for more engaging electronic books: insights from a cooperative inquiry study. In Proceedings of the 2014 conference on interaction design and children.
Currie, 2019, Wearable technology-based metrics for predicting operator performance during cardiac catheterisation, International Journal of Computer Assisted Radiology and Surgery, 14, 645, 10.1007/s11548-019-01918-0
Currie, 2017, Eye tracking the visual attention of nurses interpreting simulated vital signs scenarios: mining metrics to discriminate between performance level, IEEE Transactions on Human–Machine Systems, 48, 113, 10.1109/THMS.2017.2754880
de Mooij, 2020, Should online math learning environments be tailored to individuals’ cognitive profiles?, Journal of Experimental Child Psychology, 191, 10.1016/j.jecp.2019.104730
Dickstein-Fischer, L., Alexander, E., Yan, X., Su, H., Harrington, K., & Fischer, G. S. (2011). An affordable compact humanoid robot for autism spectrum disorder interventions in children. In 2011 Annual international conference of the ieee engineering in medicine and biology society.
Dickstein-Fischer, L., & Fischer, G. S. (2014). Combining psychological and engineering approaches to utilizing social robots with children with Autism. In 2014 36th annual international conference of the IEEE engineering in medicine and biology society.
DiPaola, D., Payne, B. H., & Breazeal, C. (2020). Decoding design agendas: an ethical design activity for middle school students. In Proceedings of the interaction design and children conference.
Eckstein, 2017, Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development?, Developmental Cognitive Neuroscience, 25, 69, 10.1016/j.dcn.2016.11.001
Eom, Y., Furukawa, K., Shibata, S., Mu, S., & Karita, T. (2019). Class participation support system on an avatar robot for long-term absent students. In 2019 IEEE 8th global conference on consumer electronics.
Feng, H., Gutierrez, A., Zhang, J., & Mahoor, M. H. (2013). Can NAO robot improve eye-gaze attention of children with high functioning autism? in 2013 IEEE international conference on healthcare informatics.
Fowler, A., Nesbitt, K., & Canossa, A. (2019). Identifying cognitive load in a computer game: An exploratory study of young children. In 2019 IEEE conference on games.
Friedman, M. B. (1983). Eyetracker communication system. In The seventh annual symposium on computer applications in medical care, 1983. Proceedings.
Frutos-Pascual, 2015, Where do they look at? Analysis of gaze interaction in children while playing a puzzle game
Giannakos, 2020, Mapping child–computer interaction research through co-word analysis, International Journal of Child-Computer Interaction
Giannakos, 2020, Monitoring children’s learning through wearable eye-tracking: The case of a making-based coding activity, IEEE Pervasive Computing, 19, 10, 10.1109/MPRV.2019.2941929
Gomes, 2012, Ilha musical: a CAVE for nurturing cultural appreciation, 232
Gossen, 2014, Usability and perception of young users and adults on targeted web search engines, 18
Haas, 2017, Exploring different types of feedback in preschooler and robot interaction, 127
Hammad, H. B. E., & Mahgoub, R. M. S. (2019). A robust and adaptable method for moving the mouse cursor based on eye gazing technique. In 2019 International conference on computer, control, electrical, and electronics engineering.
Hessels, 2019, Eye tracking in developmental cognitive neuroscience – The good, the bad and the ugly, Developmental Cognitive Neuroscience, 40, 10.1016/j.dcn.2019.100710
Holmqvist, 2017, Eye tracking: A comprehensive guide to methods, Paradigms and Measures
Holmqvist, 2011
Holmqvist, K., Nyström, M., & Mulvey, F. (2012). Eye tracker data quality: what it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications.
Hornof, 2005, EyeDraw: enabling children with severe motor impairments to draw with their eyes, 161
Hornof, 2004, EyeDraw: a system for drawing pictures with the eyes, 1251
Just, 1980, A theory of reading: from eye fixations to comprehension, Psychological Review, 87, 329, 10.1037/0033-295X.87.4.329
K B, P. R., & Lahiri, U. (2016). Design of eyegaze-sensitive virtual reality based social communication platform for individuals with autism. In 2016 7th international conference on intelligent systems, modelling and simulation.
Khamassi, M., Chalvatzaki, G., Tsitsimis, T., Velentzas, G., & Tzafestas, C. (2018). A framework for robot learning during child-robot interaction with human engagement as reward signal. In 2018 27th IEEE international symposium on robot and human interactive communication.
Kleberg, 2020, Delayed gaze shifts away from others’ eyes in children and adolescents with social anxiety disorder, Journal of Affective Disorders
Kocejko, T., Ruminski, J., Bujnowski, A., & Wtorek, J. (2016). The evaluation of eGlasses eye tracking module as an extension for Scratch. In 2016 9th International Conference on Human System Interactions (HSI).
Kornev, 2018, The strategic reading brain development: An eye-tracking study of the text reading in typically-developing and dyslexic children, International Journal of Psychophysiology, 131, S50, 10.1016/j.ijpsycho.2018.07.153
Koskinen, 2011
Kozima, H., Nakagawa, C., Kawai, N., Kosugi, D., & Yano, Y. (2004). A humanoid in company with children. In 4th IEEE/RAS international conference on humanoid robots, 2004.
Krejtz, 2012, Multimodal learning with audio description: an eye tracking study of children’s gaze during a visual recognition task, 83
Krøjgaard, 2019, Eight-year-olds, but not six-year-olds, perform just as well as adults when playing concentration: Resolving the enigma, Consciousness and Cognition, 69, 81, 10.1016/j.concog.2019.01.015
Lahiri, 2013, Design of a virtual reality based adaptive response technology for children with autism, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 21, 55, 10.1109/TNSRE.2012.2218618
Lahiri, U., Warren, Z., & Sarkar, N. (2011). Dynamic gaze measurement with adaptive response technology in Virtual Reality based social communication for autism. In 2011 international conference on virtual rehabilitation.
Levantini, 2020, EYES Are the window to the mind: Eye-tracking technology as a novel approach to study clinical characteristics of ADHD, Psychiatry Research, 290, 10.1016/j.psychres.2020.113135
Liao, 2020, Electronic storybook design kindergartners’ visual attention, and print awareness: An eye-tracking investigation, Computers & Education, 144, 10.1016/j.compedu.2019.103703
Little, G. E., Bonnar, L., Kelly, S. W., Lohan, K. S., & Rajendran, G. (2016). Gaze contingent joint attention with an avatar in children with and without ASD. In 2016 joint IEEE international conference on development and learning and epigenetic robotics.
Liu, 2020, Eye-tracking based performance analysis in error finding programming test
Lohan, 2018, Toward improved child–robot interaction by understanding eye movements, IEEE Transactions on Cognitive and Developmental Systems, 10, 983, 10.1109/TCDS.2018.2838342
Lohan, K. S., Vollmer, A., Fritsch, J., Rohlfing, K., & Wrede, B. (2009). Which ostensive stimuli can be used for a robot to detect and maintain tutoring situations? In 2009 3rd international conference on affective computing and intelligent interaction and workshops.
Lukasiewicz, 1979, Eye tracking communication systems (tests of non-verbal cerebral palsy children on rpi’s eye motion analysis and tracking system), 332
Mark, 2020, NOTES, 319
Markopoulos, 2008, Chapter 10 - observation methods, 164
Masood, 2015, The usability of mobile applications for pre-schoolers, Procedia - Social and Behavioral Sciences, 197, 1818, 10.1016/j.sbspro.2015.07.241
McChesney, 2019, Eye tracking analysis of computer program comprehension in programmers with dyslexia, Empirical Software Engineering, 24, 1109, 10.1007/s10664-018-9649-y
McLaughlin, 2018, Digital training platform for interpreting radiographic images of the chest, Radiography, 24, 159, 10.1016/j.radi.2017.12.010
Mei, C., Zahed, B. T., Mason, L., & Ouarles, J. (2018). Towards joint attention training for children with ASD - a VR game approach and eye gaze exploration. In 2018 IEEE conference on virtual reality and 3d user interfaces.
Othman, A., & Mohsin, M. (2017). How could robots improve social skills in children with Autism? In 2017 6th international conference on information and communication technology and accessibility.
Papavlasopoulou, 2018, How do you feel about learning to code? Investigating the effect of children’s attitudes towards coding using eye-tracking, International Journal of Child-Computer Interaction, 17, 50, 10.1016/j.ijcci.2018.01.004
Papavlasopoulou, 2020, Coding activities for children: Coupling eye-tracking with qualitative data to investigate gender differences, Computers in Human Behavior, 105, 10.1016/j.chb.2019.03.003
Papavlasopoulou, 2017, Using eye-tracking to unveil differences between kids and teens in coding activities, 171
Potvin Kent, 2019, Children and adolescents’ exposure to food and beverage marketing in social media apps, Pediatric Obesity, 14
Pretorius, 2010, Using eye tracking to compare how adults and children learn to use an unfamiliar computer game, 275
Rahmadiva, M., Arifin, A., Fatoni, M. H., Baki, S. H., & Watanabe, T. (2019). A design of multipurpose virtual reality game for children with autism spectrum disorder. In 2019 international biomedical instrumentation and technology conference.
Read, 2008, Jabberwocky: children’s digital ink story writing from nonsense to sense, 85
Read, 2008, Validating the fun toolkit: an instrument for measuring children’s opinions of technology, Cognition, Technology & Work, 10, 119, 10.1007/s10111-007-0069-9
Schiavo, 2015, Gary: combining speech synthesis and eye tracking to support struggling readers, 417
Schindler, 2019, Domain-specific interpretation of eye tracking data: towards a refined use of the eye-mind hypothesis for the field of geometry, Educational Studies in Mathematics, 101, 123, 10.1007/s10649-019-9878-z
Sean, 2020, Designing, developing, and evaluating a global filter to work around local interference for children with autism, 1
Shahid, 2012, Video-mediated and co-present gameplay: Effects of mutual gaze on game experience, expressiveness and perceived social presence, Interactive Computing, 24, 292, 10.1016/j.intcom.2012.04.006
Sharafi, 2020, A practical guide on conducting eye tracking studies in software engineering, Empirical Software Engineering, 1
Sharma, 2019, Coding games and robots to enhance computational thinking: How collaboration and engagement moderate children’s attitudes?, International Journal of Child-Computer Interaction, 21, 65, 10.1016/j.ijcci.2019.04.004
Shic, 2014, Speech disturbs face scanning in 6-month-old infants who develop autism spectrum disorder, Biological Psychiatry, 75, 231, 10.1016/j.biopsych.2013.07.009
Silva, 2013, Morphological constraints in children’s spoken language comprehension: A visual world study of plurals inside compounds in English, Cognition, 129, 457, 10.1016/j.cognition.2013.08.003
Silva, P. R. S. D., Tadano, K., Saito, A., Lambacher, S. G., & Higashi, M. (2009). The development of an assistive robot for improving the joint attention of autistic children. In 2009 IEEE international conference on rehabilitation robotics.
Soleimani, 2019, CyberPLAYce—A tangible, interactive learning tool fostering children’s computational thinking through storytelling, International Journal of Child-Computer Interaction, 20, 9, 10.1016/j.ijcci.2019.01.002
Spiel, 2019, Agency of autistic children in technology research—A critical literature review, ACM Transactions on Computer-Human Interaction, 26, 1, 10.1145/3344919
Syeda, 2017, Visual face scanning and emotion perception analysis between autistic and typically developing children, 844
Symeonidou, 2016, Development of online use of theory of mind during adolescence: An eye-tracking study, Journal of Experimental Child Psychology, 149, 81, 10.1016/j.jecp.2015.11.007
Torii, I., Takami, S., Ohtani, K., & Ishii, N. (2014). Development of communication support application with blinks. In IISA 2014, the 5th international conference on information, intelligence, systems and applications.
Torney, H., Harvey, A., Finlay, D., Magee, J., Funston, R., & Bond, R. R. (2018). Eye-tracking analysis to compute the visual hierarchy of user interfaces on automated external defibrillators. In British HCI conference 2018.
Tragant Mestres, 2019, Young EFL Learners’ processing of multimodal input: Examining learners’ eye movements, System, 80, 212, 10.1016/j.system.2018.12.002
Underwood, 1992, The role of eye movements in reading: some limitations of the eye-mind assumption, 111, 10.1016/S0166-4115(08)61744-6
van den Bosch, 2018, Online processing of causal relations in beginning first and second language readers, Learning and Individual Differences, 61, 59, 10.1016/j.lindif.2017.11.007
van Reijmersdal, 2020, Effects of disclosing influencer marketing in videos: An eye tracking study among children in early adolescence, Journal of Interactive Marketing, 49, 94, 10.1016/j.intmar.2019.09.001
Vazquez-Fernandez, 2016, Face recognition for authentication on mobile devices, Image and Vision Computing, 55, 31, 10.1016/j.imavis.2016.03.018
Vidal, M., Bulling, A., & Gellersen, H. (2013). Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing.
Wang, 2015, Interactive eye tracking for gaze strategy modification, 247
Wass, 2016, The use of eye tracking with infants and children, 50
Yee, 2012, Developing a robotic platform to play with pre-school autistic children in a classroom environment, 81
Zhang, J., Mullikin, M., Li, Y., & Mei, C. (2020). A methodology of eye gazing attention determination for VR training. In 2020 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops.
