Generating accurate 3D gaze vectors using synchronized eye tracking and motion capture
Springer Science and Business Media LLC - Trang 1-14 - 2022
Tóm tắt
Assessing gaze behavior during real-world tasks is difficult; dynamic bodies moving through dynamic worlds make gaze analysis difficult. Current approaches involve laborious coding of pupil positions. In settings where motion capture and mobile eye tracking are used concurrently in naturalistic tasks, it is critical that data collection be simple, efficient, and systematic. One solution is to combine eye tracking with motion capture to generate 3D gaze vectors. When combined with tracked or known object locations, 3D gaze vector generation can be automated. Here we use combined eye and motion capture and explore how linear regression models generate accurate 3D gaze vectors. We compare spatial accuracy of models derived from four short calibration routines across three pupil data inputs: the efficacy of calibration routines was assessed, a validation task requiring short fixations on task-relevant locations, and a naturalistic object interaction task to bridge the gap between laboratory and “in the wild” studies. Further, we generated and compared models using spherical and Cartesian coordinate systems and monocular (left or right) or binocular data. All calibration routines performed similarly, with the best performance (i.e., sub-centimeter errors) coming from the naturalistic task trials when the participant is looking at an object in front of them. We found that spherical coordinate systems generate the most accurate gaze vectors with no differences in accuracy when using monocular or binocular data. Overall, we recommend 1-min calibration routines using binocular pupil data combined with a spherical world coordinate system to produce the highest-quality gaze vectors.
Tài liệu tham khảo
Boser, Q. A., Valevicius, A. M., Lavoie, E. B., Chapman, C. S., Pilarski, P. M., Hebert, J. S., & Vette, A. H. (2018). Cluster-based upper body marker models for three-dimensional kinematic analysis: Comparison with an anatomical model and reliability analysis. Journal of Biomechanics, 72, 228–234.
Browatzki, B., Bülthoff, H.H., & Chuang, L.L. (2014). A comparison of geometric-and regression-based mobile gaze-tracking. Frontiers in Human Neuroscience, 8, 200.
Chapman, H., Gavrilescu, M., Wang, H., Kean, M., Egan, G., & Castiello, U. (2002). Posterior parietal cortex control of reach-to-grasp movements in humans. European Journal of Neuroscience, 15 (12), 2037–2042.
Cramer, A.O., van Ravenzwaaij, D., Matzke, D., Steingroever, H., Wetzels, R., Grasman, R.P., ..., Wagenmakers, E.J. (2016). Hidden multiplicity in exploratory multiway ANOVA: prevalence and remedies. Psychonomic Bulletin & Review, 23(2), 640–647.
D’Errico, J. (2012). Inpaint_nans. MATLAB Central File Exchange.
Hayhoe, M. (2000). Vision using routines: A functional account of vision. Vision Cognition, 7(1–3), 43–64.
Hayhoe, M., & Ballard, D. (2005). Eye movements in natural behavior. Trends in Cognitive Sciences, 9(4), 188–194.
Hebert, J.S., Boser, Q.A., Valevicius, A.M., Tanikawa, H., Lavoie, E.B., Vette, A.H., ..., Chapman, C.S. (2019). Quantitative eye gaze and movement differences in visuomotor adaptations to varying task demands among upper-extremity prosthesis users. JAMA Network Open, 2(9), e1911197–e1911197.
JASP Team (2021). JASP (Version )[Computer software]. https://jasp-stats.org/.
Kassner, M., Patera, W., & Bulling, A. (2014). Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: adjunct publication, association for computing machinery, New York, UbiComp ’14 adjunct, pp. 1151–1160.
Kingstone, A., Smilek, D., & Eastwood, J. D. (2008). Cognitive ethology: A new approach for studying human cognition. British Journal of Psychology, 99(3), 317–340.
van der Kruk, E., & Reijne, M.M. (2018). Accuracy of human motion capture systems for sport applications; state-of-the-art review. EJSS, 18(6), 806–819.
Land, M., Mennie, N., & Rusted, J. (1999). The roles of vision and eye movements in the control of activities of daily living. Perception, 28(11), 1311–1328.
Land, M.F. (2004). The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Experimental Brain Research, 159(2), 151–160.
Land, M.F., & Hayhoe, M. (2001). In what ways do eye movements contribute to everyday activities? Vision Research, 41(25-26), 3559–3565.
Lappi, O. (2016). Eye movements in the wild: oculomotor control, gaze behavior & frames of reference. Neuroscience & Biobehavioral Reviews, 69, 49–68.
Lavoie, E.B., Valevicius, A.M., Boser, Q.A., Kovic, O., Vette, A.H., Pilarski, P.M., ..., Chapman, C.S. (2018). Using synchronized eye and motion tracking to determine high-precision eye-movement patterns during object-interaction tasks. Journal of Vision, 18(6), 18.
Mathis, A., Mamidanna, P., Cury, K.M., Abe, T., Murthy, V.N., Mathis, M.W., & Bethge, M. (2018). Deeplabcut: markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 21(9), 1281–1289.
Neggers, S.F., & Bekkering, H. (2000). Ocular gaze is anchored to the target of an ongoing pointing movement. Journal of Neurophysiology, 83(2), 639–651.
Ohno, T., & Mukawa, N. (2004). A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the 2004 symposium on eye tracking research & applications, pp. 115–122.
Parr, J., Vine, S.J., Harrison, N., & Wood, G. (2018). Examining the spatiotemporal disruption to gaze when using a myoelectric prosthetic hand. Journal of Motor Behavior, 50(4), 416–425.
Prime, S.L., & Marotta, J.J. (2013). Gaze strategies during visually-guided versus memory-guided grasping. Experimental Brain Research, 225(2), 291–305.
Rossit, S., Malhotra, P., Muir, K., Reeves, I., Duncan, G., Livingstone, K., ..., Harvey, M. (2009). No neglect-specific deficits in reaching tasks. Cerebral Cortex, 19(11), 2616–2624. https://doi.org/10.1093/cercor/bhp016.
Rossit, S., Malhotra, P., Muir, K., Reeves, I., Duncan, G., Livingstone, K., ..., et al. (2009). No neglect-specific deficits in reaching tasks. Cerebral Cortex, 19(11), 2616–2624.
SCCN (2021). Lab streaming layer.
Scheel, C., & Staadt, O. (2015). Mobile 3d gaze tracking calibration. In 2015 12th conference on computer and robot vision, IEEE, pp. 176–183.
Sperandio, I., Kaderali, S., Chouinard, P.A., Frey, J., & Goodale, M.A. (2013). Perceived size change induced by nonvisual signals in darkness: the relative contribution of vergence and proprioception. Journal of Neuroscience, 33(43), 16915–16923.
Tomasi, M., Pundlik, S., Bowers, A.R., Peli, E., & Luo, G. (2016). Mobile gaze tracking system for outdoor walking behavioral studies. Journal of Vision, 16(3), 27–27.
Valevicius, A.M., Boser, Q.A., Lavoie, E.B., Murgatroyd, G.S., Pilarski, P.M., Chapman, C.S., ..., Hebert, J.S. (2018). Characterization of normative hand movements during two functional upper limb tasks. PLoS One, 13(6), e0199549.
Vickers, J.N., & Williams, A.M. (2007). Performing under pressure: The effects of physiological arousal, cognitive anxiety, and gaze control in biathlon. Journal of Motor Behavior, 39(5), 381–394.
Whitney, D., Westwood, D.A., & Goodale, M.A. (2003). The influence of visual motion on fast reaching movements to a stationary object. Nature, 423(6942), 869–873.
Williams, H. E., Chapman, C. S., Pilarski, P. M., Vette, A. H., & Hebert, J. S. (2019). Gaze and movement assessment (gaMA): Inter-site validation of a visuomotor upper limb functional protocol. PLoS One, 14(12), e0219333.
Wilson, M.R., Vine, S.J., & Wood, G. (2009). The influence of anxiety on visual attentional control in basketball free throw shooting. Journal of Sport and Exercise Psychology, 31(2), 152–168.
Wispinski, N.J., Stone, S.A., Bertrand, J.K., Zuk, A.A.O., Lavoie, E.B., Gallivan, J.P., & Chapman, C.S. (2021). Reaching for known unknowns: Rapid reach decisions accurately reflect the future state of dynamic probabilistic information. Cortex, 138, 253–265.
Zangemeister, W.H., & Stark, L. (1981). Active head rotations and eye-head coordination. Annals of the New York Academy of Sciences, 374(1), 540–559.