Left/right hand segmentation in egocentric videos
Tài liệu tham khảo
Alata, 2009, Is there a best color space for color image characterization or representation based on multivariate gaussian mixture model?, Comput. Vis. Image Understand., 113, 867, 10.1016/j.cviu.2009.03.001
Baraldi, 2015, Gesture recognition using wearable vision sensors to enhance Visitors’ museum experiences, IEEE Sensors J., 15, 1, 10.1109/JSEN.2015.2411994
Betancourt, 2014, A sequential classifier for hand detection in the framework of egocentric vision, 600
Betancourt, 2015, A dynamic approach and a new dataset for hand-detection in first person vision
Betancourt, 2015, Towards a unified framework for hand-based methods in first person vision
Betancourt, 2015, Filtering SVM frame-by-frame binary classification in a detection framework
Betancourt, 2015, The evolution of first person vision methods: A survey, IEEE Trans. Circ. Syst. Video Technol., 25, 744, 10.1109/TCSVT.2015.2409731
Buso, 2015, Goal-oriented top-down probabilistic visual attention model for recognition of manipulated objects in egocentric videos, Sig. Process.
Cai, 2015, A scalable approach for understanding the visual structures of hand grasps, IEEE Int. Conf. Robot. Automat., 1360
Cook, 2013, Atypical basic movement kinematics in autism spectrum conditions, Brain, 136, 2816, 10.1093/brain/awt208
Fathi, 2011, Understanding egocentric activities, 407
Fathi, 2012, Learning to recognize daily actions using gaze, 314
Fathi, 2011, Learning to recognize objects in egocentric activities, 3281
Fitzgibbon, 1995, A buyer’s guide to conic fitting, British Mach. Vis. Conf., 513
Goble, D. J., Brown, S. H., 2008. The biological and behavioral basis of upper limb asymmetries in sensorimotor performance.
Jones, 2002, Statistical color models with application to skin detection, 81
Knaus, 2016, Handedness in children with autism spectrum disorder, Percept. Motor Skills, 10.1177/0031512516637021
van Laerhoven, 2013, Wearable computing, 12, 125
Lee, 2014, This hand is my hand: a probabilistic approach to hand disambiguation in egocentric video, 1
Li, 2013, Model recommendation with virtual probes for egocentric hand detection, 2624
Li, 2013, Pixel-level hand detection in egocentric videos, 3570
Mcmanus, 2009, The history and geography of human handedness, Lang. Lateral. Psychosis,, 37, 10.1017/CBO9780511576744.004
Morerio, 2013, Hand Detection in First Person Vision, 0
Ren, 2009, Egocentric recognition of handled objects: Benchmark and analysis, 49
Rentería, 2012, Cerebral asymmetry: a quantitative, multifactorial, and plastic brain phenotype, Twin Res. Human Gene., 15, 401, 10.1017/thg.2012.13
Speth, 2013, Observational skills assessment score: Reliability in measuring amount and quality of use of the affected hand in unilateral cerebral palsy, BMC Neurol., 13, 152, 10.1186/1471-2377-13-152
Swinnen, 2004, Two hands, one brain: cognitive neuroscience of bimanual skill, Trends Cognit. Sci., 8, 18, 10.1016/j.tics.2003.10.017
Turolla, 2013, Virtual reality for the rehabilitation of the upper limb motor function after stroke: A prospective controlled trial, J. Neuroeng. Rehab., 10, 85, 10.1186/1743-0003-10-85
Vincze, 2009, Integrated vision system for the semantic interpretation of activities where a person handles objects, Comput. Vis. Image Understand., 113, 682, 10.1016/j.cviu.2008.10.008
Yang, 2015, Grasp type revisited: a modern perspective on a classical feature for vision, 400
Zhu, 2014, Pixel-level hand detection with shape-aware structured forests, 1
Zhu, 2015, Structured forests for pixel-level hand detection and hand part labelling, Comput. Vis. Image Understand., 141, 95, 10.1016/j.cviu.2015.07.008