Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications
Tóm tắt
Unlike retail stores, in which the user is forced to be physically present and active during restricted opening hours, online shops may be more convenient, functional and efficient. However, traditional online shops often have a narrow bandwidth for product visualizations and interactive techniques and lack a compelling shopping context. In this paper, we report a study on eliciting user-defined gestures for shopping tasks in an immersive VR (virtual reality) environment. We made a methodological contribution by providing a varied practice for producing more usable freehand gestures than traditional elicitation studies. Using our method, we developed a gesture taxonomy and generated a user-defined gesture set. To validate the usability of the derived gesture set, we conducted a comparative study and answered questions related to the performance, error count, user preference and effort required from end-users to use freehand gestures compared with traditional immersive VR interaction techniques, such as the
Từ khóa
Tài liệu tham khảo
Chen T, Pan ZG, Zheng JM (2008). EasyMall—an interactive virtual shopping system. In: 5th international conference on fuzzy systems and knowledge discovery. 4. pp. 669–673
Zhao L, Zhang N (2012). The virtual reality systems in electronic commerce. In: IEEE symposium on robotics and applications. pp. 833–835
Speicher M, Cucerca S, Krüger A (2017). VRShop: a mobile interactive virtual reality shopping environment combining the benefits of on- and offline shopping. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 1(3). pp. 1–31
Speicher M, Hell P, Daiber F, Simeone A, Krüger A (2018). A virtual reality shopping experience using the apartment metaphor. In: Proceedings of the international conference on advanced visual interfaces. pp. 1–9
Sanna A, Montrucchio B, Montuschi P, & Demartini C (2001). 3D-dvshop: a 3D dynamic virtual shop. In: Multimedia. pp. 33–42
Cardoso LS, da Costa RMEM, Piovesana A, Costa M, Penna L, Crispin A, Carvalho J, Ferreira H, Lopes M, Brandao G, Mouta R (2006). Using virtual environments for stroke rehabilitation. In: International workshop on virtual rehabilitation. pp. 1–5
Josman N, Hof E, Klinger E, Marié RM, Goldenberg K, Weiss PL, Kizony R (2006). Performance within a virtual supermarket and its relationship to executive functions in post-stroke patients. In: International workshop on virtual rehabilitation. pp. 106–109
Carelli L, Morganti F, Weiss P, Kizony R, Riva G (2008). A virtual reality paradigm for the assessment and rehabilitation of executive function deficits post stroke: feasibility study. In: IEEE virtual rehabilitation. pp. 99–104
Josman N, Kizony R, Hof E, Goldenberg K, Weiss P, Klinger E (2014) Using the virtual action planning-supermarket for evaluating executive functions in people with stroke. J Stroke Cerebrovasc Dis 23(5):879–887
Wu HY, Wang Y, Qiu JL, Liu JY, Zhang XL (2018) User-defined gesture interaction for immersive VR shopping applications. Behav Inf Technol. https://doi.org/10.1080/0144929X.2018.1552313
Nanjappan V, Liang HN, Lu FY, Papangelis K, Yue Y, Man KL (2018) User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments. Hum Comput Inf Sci 8(31):1–16
Song P, Goh WB, Hutama W, Fu CW, Liu XP (2012). A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1297–1236
Feng ZQ, Yang B, Li Y, Zheng YW, Zhao XY, Yin JQ, Meng QF (2013) Real-time oriented behavior-driven 3D freehand tracking for direct interaction. Pattern Recogn 46:590–608
Alkemade R, Verbeek FJ, Lukosch SG (2017) On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design. Int J Hum–Comput Int 33(11):882–901
Cui J, Sourin A (2018) Mid-air interaction with optical tracking for 3D modeling. Comput Graph 74:1–11
Figueiredo L, Rodrigues E, Teixeira J, Teichrieb V (2018) A comparative evaluation of direct hand and wand interactions on consumer devices. Comput Graph 77:108–121
Tollmar K, Demirdjian D, Darrell T (2004). Navigating in virtual environments using a vision-based interface. In: Proceedings of the third Nordic conference on Human-computer interaction. pp. 113–120
Sherstyuk A, Vincent D, Lui JJH, Connolly KK (2007). Design and development of a pose-based command language for triage training in virtual reality. In: IEEE symposium on 3D user interfaces. pp. 33–40
Verhulst E, Richard P, Richard E, Allain P, Nolin P (2016). 3D interaction techniques for virtual shopping: design and preliminary study. In: International conference on computer graphics theory and applications. pp. 271–279
Kölsch M, Turk M, Höllerer T (2004). Vision-based interfaces for mobility. In: Mobile and ubiquitous systems: networking and services. pp. 86–94
Colaco A, Kirmani A, Yang HS, Gong NW, Schmandt C, Goyal VK (2013). Mine: compact, low-power 3D gesture sensing interaction with head-mounted displays. In: Proceedings of the 26th annual ACM symposium on user interface software and technology. pp. 227–236
Ohta M, Nagano S, Takahashi S, Abe H, Yamashita K (2015). Mixed-reality shopping system using HMD and smartwatch. In: Adjunct proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015 ACM international symposium on wearable computers. pp. 125–128
Badju A, Lundberg D (2015). Shopping using gesture driven interaction. Master’s Thesis. Lund University. pp. 1–105
Altarteer S, Charissis V, Harrison D, Chan W (2017). Development and heuristic evaluation of semi-immersive hand-gestural virtual reality interface for luxury brands online stores. In: International conference on augmented reality, virtual reality and computer graphics. pp. 464–477
Chan E, Seyed T, Stuerzlinger W, Yang XD, Maurer F (2016). User elicitation on single-hand microgestures. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3403–3411
Choi S (2016) Understanding people with human activities and social interactions for human-centered computing. Hum Comput Inf Sci 6(9):1–10
Wu HY, Zhang SK, Qiu JL, Liu JY, Zhang XL (2018) The gesture disagreement problem in freehand gesture interaction. Int J Hum–Comput Inter. https://doi.org/10.1080/10447318.2018.1510607
Furnas GW, Landauer TK, Gomez LM, Dumais ST (1987) The vocabulary problem in human-system communication. Commun ACM 30(11):964–971
Morris MR, Wobbrock JO, Wilson AD (2010). Understanding users’ preferences for surface gestures. In: Proceedings of graphics interface. pp. 261–268
Wobbrock JO, Morris MR, Wilson AD (2009). User-defined gestures for surface computing. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1083–1092
Kray C, Nesbitt D, Rohs M (2010). User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. pp. 239–248
Ruiz J, Li Y, Lank E (2011). User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 197–206
Shimon SSA, Lutton C, Xu ZC, Smith SM, Boucher C, Ruiz J (2016). Exploring non-touchscreen gestures for smartwatches. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3822–3833
Peshkova E, Hitz M, Ahlström D, Alexandrowicz RW, Kopper A (2017). Exploring intuitiveness of metaphor-based gestures for UAV navigation. In: 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp. 175–182
Gheran BF, Vanderdonckt J, Vatavu RD (2018). Gestures for smart rings: empirical results, insights, and design implications. In: ACM SIGCHI conference on designing interactive systems. pp. 623–635
Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions. 21(3):40–45
Seyed T, Burns C, Sousa MC, Maurer F, Tang A (2012). Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. pp. 41–50
Tung YC, Hsu CY, Wang HY, Chyou S, Lin JW, Wu PJ, Valstar A, Chen MY (2015). User-defined game input for smart glasses in public space. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3327–3336
Hoff L, Hornecker E, Bertel S (2016). Modifying gesture elicitation: Do kinaesthetic priming and increased production reduce legacy bias? In: Proceedings of the tenth international conference on tangible, embedded, and embodied interaction. pp. 86–91
Jo D, Kim GJ (2019) Iot + AR: pervasive and augmented environments for “Digi-log” shopping experience. Hum Comput Inf Sci 9(1):1–17
Wu HY, Wang JM, Zhang XL (2017) Combining hidden Markov model and fuzzy neural network for continuous recognition of complex dynamic gestures. Visual Computer. 33(10):1227–1263
Vatavu RD, Wobbrock JO (2015). Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1325–1334
Wu HY, Wang JM (2016) A visual attention-based method to address the Midas touch problem existing in gesture-based interaction. Visual Computer. 32(1):123–136
Montero CS, Alexander J, Marshall M, Subramanian S (2010). Would you do that?—Understanding social acceptance of gestural interfaces. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. pp. 275–278
Wu HY, Wang JM, Zhang XL (2016) User-centered gesture development in TV viewing environment. Multimedia Tools Appl 75(2):733–760
Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183
Lund AM (2001) Measuring usability with the USE Questionnaire. SIG Newslett 8:2
Regenbrecht H, Schubert T (2002) Real and illusory interaction enhance presence in virtual environments. Pres Teleoper Virtual Environ 11(4):425–434
Bowman DA, Kruijff E, LaViola J, Poupyrev I (2004) 3D user interfaces: theory and practice. Addison Wesley Longman Publishing Co., Inc, Redwood City