User gesticulation inside an automated vehicle with external communication can cause confusion in pedestrians and a lower willingness to cross
Tài liệu tham khảo
23049:2018, I., 2018. Road Vehicles: Ergonomic Aspects of External Visual Communication from Automated Vehicles to Other Road Users. Standard. International Organization for Standardization.
Ackermans, 2020, The effects of explicit intention communication, conspicuous sensors, and pedestrian attitude in interactions with automated vehicles, 1
author, U. (2020). lme4 convergence warnings: troubleshooting. URL: https://rstudio-pubs-static.s3.amazonaws.com/33653_57fc7b8e5d484c909b615d8633c01d51.html. [Online; accessed 03-SEPTEMBER-2020].
Chai, Z., Nie, T., Becker, J. (2021). The Battle to Embrace the Trend. Springer Singapore, Singapore. pp. 179–249. doi:10.1007/978-981-15-6728-5_7. URL: https://doi.org/10.1007/978-981-15-6728-5_7.
Chang, 2018, A video-based study comparing communication modalities between an autonomous car and a pedestrian, 104
Charisi, 2017, Children’s views on identification and intention communication of self-driving vehicles, 399
Colley, 2020, Evaluating highly automated trucks as signaling lights, 111
Colley, M., Rukzio, R. (2020). A design space for external communication of autonomous vehicles. In Proceedings of the 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, ACM. Association for Computing Machinery, New York, NY, USA. p. 212–222. doi:10.1145/3409120.3410646. URL: doi: 10.1145/3409120.3410646.
Colley, 2020, Towards inclusive external communication of autonomous vehicles for pedestrians with vision impairments, 1
Colley, 2019, Including people with impairments from the start: External communication of autonomous vehicles, 307
Colley, 2019, For a better (simulated) world: Considerations for vr in external communication research, 442
Colley, 2020, Unveiling the lack of scalability in research on external communication of autonomous vehicles, 1
Cummings, M., Ryan, J. (2014). Point of view: who is in charge? the promises and pitfalls of driverless cars.
Cunningham, M., Regan, M. A. (2015). Autonomous vehicles: human factors issues and future research.
Deb, 2020, Comparison of child and adult pedestrian perspectives of external features on autonomous vehicles using virtual reality experiment, 145
Degani, A., Kirlik, A. (1995). Modes in human-automation interaction: Initial observations about a modeling approach. In 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, IEEE. IEEE, New York, NY, USA. pp. 3443–3450.
Degani, 1996, Modes in automated cockpits: Problems, data analysis and a modelling framework, 258
Dey, 2020, Taming the ehmi jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces, Transportation Research Interdisciplinary Perspectives, 7, 100174, 10.1016/j.trip.2020.100174
Dey, 2018, Interface concepts for intent communication from autonomous vehicles to vulnerable road users, 82
Dey, 2019, Gaze patterns in pedestrian interaction with vehicles: Towards effective design of external human-machine interfaces for automated vehicles, 369
Edition, I. (2018). What Happened When Driver Put His Tesla on Auto Pilot?. URL: https://www.youtube.com/watch?v=kaAUHpeFj1c&t=53s. [Online; accessed 12-JUNE-2020].
Faas, 2020, A longitudinal video study on communicating status and intent for self-driving vehicle – pedestrian interaction, 1
Faas, 2020, External hmi for self-driving vehicles: Which information shall be displayed?, Transportation Research Part F: Traffic Psychology and Behaviour, 68, 171, 10.1016/j.trf.2019.12.009
Fagnant, 2015, Preparing a nation for autonomous vehicles: opportunities, barriers and policy recommendations, Transportation Research Part A: Policy and Practice, 77, 167
Franke, U., Pfeiffer, D., Rabe, C., Knoeppel, C., Enzweiler, M., Stein, F., Herrtwich, R.G. (2013). Making bertha see.
González, 2007, Eyes on the road, hands on the wheel: Thumb-based interaction techniques for input on steering wheels, 95
Habibovic, 2018, Communicating intent of automated vehicles to pedestrians, Frontiers in psychology, 9, 1336, 10.3389/fpsyg.2018.01336
Hart, 1988, Development of nasa-tlx (task load index): Results of empirical and theoretical research, volume 52, 139
Hecht, 2018, Lidar for self-driving cars, Optics and Photonics News, 29, 26, 10.1364/OPN.29.1.000026
Hock, 2017, Carvr: Enabling in-car virtual reality entertainment, 4034
Holländer, 2019, Overtrust in external cues of automated vehicles: An experimental investigation, 211
Hou, 2020, Autonomous vehicle-cyclist interaction: Peril and promise, 1
Inners, 2017, Beyond liability: Legal issues of human-machine interaction for automated vehicles, 245
Johnson, 1995, Experimental study of vertical flight path mode awareness, IFAC Proceedings Volumes, 28, 153, 10.1016/S1474-6670(17)45225-6
Joisten, 2020, Displaying vehicle driving mode – effects on pedestrian behavior and perceived safety, 250
Joshi, A., Miller, S.P., Heimdahl, M.P. (2003). Mode confusion analysis of a flight guidance system using formal methods. In Digital Avionics Systems Conference, 2003. DASC’03. The 22nd, IEEE. IEEE, New York, NY, USA. pp. 2–D.
Körber, M. (2019). Theoretical considerations and development of a questionnaire to measure trust in automation. In S. Bagnara, R. Tartaglia, S. Albolino, T. Alexander, Y. Fujita (Eds.), Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018), Springer International Publishing, Cham (pp. 13–30).
Kothgassner, O.D., Felnhofer, A., Hauk, N., Kastenhofer, E., Gomm, J., Kryspin-Exner, I. (2013). Technology usage inventory (tui).
Kurpiers, 2020, Mode awareness and automated driving—what is it and how can it be measured?, Information, 11, 277, 10.3390/info11050277
Lankenau, 2001, Avoiding mode confusion in service-robots, in: Integration of Assistive Technology in the Information Age, 162
Leap, M. (2015). Magic Leap — Original Concept Video. URL: https://www.youtube.com/watch?v=kPMHcanq0xM. [Online; accessed 12-JULY-2020].
Lee, 2014, Mode confusion in driver interfaces for adaptive cruise control systems, 4105
Leveson, N., Pinnel, L.D., Sandys, S.D., Koga, S., Reese, J.D. (1997). Analyzing software specifications for mode confusion potential. In Proceedings of a workshop on human error and system development, Glasgow Accident Analysis Group. Glasgow Accident Analysis Group, Glasgow, Scotland (pp. 132–146).
LLP, M.I. (2020). Autonomous/driverless car market - growth, trends, and forecast (2020–2025).
Löcken, 2019, How should automated vehicles interact with pedestrians? a comparative analysis of interaction concepts in virtual reality, 262
LUMINQ (2020). In-glass displays for improved automotive safety. URL: https://www.lumineq.com/applications/automotive. [Online; accessed: 12-SEPTEMBER-2020].
Mahadevan, 2018, Communicating awareness and intent in autonomous vehicle-pedestrian interaction, 1
Matthews, M., Chowdhary, G., Kieson, E. (2017). Intent communication between autonomous vehicles and pedestrians.
Meme, J. (2020). Jins Meme. URL: https://jins-meme.com/en/. [Online; accessed 12-SEPTEMBER-2020].
Mercedes-Benz (2020). MBUX: Mercedes Benz User Experience. URL: https://www.volkswagen.co.uk/technology/comfort/gesture-control. [Online; accessed 12-JULY-2020].
Millard-Ball, 2018, Pedestrians, autonomous vehicles, and cities, Journal of planning education and research, 38, 6, 10.1177/0739456X16675674
Moore, 2019, The case for implicit external human-machine interfaces for autonomous vehicles, 295
Noguchi, 2012, nparld: an r software package for the nonparametric analysis of longitudinal data in factorial experiments, Journal of Statistical Software, 50, 10.18637/jss.v050.i12
Norman, 1983, Design rules based on analyses of human error, Communications of the ACM, 26, 254, 10.1145/2163.358092
Pakusch, 2018, Unintended effects of autonomous driving: A study on mobility preferences in the future, Sustainability, 10, 10.3390/su10072404
Pfleging, 2016, Investigating user needs for non-driving-related activities during automated driving, 91
Pickering, 2007, A research study of hand gesture recognition technologies and applications for human vehicle interaction, 1
Qian, 2020, Aladdin’s magic carpet: Navigation by in-air static hand gesture in autonomous vehicles, International Journal of Human-Computer Interaction, 1
Ranft, 2016, The role of machine vision for intelligent vehicles, IEEE Transactions on Intelligent Vehicles, 1, 8, 10.1109/TIV.2016.2551553
Rasouli, 2017, Understanding pedestrian behavior in complex traffic scenes, IEEE Transactions on Intelligent Vehicles, 3, 61, 10.1109/TIV.2017.2788193
Rasouli, 2019, Autonomous vehicles that interact with pedestrians: A survey of theory and practice, IEEE Transactions on Intelligent Transportation Systems, 21, 900, 10.1109/TITS.2019.2901817
Reifinger, 2007, Static and dynamic hand-gesture recognition for augmented reality applications, 728
Rettenmaier, 2019, Passing through the bottleneck-the potential of external human-machine interfaces, 1687
Rettenmaier, 2020, How much space is required? effect of distance, content, and color on external human–machine interface size, Information, 11, 346, 10.3390/info11070346
Riener, 2013, Standardization of the in-car gesture interaction space, 14
Rogers, 2019, Exploring interaction fidelity in virtual reality: Object manipulation and whole-body movements, 1
Rosenthal, 1994, Parametric measures of effect size, The handbook of research synthesis, 621, 231
Rothenbücher, 2016, Ghost driver: A field study investigating the interaction between pedestrians and driverless vehicles, 795
Spencer Jr, C.F. (2000). Cockpit automation and mode confusion: The use of auditory inputs for error mitigation. Technical Report. AIR COMMAND AND STAFF COLL MAXWELL AFB AL.
Taxonomy, S. (2014). Definitions for terms related to on-road motor vehicle automated driving systems. Technical Report. Technical report, SAE International.
TeslaFi (2020). TeslaFi Software Tracker. URL: https://www.teslafi.com/firmware.php. [Online; accessed: 12-SEPTEMBER-2020].
Volkswagen (2020). Gesture control. URL: https://www.volkswagen.co.uk/technology/comfort/gesture-control. [Online; accessed 12-JULY-2020].
Walker, 2019, Feeling-of-safety slider: Measuring pedestrian willingness to cross roads in field interactions with vehicles, 1
Yusof, 2016, The exploration of autonomous vehicle driving styles: Preferred longitudinal, lateral, and vertical accelerations, 245