Autonomous vehicle perception: The technology of today and tomorrow

Jessica Van Brummelen1, Marie O’Brien1, Dominique Gruyer2, Homayoun Najjaran1
1Advanced Control and Intelligent Systems Laboratory, University of British Columbia, Kelowna, British Columbia, Canada
2Laboratoire sur les Interactions Vehicules, Infrastructure, Conducteurs (LIVIC), IFSTTAR-CoSys-LIVIC, 25 alle des Marronniers, 78000 Versailles, France

Tóm tắt

Từ khóa


Tài liệu tham khảo

Abuelsamid, S., Feb. 2017. BMW, HERE And Mobileye Team Up To Crowd-Source HD Maps For Self-Driving. Forbes. <https://www.forbes.com/sites/samabuelsamid/2017/02/21/bmw-here-and-mobileye-team-up-to-crowd-source-hd-maps-for-self-driving/>.

Ackerman, E., Aug 2016a. Israeli startup innoviz promises $100 solid-state automotive lidar by 2018. <http://spectrum.ieee.org/cars-that-think/transportation/sensors/israeli-stealth-startup-innoviz-promises-100-solidstate-automotive-lidar-by-2018>.

Ackerman, E., Jan 2016b. Quanergy announces $250 solid-state lidar for cars, robots, and more. <http://spectrum.ieee.org/cars-that-think/transportation/sensors/quanergy-solid-state-lidar>.

Amadeo, R., Jan 2017. Google’s waymo invests in lidar technology, cuts costs by 90 percent. <https://arstechnica.com/cars/2017/01/googles-waymo-invests-in-lidar-technology-cuts-costs-by-90-percent/>.

Amin, 1995, Network, control, communication and computing technologies for intelligent transportation systems overview of the special issue, Math. Comput. Modell., 22, 1, 10.1016/0895-7177(95)00126-M

Ashraf, 2017, An investigation of interpolation techniques to generate 2d intensity image from LIDAR data, IEEE Access, 5, 8250, 10.1109/ACCESS.2017.2699686

AutonomousStuff, 2017. <http://www.autonomoustuff.com/platform/>.

Bacha, 2008, Odin: Team victortango’s entry in the darpa urban challenge, J. Field Rob., 25, 467, 10.1002/rob.20248

Balan, 2007, Shining a light on human pose: on shadows, shading and the estimation of pose and shape, 1

Banks, 2016, Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems, Ergonomics, 59, 1442, 10.1080/00140139.2016.1146344

Banks, 2016, Keep the driver in control: automating automobiles of the future, Appl. Ergon., 53, 389, 10.1016/j.apergo.2015.06.020

Bansal, 2016, Assessing public opinions of and interest in new vehicle technologies: an austin perspective, Transp. Res. Part C: Emerg. Technol., 67, 1, 10.1016/j.trc.2016.01.019

Barrachina, 2013

Behringer, R., Maurer, R.B.M., Sep 1996. Results on visual road recognition for road vehicle guidance. In: Proc. Conf. Intelligent Vehicles. pp. 415–420.

Bergenhem, C., Huang, Q., Benmimoun, A., Robinson, T., 2010. Challenges of platooning on public motorways. In: 17th World Congr. Intelligent Transport Systems. pp. 1–12. <http://www.sartre-project.eu/en/publications/Documents/ITS%20WC%20challenges%20of%20platooning%20concept%20and%20modelling%2010%20b.pdf>.

Bertozzi, 1998, Vision-based automated vehicle guidance: the experience of the argo vehicle, Tecniche di Intelligenza Artificiale e Pattern Recognit. per la Visione Artificiale, 35

Bertozzi, 2000, Vision-based intelligent vehicles: State of the art and perspectives, Rob. Auton. Syst., 32, 1, 10.1016/S0921-8890(99)00125-6

Betters, E., Dec 2015. Self-driving cars: 14 automakers betting on driverless vehicles. <http://www.pocket-lint.com/news/136208-self-driving-cars-14-automakers-betting-on-driverless-vehicles>.

Biswas, 2002, Towards object mapping in non-stationary environments with mobile robots, vol. 1, 1014

BMW, 2017. Bmw connecteddrive: Intelligent driving. <http://www.bmw.com/com/en/insights/technology/connecteddrive/2013/driver_assistance/intelligent_driving.html>.

Bojarski, M., Del Testa, D., Dworakowski, D., Firner, B., Flepp, B., Goyal, P., Jackel, L.D., Monfort, M., Muller, U., Zhang, J., Zhang, X., Zhao, J., Zieba, K., Apr. 2016. End to End Learning for Self-Driving Cars. arXiv:1604.07316 [cs]ArXiv: 1604.07316. http://arxiv.org/abs/1604.07316.

Borkar, 2009, A layered approach to robust lane detection at night, 51

Boudette, N.E., Mar 2017. Building a road map for the self-driving car. The N.Y. Times. URL <https://www.nytimes.com/2017/03/02/automobiles/wheels/self-driving-cars-gps-maps.html>.

Bresson, 2016, A cooperative fusion architecture for robust localization: application to autonomous driving, 859

Broggi, 2000, Architectural issues on vision-based automatic vehicle guidance: the experience of the argo project, Real-Time Imag., 6, 313, 10.1006/rtim.1999.0191

Broggi, 1999

Broggi, A., Bombini, L., Cattani, S., Cerri, P., Fedriga, R., Jun 2010. Sensing requirements for a 13,000 km intercontinental autonomous drive. In: 2010 IEEE Intelligent Vehicles Symp. pp. 500–505. <http://www.ce.unipr.it/people/cattani/publications-pdf/iv2010-porter.pdf>.

Broggi, 2013, Extensive tests of autonomous driving technologies, IEEE Trans. Intell. Transport. Syst., 14, 1403, 10.1109/TITS.2013.2262331

Broggi, A., Cerri, P., Debattisti, S., Laghi, M.C., Medici, P., Panciroli, M., Prioletti, A., Jun 2014. Proud-public road urban driverless test: Architecture and results. In: 2014 IEEE Intelligent Vehicles Symp. Proc. pp. 648–654.

Broggi, 2012, The vislab intercontinental autonomous challenge: an extensive test for a platoon of intelligent vehicles, Int. J. (Wash.) Vehic. Auton. Syst., 10, 147, 10.1504/IJVAS.2012.051250

Buehler, 2007

Buehler, 2009

Cacciola, S.J., 2007. Fusion of laser range-finding and computer vision data for traffic detection by autonomous vehicles. Ph.D. thesis, Virginia Polytechnic Institute and State University. <http://theses.lib.vt.edu/theses/available/etd-12142007-105238/>.

Campbell, 2010, Autonomous driving in urban environments: approaches, lessons and challenges, Philos. Transa. Roy. Soc. A: Math., Phys. Eng. Sci., 368, 4649, 10.1098/rsta.2010.0110

Canada, T., 2017a. Model s software version 7.0. <https://www.tesla.com/en_CA/presskit/autopilot?redirect=no>.

Canada, T., 2017b. Tesla autopilot. URL https://www.tesla.com/autopilot.

Cars, V., 2016. Self-driving car technology — intellisafe. <http://www.volvocars.com/intl/about/our-innovation-brands/intellisafe/intellisafe-autopilot/this-is-autopilot/the-tech>.

Center, F.M., 2016. No lights? no problem! ford fusion autonomous research vehicles use lidar sensor technology to see in the dark. <https://media.ford.com/content/fordmedia/fna/us/en/news/2016/04/11/no-lights-no-problem-ford-fusion-autonomous-research-vehicles-.html>.

Chan, Y.M., Huang, S.S., Fu, L.C., Hsiao, P.Y., Sep 2007. Vehicle detection under various lighting conditions by incorporating particle filter. In: 2007 IEEE Intelligent Transportation Systems Conf. pp. 534–539.

Chen, C., Seff, A., Kornhauser, A., Xiao, J., 2015. Deepdriving: Learning affordance for direct perception in autonomous driving. In: Proc. IEEE Int. Conf. Computer Vision. pp. 2722–2730. <http://www.cv-foundation.org/openaccess/content_iccv_2015/html/Chen_DeepDriving_Learning_Affordance_ICCV_2015_paper.html>.

Cheng, 2011

Cho, H., Seo, Y.W., Kumar, B.V.K.V., Rajkumar, R.R., May 2014. A multi-sensor fusion system for moving object detection and tracking in urban driving environments. In: 2014 IEEE Int. Conf. Robotics and Automation (ICRA). pp. 1836–1843.

Company, F.M., 2014. Accident avoidance and driver assist technologies. <http://corporate.ford.com/microsites/sustainability-report-2013-14/vehicle-avoidance.html>.

Cui, Z., Yang, S.W., Tsai, H.M., Sep 2015. A vision-based hierarchical framework for autonomous front-vehicle taillights detection and signal recognition. In: 2015 IEEE 18th Int. Conf. Intelligent Transportation Systems. pp. 931–937.

Dang, R., Ding, J., Su, B., Yao, Q., Tian, Y., Li, K., Oct 2014. A lane change warning system based on v2v communication. In: 17th Int. IEEE Conf. Intelligent Transportation Systems (ITSC). pp. 1923–1928.

Dao, 2006, Co-operative lane-level positioning using markov localization, 1006

Daraei, M.H., Vu, A., Manduchi, R., 2017. Velocity and Shape from Tightly-Coupled LiDAR and Camera. In: Proc. 2017 IEEE Intelligent Vehicles Symposium.

Dasarathy, 1997, Sensor fusion potential exploitation-innovative architectures and illustrative applications, Proc. IEEE, 85, 24, 10.1109/5.554206

Davies, A., May 2016. Gm’s using cameras on customer cars to build self-driving car maps. <https://www.wired.com/2016/01/gms-building-self-driving-car-maps-with-cameras-on-customer-cars/>.

Davila, A., Aramburu, E., Freixas, A., Apr 2013. Making the best out of aerodynamics: Platoons. In: SAE Technical Paper. SAE International, pp. 1–6. https://doi.org/10.4271/2013-01-0767.

Denuelle, A., Srinivasan, M.V., Dec 2015. Bio-inspired visual guidance: From insect homing to uas navigation. In: 2015 IEEE Int. Conf. Robotics and Biomimetics (ROBIO). pp. 326–332.

Dey, 2016, Vehicle-to-vehicle (v2v) and vehicle-to-infrastructure (v2i) communication in a heterogeneous wireless network performance evaluation, Transp. Res. Part C: Emerg. Technolo., 68, 168, 10.1016/j.trc.2016.03.008

Dissanayake, 2001, A solution to the simultaneous localization and map building (SLAM) problem, IEEE Tran. Robot. Autom., 17, 229, 10.1109/70.938381

Douxchamps, D., 2017. A small list of IMU/INS/INU. https://damien.douxchamps.net/research/imu/.

Duffy, L., Apr. 2017. Next-Generation LiDAR Better than Ever. Point of Beginning. <http://www.pobonline.com/articles/100871-next-generation-lidar-better-than-ever>.

Durrant-Whyte, 2006, Simultaneous localization and mapping: Part i, IEEE Robot. Autom. Magaz., 13, 99, 10.1109/MRA.2006.1638022

Durrant-Whyte, 1990, Sensor models and multisensor integration, 73

Eddy, N., Jan 2016. Google details self-driving cars problems in dmv report. <http://www.informationweek.com/it-life/google-details-self-driving-cars-problems-in-dmv-report/d/d-id/1323905>.

eFuture, 2011. efuture efficeint mobility: Expected results. <http://www.efuture-eu.org/about/expected-results/>.

Eleter, 1996, Intelligent mapping of the prolab2 vehicle dynamic environment, Math. Comput. Simul, 41, 329, 10.1016/0378-4754(95)00082-8

Elmenreich, W., 2002. An introduction to sensor fusion. Tech. Rep. 47/2001, Vienna University of Technology, Austria.

Eureka, 1995. Programme for a european traffic system with highest efficiency and unprecedented safety. <http://www.eurekanetwork.org/project/id/45>.

Fagnant, 2015, Preparing a nation for autonomous vehicles: opportunities, barriers and policy recommendations, Transp. Res. Part A: Policy Pract., 77, 167

Folsom, 2011, Social ramifications of autonomous urban land vehicles, 1

for Intelligent Transport (HAVEit), H.A.V., 2008. Haveit project website. <http://www.haveit-eu.org/>.

Fudge, A. (Ed.), Apr 2016. European Truck Platooning Challenge 2016. Challenge Network and Rijkswaterstaat.

Gehrig, S., Reznitskii, M., Schneider, N., Franke, U., Weickert, J., Dec 2013. Priors for stereo vision under adverse weather conditions. In: 2013 IEEE Int. Conf. Computer Vision Workshops (ICCVW). pp. 238–245.

Geiger, 2012, Team annieway’s entry to the 2011 grand cooperative driving challenge, IEEE Trans. Intell. Transport. Syst., 1008, 10.1109/TITS.2012.2189882

Geiger, A., Lenz, P., Urtasun, R., Jun. 2012b. Are we ready for autonomous driving? The KITTI vision benchmark suite. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition. pp. 3354–3361.

Ghring, D., Latotzky, D., Wang, M., Rojas, R., 2013. Semi-autonomous car control using brain computer interfaces. Springer, Ch. Semi-autonomous car control using brain computer interfaces, pp. 393–408. <http://link.springer.com/chapter/10.1007/978-3-642-33932-5_37>.

Gitlin, J.M., Jan 2016a. Ars talks self-driving car technology with ford at ces. URL <http://arstechnica.com/cars/2016/01/ars-talks-self-driving-technology-with-ford-at-ces/>.

Gitlin, J.M., Jan 2016b. Assists, autopilot, and more: Ars talks about autonomous driving with audi. URL <http://arstechnica.com/cars/2016/01/assists-autopilot-and-more-ars-talks-about-autonomous-driving-with-audi/>.

Gitlin, J.M., May 2016c. From audi to volvo, most “self-driving” cars use the same hardware. URL <https://arstechnica.com/cars/2016/05/from-audi-to-volvo-most-self-driving-cars-use-the-same-hardware/>.

Golson, J., Aug 2016. Tesla’s autopilot system is reportedly getting more sensors. URL <http://www.theverge.com/2016/8/11/12443310/tesla-autopilot-next-generation-radar-triple-camera>.

Google, 2017. Google self-driving car project. <http://www.google.com/selfdrivingcar>.

Greenemeier, L., Jul 2016. Deadly tesla crash exposes confusion over automated driving. <https://www.scientificamerican.com/article/deadly-tesla-crash-exposes-confusion-over-automated-driving/>.

Grisleri, 2010, The braive autonomous ground vehicle platform, IFAC Proc. Vol., 43, 497, 10.3182/20100906-3-IT-2019.00086

Gruyer, 2015, Persee: a central sensors fusion electronic control unit for the development of perception-based adas, 250

Gruyer, 2016, Accurate lateral positioning from map data and road marking detection, Exp. Syst. Appl., 43, 1, 10.1016/j.eswa.2015.08.015

Gruyer, 2013, Target-to-track collaborative association combining a laser scanner and a camera, 1125

Gruyer, 2016, Multi-hypotheses tracking using the dempstershafer theory, application to ambiguous road context, Inform. Fusion, 29, 40, 10.1016/j.inffus.2015.10.001

Gu, T., 2014. Real time obstacle depth perception using stereo vision. Ph.D. thesis, University of Florida.

Guizzo, E., 2011. How google’s self-driving car works. IEEE Spectrum Online, October 18. <http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/how-google-self-driving-car-works>.

Hawkins, 2011, Using a software safety argument pattern catalogue: two case studies, 185

Heikoop, 2016, Psychological constructs in driving automation: a consensus model and critical comment on construct proliferation, Theoret. Issues Ergon. Sci., 17, 284, 10.1080/1463922X.2015.1101507

Henawy, M.A., Schneider, M., Oct 2011. Integrated antennas in ewlb packages for 77 ghz and 79 ghz automotive radar sensors. In: 2011 41st European Microwave Conf. pp. 1312–1315.

Heredia, 2008, Sensor and actuator fault detection in small autonomous helicopters, Mechatronics, 18, 90, 10.1016/j.mechatronics.2007.09.007

Hischke, M., Sep 1995. Collision warning radar interference. In: Proc. Intelligent Vehicles ’95 Symp. pp. 13–18.

Hodson, H., Sep 2016. Uber and google race against car firms to map the world’s cities. <https://www.newscientist.com/article/mg23130932-900-uber-and-google-race-against-car-firms-to-map-the-worlds-cities/>.

Howard, B., Nov 2016. Big boost for self-driving cars: Osram cuts lidar cost to less than $50. <https://www.extremetech.com/extreme/239359-big-boost-self-driving-cars-osram-cuts-lidar-cost-less-50>.

Huang, 2017, ROSLAMa faster algorithm for simultaneous localization and mapping (SLAM), 65

Huang, 2015, Design of a fault detection and isolation system for intelligent vehicle navigation system, Int. J. (Wash.) Navigat.Observat., 2015, 1, 10.1155/2015/279086

Huber, 2011, Integrating lidar into stereo for fast and improved disparity computation, 405

Hurney, 2015, Review of pedestrian detection techniques in automotive far-infrared video, IET Intel. Transport Syst., 9, 824, 10.1049/iet-its.2014.0236

Ilgin Guler, 2014, Using connected vehicle technology to improve the efficiency of intersections, Transp. Res. Part C: Emerg. Technol., 46, 121, 10.1016/j.trc.2014.05.008

Insider, B., Feb 2017. Uber builds out mapping data for autonomous cars. <http://www.businessinsider.com/uber-builds-out-mapping-data-for-autonomous-cars-2017-2>.

Jagannathan, S., Desappan, K., Swami, P., Mathew, M., Nagori, S., Chitnis, K., Marathe, Y., Poddar, D., Narayanan, S., Jan. 2017. Efficient object detection and classification on low power embedded systems. In: 2017 IEEE International Conference on Consumer Electronics (ICCE). pp. 233–234.

Jan, K., Cuccu, G., Schmidhuber, J., Gomez, F., 2013. Evolving large-scale neural networks for vision-based torcs. In: Proc. 15th Annu. Conf. Genetic and Evol. Comput. pp. 206–212.

Janardhanan, S., Keshavarz, M., Laine, L., Sep 2015. Introduction of traffic situation management for a rigid truck, tests conducted on object avoidance by steering within ego lane. In: 2015 IEEE 18th Int. Conf. Intelligent Transportation Systems. pp. 1527–1532.

Jaradat, M.A., Abdel-Hafez, M.F., Saadeddin, K., Jarrah, M.A., Apr 2013. Intelligent fault detection and fusion for ins/gps navigation system. In: 2013 9th Int. Symp. Mechatronics and Applications (ISMA). pp. 1–5.

Javaheri, A., Brites, C., Pereira, F., Ascenso, J., Jul. 2017. Subjective and objective quality evaluation of 3d point cloud denoising algorithms. In: 2017 IEEE International Conference on Multimedia Expo Workshops (ICMEW). pp. 1–6.

Jo, J., Tsunoda, Y., Stantic, B., Liew, A.W.-C., 2017. A Likelihood-Based Data Fusion Model for the Integration of Multiple Sensor Data: A Case Study with Vision and Lidar Sensors. Vol. 447. Springer International Publishing, Ch. A Likelihood-Based Data Fusion Model for the Integration of Multiple Sensor Data: A Case Study with Vision and Lidar Sensors, pp. 489–500, dOI: https://doi.org/10.1007/978-3-319-31293-4_39. <http://link.springer.com/10.1007/978-3-319-31293-4_39>.

Jochem, T., Pomerleau, D., 1995. No hands across america official press release.

Julier, 2003, On the role of process models in autonomous land vehicle navigation systems, IEEE Trans. Robot. Autom., 19, 1, 10.1109/TRA.2002.805661

Kato, 2015, An open approach to autonomous vehicles, IEEE Micro, 35, 60, 10.1109/MM.2015.133

Katrakazas, 2015, Real-time motion planning methods for autonomous on-road driving: State-of-the-art and future research directions, Transp. Res. Part C: Emerg. Technol., 60, 416, 10.1016/j.trc.2015.09.011

Kaviani, S., O’Brien, M., Van Brummelen, J., Michelson, D., Najjaran, H., 2016. Ins/gps localization for reliable cooperative driving. In: 2016 IEEE Canadian Conf. Electrical and Computer Engineering (CCECE). Vancouver, Canada, pp. 1–4.

Ke, R., Lutin, J., Spears, J., Wang, Y., Jul. 2017. A Cost-Effective Framework for Automated Vehicle-Pedestrian Near-Miss Detection Through Onboard Monocular Vision. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). pp. 898–905.

Kim, 2017, Takeover requests in simulated partially autonomous vehicles considering hum. factors, IEEE Trans. Human-Mach. Syst. PP, 1

Krisher, T., Durbin, D.-A., Sep 2016. Tesla update halts automatic steering if driver inattentive. <http://phys.org/news/2016-09-tesla-halts-automatic-driver-inattentive.html>.

Kyriakidis, 2017, A hum. factors perspective on automated driving, Theoret. Issues Ergon. Sci., 1

Kyriakidis, 2015, Public opinion on automated driving: Results of an international questionnaire among 5000 respondents, Transp. Res. Part F: Traffic Psychol. Behav., 32, 127, 10.1016/j.trf.2015.04.014

Lambert, F., Mar 2017. Tesla’s “vision” and autopilot chip efforts validated by intel’s $15 billion acquisition of mobileye. <https://electrek.co/2017/03/13/tesla-vision-autopilot-chip-intel-mobileye/>.

Larburu, M., Sanchez, J., Rodriguez, D.J., Oct 2010. Safe road trains for environment: Hum. factors’ aspects in dual mode transport systems. In: ITS World Congr. pp. 1–12.

Leonard, 2008, A perception-driven autonomous urban vehicle, J. Field Rob., 25, 727, 10.1002/rob.20262

Leonard, 1992, Dynamic map building for an autonomous mobile robot, Int. J. (Wash.) Robot. Res., 11, 286, 10.1177/027836499201100402

Levin, S., Harris, M., Mar 2017. The road ahead: self-driving cars on the brink of a revolution in california. The Guardian. <https://www.theguardian.com/technology/2017/mar/17/self-driving-cars-california-regulation-google-uber-tesla>.

Levinson, J., Askeland, J., Becker, J., Dolson, J., Held, D., Kammel, S., Kolter, J.Z., Langer, D., Pink, O., Pratt, V., et al., Jun 2011. Towards fully autonomous driving: Systems and algorithms. In: 2011 IEEE Intelligent Vehicles Symp. (IV). pp. 163–168.

Levinson, 2007, Map-based precision vehicle localization in urban environments, Robot.: Sci. Syst., 4, 1

Levinson, J., Thrun, S., May 2010. Robust vehicle localization in urban environments using probabilistic maps. In: 2010 IEEE Int. Conf. Robotics and Automation (ICRA). pp. 4372–4378.

Lexus, 2017. Lexus rx - safety. <http://www.lexus.com/models/RX/safety>.

Lowry, 2016, Visual place recognition: a survey, IEEE Trans. Robot., 32, 1, 10.1109/TRO.2015.2496823

Lu, 2016, Visual place recognition: a survey, Transp. Res. Part F: Traff. Psychol. Behav., 43, 183, 10.1016/j.trf.2016.10.007

Maddern, 2016, Real-time probabilistic fusion of sparse 3d lidar and dense stereo, 2181

Marais, 2014, Toward accurate localization in guided transport: combining gnss data and imaging information, Transp. Res. Part C: Emerg. Technol., 43, 188, 10.1016/j.trc.2013.11.008

Maurer, M., Behringer, R., Furst, S., Thomanek, F., Dickmanns, E.D., Aug 1996. A compact vision system for road vehicle guidance. In: Proc. 13th Int. Conf. Pattern Recognit. Vol. 3. pp. 313–317 vol.3.

2016

Maybeck, 1979, Chapter 1: Introduction, vol. 1, 10

Mercedes-Benz, Aug 2013. <https://www.mercedes-benz.com/en/mercedes-benz/innovation/mercedes-benz-intelligent-drive/>.

Mobileye, 2017. About us. <http://www.mobileye.com/about/>.

Mok, B.K.-J., Sirkin, D., Sibi, S., Miller, D.B., Ju, W., 2015. Understanding driver-automated vehicle interactions through wizard of oz design improvisation. In: Proc. Int. Driving Symp. Hum. Factors in Driver Assessment, Training and Vehicle Design. pp. 386–392. <http://www.wendyju.com/publications/058.pdf>.

Montemerlo, 2008, Junior: the stanford entry in the urban challenge, J. Field Rob., 25, 569, 10.1002/rob.20258

Muoio, D., Aug 2016. Here’s why self-driving cars can’t handle bridges. <http://www.businessinsider.com/autonomous-cars-bridges-2016-8>.

Navya, 2017. Navya -100% autonomous, driverless and electric. URL http://navya.tech/.

Neale, V.L., Dingus, T.A., Klauer, S.G., Sudweeks, J., Goodman, M., 2005. An overview of the 100-car naturalistic study and findings. National Highway Traffic Safety Administration, Paper. <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.172.2366&rep=rep1&type=pdf>.

Nguyen, A., Le, B., Nov. 2013. 3d point cloud segmentation: A survey. In: 2013 6th IEEE Conference on Robotics, Automation and Mechatronics (RAM). pp. 225–230.

(NHTSA), N.H.T.S.A., May 2013. U.s. department of transportation releases policy on automated vehicle development. <http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Releases+Policy+on+Automated+Vehicle+Development>.

Oagana, A., Jan 2016. A short history of mercedes-benz autonomous driving technology. <http://www.autoevolution.com/news/a-short-history-of-mercedes-benz-autonomous-driving-technology-68148.html>.

O’Brien, M., Kaviani, S., Van Brummelen, J., Michelson, D., Najjaran, H., 2016. Localization Estimation Filtering Techniques for Reliable Cooperative Driving. In: 2016 Canadian Society Mechanical Engineering Int. Congr. Kelowna, Canada, pp. 1–5.

Obst, M., Bauer, S., Wanielik, G., Apr 2012. Urban multipath detection and mitigation with dynamic 3d maps for reliable land vehicle localization. In: 2012 IEEE/ION Position Location and Navigation Symp. (PLANS). pp. 685–691.

of British Columbia (ICBC), I.C., 2015. Learn to drive smart: Your guide to driving safely. Insurance Corporation of British Columbia (ICBC). <http://www.icbc.com/driver-licensing/Documents/driver-full.pdf>.

of Transportation, U.D., Sep 2016. Federal automated vehicles policy.

Ohn-Bar, 2016, Looking at humans in the age of self-driving and highly automated vehicles, IEEE Trans. Intell. Vehic., 1, 90, 10.1109/TIV.2016.2571067

Pachal, P., Jul 2016. Despite tesla’s setbacks, audi is racing fast toward self-driving cars. <http://mashable.com/2016/07/16/audi-self-driving-a7/>.

Park, 2017, Vehicle positioning based on velocity and heading angle observer using low-cost sensor fusion, J. Dynam. Syst., Measur., Control, 139, 13

Park, 2014, Calibration between color camera and 3d LIDAR instruments with a polygonal planar board, Sensors (Basel), 14, 5333, 10.3390/s140305333

Payre, 2016, Fully automated driving impact of trust and practice on manual control recovery, Hum. Fact.: J. Hum. Fact. Ergon. Soc., 58, 229, 10.1177/0018720815612319

Pettitt, J., Dec 2016. This company has developed stereo cameras for driverless cars. <http://www.cnbc.com/2016/12/02/seegrid-sensors-for-driverless-cars.html>.

Pomerleau, D., Sep 1995. Ralph: Rapidly adapting lateral position handler. In: Proc. IEEE Symp. Intelligent vehicle. pp. 1–5. <http://www.cs.cmu.edu/tjochem/nhaa/ralph.html>.

Pomerleau, D.A., 1989. Alvinn, an autonomous land vehicle in a neural network. Advances in neural information processing systems. <https://papers.nips.cc/paper/95-alvinn-an-autonomous-land-vehicle-in-a-neural-network.pdf>.

Pous, 2017, Intelligent vehicle embedded sensors fault detection and isolation using analytical redundancy and nonlinear transformations, J. Control Sci. Eng., 2017, 1, 10.1155/2017/1763934

Rabel, D., Dec. 2016. How HERE HD Live Map paved the way for autonomous cars in 2016. <http://360.here.com/2016/12/13/how-here-hd-live-map-paved-the-way-for-autonomous-cars-in-2016/>.

Radecki, P., Campbell, M., Matzen, K., 2016. All weather perception: Joint data association, tracking, and classification for autonomous ground vehicles. arXiv preprint arXiv:1605.02196. http://arxiv.org/abs/1605.02196.

Ranft, 2016, The role of machine vision for intelligent vehicles, IEEE Trans. Intell. Vehic., 1, 8, 10.1109/TIV.2016.2551553

Rasshofer, 2005, Automotive radar and lidar systems for next generation driver assistance functions, Adv. Radio Sci., 3, 205, 10.5194/ars-3-205-2005

Rebut, 2004, Image segmentation and pattern recognit. for road marking analysis, vol. 1, 727

Renault, 2017. Adas: a range of technologies promoting safety and easier driving experience. <https://group.renault.com/en/passion-2/innovation/renault-a-born-innovator/adas-a-range-of-technologies-promoting-safety-and-easier-driving-experience/>.

Reuters, T., 2017. Thomson reuters - ip & science - web of science. <http://ipscience.thomsonreuters.com/product/web-of-science/>.

Ros, G., Ramos, S., Granados, M., Bakhtiary, A., Vazquez, D., Lopez, A.M., Jan 2015. Vision-based offline-online perception paradigm for autonomous driving. In: 2015 IEEE Winter Conf. Applications Computer Vision. pp. 231–238.

Ross, P.E., Apr 2017. Velodyne announces a solid-state lidar. <http://spectrum.ieee.org/cars-that-think/transportation/sensors/velodyne-announces-a-solidstate-lidar>.

Rouff, 2011

Safwan, 2013, Bio-mimetic vision system for autonomous mobile robot navigation, Sir Syed Univ. Res. J. Eng. Technol., 3

Saito, 2016, Driver assistance system with a dual control scheme: Effectiveness of identifying driver drowsiness and preventing lane departure accidents, IEEE Trans. Human-Mach. Syst., 46, 660, 10.1109/THMS.2016.2549032

Sanchez-Lopez, 2016, FuSeOn: a low-cost portable multi sensor fusion research testbed for robotics, 57

Sasiadek, 2003, Low cost automation using ins/gps data fusion for accurate positioning, Robotica, 21, 255, 10.1017/S0263574702004757

Satzoda, 2015, Drive analysis using vehicle dynamics and vision-based lane semantics, IEEE Trans. Intell. Transport. Syst., 16, 9, 10.1109/TITS.2014.2331259

Saust, F., Wille, J.M., Lichte, B., Maurer, M., Jun 2011. Autonomous vehicle guidance on braunschweig’s inner ring road within the stadtpilot project. In: 2011 IEEE Intelligent Vehicles Symp. (IV). pp. 169–174.

Schamm, 2010, On-road vehicle detection during dusk and at night, 418

Schipper, 2015, Simulative prediction of the interference potential between radars in common road scenarios, IEEE Trans. Electromagn. Compat., 57, 322, 10.1109/TEMC.2014.2384996

Schnelle, 2016, A personalizable driver steering model capable of predicting driver behaviors in vehicle collision avoidance maneuvers, IEEE Trans. Human-Mach. Syst., 1

Schoettle, B., Sivak, M., 2014. A survey of public opinion about autonomous and self-driving vehicles in the US, the UK, and Australia. No. 21 in UMTRI-2014. The University of Michigan Sustainable Worldwide Transportation. <https://deepblue.lib.umich.edu/handle/2027.42/108384>.

Seif, H.G., Hu, X., Jun. 2016. Autonomous Driving in the iCityHD Maps as a Key Challenge of the Automotive Industry. Engineering 2 (2), 159–162. <http://www.sciencedirect.com/science/article/pii/S2095809916309432>.

Sherman, D., Feb 2016. Semi-autonomous cars compared! tesla model s vs. bmw 750i, infiniti q50s, and mercedes-benz s65 amg. URL http://www.caranddriver.com/features/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature.

Shrivastava, A., Rajamani, R., 2001. Fault diagnostics for gps-based lateral vehicle control. In: Proc. 2001 American Control Conf. Vol. 1. pp. 31–36 vol.1.

Simani, S., 2003. Model-based fault diagnosis in dynamic systems using identification techniques. Ph.D. thesis, University of Modena and Reggio Emilia. <http://silviosimani.it/thesis.pdf>.

Singh, S., Feb 2015. Critical reasons for crashes investigated in the national motor vehicle crash causation survey. <https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812115>.

Sivaraman, 2013, Looking at vehicles on the road: a survey of vision-based vehicle detection, tracking, and behavior analysis, IEEE Trans. Intell. Transport. Syst., 14, 1773, 10.1109/TITS.2013.2266661

Sorokanich, R., 2014. 6 simple things google’s self-driving car still can’t handle. <http://gizmodo.com/6-simple-things-googles-self-driving-car-still-cant-han-1628040470>.

Souppouris, A., Dec 2014. Riding in audi’s 150mph self-driving rs 7, the anti-google car. <https://www.engadget.com/2014/12/18/audi-self-driving-rs-7-concept-test-drive/>.

Spears, J., Lutin, J.M., Wang, Y., Ke, R., Clancy, S.M., May 2017. Active Safety-Collision Warning Pilot in Washington State. Tech. Rep. Transit IDEA Project 82, Transportation Research Board. <https://trid.trb.org/view.aspx?id=1480393>.

Stanton, 2007, The psychology of driving automation: a discussion with professor don norman, Int. J. (Wash.) Vehic. Des., 45, 289, 10.1504/IJVD.2007.014906

Stepan, 2005, Robust data fusion with occupancy grid, IEEE Trans. Syst., Man, Cybernet., Part C (Appl. Rev.), 35, 106, 10.1109/TSMCC.2004.840048

Stewart, J., May 2016. $30k retrofit turns dumb semis into self-driving robots. <https://www.wired.com/2016/05/otto-retrofit-autonomous-self-driving-trucks/>.

Stoklosa, A., Feb 2015. Volvo has a “production-viable” autonomous car, will put it on the road by 2017. <http://blog.caranddriver.com/volvo-has-a-production-viable-autonomous-car-will-put-it-on-the-road-by-2017/>.

Stuff, A., 2017. ibeo lux fusion system — lidar — product. <https://autonomoustuff.com/product/ibeo-lux-fusion-system/>.

Suhr, 2017, Sensor fusion-based low-cost vehicle localization system for complex urban environments, IEEE Trans. Intell. Transport. Syst., 18, 1078, 10.1109/TITS.2016.2595618

Sun, 2006, On-road vehicle detection: a review, IEEE Trans. Pattern Anal. Mach. Intell., 28, 694, 10.1109/TPAMI.2006.104

Tesla, Jun 2016. A tragic loss. <https://www.tesla.com/en_CA/blog/tragic-loss>.

Thorpe, 1988, Vision and navigation for the carnegie-mellon navlab, IEEE Trans. Pattern Anal. Mach. Intell., 10, 362, 10.1109/34.3900

Tingwall, E., 2013. Sensory overload: How the new mercedes s-class sees all. Car and Driver. <http://blog.caranddriver.com/sensory-overload-how-the-new-mercedes-s-class-sees-all/>.

Ulrich, 2014, Top ten tech cars, IEEE Spectr., 51, 38, 10.1109/MSPEC.2014.6776304

Urmson, 2008, Autonomous driving in urban environments: boss and the urban challenge, J. Field Rob., 25, 425, 10.1002/rob.20255

Urmson, C., Bagnell, J.A., Baker, C.R., Hebert, M., Kelly, A., Rajkumar, R., Rybski, P.E., Scherer, S., Simmons, R., Singh, S., Apr 2007. Tartan racing: A multi-modal approach to the DARPA Urban Challenge. Carnegie Mellon University. <http://repository.cmu.edu/robotics/967/>.

Vagia, 2016, A literature review on the levels of automation during the years. What are the different taxonomies that have been proposed?, Appl. Ergon., 53, 190, 10.1016/j.apergo.2015.09.013

van Nunen, E., Koch, R., Elshof, L., Krosse, B., Oct 2016. Sensor safety for the european truck platooning challenge. In: Proc. 23rd ITS World Congr. pp. 1–12. <https://www.researchgate.net/profile/Ellen_Nunen/publication/311716444_Sensor_Safety_for_the_European_Truck_Platooning_Challenge/links/58579b6008aeff086bfd0faa/Sensor-Safety-for-the-European-Truck-Platooning-Challenge.pdf>.

van Nunen, 2012, Cooperative competition for future mobility, IEEE Trans. Intell. Transport. Syst., 13, 1018, 10.1109/TITS.2012.2200475

Vanderbilt, T., Jan 2012. Let the robot drive: The autonomous car of the future is here. WIRED. <http://www.wired.com/2012/01/ff_autonomouscars/>.

Vandezande, L., Sep 2013. Lexus drops night vision tech. <http://www.autoguide.com/auto-news/2013/09/lexus-drops-night-vision-tech.html>.

Vanholme, B., Jun 2012. Highly automated driving on highways based on legal safety. PhD thesis, University of Evry-Val-d’Essonne.

Vanholme, 2013, Highly automated driving on highways based on legal safety, IEEE Trans. Intell. Transport. Syst., 14, 333, 10.1109/TITS.2012.2225104

Vincent, J., Aug 2016. Ford and baidu invest $150 million in lidar technology for autonomous cars. <http://www.theverge.com/2016/8/16/12499622/ford-baidu-velodyne-investment>.

VisLab, 2010. Braive external and internal equipment. <http://www.braive.vislab.it/equipment.php>.

Viswanathan, 2017, Applications of image processing and real-time embedded systems in autonomous cars: a short review, Int. J. Image Process. (IJIP), 11, 35

Volvo, 2016. Pilot assist∗. <http://support.volvocars.com/uk/cars/Pages/owners-manual.aspx?mc=v526&my=2016&sw=15w46&article=548956727ac6edfbc0a80151522a4edc>.

Walton, M., May 2004. Robots fail to complete Grand Challenge. CNN. <http://www.cnn.com/2004/TECH/ptech/03/14/darpa.race/index.html>.

Wang, M., Ganjineh, T., Rojas, R., 2011a. Action annotated trajectory generation for autonomous maneuvers on structured road networks. In: 2011 5th Int. Conf. Automation, Robotics and Applications (ICARA 2011). pp. 1–6.

Wang, T., Xin, J., Zheng, N., Aug 2011b. A method integrating human visual attention and consciousness of radar and vision fusion for autonomous vehicle navigation. In: 2011 IEEE 4th Int. Conf. Space Mission Challenges Information Technology (SMC-IT). pp. 192–197.

Windridge, 2013, Characterizing driver intention via hierarchical perception-action modeling, IEEE Trans. Human-Mach. Syst., 43, 17, 10.1109/TSMCA.2012.2216868

Wolf, D., Sukhatme, G.S., Apr 2004. Online simultaneous localization and mapping in dynamic environments. In: Proc. 2004 IEEE Int. Conf. Robotics and Automation. Vol. 2. pp. 1301–1307 Vol.2.

Wu, 2012, Applying a functional neurofuzzy network to real-time lane detection and front-vehicle distance measurement, IEEE Trans. Syst., Man, Cybernet., Part C (Appl. Rev.), 42, 577, 10.1109/TSMCC.2011.2166067

Wu, 2013, Vehicle localization using road markings, 1185

Xiong, X., Wang, J., Zhang, F., Li, K., 2016. Combining deep reinforcement learning and safety based control for autonomous driving. arXiv preprint arXiv:1612.00147. <https://arxiv.org/abs/1612.00147>.

Zhou, E., 2016. Nested models and nonparametric lstms in vision-based autonomous driving and developing an r package for bayesian-optimized deep learning. Ph.D. thesis, Princeton University. <http://csml.princeton.edu/sites/csml/files/resource-links/zhou_eddie_final_thesis.pdf>.

Zhou, 2014, A new minimal solution for the extrinsic calibration of a 2d LIDAR and a Camera using three plane-line correspondences, IEEE Sens. J., 14, 442, 10.1109/JSEN.2013.2284789

Zhu, 2017, Overview of environment perception for intelligent vehicles, IEEE Trans. Intell. Transport. Syst., 1

Ziegler, 2014, Making bertha drive - an autonomous journey on a historic route, IEEE Intell. Transport. Syst. Magaz., 6, 8, 10.1109/MITS.2014.2306552