Trajectory estimation and position correction for hopping robot navigation using monocular camera
Tóm tắt
In this paper, a navigation and environment mapping method is presented for small exploration robots that use hopping motion. While previous research about hopping rovers mostly focuses on mobility and mechanical design, the motivation for the proposed method is to provide a fully autonomous navigation system using only a monocular camera. The method accurately estimates the hopping distance and reconstruct the 3D environment using Structure from Motion, proving that a monocular system is not only feasible, but accurate and robust at the same time. The relative scale problem of the reconstructed scene and trajectory is solved by the known gravity and parabolic motion constraints. After each hop, the error in landing position is corrected by a modified Iterative Closest Point algorithm with non-overlapping part elimination. The environmental point cloud is projected onto a 2D image, that is used to find the most suitable landing position for the next hop using protrusion based obstacle detection, and navigate the robot towards the goal direction. Both virtual environment simulations and real experiments confirm the feasibility and highlight the advantages of the presented method.
Tài liệu tham khảo
Yabuta H (2019) Arrival, touchdown and sequel to the voyage of hayabusa2. Nat Astron 3:287. https://doi.org/10.1038/s41550-019-0750-y
Matsumoto R, Sato T, Maeda T, Kunii Y, Kida H (2019) Position and attitude estimation for distributed exploration of small rovers using flash light from leader agent. In: 2019 9th international conference on recent advances in space technologies (RAST), pp. 741–745. https://doi.org/10.1109/RAST.2019.8767809
Yoshikawa K, Otsuki M, Kubota T, Maeda T, Ushijima M, Watanabe S, Sakamoto K, Kunii Y, Umeda K (2017) A new mechanism of smart jumping robot for lunar or planetary satellites exploration. In: 2017 IEEE Aerospace Conference, pp 1–9. https://doi.org/10.1109/AERO.2017.7943807
Maeda T, Kunii Y, Yoshikawa K, Otsuki M, Yoshimitsu T, Kubota T (2018) Design of shoe plate for small hopping rover on loose soil. In: 2018 IEEE Aerospace Conference, pp. 1–7 https://doi.org/10.1109/AERO.2018.8396530
So EWY, Yoshimitsu T, Kubota T (2009) Hopping odometry: Motion estimation with selective vision. In: 2009 IEEE/RSJ international conference on intelligent robots and systems, pp 3808–3813. https://doi.org/10.1109/IROS.2009.5354065
So EWY, Yoshimitsu T, Kubota T (2011) Visual odometry for a hopping rover on an asteroid surface using multiple monocular cameras. Adv Robot 25(6–7):893–921. https://doi.org/10.1163/016918611X563355
So EWY, Yoshimitsu T, Kubota, T (2010) Divergent stereo visual odometry for a hopping rover on an asteroid surface. In: i-SAIRAS - International Symposium on Artificial Intelligence, Robotics and Automation in Space
Davison AJ, Reid ID, Molton ND, Stasse O (2007) Monoslam: real-time single camera slam. IEEE Trans Pattern Anal Mach Intell 29(6):1052–1067. https://doi.org/10.1109/TPAMI.2007.1049
Bianco S, Ciocca G, Marelli D (2018) Evaluating the performance of structure from motion pipelines. J Imag 4:98
Triggs B, McLauchlan P, Hartley R, Fitzgibbon A (2000) Bundle adjustment – a modern synthesis. In: VISION ALGORITHMS: THEORY AND PRACTICE, LNCS, pp. 298–375. Springer
Schönberger JL, Frahm J-M (2016) Structure-from-motion revisited. In: Conference on Computer Vision and Pattern Recognition (CVPR)
Schönberger JL, Zheng E, Pollefeys M, Frahm J-M (2016) Pixelwise view selection for unstructured multi-view stereo. In: European Conference on Computer Vision (ECCV)
Rusu RB, Cousins S (2011) 3D is here: Point Cloud Library (PCL). In: IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China
Raguram R, Chum O, Pollefeys M, Matas J, Frahm JM (2013) Usac: a universal framework for random sample consensus. IEEE Trans Pattern Anal Mach Intell 35(8):2022–2038. https://doi.org/10.1109/TPAMI.2012.257
Besl PJ, McKay ND (1992) A method for registration of 3-d shapes. IEEE Trans Pattern Anal Mach Intell 14(2):239–256. https://doi.org/10.1109/34.121791
Injarapu ASHHV, Gawre SK (2017) A survey of autonomous mobile robot path planning approaches. In: 2017 International conference on recent innovations in signal processing and embedded systems (RISE), pp 624–628. https://doi.org/10.1109/RISE.2017.8378228
Kovacs B, Szayer G, Tajti F, Burdelis M, korondi P (2016) A novel potential field method for path planning of mobile robots by adapting animal motion attributes. Robotics and autonomous systems. https://doi.org/10.1016/j.robot.2016.04.007
Kovacs G, Kunii Y, Maeda T, Hashimoto H (2019) Saliency and spatial information-based landmark selection for mobile robot navigation in natural environments. Adv Robot 33(10):520–535. https://doi.org/10.1080/01691864.2019.1602564
Quixel Megascans Library (2019) https://quixel.com/assets/rkhtq. Accessed 20 July 2019
Blender (2019) https://www.blender.org/. Accessed 20 July 2019
Craddock N (2019) Projectile addon for Blender. https://github.com/natecraddock/projectile. Accessed 20 July 2019