Two-stage multi-sensor fusion positioning system with seamless switching for cooperative mobile robot and manipulator system
Tóm tắt
The stoppage of a mobile platform is generally scheduled to feed parts for machines on production lines, such as fenced industrial robotic manipulators. A non-stop mobile robotic part feeding system can contribute to production efficiency and flexibility but contains several challenging tasks. For example, the industrial robotic manipulator needs to perceive the positions of the mobile robot accurately and robustly before grasping the supplies when the mobile robot moves around. Thus, based on the relative distance between the two robots, an interaction mode of the integrated robotic system consisting of a fixed robotic manipulator and a mobile robot is developed for robotic interaction. In order to accurately and robustly perceive the positions of a mobile robot, two different positioning approaches for the robotic manipulator positioning mobile robot in an indoor environment are utilised. One approach is ultrasonic sensors fused with inertia measurement units (IMU) by extended Kalman filter (EKF). Furthermore, an outlier rejection mechanism is implemented to escape outliers from ultrasonic measurement. Another positioning approach is achieved by detecting an ArUco marker with visual sensor. Lastly, a positioning switching strategy according to the visual sensor state allows the robotic manipulator to reposition the mobile robot seamlessly. According to the static experiments, EKF-based positioning approach fusing IMU with ultrasonic sensor can export high-accuracy (the root mean square error is 0.04 m) and high-precision (the standard deviation is 0.0033 m) in positioning while keeping a high update frequency of 181.9 HZ in static positioning. Evaluations through dynamic experiments demonstrate that the proposed positioning system can suppress the positioning drifts over time in comparison with wheel encoder-based positioning method. The two-stage repositioning strategy can support the robotic manipulator to identify the positions of the mobile robot robustly, even in the case when the visual sensor is occluded.
Từ khóa
Tài liệu tham khảo
Akella, S., Huang, W.H., Lynch, K.M., Mason, M.T.: Parts feeding on a conveyor with a one joint robot. Algorithmica 26(3), 313–344 (2000)
Alatise, M.B., Hancke, G.P.: Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended Kalman filter. Sensors 17(10), 2164 (2017)
Allen, P.K., Timcenko, A., Yoshimi, B., Michelman, P.: Automated tracking and grasping of a moving object with a robotic hand-eye system. IEEE Trans. Robot. Autom. 9(2), 152–165 (1993)
Andersen, R.E., Hansen, E.B., Cerny, D., Madsen, S., Pulendralingam, B., Bøgh, S., Chrysostomou, D.: Integration of a skill-based collaborative mobile robot in a smart cyber-physical environment. Procedia Manuf. 11, 114–123 (2017)
Babinec, A., Jurišica, L., Hubinskỳ, P., Duchoň, F.: Visual localization of mobile robot using artificial markers. Procedia Engineering 96, 1–9 (2014)
Ben-Afia, A., Deambrogio, L., Salós, D., Escher, A.C., Macabiau, C., Soulier, L., Gay-Bellile, V.: Review and classification of vision-based localisation techniques in unknown environments. IET Radar, Sonar & Navigation 8(9), 1059–1072 (2014)
Bøgh, S., Hvilshøj, M., Kristiansen, M., Madsen, O.: Identifying and evaluating suitable tasks for autonomous industrial mobile manipulators (aimm). The International Journal of Advanced Manufacturing Technology 61(5), 713–726 (2012)
Canny, J.: A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 6, 679–698 (1986)
Carlisle, B., Goldberg, K., Rao, A., Wiegley, J.: A pivoting gripper for feeding industrial parts. In: Proceedings of the 1994 IEEE International Conference on Robotics and Automation, IEEE, pp. 1650–1655. (1994)
Causey, G.C., Quinn, R.D., Barendt, N.A., Sargent, D.M., Newman, W.S.: Design of a flexible parts feeding system. In: Proceedings of International Conference on Robotics and Automation, vol. 2, IEEE, pp. 1235–1240. (1997)
Chang, C.H., Wang, S.C., Wang, C.C.: Exploiting moving objects: multi-robot simultaneous localization and tracking. IEEE Trans. Autom. Sci. Eng. 13(2), 810–827 (2015)
Chen, X., Xu, Y., Li, Q., Tang, J., Shen, C.: Improving ultrasonic-based seamless navigation for indoor mobile robots utilizing EKF and LS-SVM. Measurement 92, 243–251 (2016)
Coelho, F.O., Carvalho, J.P., Pinto, M.F., Marcato, A.L.: EKF and computer vision for mobile robot localization. In: 2018 13th APCA International Conference on Automatic Control and Soft Computing (CONTROLO), IEEE, pp. 148–153. (2018)
Dang, Q.V., Nielsen, I., Steger-Jensen, K., Madsen, O.: Scheduling a single mobile robot for part-feeding tasks of production lines. J. Intell. Manuf. 25(6), 1271–1287 (2014)
De Farias, C., Adjigble, M., Tamadazte, B., Stolkin, R., Marturi, N.: Dual quaternion-based visual servoing for grasping moving objects. In: 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), pp. 151–158. IEEE (2021)
Dewi, T., Uchiyama, N., Sano, S.: Service mobile robot control for tracking a moving object with collision avoidance. In: 2015 IEEE international workshop on advanced robotics and its social impacts (ARSO), IEEE, pp. 1–6. (2015)
Ding, J., Yan, Z., We, X.: High-accuracy recognition and localization of moving targets in an indoor environment using binocular stereo vision. ISPRS Int. J. Geo Inf. 10(4), 234 (2021)
Dobrev, Y., Flores, S., Vossiek, M.: Multi-modal sensor fusion for indoor mobile robot pose estimation. In: 2016 IEEE/ION Position, Location and Navigation Symposium (PLANS), IEEE, pp. 553–556. (2016)
Ebner, F., Fetzer, T., Deinzer, F., Köping, L., Grzegorzek, M.: Multi sensor 3D indoor localisation. In: 2015 international conference on indoor positioning and indoor navigation (IPIN), IEEE, pp. 1–11. (2015)
Fathi, M., Rodríguez, V., Fontes, D.B., Alvarez, M.J.: A modified particle swarm optimisation algorithm to solve the part feeding problem at assembly lines. Int. J. Prod. Res. 54(3), 878–893 (2016)
Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., Marín-Jiménez, M.J.: Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn. 47(6), 2280–2292 (2014)
Guo, Q., Chen, Z.: Neural adaptive control of single-rod electrohydraulic system with lumped uncertainty. Mech. Syst. Signal Process. 146, 106869 (2021)
Guo, Q., Zhang, Y., Celler, B.G., Su, S.W.: Neural adaptive backstepping control of a robotic manipulator with prescribed performance constraint. IEEE Transac. Neural Netw. Learn. Syst. 30(12), 3572–3583 (2018)
Han, S.D., Feng, S.W., Yu, J.: Toward fast and optimal robotic pick-and-place on a moving conveyor. IEEE Robot. Autom. Lett. 5(2), 446–453 (2019)
Hausman, K., Weiss, S., Brockers, R., Matthies, L., Sukhatme, G.S.: Self-calibrating multi-sensor fusion with probabilistic measurement validation for seamless sensor switching on a UAV. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 4289–4296. (2016)
Kaltiokallio, O., Hostettler, R., Patwari, N., Jäntti, R.: Recursive Bayesian filters for RSS-based device-free localization and tracking. In: 2018 International Conference on Indoor Positioning and Indoor Navigation (IPIN), IEEE, pp. 1–8. (2018)
Kam, H.C., Yu, Y.K., Wong, K.H.: An improvement on ArUco marker for pose tracking using Kalman filter. In: 2018 19th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), IEEE, pp. 65–69. (2018)
Leutenegger, S., Siegwart, R.Y.: A low-cost and fail-safe inertial navigation system for airplanes. In: 2012 IEEE International Conference on Robotics and Automation, IEEE, pp. 612–618. (2012)
Li, Z., Li, X., Li, Q., Su, H., Kan, Z., He, W.: Human-in-the-loop control of soft exosuits using impedance learning on different terrains. IEEE Transac. Robot. 38, 2979–2993 (2022)
Liu, C., Tomizuka, M.: Modeling and controller design of cooperative robots in workspace sharing human-robot assembly teams. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 1386–1391. (2014)
Loria, A., Dasdemir, J., Jarquin, N.A.: Leader-follower formation and tracking control of mobile robots along straight paths. IEEE Trans. Control Syst. Technol. 24(2), 727–732 (2015)
Luu, T.H., Tran, T.H.: 3D vision for mobile robot manipulator on detecting and tracking target. In: 2015 15th International Conference on Control, Automation and Systems (ICCAS), IEEE, pp. 1560–1565. (2015)
Lynen, S., Achtelik, M.W., Weiss, S., Chli, M., Siegwart, R.: A robust and modular multi-sensor fusion approach applied to MAV navigation. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 3923–3929 (2013)
Mane, S., Mangale, S.: Moving object detection and tracking using convolutional neural networks. In: 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), IEEE, pp. 1809–1813. (2018)
Marquardt, D.W.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11(2), 431–441 (1963)
Mautz, R., Tilch, S.: Survey of optical indoor positioning systems. In: 2011 International Conference on Indoor Positioning and Indoor Navigation, IEEE, pp. 1–7. (2011)
Mirtich, B., Zhuang, Y., Goldberg, K., Craig, J., Zanutta, R., Carlisle, B., Canny, J.: Estimating pose statistics for robotic part feeders. In: Proceedings of IEEE International Conference on Robotics and Automation, vol. 2, IEEE, pp. 1140–1146 (1996)
Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings 2007 IEEE International Conference on Robotics and Automation, IEEE, pp. 3565–3572. (2007)
Muszynska, M., Burghardt, A., Kurc, K., Szybicki, D.: Verification hybrid control of a wheeled mobile robot and manipulator. Open Eng. (2016). https://doi.org/10.1515/eng-2016-0007
Nomura, H., Naito, T.: Integrated visual servoing system to grasp industrial parts moving on conveyer by controlling 6 DOF arm. In: Proceedings of 2000 IEEE International Conference on Systems, Man and Cybernetics. ‘Cybernetics Evolving to Systems, Humans, Organizations, and Their Complex Interactions’(cat. no. 0, vol. 3, pp. 1768–1775). IEEE (2000)
Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 9(1), 62–66 (1979)
Papanikolopoulos, N.P., Khosla, P.K., Kanade, T.: Visual tracking of a moving target by a camera mounted on a robot: A combination of control and vision. IEEE Trans. Robot. Autom. 9(1), 14–35 (1993)
Salehian, S.S.M., Figueroa, N., Billard, A.: Coordinated multi-arm motion planning: Reaching for moving objects in the face of uncertainty. In: Robotics: Science and Systems. MIT Press (2016)
Suzuki, S., et al.: Topological structural analysis of digitized binary images by border following. Comput. Vis. Gr. Image Process. 30(1), 32–46 (1985)
Xu, Y., Yu, H., Zhang, J.: Fusion of inertial and visual information for indoor localisation. Electron. Lett. 54(13), 850–851 (2018)
Yu, X., He, W., Li, Q., Li, Y., Li, B.: Human-robot co-carrying using visual and force sensing. IEEE Trans. Industr. Electron. 68(9), 8657–8666 (2020)
Yu, X., Li, B., He, W., Feng, Y., Cheng, L., Silvestre, C.: Adaptive-constrained impedance control for human-robot co-transportation. IEEE Transac. Cybern. 52, 13237–13249 (2021)
Zabalza, J., Fei, Z., Wong, C., Yan, Y., Mineo, C., Yang, E., Rodden, T., Mehnen, J., Pham, Q.C., Ren, J.: Smart sensing and adaptive reasoning for enabling industrial robots with interactive human-robot capabilities in dynamic environments-a case study. Sensors 19(6), 1354 (2019)
Zhang, H., Jin, H., Liu, Z., Liu, Y., Zhu, Y., Zhao, J.: Real-time kinematic control for redundant manipulators in a time-varying environment: Multiple-dynamic obstacle avoidance and fast tracking of a moving object. IEEE Trans. Industr. Inf. 16(1), 28–41 (2019)
Zhang, G., He, Y., Dai, B., Gu, F., Yang, L., Han, J., Liu, G., Qi, J.: Grasp a moving target from the air: System & control of an aerial manipulator. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 1681–1687. (2018)