Articles

Hopping Rover Navigation Method for Rugged Environments

Published:
December 23, 2019
Authors
View
Keywords
License

Copyright (c) 2019 by the authors

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

How To Cite
Selected Style: APA
Kovacs, G., Kunii, Y., & Hashimoto, H. (2019). Hopping Rover Navigation Method for Rugged Environments. Recent Innovations in Mechatronics, 6(1), 1-6. https://doi.org/10.17667/riim.2019.1/9.
Abstract

In this paper a navigation method is presented for space exploration robots using hopping motion in environments with large elevation differences. A monocular camera system is used to reconstruct the flight trajectory and environment around the robot using Structure from Motion while traveling. The created environmental point cloud is projected to 2D to create a variable resolution image and image processing is used to find the most suitable position for the next landing based on normals with the help of gradient maps and error estimation. The method is evaluated in a simulation environment against the previously used protrusion based method to show that the proposed system can extend the operation of the robot to terrains with large elevation differences while still successfully avoid obstacles and dangerous areas.

References
  1. H. Yabuta, “Arrival, touchdown and sequel to the voyage of hayabusa2”, Nature Astronomy, vol.3., 2019, pp.287.
  2. K. Yoshikawa et al., “A new mechanism of smart jumping robot for lunar or planetary satellites exploration”, IEEE Aerospace Conference, 2017 p. 1–9.
  3. L. Raura, A. Warren and J. Thangavelautham, ”Spherical planetary robot for rugged terrain traversal,” 2017 IEEE Aerospace Conference, Big Sky, MT, 2017, pp. 1-10.
  4. K. Otsu, T. Maeda, M. Otsuki and T. Kubota, “A Study on Monocular Visual Odometry using Parabolic Motion Constraints”, The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)–in Japanese, 2016. 2A2-17b5.
  5. E. So, T. Yoshimitsu and T. Kubota, “Visual Odometry for a Hopping Rover on an Asteroid Surface using Multiple Monocular Cameras”, Advanced Robotics, 2011, pp. 893-921.
  6. P. Baranyi, I. Nagy, P. Korondi and H. Hashimoto, ”General guiding model for mobile robots and its complexity reduced neuro-fuzzy approximation”, IEEE International Conference on Fuzzy Systems, San Antonio, TX, USA, 2000, pp. 1029-1032 vol.2.
  7. B. Kovacs, G. Szayer, F. Tajti, M. Burdelis and P. Korondi, ”A novel potential field method for path planning of mobile robots by adapting animal motion attributes”, Robotics and Autonomous Systems vol. 82, 2016, pp. 24-34.
  8. G. Kovacs, Y. Kunii, T. Maeda and H. Hashimoto, “Trajectory Estimation and Position Correction for Hopping Robot Navigation using Monocular Camera”, unpublished
  9. S. Bianco, G. Ciocca and D. Marelli, “Evaluating the Performance of Structure from Motion Pipelines”, Journal of Imaging, vol.4., 2018, pp. 98
  10. Y. Furukawa and J. Ponce, ”Accurate, Dense, and Robust Multiview Stereopsis,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 8, pp. 1362-1376, Aug. 2010.
  11. P. J. Besl and N. D. McKay, ”A method for registration of 3-D shapes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239-256, Feb. 1992.
  12. R. Raguram, O. Chum, M. Pollefeys, K. Matas and J.M. Frahm, “Usac: A universal frame- work for random sample consensus.” IEEE Transactions on Pattern Analysis and Machine Intelligence. 2013, vol. 29, pp. 2022– 2038.
  13. Blender, https://www.blender.org/ , Accessed 2019-10-15
Database Logos