Obstacle Detection with Differences of Normals in Unorganized Point Clouds for Mobile Robotics

Edison P. Velasco-Sánchez1, Alejandro Olivas1, Luis F. Recalde2, Bryan S. Guevara 3, Francisco A. Candelas1,
1 Group of Automation, Robotics and Computer Vision (AUROVA), University of Alicante
2 Worcester Polytechnic Institute, Robotics Engineering
3 Instituto de Automática (INAUT), Universidad Nacional de San Juan-CONICET, San Juan J5400, Argentina.

Video Presentation

Abstract

In mobile robotics, there is an increasing need for algorithms that accurately identify in real-time the environment in which a robot is operating, especially when these environments are unstructured. Thus, identifying a safe navigation path is a critical aspect to ensure the safety and smooth operation of autonomous robots. In this paper, we present an algorithm for mobile robotics that identifies potential obstacles in an unstructured environment using the Difference of Normals in the point cloud generated by a 3D LiDAR sensor. The aim of our algorithm is to detect obstacles from point clouds with a fast, low-complexity approach that is specifically designed for autonomous driving applications. Our method has been shown to identify obstacles in real-time and differs from the rest of the state-of-the-art by accurately distinguishing obstacles such as potholes and sidewalks from sloping and unevenness terrain. The algorithm has been successfully tested on an ackermann robot equipped with a 128-layer Ouster OS1 LiDAR sensor. The processing time of our system is 52 ms.

Pipeline of the obstacle filtering method using the difference of normals in point clouds.

BibTeX


        @inproceedings{velasco2023obstacle,
          title={Obstacle Detection with Differences of Normals in Unorganized Point Clouds for Mobile Robotics},
          author={Velasco-S{\'a}nchez, Edison and Olivas, Alejandro and Recalde, Luis F and Guevara, Bryan S and Candelas, Francisco A},
          booktitle={2023 IEEE Seventh Ecuador Technical Chapters Meeting (ECTM)},
          pages={1--5},
          year={2023},
          organization={IEEE}
        }