Autonomous Obstacle Crossing Strategies for the Hybrid Wheeled/Legged Robot CENTAURO
View/ Open
Author
De Luca, Alessio <1996>
Date
2020-10-27Data available
2020-11-05Abstract
The advancement of humanoid robots in the last decades, introduced the possibility to let robots perform tasks originally designed for humans. The applications involve surgery, health care or situations that can be dangerous for a human being like working in half-destroyed areas.
Here we will present the development of the modules used to perceive the environment and to overcome obstacles on the terrain via stepping on and stepping over. The ability to use the perception system to perform autonomously a crossing task was novel for CENTAURO robot.
We will start analyzing the perception which is an important aspect that a robot needs to have in order to understand which are the elements around it, so that it can navigate avoiding obstacles and interacting with the objects in the scene.
In this project we worked with a 3D LiDAR, to obtain a representation of the environment as a point cloud. The developed perception module uses the Point Cloud Library to process the points representing the scene, since PCL provides state-of-the-art algorithms to perform filtering, segmentation and normal estimation in a point cloud. In particular, vertical and horizontal planar segmentation have been carried out to obtain information about the walls and the floor location. This information is used to navigate in the room towards an obstacle and stopping before crashing on it.
The second module developed in this project is related to the motion control that allows the robot to overcome the perceived obstacle.
Based on the dimension of the obstacle, assumed with a rectangular shape, we perform a simple feasibility analysis to understand how it can be crossed: via stepping over or stepping on. Following we proceed with the stepping procedure, executing the methods implemented that can adapt the length of the steps based on the depth and height of the obstacle we are facing.
Both modules were tested at the beginning in the simulation environment, Gazebo, then on the real robot CENTAURO. The advancement of humanoid robots in the last decades, introduced the possibility to let robots perform tasks originally designed for humans. The applications involve surgery, health care or situations that can be dangerous for a human being like working in half-destroyed areas.
Here we will present the development of the modules used to perceive the environment and to overcome obstacles on the terrain via stepping on and stepping over. The ability to use the perception system to perform autonomously a crossing task was novel for CENTAURO robot.
We will start analyzing the perception which is an important aspect that a robot needs to have in order to understand which are the elements around it, so that it can navigate avoiding obstacles and interacting with the objects in the scene.
In this project we worked with a 3D LiDAR, to obtain a representation of the environment as a point cloud. The developed perception module uses the Point Cloud Library to process the points representing the scene, since PCL provides state-of-the-art algorithms to perform filtering, segmentation and normal estimation in a point cloud. In particular, vertical and horizontal planar segmentation have been carried out to obtain information about the walls and the floor location. This information is used to navigate in the room towards an obstacle and stopping before crashing on it.
The second module developed in this project is related to the motion control that allows the robot to overcome the perceived obstacle.
Based on the dimension of the obstacle, assumed with a rectangular shape, we perform a simple feasibility analysis to understand how it can be crossed: via stepping over or stepping on. Following we proceed with the stepping procedure, executing the methods implemented that can adapt the length of the steps based on the depth and height of the obstacle we are facing.
Both modules were tested at the beginning in the simulation environment, Gazebo, then on the real robot CENTAURO.
Type
info:eu-repo/semantics/masterThesisCollections
- Laurea Magistrale [4954]