ROS 2-Based Autonomous Navigation for Agricultural Robots: Sensor Integration and Processing for Traversable Terrain Segmentation
Author
Sani, Ettore <1999>
Date
2023-10-26Data available
2023-11-02Abstract
The field of autonomous robotics has made significant advancements in indoor environments, but the transition to outdoor settings introduces new challenges that require domain-specific solutions.
This thesis addresses a challenge encountered during outdoor robot navigation in agricultural environments, given by the presence of overcomable obstacles, such as tall grass and weeds.
While these are safe for the robot to traverse, LiDAR sensors may mistakenly classify them as obstacles and thus force the robot to take longer paths to avoid them or even abort navigation altogether.
Our methodology was implemented and validated on the Clearpath Husky platform, equipped with a comprehensive sensor suite, including odometry, IMU, LiDAR, GPS, and an RGBD camera.
The integration of this sensor setup with the ROS 2 framework, including essential components like the robot localization package and the Nav2 system, empowers the robot with the capability to execute precise GPS waypoint navigation.
This work presents an innovative solution to address this challenge and enhance outdoor robot navigation in agricultural settings.
We integrated real-time computer vision techniques with the ROS 2 Navigation framework, creating a custom pipeline that significantly improves the robot's perception and navigational capabilities.
Notably, we leveraged the YOLOv8 neural network, trained on a custom dataset comprising over 1000 manually labeled images, to enable the robot to discern traversable areas from obstacles in real time.
This research contributes to the growing field of outdoor robotics and extends the applicability of autonomous robots in agriculture and similar domains.
Our approach is open-source and adaptable, offering a valuable resource for the robotics community to address real-world challenges in outdoor navigation. The field of autonomous robotics has made significant advancements in indoor environments, but the transition to outdoor settings introduces new challenges that require domain-specific solutions.
This thesis addresses a challenge encountered during outdoor robot navigation in agricultural environments, given by the presence of overcomable obstacles, such as tall grass and weeds.
While these are safe for the robot to traverse, LiDAR sensors may mistakenly classify them as obstacles and thus force the robot to take longer paths to avoid them or even abort navigation altogether.
Our methodology was implemented and validated on the Clearpath Husky platform, equipped with a comprehensive sensor suite, including odometry, IMU, LiDAR, GPS, and an RGBD camera.
The integration of this sensor setup with the ROS 2 framework, including essential components like the robot localization package and the Nav2 system, empowers the robot with the capability to execute precise GPS waypoint navigation.
This work presents an innovative solution to address this challenge and enhance outdoor robot navigation in agricultural settings.
We integrated real-time computer vision techniques with the ROS 2 Navigation framework, creating a custom pipeline that significantly improves the robot's perception and navigational capabilities.
Notably, we leveraged the YOLOv8 neural network, trained on a custom dataset comprising over 1000 manually labeled images, to enable the robot to discern traversable areas from obstacles in real time.
This research contributes to the growing field of outdoor robotics and extends the applicability of autonomous robots in agriculture and similar domains.
Our approach is open-source and adaptable, offering a valuable resource for the robotics community to address real-world challenges in outdoor navigation.
Type
info:eu-repo/semantics/masterThesisCollections
- Laurea Magistrale [4811]