Navigation
A navigation system was developed for a more flexible robot in terms of adaptability, since using a real map can help get more precision compared to a system that uses unitary movements.
The implementation was done using the navigation stack provided by ROS. To get the setup running there are some requirements that need to be met, such as the sensor sources, the transform configuration, the odometry information, the base controller, the map, the robot model, and the navigation stack itself.
Sensor sources
The sensor sources are the sensors that the robot uses to get information about its surroundings. In this case, the robot uses a 2D LiDAR. In this case, a LD06 LiDAR was used initially, but due to the noise in the readings, it was changed to a YDLIDAR Tmini Pro. To use the LiDAR, each one had its own ROS package, which was used to get the information from the sensor and publish it to a topic. The packages used were:
Each package has its own launch file, which is used to launch the node that publishes the information to the topic /scan
. The launch files used were:
Another sensor used was the IMU, which was used to get the orientation of the robot. The IMU used was the BNO055, which wass used with the package bno055. The launch file used to launch the node that publishes the information to the topic /imu/data
was:
Map
The map was created using the Hector SLAM package, which is a SLAM algorithm that can be used without odometry. The map was created using a 2D LiDAR. To launch the mapping process, the following command was used:
This command launches the mapping process and the RViz visualization. The mapping process is done by moving the robot around the environment, so that the LiDAR can get information about the environment.
Planning
The planning is done using the navigation stack provided by ROS. Once setup, the navigation stack receives information from different sources, as well as different goals from the algorithm. Upon receiving a goal, the navigation stack plans a path to the goal and calculates the velocity commands to send to the robot. In this case, the planner used was the dwa_local_planner, which is a local planner that uses the Dynamic Window Approach to calculate the velocity commands. The velocity commands are calculated using the information from the sensor sources, the odometry information, the map, and the robot model.
The velocity commands were initially published to the /cmd_vel
topic, which is used by the base controller to move the robot. However, in order to utilize the limit switches, the velocity commands were published to another topic, and another node was created to listen to the velocity commands from the planner, as well as the velocity commands published by the algorithm (recovery situations), and published to the /cmd_vel
topic prioritizing the recovery situations. The node used to do this was the mux_cmd_vel custom node.
Sending goals
The goals are sent to the navigation stack using the move_base package. In this case the goals were limited to the goals sent by the algorithm, being only 90 degree turns and 30 cm movements forward or backward. In order to send accurate goals, a custom transform was used, which is used to represent the ideal position of the robot at any given moment, compensating for incaccuracies in the robot's translational and rotational movement.
This transform was calculated by using the IMU yaw data as well as the localization_grid data and was published by a transform broadcaster.
- IMU data: Stored when the robot is initialized, and used to calculate the angle of each cardinal direction, which are then used to update the transform.
- Localization grid: Used to get the distance from the robot to the center of the current tile. Used to update the transform to send goals from the center of the tile.
Problems
Multiple problems were encountered during the development of the navigation system. The main problem was the noise in the LiDAR readings, which caused the robot to generate wrong maps, making it impossible to navigate. This led to trying different LiDARs, as well as implementing filters for the LiDAR readings. The filters used included a custom script that reduced the laser scan readings by a given factor, allowing a faster processing and a consistent number of points per scan, as well as the laser_filters package, which was used to filter the laser scan readings that were too close to the robot, as well as the readings that were too far from the robot.