A mobile robot for autonomous exploration and obstacle avoidance in urban search and rescue environments using the Hector software stack.
A mobile robot for autonomous exploration and obstacle avoidance in urban search and rescue environments using the Hector software stack. This robot uses only one external sensor (a lidar sensor) and requires no odometry source (such as wheel encoders). However, it can only be used on flat surfaces (unless and IMU is incorporated into the system).
To download all the relevant code used in the implementation of this mobile robot platform, you must first clone this github repository. You can do that with the command git clone https://github.com/eteskeredzic/ETFmobileRobot
After that, in the folder labled src you can find all the relevant code. The folder is divided into subfolders, as follows:
After building the workspace, you can run all relevant parts of the system. In the main folder of the workspace, there are multiple BASH scripts, which can be used for automatic launch. To launch the robot, simply run the script 0.sh
through the terminal. Alternatively, if you want to run the system manually, you need to launch the following commands (in seperate terminals):
roslaunch rplidar_ros rplidar.launch
roslaunch hector_slam tutorial.launch
roslaunch nav.launch
rosrun hector_exploration_controller simple_exploration_controller
rosrun MotorController newController.py
Don’t forget to ‘source devel/setup.bash’.
To launch the system for manual control of the robot, use the command rosrun key_teleop key_teleop.py
Multiple parameters are available for modificitaion, with the most important being:
exploration_plan_generation_timer_
and cmd_vel_generator_timer_
;tolerance_trans
(allowed error for linear movement), tolerance_rot
(allowed error for angular movement), max_vel_lin
(maximum linear velocity), max_vel_th
(maximum angular velocity), min_vel_lin
(minimum linear velocity), min_vel_th
(minimum angular velocity);map_size
on values no greater than 1024. The map update parameters should be setup in accordance to the specifications of the lidar sensor you are using;
First floor of the Faculty of Electrical Engineering in Sarajevo
We also wrote a paper on this. Read it here: Low cost UGV platform for autonomous 2D navigation and map-building based on a single sensory input
I would like to thank my good friends an colleagues Adnan Arnautovic, Amar Bico, Damad Butkovic, and Kenan Karahodzic, in assisting the process of constructing, testing, and debugging this mobile robot platform.