WORK IN PROGRESS Last Edited: 18/7/2020
The objective of this note is to document my steps in getting a Turtlebot2 with a Kobuki base localizing on a static map so that I can reference at a future date. The static map was created using the slam_toolbox package, and laser scans from a RPLidar A2 mounted on top of a Turtlebot2. I use the AMCL package for localization, as I haven’t been successful in getting the localization mode to work with the slam_toolbox package at the time of this writing. (11th/7/2020) I’ve placed all necessary launch files and robot descriptions at the following repo.
Constructing the Turtlebot 2 to run freely is not cheap as you can see from the list of hardware.
|Robot||Turtlebot2 w/ Kobuki base||$1049.00|
|Power||Portable Charger RAVPower 27000mAh 85W(100W Max)||$129.00|
- Install the necessary packages into your ROS workspace as turtlebot packages to support Turtlebot2 have not been released to Melodic and is a work in progress (22/6/2020). 
sudo apt install ros-melodic-laptop-battery-monitor sudo apt install ros-melodic-openni2-camera sudo apt install ros-melodic-openni2-launch sudo apt install ros-melodic-gmapping sudo apt install ros-melodic-yocs-msgs sudo apt install ros-melodic-yujin-ocs mkdir -p ros_ws/src cd ros_ws/src git clone https://github.com/turtlebot/turtlebot.git cd turtlebot git clone https://github.com/turtlebot/turtlebot_interactions.git git clone https://github.com/turtlebot/turtlebot_apps git clone https://github.com/turtlebot/turtlebot_msgs cd ~/row_ws/src git clone --single-branch --branch melodic https://github.com/yujinrobot/kobuki.git ### To work with this guide clone the turtlebot2_lidar repo. git clone https://github.com/surfertas/turtlebot2_lidar.git
- If you take a look at the
turtlebot2_lidarrepo, you can see that I have created a URDF representation of the lidar device and placed the file in the
urdf/sensorsdirectory, e.g. in this case the file name is
rplidar_a2.urdf.xacro. The specifications of the RPLidar A2 were taken from the [device data sheet]. Add the device to the robot description. You can use Onshape to design a mount or feel free to use mine for the RPLidar A2. The Turtlebot data sheet comes in handy as well. 
Steps to create a map
- Place your turtlebot on the power dock. It is recommended that the dock is fixed to a specific location free from obstruction. Boot up the NUC powered by the portable charger.
- You will need to ssh from 4 panels, for 1) bringup 2) slam_toolbox 3) rviz 4) teleop. For panel 3) add the argument
-Xwhen using ssh to allow for forwarding.  Forwarding will be needed when using RVIZ from your remote pc.
- Source the necessary environment from your working directory. Run
source /opt/ros/melodic/setup.bash && source devel/setup.bash.
- Next on panel 1), check which port the lidar is connected to by executing the following command
ls -l /dev | grep ttyUSB. In my case, the lidar was required to be connected to
ttyUSB1was being used by the connection to the Turtlebot. This paramater is set as a environment variable when your run the environment script. Once the NUC is powered on, unplug any USB cables, and make sure that the lidar is connected first, so
ttyUSB0is assigned to the lidar.
- Change permissions of the port by executing
sudo chmod 666 /dev/ttyUSB0
roslaunch turtlebot2_lidar bringup_minimal.launchin panel 1). At this point you should see the lidar start to spin.
- On panel 2) run
roslaunch turtlbot2_lidar slam_toolbox_lidar.launch. I had to add a
static_transform_publisherto the launch file publish a transform from the
laser. Note that the
base_laser_linkwas defined in
- On panel 3) start up RVIZ by running
roslaunch turtlebot_rviz_launchers view_navigation.launchwhich will kick off RVIZ on your remote laptop. You should see a top down visual of the Turtlebot with scans picking up some nearby obstacles. Get the slam_toolbox panel open in rviz by selecting from the top left menu: Panels->Add New Panel-> slam_toolbox->SlamToolboxPlugin.
- Finally on panel 4) run
roslaunch turtlebot_teleop ps3_teleop.launch. Make sure that the ps3 controller has been synced with the NUC. Steps to sync can be found here if you are having trouble. 
- You are ready to create a map! Teleop around and continue until you are satisfied with the generated map.
- To save the map we need to start by inputting
save mapand also for
serialization(to be used with slam_toolbox localization mode). Note that the name doesn’t have the file type, just the prefix. (e.g. if you want files like
<path_to_directory>/map). If no path is specified, the files will be placed in
- Now you should have the necessary files needed for localization.
Steps to localize using the new created map and the AMCL package
- Open a terminal panel and ssh into the NUC. Go to the root of the workspace directory and source the ROS environment.
source /opt/ros/melodic/setup.bash && source devel/setup.bash
- Set the environment variable
turtlebot_env.shwith the path to the
.yamlfile that was created in the previous map creation process. Execute the bash file to update necessary environment variables.
bringup_minimal.launchto boot up the physical turtlebot. Make sure that the
TURTLEBOT_SERIAL_PORTis set to
/dev/ttyUSB1or whatever port the turtlebot is connected to.
- Launch RVIZ. Follow the aforementioned steps from launching RVIZ when creating a map.
- Before launching the AMCL launch file, set the argument defaults for
initial_pose_awith the position data obtained from the starting position in the mapping exercise (used the dock in my case).
rplidar_a2_amcl.launch. A couple things to point out here. First, I am using the default
amcl.launch.xmlfile. This hasn’t been tuned for the RPLidar A2 specifically. Secondly, I’ve added the
static_transform_publisherto define the transform between
laserhere as well, in addition to defining a parameter named
frame_idwith value set to
mapwhich is passed to the
map_server. Without these 2 steps I was getting transform errors. Use
rosrun rqt_tf_tree rqt_tf_treeto see if correct transforms are being published. In this case you the expected transforms are
- If the laser scan isn’t really lining up with the map as you would expect, use
2D Pose Estimateto publish a pose estimation. Repeat until the scan matches as expected.
- Once the scans match up, you are ready to navigate. In RVIZ, use 2D Nav Goal to set a target goal and watch your turtlebot go.
Localization with slam_toolbox (INCOMPLETE)
- Repeat steps 1.-4. of localization with AMCL package.
- We need to modify the params yaml file that is passed to the
slam_toolboxpackage. The original config dir is located at
slam_toolbox/config. I moved the config directory under
turltebot2_lidarfor easy modification. The file that needs to be modified is
- Change the
modeparameter to localization from mapping. Further you can set the
map_start_at_dock. Note for
map_file_namedon’t include the file type. (e.g. /map.posegraph would just be /map)
TODO: debug Failed to compute laser pose, aborting continue mapping (“laser” passed to lookupTransform argument source_frame does not exist. ) Tried publishing the transform via different launch file but still doesnt work but Removing the / from /laser in the static_transform_publish made it so laser was recognized, but now getting different issue Failed to compute laser pose, aborting continue mapping (Lookup would require extrapolation at time 1593875399.670327337, but only time 1593875399.833934289 is in the buffer, when looking up transform from frame [laser] to frame [base_footprint])