2d SLAM with ROS and Kinect

Introduction

So far we really did not tap much into the power of ROS. Teleoperation and dead reckoning was just a way to ‘warm up’ and get familiar with the basics of ROS. This is about to change as we are getting ready to leverage ROS’s implementation of SLAM (Simultaneous Localization and Mapping). The associated ROS package isgmapping which wraps the open source GMapping implementation that is available on OpenSlam.org. In order to use gmapping we need a source of odometry information (Ardros already provides this) and a source of depth information. Typically laser scanners are used to get the depth information. They are accurate and fast but also quite expensive and hence not a very attractive choice for hobby robotics. Instead of a laser scanner I am using the Microsoft Kinect sensor which provides rgb + d information (color video plus depth information).

Hooking up the Kinect Sensor

The following assumes that you have the package desktop-full of  ROS Diamondback installed. In addition we need to install the openni_kinect package as described at http://www.ros.org/wiki/openni_kinect. Once in place we can verify that the depth sensor information comes in. After hooking up the Kinect sensor, type the following command in a terminal:

roslaunch openni_camera openni_node.launch

Then in a different terminal:

rosrun rviz rviz

In rviz set the Fixed Frame parameter to /openni_camera under .Global Options. Then add a PointCloud2 display type and select /camera/rgb/points for its topic. You should now see a color coded depth image similar to the one below:

KinectDepth

Additional information about how the Kinect sensor is represented in ROS can be found here: http://www.ros.org/wiki/openni_camera

Faking a Laser Scanner

To use the depth image for SLAM we convert the point cloud to a faked laser scan signal by cutting a horizontal slice out of the image and using the nearest distance (closest depth) in each column. Fortunately we don’t need to code this ourselves. The pointcloud_to_laserscan package which is part of the turtlebot stack already covers this. The turtlebot stack does not come with desktop-full and needs to be installed separately.

The remainder of this blog requires revision 70 of my Ardros package. For details about how to get the Ardros source code please refer to my blog entry Ardros – Dead Reckoning but make sure that you use the newer revision.

We are now in a position to look at the laser scan output in rviz:

  • Close all open terminals
  • Open a new terminal and execute the following
roscd ardros
roslaunch ./launch/kinect_laser.launch
  • Open a new terminal and execute
rosrun rviz rviz
  • In rviz
    • Set the fixed frame under Global Options to /openni_depth_frame
    • Add the display type ‘Laser Scan’ and select the topic /scan for it.

You should now see the laser scan data as derived from Kinect’s point cloud. For the screenshot below I added a grid with a cell size 0f 0.1m to the rviz display and also included a tf (transform) display showing only the /openni_camera frame. For clarification I added the x, y, z labels after taking the screenshot.

LaserScan

The launch file ‘kinect_laser.launch‘ warrants a closer look:

<launch>
  <!-- kinect and frame ids -->
  <include file="$(find openni_camera)/launch/openni_node.launch"/>

  <!-- openni_manager -->
  <node pkg="nodelet" type="nodelet" name="openni_manager" output="screen" respawn="true" args="manager"/>

  <!-- throttling -->
  <node pkg="nodelet" type="nodelet" name="pointcloud_throttle" args="load pointcloud_to_laserscan/CloudThrottle openni_manager">
    <param name="max_rate" value="2"/>
    <remap from="cloud_in" to="/camera/depth/points"/>
    <remap from="cloud_out" to="cloud_throttled"/>
  </node>

  <!-- fake laser -->
  <node pkg="nodelet" type="nodelet" name="kinect_laser" args="load pointcloud_to_laserscan/CloudToScan openni_manager">
    <param name="output_frame_id" value="/openni_depth_frame"/>
    <remap from="cloud" to="cloud_throttled"/>
  </node>
</launch>

The first include statement brings in the openni_node launch file which brings up the Kinect camera (openni camera) and the tf transformations that link the various  camera frames (depth, rgb, etc.) together.

Then a node of type nodelet named  openni_manager is spun up. It acts as the container for other nodelets. Nodelets are like nodes but instead of running as their own process they are loaded into a parent node of type nodelet. Since multiple nodelets execute in the same process they can very efficiently share data. This is relevant when large amounts of data (like video frames) are processed. The launch file causes two nodelets to be loaded to a) throttle the rgb-d frame rate down to 2 Hertz and b) to cut a slice out of each rgb-d frame, calculate its distance column by column and assemble it as a laser scan data structure published on the topic /scan.

Both nodelets are part of the turtlebot stack. CloudToScan supports a couple of parameters that define the size and location of the slice that is used as the basis for the laser scan data. I use the default values. For details please refer to the source code.

Putting it all Together

Finally we are ready to get SLAM working. As inputs the SLAM algorithm requires odometry information and laser scan data. We provide both by using the launch file ‘ardros_configuration.launch‘.

<launch>
  <node pkg="tf" type="static_transform_publisher" name="base_to_kinect_broadcaster" args="-0.115 0 0.226 0 0 0 base_link openni_camera 100" />

  <!--we use the kinect sensor as a fake laser scanner /-->
  <include file="$(find ardros)/launch/kinect_laser.launch"/>

  <node name="arduino" pkg="ardros" type="arduino.py">
    <rosparam file="$(find ardros)/info/ardros.yaml" command="load" />
  </node>
</launch>

The content is best explained from bottom to top: We launch the arduino node which represents the robot and also launch the kinect based laser scanner as discussed above. In addition we need to publish a tf transform that relates the odometry frame base_link with the openni_camera frame. This transformation makes it possible to transform the laser scan data into the base_link frame. In the photo you can see where the Kinect sensor is located on the robot.

KinectOnRobot

Finally we need to launch slam gmapping. All this is wrapped up in the slam.launch file:

<launch>
	<include file="$(find ardros)/launch/ardros_configuration.launch"/>

	<node pkg="gmapping" type="slam_gmapping" name="slam_gmapping" output="screen">
		<!--remap from="scan" to="base_scan"/-->
		<!--param name="odom_frame" value="odom"/-->
		<param name="map_update_interval" value="2.0"/>
		<param name="maxUrange" value="6.0"/>
		<param name="sigma" value="0.05"/>
		<param name="kernelSize" value="1"/>
		<param name="lstep" value="0.05"/>
		<param name="astep" value="0.05"/>
		<param name="iterations" value="5"/>
		<param name="lsigma" value="0.075"/>
		<param name="ogain" value="3.0"/>
		<param name="lskip" value="0"/>
		<param name="srr" value="0.01"/>
		<param name="srt" value="0.02"/>
		<param name="str" value="0.01"/>
		<param name="stt" value="0.02"/>
		<param name="linearUpdate" value="0.25"/>     <!-- param name="linearUpdate" value="0.5"/-->
		<param name="angularUpdate" value="0.262"/>   <!--param name="angularUpdate" value="0.436"/-->
		<param name="temporalUpdate" value="-1.0"/>
		<param name="resampleThreshold" value="0.5"/>
		<param name="particles" value="300"/>
		<param name="xmin" value="-50.0"/>
		<param name="ymin" value="-50.0"/>
		<param name="xmax" value="50.0"/>
		<param name="ymax" value="50.0"/>
		<param name="delta" value="0.05"/>
		<param name="llsamplerange" value="0.01"/>
		<param name="llsamplestep" value="0.01"/>
		<param name="lasamplerange" value="0.005"/>
		<param name="lasamplestep" value="0.005"/>
	</node>
</launch>

The various gmapping parameters are explained at http://www.ros.org/wiki/gmapping. There is plenty of room for fine tuning but the values listed above work fairly well. I am using mainly the default values but I specifically chose a relatively high particle count to increase the accuracy which suffers from the fact that the Kinect sensor has a much smaller horizontal field of view than a real laser scanner. So let’s run it:

  • Close all open terminals
  • Open a new terminal and execute the following
roscd ardros
roslaunch ./launch/slam.launch
  • Open a new terminal and execute
roscd ardros
rosrun rviz rviz -d ./rviz/slam.vcg

We can now drive the robot around for a while and watch as the map is assembled. When the map is completed it can be saved:

  • Open a new terminal and execute
roscd ardros
rosrun map_server map_saver -f ./maps/map

As an example here is snapshot of a sample run and a screen recording that shows how it evolved.

RunningSlam

33 thoughts on “2d SLAM with ROS and Kinect”

  1. Dear Rainer,

    I downloaded and make-ed your sources. I commented following lines as I have not got arduino:

    Then I ran “roslaunch ./launch/slam.launch”. And got warn:

    [ WARN] [1302487709.525039769]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.
    [ WARN] [1302487769.536334306]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.

    In rviz I’ve got following errors. In Laser Scan: “For frame [/openni_depth_frame]: No transform to fixed frame [/map]. TF error: [Could not find a connection between ‘/map’ and ‘/openni_depth_frame’ because they are not part of the same tree.Tf has two or more unconnected trees.]” And in axes: “For frame [/base_link]: No transform to fixed frame [/map]. TF error: [Could not find a connection between ‘/map’ and ‘/base_link’ because they are not part of the same tree.Tf has two or more unconnected trees.]”

    Could you give any advice how to fix it?

    Regards,
    Alex

  2. Hi Alex,

    Your message got stuck in the spam filter. I found it by accident today. It is difficult for me to give you a conclusive answer since I don’t know what you replaced the arduino node and controller with. The transformation between the base_link and the map frame is established via the odom messages (including transformations) that are published by the arduino node.

    Regards,
    Rainer

  3. Dear Rainer,
    During this slam demo, how did the robot dirver? Did it driver automatically by the slam algorithm or you control it remotely through the ros ?

    Regards,
    yangyang

  4. Hi yangyang,

    While running the SLAM algorithm the robot needs to be manually driven around. For this I either simply push the robot manually or I drive it via the game controller. Once the robot has seen enough of its surroundings you can complete the SLAM process and save the map. This concludes the SLAM process. From this point onwards the robot can drive autonomously to a specified point in the map.

    Regards,
    Rainer

  5. Dear Rainer,
    Thank you so much for your reply.
    In your reply, you said “For this I either simply push the robot manually or I drive it via the game controller. “. But if you push your robot manually , do your robot send the odometry information to ros?
    I don’t hava a game controller. So if I push my robot manually during slam demo, my robot will not send the odometry information to ros.So I think I need to buy a joystick to drive my robot during the slam demo.Am I right?

    Regards,
    yangyang

  6. Hi yangyang,

    In my case I do get odometry information when I manually push the robot. The motors are not powered but the wheels turn and the wheel encoder signals are retrieved. Essentially everything is hooked up; I just disconnect the power from the motors. It is convenient to have a game controller but don’t buy it just for the purpose of doing SLAM.

  7. Hi, Dr. Rainer

    i have done almost everything on this tutorial, but when i get to the last part, which is this :

    roscd ardros
    rosrun rviz rviz -d ./rviz/slam.vcg

    with my kinnect plugged in and all, i am getting this message :

    File “/opt/ros/electric/stacks/ros/ardros/nodes/arduino.py”, line 45, in
    from ardros.srv import *
    ImportError: No module named ardros.srv
    [arduino-11] process has died [pid 16431, exit code 1].
    log files: /root/.ros/log/972dace0-4e9a-11e1-b079-00242bb766af/arduino-11*.log
    [ INFO] [1328296174.533666662]: [/openni_node1] Number devices connected: 1
    [ INFO] [1328296174.535273570]: [/openni_node1] 1. device on bus 001:42 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id ‘A00365A25924043A’
    [ INFO] [1328296174.558966173]: [/openni_node1] searching for device with index = 1
    [ INFO] [1328296174.711290279]: [/openni_node1] Opened ‘Xbox NUI Camera’ on bus 1:42 with serial number ‘A00365A25924043A’
    [ INFO] [1328296174.815792670]: rgb_frame_id = ‘/openni_rgb_optical_frame’
    [ INFO] [1328296174.828459226]: depth_frame_id = ‘/openni_depth_optical_frame’
    [ WARN] [1328296186.150941419]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.

    what does it mean? could you help me correct it?

    Thanks

  8. Hi Hector,

    It looks like you are only missing the package build step. It generates the source code for the service definition. This is explained in Creating a ROS msg and srv. To build the ardros package please run the following in a terminal:

    rosmake ardros

    After the completion of the build you should see the generated python wrapper files in /ardros/src/ardros/srv.

    Regards,
    Rainer

  9. Dr. Rainer, now i’m getting this on console, also i wanted to know if the default configuration in rviz for grid, fixed frame, laser scan and axes are the ones that i should use or if i must change the topic of those.

    This is the error:
    raise SerialException(“could not open port %s: %s” % (self._port, msg))
    serial.serialutil.SerialException: could not open port /dev/ttyUSB0: [Errno 2] No such file or directory: ‘/dev/ttyUSB0′
    [arduino-11] process has died [pid 1789, exit code 1].
    log files: /root/.ros/log/65bd5346-51aa-11e1-b079-00242bb766af/arduino-11*.log
    [ INFO] [1328632825.160461453]: [/openni_node1] Number devices connected: 1
    [ INFO] [1328632825.163598235]: [/openni_node1] 1. device on bus 001:07 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id ‘A00365A25924043A’
    [ INFO] [1328632825.184956188]: [/openni_node1] searching for device with index = 1
    [ INFO] [1328632825.345177327]: [/openni_node1] Opened ‘Xbox NUI Camera’ on bus 1:7 with serial number ‘A00365A25924043A’
    [ INFO] [1328632825.434232842]: rgb_frame_id = ‘/openni_rgb_optical_frame’
    [ INFO] [1328632825.453714435]: depth_frame_id = ‘/openni_depth_optical_frame’
    [ WARN] [1328632831.885187166]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.
    [ WARN] [1328632891.888258715]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.
    [ WARN] [1328632951.888695502]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.

  10. Dr.Rainer
    i’ve solved the problem stated up of this comment, however in rviz i am still having problems in axes it shows an error stating:
    For frame[/base_link]:No transform to fixed frame [/map]. TF error: [Could not find a connection between ‘/map’ and ‘/base_link’] because they are not part of the same tree. Tf has two or more unconnected trees
    also for Laser Scan, it shows:
    Showing [0] points from [0] messages
    Topic No messages received
    Transform[sender =/openni_manager]
    For frame [/openni_depth_frame]: No transform to fixed frame[/map].TF error: [Could not find a connection between ‘/map’ and ‘/openni_depth_frame’ because they are not part of the same tree].Tf has two or more unconnected trees.]

    Thanks

  11. Hi Hector,

    I just published a new post Ardros – Transform Between base_link and the Kinect Sensor which might be helpful for you. Please have a look at the image that illustrates how the various coordinate systems (frames) are connected and then analyze the tree on your system. I suspect that the odometry frame is not hooked up in your case.

    Rainer

  12. As you said, the connection between odom and base_link is missing, how do i create a connection between those???

  13. i created a connection by adding this node in the ardros_configuration,launch:

    and in the rviz now i am not getting any trouble but it doesn’t provide a map =(

  14. in the rviz it registers the first scan but when i move it left or right, it doesn’t update the map, nor it detects that it’s been moved

  15. Hi Dr Hessmer, thank you for taking the effort to create this blog. It helps me ubderstand ROS pretty quickly.

    By the way is it possible to post up rxgraph with all topics on? Thank you so much.

    I am having some trouble with “rosrun map_server map_saver -f ./maps/map” it keeps on waiting for map. =(

    (Btw I am trying to do this using nxt and kinect instead of ardruino).

    Thank you so much once again, you are very kind-hearted.

  16. Hi ,

    I just wanted to know that when we specify the maxUrange to 6 , is it in cm or m? If it is in m then isn’t the max range of kinect to 5 m???

  17. Hi,
    When I try to run: roslaunch ./launch/slam.launch I get this error:

    Traceback (most recent call last):
    File “/home/alex/drh-robotics-ros/ros/ardros/nodes/arduino.py”, line 46, in
    from ardros.msg import *
    ImportError: No module named msg

    I have already rosmake ardros but I’m still getting this error, and though rviz gets open there are several errors on these items:
    Laser Scan:
    For frame [/openni_depth_frame]: No transform to fixed frame [/map]
    Axes:
    For frame [/base_link]: No transform to fixed frame [/map]

    Thank you very much for this post it’s very useful for my Dissertation

  18. Hi Dr. Rainer Hessmer,

    This blog is really very useful for getting SLAM work with kinect and point cloud to lase scan transformation. However, I have a query here, while creating slam.launch file in setting up arduino node here, I am working on turtlebot with roomba base. And I have already started node corresponding to roomba , So, in the launch file I can just omit the node that represents robot with just static_transform_publisher and faked laser part ? Could you please let me know if I am missing something here ?

    Thanks,
    Devasena

  19. Hi Dr Hessmer,
    I am doing a project relate to 3D mapping on quadrotor, i am using hector mapping to create 2D map and octomap to create 3D maping.But my 2d slam is not very good i do not know whether it is because my 2D laser scanner (hokuyo URG-04LX) has a very low scan rate which is 10hz.
    I am now trying to use a kinect to convert it to a fake laser to replace my real laser to see the result. but i have some problem with frame transform:)
    my previous tf tree in hector mapping is:
    map-→ base_footprint → base_stabilized → base_link → laser _link
    map → scanmatcher_frame
    how should add the kinect frame to my previous?

  20. Hi Dr Rainer Hessmer,

    I am doing a project related to 2D mapping and navigation with the hokuyo laser. I have a lot of troubles to make it work. Do you have a begginer code for it or maybe any advice? If you do. I will apreciate your help.

    Thaks,
    Andres

  21. Hello Dr. Hessmer,

    I’m running Ubuntu 12.04, ros hydro, openni and now I’m trying desperately to connect my kinect to rviz.
    My aim is to realize a SLAM algorithm on my Eddie robot, therfore I’m using this guide.

    The first steps are easy, installing the ros and openni doesn’t make any problems. But after the install process, neither rviz nor openni is able to locate the kinect: rviz says: No tf data. Actual error: Fixed Frame [openni_camera] does not exist.,openni says: no device connected (after using roslaunch openni_launch openni.launch to start the framework.).

    Ubuntu itself connects with the kinect: lsusb displays the connections, so i think the drivers which come with openni are actually working.

    What am I doing wrong?
    Thanks for your help!!!

    Robin

  22. Hello Dr.Hessmer,

    I’m using ubuntu 12.04 with groovy and 12.10 with hydro…

    I tried your works first on my hydro ROS with ubuntu 12.10 but unfortunately I have many problems, while I use freenect instead openni and your packages is not support Hydro (sorry if I’m wrong, I new with ROS)

    Then I try with the second one ubuntu 12.04 with groovy, I can do all your works although I need to modify or add openni_… .yaml and edit the Kinect_laser .launch due to changes on openni packages. Until I launch SLAM.launch I get some error repeatedly and also some fatal and aso directly open RVIZ and what I got is just black screen on rviz (when I launch the SLAM.vcg)

    I hope that you can help me to make it better or rather works perfectly like yours… My goal is to run your works in my hydro ROS as my reference but If it can’t be done, I hiope that in groovy it can works….

    I hope other ways to make this 2D SLAM works rather black screen on Rviz or Fatal error.

    Sorry for asking many things.. Thank you…

    Davine

  23. Hi there to every body, it’s my first pay a quick visit of this web site;
    this blog includes amazing and genuinely excellent stuff in favor of readers.

  24. Hello Dr Rainer;

    Need your a bit of guidance. I am working on navigation using microsoft kinect one. I need a bit of guidance on to how to control the robot motors using microsoft kinect images.

  25. An impressive share! I have just forwarded this onto a colleague who had been doing a little research on this. And he actually bought me lunch due to the fact that I found it for him… lol. So allow me to reword this…. Thank YOU for the meal!! But yeah, thanks for spending some time to talk about this topic here on your internet site.

Leave a Reply

Your email address will not be published. Required fields are marked *