Using the ROS Navigation Stack

In this post I cover how we can leverage the ROS navigation stack to let the robot autonomously drive from a given location in a map to a defined goal. I will skim over many of the details since the associated tutorials on the ROS wiki do a great job in describing how to set up the navigation stack.

With SLAM working on the Ardros robot (see my previous post) we already have much of the required setup covered. In particular we have:

  • A robot that publishes odometry information and accepts drive commands
  • (Faked) laser scanner sensor data derived from the Kinect sensor
  • Transform configurations covering the transformations between base_link, the odometry frame, and the laser scanner frame (depth frame of the Kinect sensor)
  • A map generated via SLAM

From a high level perspective navigation brings in these additions:

  • Localization of the robot in a given map using Monte Carlo Localization (MCL). For more information about MCL (including C# source code) please seee my blog post Monte Carlo Localization for Robots.
  • Global and local path planning. Amongst other parameters, path planning most notably requires a map, information about the size (footprint) of the robot, and information about how fast the robot can drive and accelerate.

Before reading on I highly recommend that you minimally read through the ROS tutorial ‘Setup and Configuration of the Navigation Stack on a Robot‘.

Okay, let’s dive into the Ardros specific files. As described in my previous posts the source code is available from here: http://code.google.com/p/drh-robotics-ros/. This post is based on revision 79 of the source code.

The whole navigation machinery is started via the launch file navigation.launch. The launch file itself is very simple:

<launch>
	<include file="$(find ardros)/launch/ardros_configuration.launch"/>
	<include file="$(find ardros)/launch/move_base.launch"/>
</launch>

It brings in two other launch files, the first being the ardros_configuration.launch which we already know from theprevious post. It handles the Ardros specific aspects including odometry, drive commands, and the usage of the KInect sensor as a fake laser scanner.

The second include file, move_base.launch, is responsible for bringing up the navigation specific aspects:

<launch>
  <!-- Run the map server -->
  <node name="map_server" pkg="map_server" type="map_server" args="$(find ardros)/maps/map.pgm 0.05"/>

  <!--- Run AMCL -->
  <include file="$(find ardros)/launch/amcl_diff.launch" />

  <node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
    <rosparam file="$(find ardros)/info/costmap_common_params.yaml" command="load" ns="global_costmap" />
    <rosparam file="$(find ardros)/info/costmap_common_params.yaml" command="load" ns="local_costmap" />
    <rosparam file="$(find ardros)/info/local_costmap_params.yaml" command="load" />
    <rosparam file="$(find ardros)/info/global_costmap_params.yaml" command="load" />
    <rosparam file="$(find ardros)/info/base_local_planner_params.yaml" command="load" />
  </node>
</launch>

The first relevant line brings up the map server and specifies what map it should serve up. It includes the path to the map which needs to be adjusted accordingly. Secondly it imports the launch file for AMCL (adaptive (or KLD-sampling) Monte Carlo localization) configured for a robot with differential drive (see odom_model_type inamcl_diff.launch). AMCL is responsible for localizing the robot on the map. AMCL uses numerous parameters, many of which are specified in the launch file. I use largely the default values but I increased the particle count numbers. I expect that going forward I will likely fine tune more of these parameters.

It is worth noting that Ardros is quite similar to Willow Garage’s turtlebot in its use of the Kinect sensor in conjunction with the navigation stack. I started to use the node pointcloud_to_laserscan before it became part of the turtlebot stack and hence had to come up with my own launch files and associated parameters. In the future I should be able to borrow quite a bit from turtlebot, minimally better parameters for AMCL.

Finally the last part of the move_base.launch file brings up the move_base node which is the center piece of the ROS navigation stack (see diagram and detailed explanation athttp://www.ros.org/wiki/navigation/Tutorials/RobotSetup). The content of the referenced yaml files should be pretty clear after reading through the navigation stack tutorial. I will only briefly touch on the robot’s footprint which is defined in the file costmap_common_params.yaml:

# for details see: http://www.ros.org/wiki/navigation/Tutorials/RobotSetup

obstacle_range: 2.5
raytrace_range: 3.0
footprint: [[0.305, 0.278], [0.04, 0.193], [-0.04, 0.193], [-0.282, 0.178], [-0.282, -0.178], [-0.04, -0.193], [0.04, -0.193], [0.305, -0.278]]
#footprint: [[0.075, 0.178], [0.04, 0.193], [-0.04, 0.193], [-0.282, 0.178], [-0.282, -0.178], [-0.04, -0.193], [0.04, -0.193], [0.075, -0.178]]
#robot_radius: ir_of_robot
inflation_radius: 0.6

observation_sources: laser_scan_sensor

laser_scan_sensor: {sensor_frame: openni_depth_frame, data_type: LaserScan, topic: scan, marking: true, clearing: true}

The specified points define a polygon that reaches out beyond the front of the robot. I.e., I pretend the robot is bigger than it actually is. The reason for this is that I want to make sure that the path planner does not generate paths that drive the robot closer to an obstacle than it can actually see. The minimum distance that the Kinect sensor can measure is about 0.6m (2ft). To mitigate the short range ‘blindness’ I

  1. Mount the Kinect towards the back of the robot
  2. Pretend that the robot footprint reaches out beyond the front wheels further than it actually does.

Now we are ready to run the navigation stack. Close all terminals and then run the following command in a new terminal:

roslaunch `rospack find ardros`/launch/navigation.launch

In a second terminal bring up rviz:

rosrun rviz rviz -d `rospack find ardros`/rviz/navigation.vcg

You will probably see a number of error messages in the first terminal since the robot has not been located in the map. To locate the robot, set the starting pose by clicking on the ‘2D Pose Estimate’ button in rviz, followed by clicking on the current location of the robot in the map and, while still holding the left mouse button, dragging the mouse in the direction the robot faces. After specifying the goal in the same way after clicking on the ‘2D Nav Goal’ button you should see the planned path and the robot should start driving towards the goal.

Optionally you can monitor the status of the robot driving to the specified goal by running the following command in a third terminal:

rostopic echo /move_base/status

The video below is the recording of a test run showing the output in rviz while Ardros navigates a room. The video shows how the initial pose and the goal are selected. Initially the probabilistic cloud of poses used by AMCL is very wide. As the robot drives around the cloud condenses as the robot position determined from the odometry and laser scan information becomes better defined. The video also shows the calculated path that the robot tries to follow to get to the specified goal.

17 thoughts on “Using the ROS Navigation Stack”

  1. Dr,

    Thanks for the posts I’ve been planning the migration of my custom arduino to ROS bit by bit for a while.

    I’m using python and a beagle board xM with a (Seeduino ) Arduino Mega

    Having a working example, especially for the various configuration files, nodes etc is extremely helpful

    thanks
    Dave

  2. Dear Dr. Hessmer,

    This is a very useful example/guide to use for learning ROS. I just started not to far back working with ROS and needless to say finding your blog is like finding a diamond in the rough. Thank you so much for sharing.

    cheers,
    Drew

  3. Dear Dr. Hessmer,

    Could you please explain how do you get the fake laser scanner data using the depth data? I went through your codes but didn`t understand how you did it.

    Thanks

  4. Dear Dr. Hessmer,

    Thanks for sharing your work.
    I’ve built a map and I’m trying to do some navigation, but one file is missing : /rviz/navigation.vcg in ardros path.
    Where can I find this file ?

    Regards,
    Cyril

  5. Wow, time flies. Let me answer the questions in reverse order:

    Cyril, I added the file navigation.vcg to Subversion. You can access it from here: http://code.google.com/p/drh-robotics-ros/source/browse/trunk/ros/ardros/rviz/navigation.vcg

    Thomas, the translation from depth data to fake laser data is configured in the launch file /launch/kinect_laser.launch. The launch file causes the nodes pointcloud_to_laserscan/CloudThrottle and pointcloud_to_laserscan/CloudToScan to be loaded. Both nodes are part of the turtlebot stack (see http://www.ros.org/wiki/turtlebot).

    Regards,
    Rainer

  6. Dear Rainer,
    Thank you so much for your last reply for SLAM. Now , my robot can build a good map in accordance with your “2d SLAM with ROS and Kinect” tutorial. But , there are some problems when I am using the ROS Navigation Stack. My robot can drive well when the navigation path is a line on the map. But it will often turn circle nonstop when the path is not a line.

    Regards,
    yangyang

  7. Dear Dr. Hessmer,

    Could you please tell me in the ardunio system,which kind of algorithm is used for the path planning. I can’t get it in the code.

    Thanks,
    Liang

  8. Dear Dr. Hessmer,

    Following your tutorials I have been able to make aa map using kinect and turtlebot.But now when I am trying navigation with amcl using rviz, I am facing this timeout message and my robot doesnt go to the goal.

    [ WARN] [1356024461.311514700]: Costmap2DROS transform timeout. Current time: 1356024461.3112, global_pose stamp: 1356024460.8922, tolerance: 0.3000
    [ WARN] [1356024461.311840090]: Could not get robot pose, cancelling reconfiguration

    What should i look for improvement.
    Regards

  9. Hi Dr, my name is Cris and im a Chilean master student that is using ROS. Right now im stuck and can not continue with my thesis due to some problems with the gmapping and navigation stacks. I have been unable to advance for several week and I found that you understand a lot of the subject. I would really appreciate if you could help me. Right now i can create correctly the map with gmapping but when i use the navigation stack the robot does not use this map and simply goes to the goal directly. Also when i look for obstacles non appear in rviz so i think thats the problem. Unfortunately the help in the INTERNET has not bee of help and thats why i send you this mail.
    Kind regards

    Cris

  10. Hi,
    First of all thanks for this excellent content in your blog.
    I am using USARSim/ROS (http://sourceforge.net/projects/usarsimros/) . Currently I am able to move the robot in the environment constructed in USARSim robotic simulator and generate a map of the environment through ros.
    The package structure is given below:
    root
    fuerte_workspace
    sandbox
    usarsim (provided in the above link)
    usarsim_inf
    info(added by me)
    costmap_common_params.yaml
    local_costmap_params.yaml
    global_costmap_params.yaml
    base_local_planner_params.yaml
    Now I try to run the move_base.launch file which I have created inside the move_base package. The move_base file looks like below

    But I got “ResourceNotFound: usarsim_inf”. I have mentioned the entire text that come after
    “root@ubuntu:/opt/ros/fuerte/stacks/navigation/move_base# roslaunch move_base move_base.launch”

    … logging to /root/.ros/log/be7a7d22-825b-11e3-8c4e-3c970e099562/roslaunch-ubuntu-3048.log
    Checking log directory for disk usage. This may take awhile.
    Press Ctrl-C to interrupt
    Done checking log file disk usage. Usage is <1GB.

    WARNING: ignoring defunct tag
    Traceback (most recent call last):
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/__init__.py”, line 257, in main
    p.start()
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/parent.py”, line 257, in start
    self._start_infrastructure()
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/parent.py”, line 206, in _start_infrastructure
    self._load_config()
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/parent.py”, line 121, in _load_config
    self.config = roslaunch.config.load_config_default(self.roslaunch_files, self.port, verbose=self.verbose)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/config.py”, line 421, in load_config_default
    loader.load(f, config, verbose=verbose)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 694, in load
    self._load_launch(launch, ros_config, is_core=core, filename=filename, argv=argv, verbose=verbose)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 666, in _load_launch
    self._recurse_load(ros_config, launch.childNodes, self.root_context, None, is_core, verbose)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 601, in _recurse_load
    n = self._node_tag(tag, context.child(”), ros_config, default_machine, verbose=verbose)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 95, in call
    return f(*args, **kwds)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 347, in _node_tag
    self.opt_attrs(tag, context, (‘machine’, ‘args’, ‘output’, ‘respawn’, ‘cwd’, ‘launch-prefix’, ‘required’))
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 178, in opt_attrs
    return [self.resolve_args(tag_value(tag,a), context) for a in attrs]
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/xmlloader.py”, line 159, in resolve_args
    return substitution_args.resolve_args(args, context=context.resolve_dict, resolve_anon=self.resolve_anon)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/substitution_args.py”, line 203, in resolve_args
    resolved = _find(resolved, a, args, context)
    File “/opt/ros/fuerte/lib/python2.6/dist-packages/roslaunch/substitution_args.py”, line 139, in _find
    return resolved[0:idx-len(arg)] + r.get_path(args[0]) + resolved[idx:]
    File “/usr/lib/pymodules/python2.6/rospkg/rospack.py”, line 167, in get_path
    raise ResourceNotFound(name, ros_paths=self._ros_paths)
    ResourceNotFound: usarsim_inf
    ROS path [0]=/opt/ros/fuerte/share/ros
    ROS path [1]=/opt/ros/fuerte/share
    ROS path [2]=/opt/ros/fuerte/stacks

    Thank you for your time.

Leave a Reply

Your email address will not be published. Required fields are marked *