From Tom Moore via ros-users@
I am pleased to announce the release of a new ROS package, robot_localization. The package estimates the state (3D pose and velocity) of a mobile robot through sensor fusion. Its features include:
* Fusion of an arbitrary number of sensors: the nodes do not restrict the number of input sources. If, for example, your robot has multiple IMUs or multiple sources of odometry information, the nodes within robot_localization can support all of them.
* Support for multiple ROS message types: all nodes in robot_localization can take in Odometry, Imu, PoseWithCovarianceStamped, or TwistWithCovarianceStamped messages.
* Per-sensor input customization: if a given sensor message contains data that you don't want to include in your state estimate, robot_localization's nodes allow you to exclude that data on a per-sensor basis.
* Continuous estimation: each node in robot_localization begins estimating the robot's state as soon as it receives a single measurement. If there is a holiday in the sensor data (i.e., a long period in which no data is received), the filter will continue to estimate the robot's state via a 3D motion model.
robot_localization currently contains only one node, ekf_localization, which, as the name implies, employs an extended Kalman filter. New nodes, such as an unscented Kalman filter node, will be added as they become available.
robot_localization is currently available for ROS Groovy, Hydro, and Indigo. The package's wiki page athttp://wiki.ros.org/robot_
Development of this node was funded by Charles River Analytics, Inc.
Leave a comment