RTAB-Map (Real-Time Appearance-Based Mapping) is a RGB-D Graph-Based SLAM approach based on an incremental appearance-based loop closure detector. The loop closure detector uses a bag-of-words approach to determinate how likely a new image comes from a previous location or a new location. When a loop closure hypothesis is accepted, a new constraint is added to the map’s graph, then a graph optimizer minimizes the errors in the map. A memory management approach is used to limit the number of locations used for loop closure detection and graph optimization, so that real-time constraints on large-scale environnements are always respected. RTAB-Map can be used alone with a hand-held Kinect or stereo camera for 6DoF RGB-D mapping, or on a robot equipped with a laser rangefinder for 3DoF mapping.
- M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014. (IEEE Xplore)
- Results shown in this paper can be reproduced by the Multi-session mapping tutorial.
Loop closure detection
- M. Labbé and F. Michaud, “Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation,” in IEEE Transactions on Robotics, vol. 29, no. 3, pp. 734-745, 2013. (IEEE Xplore)
- M. Labbé and F. Michaud, “Memory management for real-time appearance-based loop closure detection,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, pp. 1271–1276. (IEEE Xplore)
- Visit RTAB-Map’s page on IntRoLab for detailed information on the loop closure detection approach and related datasets.
- Installation instructions.
- For ROS users, take a look to rtabmap page on the ROS wiki for a package overview. See also SetupOnYourRobot to know how to integrate RTAB-Map on your robot.
- Visit the wiki.
- Ask a question on RTAB-Map Forum.
- Post an issue on GitHub
- For the loop closure detection approach, visit RTAB-Map on IntRoLab website
- Visit rtabmap_ros wiki page for nodes documentation, demos and tutorials on ROS.
- Ask a question on answers.ros.org with rtabmap or rtabmap_ros tag.
- If OpenCV is built without the nonfree module, RTAB-Map can be used under the permissive BSD License.
- If OpenCV is built with the nonfree module, RTAB-Map is free for research only because it depends on SURF and SIFT features. SIFT and SURF are not free for commercial use.
- SURF noncommercial notice: http://www.vision.ee.ethz.ch/~surf/download.html
- SIFT patent: http://www.cs.ubc.ca/~lowe/keypoints/
- Mathieu Labbé
- RTAB-Map’s page at IntRoLab
- Similar projects: Find-Object
- If you find this project useful and to help me keeping this project updated, you can buy me a cup of coffee with the link below :P. It is also nice to receive new sensors to test with and even supporting them in RTAB-Map for quick SLAM demonstrations (e.g., stereo cameras, RGB-D cameras, 2D/3D LiDARs). Thanks Stereolabs for the ZED, thanks Walt (with Tango coupon discount) and Google for Google Tango Development Kits and thanks to all contributors (for donations, reporting bugs, helping me fixing bugs or making pull requests).
New tutorial: Multi-Session Mapping with RTAB-Map Tango
- Version 0.11.14 : Visit the release page for more info!
- Tango app also updated:
Application example: See how RTAB-Map is helping nuclear dismantling with Areva’s MANUELA project (Mobile Apparatus for Nuclear Expertise and Localisation Assistance):
Version 0.11.11: Visit the release page for more info!
- Version 0.11.8: Visit the release page for more info!
- Version 0.11.7: Visit the release page for more info!
I’m pleased to announce that RTAB-Map is now on Project Tango. The app is available on Google Play Store.
- Version 0.10.10: Visit the release page for more info!
New example to speed up RTAB-Map’s odometry, see this page:
At IROS 2014 in Chicago, a team using RTAB-Map for SLAM won the Kinect navigation contest held during the conference. See their press release for more details: Winning the IROS2014 Microsoft Kinect Challenge. I also added the Wiki page IROS2014KinectChallenge showing in details the RTAB-Map part used in their solution.
Here a comparison between reality and what can be shown in RVIZ (you can reproduce this demo here):
Added Setup on your robot wiki page to know how to integrate RTAB-Map on your ROS robot. Multiple sensor configurations are shown but the optimal configuration is to have a 2D laser, a Kinect-like sensor and odometry.
Onboard mapping Remote mapping
I’m glad to announce that my paper submitted to IROS 2014 was accepted! This paper explains in details how RGB-D mapping with RTAB-Map is done. Results shown in this paper can be reproduced by the Multi-session mapping tutorial:
Appearance-based loop closure detection
More loop closure detection videos here.