SLAM on the Turtlebot3 Waffle Pi
My goal in this project was to acquire a strong understanding of Simultaneous Localization and Mapping(SLAM) by implementing an Extended Kalman Filter(EKF) from scratch.
The goal of this project is to implement SLAM and utilize the Turtlebot3 Waffle Pi robot to navigate. SLAM is a fundamental aspect of autonomy in robotics. Just like we cannot have a postal system without zip codes or computer network without some Internet Protocol addressing, we cannot have autonomous robots without SLAM. The two key questions that inspired the development of SLAM are:
- Where is the robot in the world?
- Where are the stationary/moving items in the world?
The answers to the questions above are the components of the robot's state. We wish to know this state at all times in order to accomplish tasks like Planning and Navigation. By definition, the state of a robot is the collection of all aspects of the robot and environment that impact the future. Due to uncertainty in the dynamics and interaction of a robot and it's environment, the state of a robot is represented with a probability density function(pdf). The challenge is then to model the propagation of the this pdf as the robot moves and interacts with it's environment. The proper representation and modeling of the dynamics of the pdf has been solved in a probabilistic framework using the Bayes Filter. The Kalman, Extended Kalman Filter and Particle Filters are all unique implementations of the Bayes Filter.
Localization is one aspect of state estimation. Localization is the problem of determining the robot's position and orientation at all times while it moves in it's environment. For a mobile robot in a 2D environment, knowing the (x,y) coordinates as well as the heading with respect to a global frame is enough information to know where the robot is in a map or world.
I implement the EXF for a simple form of Localization known as position tracking in which the initial pose(position and orientation) of the robot is known. The robot moves in a static environment and there is some noise in the robot motion. The mobile robot takes measurements from either it's odometry readings or the Inertial measurement unit(IMU) to correct for motion noise. I obtained similar results using either the odometry readings for measurement or the IMU sensor information.
Below is a short clip showing how the robot's estimated x and y position's match the robot's true pose.
Simultaneous Localization and Mapping
Simultaneous Localization and Mapping(SLAM) is another aspect of state estimation and is more complex than Localisation. SLAM is concerned with concurrently determing the robot's pose as well as the location of objects in the world(map). As humans, we solve the SLAM problem everytime without thinking. We repeatedly move and observe our surroundings while building a mental map of world. As such SLAM consists of two main operations:
- Motion: I implement the velocity motion model
- Measurement: I utilize lidar range measurements to obtain the bearing and range to detected objects
Spurious Landmarks, Particle Filters and Beyond
While the SLAM problem has been solved in a probabilistic framework, it still poses challenges when implementing it from scratch. I will continue to work on making the EKF SLAM implementation more robust to handling many more features. In addition the robot's pose estimate diverges much more from the ground truth more than in the EKF Localization implementation.
Particle filters are an exciting and well used variant of the Bayes Filter for state estimation. I will continue to work on robust implementations of the particle filter in future quarters.
For more information on how the EKF works, take a look at the README.
- Lidar data usage
- Probabilistic State Estimation Algorithms
- Github Use