A compilation of concepts I want to remember...

» Home
» Github

# Autonomous Intelligent Systems #3: Robot Mapping

10 Nov 2019 » ais, ekf

Continuing the self-study exercise of working through the Robot Mapping course offered by the Autonomous Intelligent Systems lab at the University of Freiburg.

Similar to past related exercises, I am completing the programming tasks in python as opposed to matlab.

GitHub

Before attempting the problem set (sheets) complete the slides+recording on the following topics.

1. EKF SLAM [slides][recording]

Sheet 4

#### Exercise 1: Implement the prediction step of the EKF SLAM algorithm.

This was relatively straight forward as task just requires updating the respective indices of the state with the motion model and the covariance with the Jacobian of the motion model. This can be found in prediction_step.py.

#### Exercise 2: Implement the correction step.

Run main.py which should generate an image for each time step. The figures will be saved to the /plots directory. Next, from the /plots directory run the following command to generate a video from the images.

\$ ffmpeg -r 10 -start_number 0 -i 'odom_%d.png' -b 500000  odom.mp4


The output should represent a stream of a robot in motion with the visualization of the landmarks that the robot is sensing at each time step.

A couple things to note:

1. Possible the jacobian for the observation shown on pg. 39 is incorrect. Specifically the partial derivative of atan2(dy,dx)-mut,theta is -1 based on my derivation as opposed to -q shown on pg. 39. I am in the process of double checking with the author.

2. The suggested implementation by the assignment was to compute the Kalman gain after the stacked Jacobian of the observation was fully constructed for all observations for that time step. I found that this led to unstable results. Instead, computing the Kalman gain after each observation resulted in more stable results.

TODO: Implement the visualization of the probability ellipses.