Build Your Own Autonomous Drone Part 3: Using SLAM In GPS Denied Environment For Position Estimation

In this tutorial we will talk about how to provide accurate position estimation from SLAM (Simultaneous localization and mapping) and control your drone in GPS denied environment.

GPS can provide the drone with accurate position estimate, but there are cases when GPS signal are not applicable or stable, such as under the bridge, inside a room or in a densely populated city. To enable the drone to fly in these environments we will need to provide other sources of position estimation such as SLAM. SLAM uses camera as its primary sensor, which could be monocular camera, stereo camera or RGBD camera. These cameras have their corresponding advantages and disadvantages; Monocular camera is cheap and small in size but it is not capable of estimating the depth of an environment without the help of other sensors; Stereo Camera is able to estimate the depth of an environment but it is computational intensive and requires good calibration; RGBD camera is capable of estimating the depth but it has narrow range and noises and it is prone to errors caused by direct sunlight.

This tutorial has been validated on a real drone but the performance is not stable and there is plenty of room for improvement. Therefore, I would recommend you to proceed with extreme caution and do not test it on a real drone before you are ready. In this tutorial we will test it in Gazebo and the setup procedure is pretty much the same as a real drone.

This tutorial can be organized into the following part:

  1. Environment update and setup;

  2. Building SLAM and its usage;

  3. Test it in Gazebo.

1. Environment Update and Setup

If you have finished previous tutorials, you have set up ROS, MAVROS, Gazebo as well as PX4 firmware. Now, change directory to where you put GAAS or clone a clean copy by :

git clone

Or update GAAS by:

git pull origin master

Copy ROS launch files to PX4 firmware so you can launch the simulation environment in a new terminal:

cp -r (GAAS_PATH)/simulator/launch/* (PX4_FIRMWARE_PATH)/Firmware/launch

We have updated Gazebo simulation models for drones, and you will need to remove old ones correspondingly from "(PX4_FIRMWARE_PATH)/Firmware/Tools/sitl_gazebo/models" and recopy them by:

cp -r (GAAS_PATH)/simulator/models/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/models/

a. Build SLAM

SLAM used by GAAS is based on the following project:

Firstly, install the following dependencies:

Mavros msgs: sudo apt install ros-kinetic-mavros-msgs
Pangolin (for visualization):
Eigen3: sudo apt-get install libeigen3-dev
g2o: sudo apt-get install libcxsparse-dev libqt4-dev libcholmod3.0.6 libsuitesparse-dev qt4-qmake
glog (for logging): sudo apt-get install libgoogle-glog-dev
ROS Octomap: sudo apt-get install ros-kinetic-octomap-*

next, install PCL, DBoW3, g2o and opencv by:

# DBoW3
# g2o
# opencv, version higher than 3.4.5 recommended, you'll need opencv contrib

Then, change directory to SLAM/ygz_slam_ros and build it by:

cd (GAAS_PATH)/software/SLAM/ygz_slam_ros
#it might take a while

b. Download QGroundControl

We will be using QGroundControl to set up and visualize Drone parameters and states, to download it, vistit:

Select your OS platform and configure it according to the description stated in the link.

If you have successfully built ygz_slam_ros and downloaded QGroundControl, you can continue to the next step.

2. SLAM and Its Usage

In ygz_slam_ros/examples, you will find a config file that will be used by SLAM, which is called simulationCamera.yaml. The config file includes image left and right topic names, camera intrinstics as well as other parameters that will be used. To give you a glimpse of it:

# Camera Parameters. Adjust them!
# camera and imu topics
Left: /gi/simulation/left/image_raw
Right: /gi/simulation/right/image_raw
# not used for now
Imu: /mavros/imu/data
# Camera calibration and distortion parameters (OpenCV)
# if running in pure stereo vision mode
PureVisionMode: true
# do we need visualization?
UseViewer: false
# display mappoints?
displayMapPoints: false
Camera.fx: 376.0
Camera.fy: 376.0 376.0 240.0
Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

Feel free to modify camera topic names if you'd like to test it on your own device, note that IMU is not used for now and it is still a working-in-progress project.

3. Test Up in Gazebo

Now, let's cut to the chase and start to test it up in Gazebo.

Firstly, let's launch a simulation environment by:

roslaunch px4 slam.launch

A Gazebo window will pop out and the drone will be centered in an empty world. As I mentioned in previous tutorials, remember to check out MAVROS connection status by:

rostopic echo /mavros/state

Make sure "connected: True".

If Gazebo is not publishing images, try : sudo apt install ros-kinetic-gazebo-control

Before continuing, if you take a look at slam.launch you will find the following argument:

<arg name="vehicle" default="iris_stereo_gray_no_gps"/>

You can see I am using "iris_stereo_gray_no_gps" drone model for the simulation and in this model I have disabled GPS plugin, so there would be no GPS signal in the simulation. If you have downloaded QGroundControl, you can start it in a terminal:

./QGroundControl.AppImage # if you downloaded AppImage
Fig 1, QGroundControl

QGroundControl will automatically connect to Gazebo simulation, and in the window you will see "No GPS Lock for Vehicle", which means GPS signal is deactivated and we will only be using SLAM as the only external position estimation.

Now let's start SLAM. In a terminal, start SLAM by:

./bin/EurocStereoVIO_ros ./examples/simulationCamera.yaml

A window will appear and you are able to visualize left camera image in real time. Note there are dots in different colors on the image and it means SLAM is initialized, otherwise you'll need to update Gazebo drone models as mentioned in the first part.

Fig 2, SLAM initialized

To enable vision as the source of external position estimation, select the top left GEAR button and select "Parameters", input "vision" in the search and tick "vision position fusion" only. Click "save" to save the modification.

Fig 3, QGroundContrl, select vision position fusion

We will be using SLAM for the estimation of horizontal plane position estimation only, neglecting SLAM yaw and using barometer for the drone height estimation.

Next, hit "Clear" , select "Tools" and choose "reboot vehicle" to make modifications take effect.

Actually you don't need to reboot the vehicle for parameter setting to take effect in SITL, but it is required in the field test. I included this step For completeness and safety concern.

In a terminal, type:

rostopic echo /mavros/vision_pose/pose

You will find the current SLAM position estimation looks like:

seq: 1489
secs: 3453
nsecs: 368000000
frame_id: ''
x: 0.000944685346145
y: -0.00012923774903
z: 0.000286279467091
x: 0.0
y: 0.0
z: 0.0
w: 0.0

Now, change directory to (GAAS_PATH)/demo/tutorial_3 and takeoff the drone:


In another terminal, command the drone to fly a square of size 3x3 meters:


You will find the drone achieves good position estimation and the drone is able to successfully use SLAM to achieve complex task.

Fig 4, SLAM position history

After completing all the tasks, the drone will land on the ground. You can visualize the flight path in the above window.

Although the procedures mentioned above have been validated on a real drone, the performance is not stable and it requires a fair amount of parameter tuning and SLAM improvements before we can achieve good performance. Some problems we have found while testing:

  1. There are SLAM segmentation faults, so the drone would change to ALTITUDE mode while flying;

  2. There are cases when the drone will gradually drift away even though SLAM is on and relative parameters are properly set;

  3. Poor performance when large and sudden movements.

If you'd like to test it on a real drone, typical steps are:

  1. start SLAM;

  2. start MAVROS and make sure it is connected to the Drone;

  3. set relative parameters with QGroundControl;

  4. check /mavros/local_position/pose is publishing meaningful position information, this topic fuses position output from /mavros/vision_pose/pose, so if the output of it, for example, is 1e-20, it means SLAM output is not successfully provided and you have to check the states of MAVROS, corresponding parameters as well as SLAM.

  5. takeoff the drone with either your controller and set the drone to "POSITION" mode or use provided python scripts to takeoff the drone.

Do not test on a real drone before you have tested everything in Gazebo.

To sum up, in this tutorial we have illustrated how to build SLAM, how to use QGroundControl with Gazebo simulation and how to conduct a task without GPS information.