Autonomous Drone Software E03: Using SLAM In GPS Denied Environment For Position Estimation

In this tutorial we will talk about how to provide accurate position estimation from SLAM (Simultaneous localization and mapping) and control your drone in GPS denied environment.

Give us a Star 🌟 on Github if you find this tutorial useful.

Help us to understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

---------------------

We suggest developers to install GAAS mirror to use the project:

GAAS v0.7 Release Mirror - x64

Should you have any questions or suggestions to the tutorials or GAAS itself, let us know and join our slack

GPS can provide the drone with accurate position estimate, but there are cases when GPS signal are not applicable or stable, such as under the bridge, inside a room or in a densely populated city. To enable the drone to fly in these environments we will need to provide other sources of position estimation such as SLAM. SLAM uses camera as its primary sensor, which could be monocular camera, stereo camera or RGBD camera. These cameras have their corresponding advantages and disadvantages; Monocular camera is cheap and small in size but it is not capable of estimating the depth of an environment without the help of other sensors; Stereo Camera is able to estimate the depth of an environment but it is computational intensive and requires good calibration; RGBD camera is capable of estimating the depth but it has narrow range and noises and it is prone to errors caused by direct sunlight.

This tutorial has been validated on a real drone but the performance is not stable and there is plenty of room for improvement. Therefore, I would recommend you to proceed with extreme caution and do not test it on a real drone before you are ready. In this tutorial we will test it in Gazebo and the setup procedure is pretty much the same as a real drone.

This tutorial can be organized into the following part:

  1. Environment update and setup;

  2. Building SLAM and its usage;

  3. Test it in Gazebo.

1. Environment Update and Setup

If you have finished previous tutorials, you have set up ROS, MAVROS, Gazebo as well as PX4 firmware. Now, change directory to where you put GAAS or clone a clean copy by :

git clone https://github.com/generalized-intelligence/GAAS

Or update GAAS by:

cd (GAAS_PATH)
git pull origin master

Copy ROS launch files to PX4 firmware so you can launch the simulation environment in a new terminal:

cp -r (GAAS_PATH)/simulator/launch/* (PX4_FIRMWARE_PATH)/Firmware/launch

We have updated Gazebo simulation models for drones, and you will need to remove old ones correspondingly from "(PX4_FIRMWARE_PATH)/Firmware/Tools/sitl_gazebo/models" and recopy them by:

cp -r (GAAS_PATH)/simulator/models/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/models/

a. Build SLAM

SLAM used by GAAS is based on the following project:

https://github.com/gaoxiang12/ygz-stereo-inertial

Firstly, install the following dependencies:

Mavros msgs: sudo apt install ros-kinetic-mavros-msgs
Pangolin (for visualization): https://github.com/stevenlovegrove/Pangolin
Eigen3: sudo apt-get install libeigen3-dev
g2o: sudo apt-get install libcxsparse-dev libqt4-dev libcholmod3.0.6 libsuitesparse-dev qt4-qmake
glog (for logging): sudo apt-get install libgoogle-glog-dev
ROS Octomap: sudo apt-get install ros-kinetic-octomap-*

Replace ros-kinetic- with ros-melodic- while using Ubuntu 18.04 LTS with ROS-Melodic.

You may install these dependencies (except Pangolin) with the following command:

 apt-get install libgoogle-glog-dev
sudo apt-get install ros-kinetic-octomap-*
sudo apt-get install libglew-dev
sudo apt-get install libxkbcommon-x11-dev
sudo apt-get install libqt4-dev libsuitesparse-dev qt4-qmake
#For Ubuntu16.04
sudo apt-get install libqglviewer-dev libcholmod3.0.6 
#For Ubuntu18.04
sudo apt-get install libqglviewer-headers libcholmod3 

You need to compile and install Pangolin。

next, install PCL, DBoW3, g2o and opencv by:

# PCL
https://github.com/PointCloudLibrary/pcl

# DBoW3
https://github.com/rmsalinas/DBow3

# g2o
https://github.com/RainerKuemmerle/g2o

# opencv, version higher than 3.4.5 recommended
https://github.com/opencv/opencv

In order to make sure the opencv built includes the component opencv_cvv, first we need to clone opencv_contrib:

https://github.com/opencv/opencv_contrib

Enter the opencv directory and checkout to version 3.4.5 or higher, notice that you should also checkout opencv_contrib to the same version:

git checkout 3.4.5

Then install and use the cmake-gui to configure:

sudo apt-get install cmake-qt-gui
cmake-gui

Set where to build the binaries and click the Configure button:

set OPENCV_EXTRA_MODULES_PATH to the path of your opencv_contrib clone repo, and toggle WITH_QT and ENABLE_CXX11:

Click Configure again, you will see the option BUILD_opencv_cvv, toggle it and click Generate button to generate the makefile, then build it with the makefile.

Notice that clicking the Generate button will only generate a makefile in the path you set, you need to run manually:make && sudo make install

Since the ROS has included an opencv with lower version, in order to avoid the cmake to select the wrong opencv version, you need to edit the cmake configure file for the SLAM:

cd (GAAS_PATH)/software/SLAM/ygz_slam_ros
gedit CMakeLists.txt

At line 53:

find_package(OpenCV REQUIRED)

Add your version after OpenCV (must include the space), replace 3.4.5 if you have installed a higher version:

find_package(OpenCV 3.4.5 REQUIRED)

Then, change directory to SLAM/ygz_slam_ros and build it by:

cd (GAAS_PATH)/software/SLAM/ygz_slam_ros

#it might take a while
sh generate.sh 

On Ubuntu 18.04 LTS you may see the following warning messages and you can just ignore them:

b. Download QGroundControl

We will be using QGroundControl to set up and visualize Drone parameters and states, to download it, vistit:

https://docs.qgroundcontrol.com/en/getting_started/download_and_install.html

Select your OS platform and configure it according to the description stated in the link.

If you have successfully built ygz_slam_ros and downloaded QGroundControl, you can continue to the next step.

2. SLAM and Its Usage

In ygz_slam_ros/examples, you will find a config file that will be used by SLAM, which is called simulationCamera.yaml. The config file includes image left and right topic names, camera intrinstics as well as other parameters that will be used. To give you a glimpse of it:

%YAML:1.0

#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------


# camera and imu topics
Left: /gi/simulation/left/image_raw
Right: /gi/simulation/right/image_raw

# not used for now
Imu: /mavros/imu/data

# Camera calibration and distortion parameters (OpenCV) 
# if running in pure stereo vision mode
PureVisionMode: true

# do we need visualization?
UseViewer: false

# display mappoints?
displayMapPoints: false

Camera.fx: 376.0
Camera.fy: 376.0
Camera.cx: 376.0
Camera.cy: 240.0

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

Feel free to modify camera topic names if you'd like to test it on your own device, note that IMU is not used for now and it is still a working-in-progress project.

3. Test Up in Gazebo

Now, let's cut to the chase and start to test it up in Gazebo.

Firstly, let's launch a simulation environment by:

roslaunch px4 slam.launch

A Gazebo window will pop out and the drone will be centered in an empty world. As I mentioned in previous tutorials, remember to check out MAVROS connection status by:

rostopic echo /mavros/state

Make sure "connected: True".

If Gazebo is not publishing images, try : sudo apt install ros-kinetic-gazebo-control

Before continuing, if you take a look at slam.launch you will find the following argument:

<arg name="vehicle" default="iris_stereo_gray_no_gps"/>

You can see I am using "iris_stereo_gray_no_gps" drone model for the simulation and in this model I have disabled GPS plugin, so there would be no GPS signal in the simulation. If you have downloaded QGroundControl, you can start it in a terminal:

./QGroundControl.AppImage # if you downloaded AppImage

QGroundControl will automatically connect to Gazebo simulation, and in the window you will see "No GPS Lock for Vehicle", which means GPS signal is deactivated and we will only be using SLAM as the only external position estimation.

While running Gazebo, if everything works properly with the exception of the above Err message, the Err message itself DOES NOT affect the performance of Gazebo.

If you happen to know how to resolve this Err message, please submit a PR on GitHub to help us make it go away.

Now let's start SLAM. In a terminal, start SLAM by:

./bin/EurocStereoVIO_ros ./examples/simulationCamera.yaml

or you can also run:

sh run.sh

A window will appear and you are able to visualize left camera image in real time. Note there are dots in different colors on the image and it means SLAM is initialized, otherwise you'll need to update Gazebo drone models as mentioned in the first part.

To enable vision as the source of external position estimation, select the top left GEAR button and select "Parameters", input "vision" in the search and tick "vision position fusion" only. Click "save" to save the modification.

We will be using SLAM for the estimation of horizontal plane position estimation only, neglecting SLAM yaw and using barometer for the drone height estimation.

Next, hit "Clear" , select "Tools" and choose "reboot vehicle" to make modifications take effect.

Actually you don't need to reboot the vehicle for parameter setting to take effect in SITL, but it is required in the field test. I included this step For completeness and safety concern.

In a terminal, type:

rostopic echo /mavros/vision_pose/pose

You will find the current SLAM position estimation looks like:

---
header: 
  seq: 1489
  stamp: 
    secs: 3453
    nsecs: 368000000
  frame_id: ''
pose: 
  position: 
    x: 0.000944685346145
    y: -0.00012923774903
    z: 0.000286279467091
  orientation: 
    x: 0.0
    y: 0.0
    z: 0.0
    w: 0.0
---

Now, change directory to (GAAS_PATH)/demo/tutorial_3 and takeoff the drone:

python px4_mavros_run.py

In another terminal, command the drone to fly a square of size 3x3 meters:

python square.py

You will find the drone achieves good position estimation and the drone is able to successfully use SLAM to achieve complex task.

After completing all the tasks, the drone will land on the ground. You can visualize the flight path in the above window.

Although the procedures mentioned above have been validated on a real drone, the performance is not stable and it requires a fair amount of parameter tuning and SLAM improvements before we can achieve good performance. Some problems we have found while testing:

  1. There are SLAM segmentation faults, so the drone would change to ALTITUDE mode while flying;

  2. There are cases when the drone will gradually drift away even though SLAM is on and relative parameters are properly set;

  3. Poor performance when large and sudden movements.

If you'd like to test it on a real drone, typical steps are:

  1. start SLAM;

  2. start MAVROS and make sure it is connected to the Drone;

  3. set relative parameters with QGroundControl;

  4. check /mavros/local_position/pose is publishing meaningful position information, this topic fuses position output from /mavros/vision_pose/pose, so if the output of it, for example, is 1e-20, it means SLAM output is not successfully provided and you have to check the states of MAVROS, corresponding parameters as well as SLAM.

  5. takeoff the drone with either your controller and set the drone to "POSITION" mode or use provided python scripts to takeoff the drone.

Do not test on a real drone before you have tested everything in Gazebo.

To sum up, in this tutorial we have illustrated how to build SLAM, how to use QGroundControl with Gazebo simulation and how to conduct a task without GPS information.

If you find any bug or need more help, please let us know either by opening an issue or through our facebook group.

Give us a Star 🌟 on Github if you find this tutorial useful.

Support us on Patreon.

Help us to understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

Last updated