Autonomous Drone Software E04: Depth Estimation, Octomap and Path Planning

In this tutorial we will walk through the process of using stereo camera and octomap for environment perception and A* for path finding in an unknown environment.

Give us a Star 🌟 on Github if you find this tutorial useful.

Help us to understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

---------------------

We suggest developers to install GAAS mirror to use the project:

GAAS v0.7 Release Mirror - x64

Should you have any questions or suggestions to the tutorials or GAAS itself, let us know and join our slack

In the previous tutorials, we have talked about how to control your drone with python and enable your drone to fly in GPS denied environments, which is a big step towards autonomous drone. Now, we will talk about how to enable your drone to fly in an unknown environment from point A to point B, with the help of a pair of stereo camera only, in Gazebo simulator. This tutorial can be organized as follow:

  1. Obstacle distance estimation with Stereo Camera;

  2. Octomap as a way to represent the environment;

  3. A* path finding in 3D space;

  4. Simple path pruning;

Again, I have to stress that you will need some extra work before going to the field test and the algorithms mentioned here are far from optimal.

Before continuing, remember to update local GAAS and copy related files to corresponding folder:

cd (GAAS_PAHT)
git pull origin master
cp -r (GAAS_PATH)/simulator/launch/* (PX4_FIRMWARE_PATH)/Firmware/launch
cp -r (GAAS_PATH)/simulator/worlds/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/worlds/

1. Distance Estimation with Stereo Camera

I mentioned in the last tutorial that it is not possible to measure absolute scale with only one camera without the help of other sensors. RGBD cameras are prone to sunlight, noises and they are expensive,so we will be using stereo camera as the sensor for distance estimation (for mathematical background, check Reading Materials [1]).

A C++ based package, based on StereoBM and WLS disparity filter can be found in GAAS/software/Obstacle_Map. We will not go into details, but the workflow can be organized as:

  1. disparity map generation with opencv provided StereoBM method;

  2. map disparity map to 3D points in camera frame;

  3. map 3D points in camera frame to drone body frame.

Before the build:

sudo apt-get install libpopt-dev

To build it:

cd (GAAS_PATH)/software/Obstacle_Map
sh generate.sh

After building it, in order to visualize how it works, let's first launch the simulation by:

roslaunch px4 path_planning.launch

Gazebo window will pop out and you can see a brick wall of size 3x3 meters located before the drone, which is the obstacle you want to avoid while flying to the target defined as 70 meters in front of the drone.

While running Gazebo, if everything works properly with the exception of the above Err message, the Err message itself DOES NOT affect the performance of Gazebo.

If you happen to know how to resolve this Err message, please submit a PR on GitHub to help us make it go away.

Next, you can use:

sh run.sh

to generate disparity map and then use:

./bin/match_pointcloud_and_slam_pose2

to convert point cloud in camera frame to drone body frame. You can visualize the result in RVIZ by selecting "add"->"by topic"->"/cloud_in" in RVIZ.

You can see a dense point cloud appeared, although not a perfect square as the brick wall itself. You can modify parameters found in config.yaml to tune the performance and try to find a better result, some of which are:

SADWindowSize: 9             #Matched block size. It must be an odd number >=1
MinDisparity: 3              #Minimum possible disparity value.
NumDisparities: 80
PreFilterCap: 7
UniquenessRatio: 10
SpeckleWindowSize: 7

Next, we can move on to Octomap.

2. Octomap

Octomap is a Octree based package to model the environment, and it can drastically decrees the number of points you need to save for memory efficiency and real time performance. We will use Octomap to represent the environment. For more information, check Reading Materials [2] and [3].

To use Octomap:

sudo apt-get install ros-kinetic-octomap-*

Before launching Octomap, use rosed to modify resolution to 0.1, max range to 10, remap from="cloud_in" to="cloud_in" and change frame_id to "map":

rosed octomap_server octomap_mapping.launch

or if you feel like using gedit:

roscd octomap_server && cd launch
sudo gedit octomap_mapping.launch

so your launch file looks like:

<!--
  Example launch file for octomap_server mapping:
  Listens to incoming PointCloud2 data and incrementally builds an octomap.
  The data is sent out in different representations.

  Copy this file into your workspace and adjust as needed, see
  www.ros.org/wiki/octomap_server for details
-->
<launch>
        <node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
                <param name="resolution" value="0.1" />

                <!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
                <param name="frame_id" type="string" value="/map" />

                <!-- maximum range to integrate (speedup!) -->
                <param name="sensor_model/max_range" value="10" />

                <!-- data source to integrate (PointCloud2) -->
                <!-- <remap from="cloud_in" to="/narrow_stereo/points_filtered2" /> -->
                <remap from="cloud_in" to="cloud_in" />

        </node>
</launch>

Next, start point cloud generation depicted in the last section and launch Octomap by:

roslaunch octomap_server octomap_mapping.launch

in RVIZ, choose to visualize the following topic:

/occupied_cells_vis_array

and the corresponding Octomap representation of the environment can be shown as:

You can tune parameters such as Octomap resolution (the size of each cell) and Max range by:

rosed octomap_server octomap_mapping.launch

Next, we will talk about how to find a path with Navigator.

3. Navigator for Path Finding

Navigator provides a 3D A* path finding algorithm for path finding, it has a poor performance in 3D space, but we will use it as a good starting point. For more information, check Reading Material [4].

In this part, you will need to keep the first two parts up and running and don' t forget to set the following two topics in RVIZ so you can visualize drone current pose and the path found:

Navigator is located at:

(GAAS_PATH)/software/Navigator

to use it, you will need to specify a target position in Navigator.py and current target position it set to 70 meters in front of the drone starting point. After setting up the starting point, you will first take off the drone using:

python px4_mavros_run.py

Next, starting path finding by:

python Navigator.py

A red flight path will appear and the drone will fly according to the flight path.

4. Path Pruning

Another thing we need to take a look at is path pruning. The path A* generated is a series of way points with incremental step size, and usually the path would be too complex, meaning there are too many extra and unnecessary waypoints. We implemented two path pruning methods:

  1. collinearity check: for a path from A to B, if any two points of it are collinear, remove extra ones and keep only the first and last one;

  2. bresenham3D obstacle detection: for a path from A to B, create lines between A and A+1, A and A+2, check if each line intersect with obstacle, if yes, keep the last point and start overt.

You can visualize the result by:

python (GAAS_PATH)/software/Navigator/path_optimization/path_pruning.py

In the image, the red cube represents an obstacle, the bottom red path is the raw path, consisted of many redundant way points; the green line is the path generated by collinearty check, only corner points are preserved; the blue line is generated by bresenham3D obstacle check, points are preserved according to a preset obstacle distance.

After path pruning, a more efficient and concise path is generated and the drone is able to flly from A to B more efficiently.

If you find any bug or need more help, please let us know either by opening an issue or through our facebook group.

Give us a Star 🌟 on Github if you find this tutorial useful.

Support us on Patreon.

Help us to understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

Reading Materials

Last updated