Autonomous Drone Software E06: Basic Object Tracking

In today's tutorial, I will briefly go over how to use GAAS-Object-Tracking module to track an object with UAVs.

Help us to understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

Join our slack

---------------------

We suggest developers to install GAAS mirror to use the project:

In the previous tutorials, we talked about how to set up a simulation, how to use Python to control UAVs and how to use stereo cameras to estimate depth. In today's tutorial, I will briefly go over how to use GAAS-Object-Tracking module to track an object with UAVs. Please note, this realization is very basic. If you would like to use it in a real operating environment, you may need to use a more complex realization.

In order to better maintain the repo, we have uploaded the tracking algorithms here. You may use the following command to clone the tracking module to your local environment:

git clone https://github.com/generalized-intelligence/GAAS-Object-Tracking.git

GAAS Tracking module consists of four algorithms, which are TLD, KCF, GOTURN, and pysot based SiamRPN. The former two algorithms are traditional computer vision algorithms and the latter two are based on neural networks. In this tutorial, KCF is our algorithm of choice.

Please note that GAAS-Object-Tracking is a module of the GAAS project. It doesn't work as a stand-alone tracking software.

Update Environment

First, update GAAS:

git pull origin master

Next, copy all files required by Gazebo simulations to the directory of PX4:

cp -r (GAAS_PATH)/simulator/launch/* ~/catkin_ws/src/Firmware/launch/
cp -r (GAAS_PATH)/simulator/models/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/models/
cp -r (GAAS_PATH)/simulator/worlds/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/worlds/
cp -r (GAAS_PATH)/simulator/posix-config/* ~/catkin_ws/src/Firmware/posix-configs/SITL/init/ekf2/
cp -r (GAAS_PATH)/simulator/urdf ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/

Launch the Simulation

We would like to use a UAV to track a moving ground vehicle in Gazebo Simulation.

First, we need to initialize the simulation:

roslaunch px4 car_tracking.launch

The above command will initialize the simulation. In the simulation, there is a moveable ground vehicle and a UAV. The ground vehicle is your tracking target. The ground vehicle has subscribed to /cmd_vel and thus we can use either code (which I will conveniently skip for now) or a plug-in to control the ground vehicle. The plug-in can be installed by:

sudo apt install ros-kinetic-teleop-twist-keyboard

After installing the plug-in, launch the plug-in.

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

After launching the plug-in, a user manual will be printed automatically.

You may press corresponding keys to control the ground vehicle.

The ground vehicle in this tutorial is extremely basic. If you would like to build a more advanced ground vehicle, you may use husky_gazebo module. The control is also based on /cmd_vel .

Build the Tracking Module

We are using KCF algorithms for this tutorial with a realization in C++, so we need to build:

cd (GAAS-Object-Tracking_PATH)/KCF
mkdir build
cd build
cmake ..
make -j6

If you have encountered "fatal error: ros_kcf/InitRect.h: No such file or directory...", increase the number of threads used for building. For example, change make -j1 to make -j6

Testing

1. Initialize the Simulation Environment

First, launch the simulation environment:

roslaunch px4 car_tracking.launch

Let the UAV take off and hover at 3m of height:

cd (GAAS_PATH)/demo/tutorial_6/6_object_tracking/
python python px4_mavros_run.py

Open another terminal and control the UAV to fly to right above the ground vehicle:

python init_drone.py

You may also revisit Autonomous Drone Software E01 to build your own command file to fly the UAV to above the ground vehicle.

Open rviz to make sure the ground vehicle is within the visual line of sight of the camera.

2. Run the Tracking Algorithm

Open a new terminal, run the tracking algorithm with:

cd (GAAS-Object-Tracking_PATH)/KCF
./bin/startRosService

Note: There is no output at this step. The algorithm is waiting for the initial respective coordinates of the ground vehicle (in the camera frame) from ROS Service.

GAAS provides a GUI-based software to quickly set the initial coordinates. Use the following command to launch the software:

cd (GAAS_PATH)/demo/tutorial_6/6_object_tracking/
python set_init.py

In order to keep the environmental variables lean, let's manually modify set_init.py to add all dependencies for ROS Service to the list of environmental variables.

First, use an editor to open set_init.py

gedit (GAAS_PATH)/demo/tutorial_6/6_object_tracking/set_init.py

You should be looking at the following variable:

import rospy
import cv2
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
import numpy as np 

import sys 
sys.path.append('(GAAS-Object-Tracking_PATH)/KCF/build/devel/lib/python2.7/dist-packages')
from ros_kcf.srv import InitRect    #Import this ROS Service dependency
from std_msgs.msg import Int32MultiArray

Save the change and close the editor. Run set_init.py. In the pop up window, draw a rectangle to enclose the ground vehicle and press "s" with the keyboard to send the initial coordinates to the tracking algorithm through ROS Service.

After you have completed the above steps, you should see outputs in the terminal where you ran the tracking algorithm. Now you may press ctrl + c to close the program.

The tracking algorithm by default subscribes /gi/simulation/left/image_raw topic. If you would like to change it, you may change the config file:

gedit (GAAS-Object-Tracking_PATH)/KCF/config.yaml

The GUI-based software is very basic. We will continue working on upgrading the software. Any contributions will be well appreciated along the way.

3. Start Tracking

First, let's set the parameters. You may set the UAV to fly at a certain height, or you may choose to keep a fixed distance between the UAV and the ground vehicle based on depth estimation with stereo cameras. The software supports StereoBM to tune the parameters. To keep things simple, in this particular tutorial, we will set the UAV to fly at a certain height. If you would like to let the UAV fly at a different height, change the corresponding parameter.

Launch the algorithm:

python track_and_move.py

Launch the ground vehicle controller plug-in:

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

Control the ground vehicle as you wish.

During Tracking, do not stop the tracking algorithm. If you would like to start over, re-launch the tracking algorithms and run set_init.py to reset the initial coordinates.

In this tutorial, both the UAV and the ground vehicle are facing the positive direction in the X-axis by default. If you would like to change the default facing direction, please modify the code. If you would like to use a global frame of reference, please see codes from Autonomous Drone Software E01.

We will continue working on improving the tracking and depth estimation. This tutorial aims to provide a basic realization of ground vehicle tracking.

Support us on Patreon.

Help us to better understand what you want to learn in the upcoming tutorials by fill out this three-question survey.

Last updated