In the last tutorial we walked through how to control your drone with MAVROS in OFFBOARD mode within Gazebo simulation environment. Now in this tutorial, we will walk you through the process to build a 3D modell using the same API as what we used last time and a website (It's free!) for building models.
This tutorial might be off the topic and have little to do with Autonomous Drone, but building a 3D model with drone has been adopted by many companies, you can check this link:
and by building a 3D model, we have the chance to realize more and more complicated tasks with simple APIs.
Steps required to build such a model are:
launch Gazebo simulation environment;
takeoff the drone and record a rosbag;
command the drone to fly a circle around a typical building;
land the drone and process images from rosbag;
upload images to provided website.
Or if you wish to build a model about a real building or an environment:
takeoff the drone with your controller or you can choose to use provided Python API if you have stable SSH access to your companion computer (take extreme care);
start recording a rosbag;
command the drone to fly around the building and make sure the building or environment is thoroughly covered by the camera;
land the drone and process the images from rosbag;
upload images to provided website.
We will walk you through each of the above step in Gazebo simulation environment and we will provide an website that you can use to build a model. After this tutorial, you will build something like this all by yourself!
Before starting, remember to update GAAS:
# change directory to where you put GAAS and git pull from master branch to updatecd (GAAS_PATH)git pull origin master
Next, copy files that are required by Gazebo simulation to where you put PX4 firmware by:
cp -r (GAAS_PATH)/simulator/launch/* ~/catkin_ws/src/Firmware/launch/cp -r (GAAS_PATH)/simulator/models/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/models/cp -r (GAAS_PATH)/simulator/worlds/* ~/catkin_ws/src/Firmware/Tools/sitl_gazebo/worlds/cp -r (GAAS_PATH)/simulator/posix-config/* ~/catkin_ws/src/Firmware/posix-configs/SITL/init/ekf2/
After completing, you can continue to the next step.
If you followed the last tutorial and the previous step you will have everything we need to launch the simulation. To launch the simulation, in a terminal:
roslaunch px4 sfm.launch
it will launch MAVROS as well as Gazebo simulation at the same time and your drone will be centered at the origin:
Before you continue, remember to check MAVROS connection status by:
# make sure "connected" is "True"rostopic echo /mavros/state
When you started simulation, sensor data such as IMU, barometer, camera and GPS etc will be published via MAVROS, a list of available topics can be checked by:
And some of the topics published by MAVROS are:
What we will be using for reconstructing the 3D model is only the left camera image:
you can view images published from this topic using RVIZ by:
You will see a window which looks like this:
in the left bottom, select 'Add' and followed by 'by topic' and select '/gi/simulation/left/image_raw' you will see the current left camera image in real time. We will be building a 3D model in this Gazebo environment.
Rosbag is like a box that wraps data published from ROS/MAVROS, and to build a 3D model, we need a series of images about a building or an environment taken from different angles. To take images and store them, we can use Rosbag.
Since we will only be using the left camera information we only need to subscribe to the left camera topic, and in order to do this, in a terminal:
rosbag record /gi/simulation/left/image_raw -O sfm.bag
Next, takeoff the drone:
The drone will take off to 2 meters high. Now we can start controlling the drone to move around the house to take pictures, while doing this we need to keep the drone headed towards the house so that we can capture detailed information about the house:
The drone will fly around the house at 2 meters high and finally it will arrive at the origin and land on the ground.
While recording the bag, we should keep the following points in mind in order to build a good model of the building:
try to take images of a building where images are partially "overlapping" with consecutive ones;
avoid sudden yaw change while taking images;
try to cover the entire building, sometimes multiple circles are required to cover taller buildings.
You've probably noticed that we commanded the drone to fly a circle rather than a square, which seems more easy as much fewer waypoints would be needed. The reason behind this is a square usually means a big yaw change (90 degrees at each corner) while taking pictures and it is what need to avoid according to the second point mentioned above.
If the drone landed, you should stop rosbag recording using "ctril + c"
Now you will have a rosbag named 'sfm.bag'. Next, let's process this bag into a series of images before starting to build the model.
To build a 3D model, we need images rather than a rosbag. To process the rosbag into a series of images, you can use provided python scripts in the tutorial_2 folder:
python bag2image.py --bag (PATH-TO-YOUR-BAG) --output_path (IMAGE-OUTPUT-FOLDER) --image_topic /gi/simulation/left/image_raw
After this step, you will extract all the images in the bag to your local folder.
We provided a website to build the model, and all you need to do is to upload images. Due to limited computation power, it might take a while before you can get the model.
Following is a real world use cases made by the website.
After uploading images to the above website (cannot upload image folder), you will need to wait for a while. Once the model is generated successfully you will receive an email and you can fetch the model in the provided website.