v i s i on b a s e d a ut onom ous c ont rol of a q ua dc opt e r - bradley...
TRANSCRIPT
-
Vision Based Autonomous Control of a Quadcopter
Jeff Deeds, Zach Engstrom, Caleb Gill, and Zack Woods
Advisors: Drs. Yufeng Lu, Jing Wang, and In Soo Ahn
Department of Electrical and Computer Engineering
Bradley University
May 2, 2017
-
Abstract
A useful application of Unmanned Aerial Vehicles (UAVs) is to sense and locate persons
during an emergency. Imagine using UAVs to locate a hiker lost in the mountains or to spot
people isolated in a dangerous area. The real-time video feed and accurate location information
from a fleet of UAVs is invaluable for decision makers to efficiently distribute resources and
dispatch rescue workers.
In this project, an autonomous vision based control system for a quadcopter is designed
to execute a mission plan using a vision system. The system includes a Raspberry Pi 3, an
onboard embedded system that communicates with the Pixhawk, an onboard autopilot. The
Raspberry Pi and the Pixhawk are physically connected and communicate through serial
communication via DroneKit-Python, a Python Application Programming Interface (API) which
utilizes the MAVLink protocol. The Raspberry Pi executes a video processing algorithm to
locate a target, in this case an AprilTag, a type of two-dimensional bar code. The mission plan of
the quadcopter is to autonomously take off to a specified altitude, fly to a waypoint with a
predefined GPS coordinate, locate the target displayed by an AprilTag, position itself over the
target, and land near the target. If the target is not detected or the UAV cannot position itself
within tolerance, the UAV returns to the takeoff point.
1
-
Acknowledgement
Our sincere gratitude goes to Drs. Yufeng Lu, Jing Wang, and In Soo Ahn, for their
guidance throughout the capstone project. Their extensive expertise has helped us explore and
overcome challenging cooperative experiments on a quadcopter platform. We would also like to
take the time to thank Bradley University and the department of Electrical Engineering for the
rigorous and enriching program that gave us the opportunity and knowledge to pursue our
passions. Finally, we would like to thank our families for always believing in us and giving us
their unconditional support and encouragement.
This project is partially supported by Air Force Research Laboratory Grant FA8750-15-1-0143
(Dr. Jing Wang, 2014-2017) .
2
-
Table of Contents
Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Acknowledgement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Table of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.1 Motivation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.2 Objective. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2. Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1 Hardware and Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2 AprilTag. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.1 Grayscale and Blur. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2.2 Local Gradient. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.2.3 Edge Detection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2.4 Edge Thinning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2.5 Line Segments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.2.6 Lines Connected. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.7 Quad Detection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.8 Quads Decoded. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.2.9 Refine Detections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.3 MAVLink. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.4 DroneKit-Python. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3
-
3. System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.1 Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.2 AprilTag Detector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.3 Control Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.4 MAVLink Command Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.5 Interprocess Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.1 AprilTag Detector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.2 Simpler Mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.3 Interprocess Communication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4 Objective Mission. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5. Conclusion and Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Appendix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
4
-
Table of Figures
Figure 1: System Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Figure 2: AprilTag Families. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Figure 3: AprilTag Sample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Figure 4: Gaussian Convolution Matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Figure 5: Gradient Calculation Matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Figure 6: Gradient Magnitude Equation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Figure 7: Gradient Direction Equation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Figure 8: Gradient Magnitude Image and Gradient Direction Image. . . . . . . . . . . . . . . . . . . . . 14
Figure 9: Weighted Edge Detection Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Figure 10: Cluster Grouping Formulas. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Figure 11: Least-Squares Linear Fit and Line Segment Image. . . . . . . . . . . . . . . . . . . . . . . . . . 17
Figure 12: Quads Detection Image with Payload Acquisition. . . . . . . . . . . . . . . . . . . . . . . . . . 18
Figure 13: MAVLink Packet Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Figure 14: System Block Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Figure 15: April Tag Detector Block Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Figure 16: Control Algorithm Block Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Figure 17: Control Algorithm Flow Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26
Figure 18: MAVLink Command Encoder Block Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Figure 19: Unix Pipeline Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Figure 20: Family 35h11 showing failed detections while skewed. . . . . . . . . . . . . . . . . . . . . . . 29
Figure 21: Family 25h9 Showing Reliable Detections While Significantly Skewed . . . . . . . . . 29
5
-
Figure 22: Example of AprilTag detection delay. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Figure 23: A GPS Plot of Quadcopter Movement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Figure 24: Example of Named Pipe Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Figure 25: An Example of a Successful Mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
6
-
1. Introduction
1.1 Motivation
The application list of Unmanned Aerial Vehicles (UAVs) and specifically autonomous
UAVs is ever growing. The use of an autonomous UAV can free up personnel in demanding
situations. Groups of UAVs working together can accomplish tedious or dangerous jobs quickly
and efficiently with the mere stroke of a key. The basis behind most of these UAVs is the ability
to make decisions based on visual information and record or transmit important data to a human
operator.
In 2016, a team at Queensland University of Technology completed a project that used a
quadcopter to autonomously detect a target on the ground and land near it [1]. In this project they
used a Raspberry Pi 2 and an Iris Quadcopter to identify if a red panel was seen on the ground
while the quadcopter was on the route to its objective location. If it was seen, the quadcopter
would adjust its flight path and land near the red panel. If it was not seen, the quadcopter would
continue to fly to its objective location. Our project was based off of the same concept, but we
added more complexities.
Our project requires a quadcopter equipped with a GPS module, remote control, and an
onboard autopilot. We installed an onboard embedded system which allowed the quadcopter to
move autonomously. The commands used are take off, GPS location move, local velocity move,
and land. All of these commands are used to accomplish the different objectives of the mission.
In addition, a camera combined with the onboard embedded system runs a detection algorithm to
locate a specified target and output location data needed to center on the target. Using all of these
7
-
components will give the UAV the ability to fly in three dimensional space, locate a specified
target, and land near it.
1.2 Objective
The objective of this project is to have a quadcopter autonomously complete a mission
that is representative of a controlled search and rescue mission. For this we separated the mission
into five simple tasks, which can be seen in the list below.
1. Take off to a desired altitude.
2. Fly to a predefined GPS coordinate.
3. Search for and detect the target on the ground.
4. Center over the target.
5. Land near the target.
A main limitation of our project is the target must be within the view of the onboard camera as it
hovers at its predefined GPS coordinate. This is because our project does not incorporate search
algorithms and does not have the necessary equipment, such as Sonar or Lidar, to detect and
avoid obstacles. Therefore, our mission must be run in an open and clear environment, such as a
field, so any excessive movements of the quadcopter would not cause harm. Even though the
objective mission of this project is very controlled, we still believe the tasks are representative of
a standard search and rescue mission. We also believe that our project will create the framework
for future groups to build upon, so the quadcopter could perform more complex search and
rescue tasks in a less controlled environment.
8
-
2. Background
2.1 Hardware and Environment
The main components used in this project are the 3D Robotics Iris+ Quadcopter with
Pixhawk PX4 autopilot, Raspberry Pi 3, and the 8 Megapixels Pi camera V2. The Camera is
connected to the Raspberry Pi 3 through the Camera Serial Interface. The Raspberry Pi 3 is
connected to the Pixhawk through the GPIO pins 1,3,4,5 on the Pi to the Telem 2 port on the
Pixhawk. These connections can be seen in Figure 1.
Figure 1: System Connections [2]
The Raspberry Pi 3 is loaded with the Linux distribution Raspbian Jessie v. 2016-9-23. It
is setup to connect to a development WiFi network on boot with a static IPv4 address. This will
allow a user to connect via Secure Shell (SSH) and access the terminal on the device in order to
9
-
execute the desired applications while mounted inside the quadcopter. It also has installed git
version control software in order to easily publish or pull changes to the software.
Quadcopter: 3D Robotics Iris+ [3]
● Max Payload: 0.8 lbs
○ Payload used: 0.17 lbs
● Controller Range: 300 meters
● Battery Capacity: 5100mAh
○ Flight Time (Current Load): 10 minutes
● Peripherals
○ GPS Module: uBlox with integrated magnetometer
○ Camera: 8 Megapixels Pi camera V2
○ Telemetry/Radio Control Frequency: 433 and 915 MHz
Onboard Autopilot: PX4 (Pixhawk) [4]
● CPU: 168 MHz Cortex M4F
● RAM: 256 KB
● Flash Memory: 2 MB
● Sensors:
○ 3D Accelerometer
○ Gyroscope
○ Magnetometer
○ Barometer
● Ports:
○ 5x UARTs
○ CAN
○ I2C
○ SPI
10
-
○ ADC
Embedded System: Raspberry Pi 3 [5]
● SoC: Broadcom BCM2837
● CPU: 4× ARM Cortex-A53, 1.2GHz
● GPU: Broadcom VideoCore IV
● RAM: 1GB LPDDR2 (900 MHz)
● Networking: 10/100 Ethernet, 2.4GHz 802.11n wireless
● Bluetooth: Bluetooth 4.1 Classic, Bluetooth Low Energy
● Storage: microSD
● GPIO: 40-pin header, populated
● Ports:
○ HDMI
○ 3.5mm analogue audio-video jack
○ 4× USB 2.0
○ Ethernet
○ Camera Serial Interface (CSI)
○ Display Serial Interface (DSI)
The C++ application comprises of the AprilTag library and its integration with the
camera and control subsystems. The application depends on the Eigen3 and OpenCV packages
for the AprilTag handling, and the Video4Linux 2 package for connecting to the camera device
The C++ project uses a CMake build system with G++, a part of the GNU Compiler Collection
(GCC). The project is contained in the project-master directory of the code repository.
CMake utilizes CMakeLists.txt files to specify build parameters and include dependencies
and libraries. Instructions for building or modifying the project are detailed in README.md
located in project-master. Dependencies for the C++ applications can be installed with the
11
-
command sudo apt-get install subversion cmake libopencv-dev
libeigen3-dev libv4l-dev.
2.2 AprilTag
Figure 2: AprilTag Families [6]
AprilTag is a visual fiducial system [7]. This system is based off of Augmented Reality
Tags (ARTags), but has a better detection rate and has gained popularity in robotics visual
detection. AprilTags are classified into families, such as those seen in Figure 2, with the first
number being the pixels within the payload and the second number being the minimum
allowable variation between each tag within the family. The mechanism of AprilTag detection in
a captured video frame is summarized into nine steps. Figure 3 contains an image of an AprilTag
that will be used to visually demonstrate some of the image processing that occurs.
12
-
Figure 3: AprilTag Sample Image [7]
2.2.1 Grayscale and Blur
AprilTag detection begins by converting the image or frame into floating point grayscale
and applying a Gaussian blur. Floating point grayscale is a grayscale that takes all pixel values
and sets them to a value between 0.0 and 1.0 [7]. This can be done in a variety of ways
depending on application. Usually when using grayscale the 3 colors red, green, and blue are
given different weights when calculating the scale because the human eye tends more towards
the green end of the spectrum, but in this case the targets are black and white so the simpler
grayscale of equal weights to all colors is used. This means for each pixel the 3 color values are
added together then divided by 3 then converted to floating point by dividing by the color
resolution of the image. For example, 16 bit color would be divided by 15 because the gray scale
would only be 4 bits and 32 bit color would divide by 1023 since the grayscale would be 10 bits
in length. Applying the Gaussian blur is done by using a Gaussian distribution to make a
convolution matrix as shown in Figure 4.
13
-
Figure 4: Gaussian Convolution Matrix [7]
Each value is the weight that pixel will give to the center pixel's final value. This is done
to smooth out the picture which helps eliminate noise for when the local gradient is calculated.
2.2.2 Local Gradient
The local gradient is a calculation of the direction or gradient at each pixel. This is done
by taking the following matrices in Figure 5 and multiplying them by the pixel values. The
center of the matrix is the pixel being calculated.
Figure 5: Gradient Calculation Matrices [7]
This is referred to as the image derivative. The final gradient magnitude is then calculated
by the equation in Figure 6 and the direction is calculated by the equation in Figure 7.
Figure 6: Gradient Magnitude Equation [7]
14
-
Figure 7: Gradient Direction Equation [7]
This gives each pixel a magnitude of change and in which direction it is changing. An
example is shown in Figure 8 of magnitude on the left and direction on the right.
Figure 8: (Left) Gradient Magnitude Image (Right) Gradient Direction Image [7]
2.2.3 Edge Detection
To prepare for edge detection the pixels are filtered for a magnitude threshold. They use a
threshold value of 0.8 [7]. If the magnitude of pixel does not exceed the threshold then it is
ignored and set to a value of 0. The pixels that are left over are then grouped by direction, which
gives the areas of higher change in the image. This leaves a set of outlines for every change that
met the threshold value.
2.2.4 Edge Thinning
After these areas of change have been detected they need to be refined into single pixel
edges. This is done by creating “clusters” [7]. The clusters are found by grouping same direction
15
-
pixels together along the gradient edge. Then each cluster is given a weight (color) based on its
gradient. This gives defined set of edges for the frame. An example of this is shown in Figure 9.
Figure 9: Weighted Edge Detection Image [7]
2.2.5 Line Segments
Clusters are grouped together to form line segments by calculating if they should be
grouped together. This is done by using the formulas shown in Figure 10.
Figure 10: Cluster Grouping Formulas [7]
In simple terms it is looking for if the variations between each cluster are as low as the
variations within each cluster. This allows for a very slight curve to be present in an edge and
still have it detected. This is very important because the tag might not be perfectly flat. Then
each cluster is fit together to form line segments by using a least-squares procedure, which is a
form of linear fitting. Then the direction of the gradient is marked with a notch to show direction
16
-
of change. This will make it possible to detect boxes in subsequent steps. Examples are shown in
Figure 11 with what least-squares procedure does on the left and the result in the trial picture on
the right. The red dots are the start and end points of each line.
Figure 11: (Left) Least-Squares Linear Fit [8], (Right) Line Segment Image [7]
2.2.6 Lines Connected
The resulting line segments are then connected. They allow a difference in stopping point
from one line and starting point of the next to be up to its length plus 5 pixels [7]. This does
result in more false positives but the key is it eliminates false negatives and can be accounted for
in the eighth step. This is done for each red dot on the frame and should eliminate segments that
are on the same “side” of a figure. If the segments on the same side could not be grouped
together it would not be possible to look for loops of 4 in step 7.
2.2.7 Quad Detection
In step 7 the line segments are searched for loops of length 4. This is where all of the
processing comes together. The lines must first be all connected. Next the “notches” must follow
in a clockwise rotation. The resulting line segments left are called quads. An example is shown
17
-
in Figure 12 with the two AprilTags detected from the original with two additional as false
positives.
Figure 12: Quads Detection Image with Payload Acquisition [7]
The false positives were between the edge of the paper and the edge of the AprilTag, and
within the payload of the larger tag. This shows what errors can occur with their more liberal line
segment connection algorithm. The benefits far outweigh the negatives though. If this algorithm
wasn’t used the tags would have to be within 30 degrees of being flat not the 80 degrees that can
be used currently.
2.2.8 Quads Decoded
False positives are eliminated by checking each “quad” for AprilTag data. The internal
area of a quad is referred to as the payload field [7]. The data consists of variations between the
white and black within this payload field. Each variation of white and black has a different
AprilTag code depending on which family it is a part of and whether a designated target family
has been preselected. If a target family has been selected, AprilTags that are not in the selected
family will not be used for the data calculation. If there is no data present, the quad is thrown out
18
-
as a false positive, which can occur for true false positives and for any tags not in a selected
family.
2.2.9 Refine Detections
The quads that are left need to be checked for accuracy. This is done first by eliminating
any quads with data that are within other quads with data. This could be the result of some of the
AprilTags having another AprilTag code within their own code structure. Then a comparison is
done to all tags within the designated family using a hamming distance calculation [7]. Hamming
distance is the amount of bits that need to be corrected in order to match a tag in the library. The
same tag could give multiple results so the lowest hamming distance is the code that is saved.
A relative distance, yaw, pitch, and roll can be calculated from the detected tag, given the
object ID.
2.3 MAVLink
Micro Aerial Vehicle Link (MAVLink) is a serial communication protocol used in PX4
autopilot devices. It consists of 6 byte header, 0 to 255 byte payload and 2 byte checksum as
shown in Figure 13 [9]. The header contains information such as message, length, a sequence
number, System ID, Component ID, and Message ID. MAVLink messages can be sent at baud
rates of 9600, 57600, and 115200 bits a second, we utilize 57600. To maintain connection
between devices, a heartbeat message must be sent once every second. Other messages can be
sent via MAVLink to command the autopilot to execute specific behavior.
19
-
Figure 13: MAVLink Packet Diagram [9]
2.4 DroneKit-Python
DroneKit-Python is a Python Application Program Interface (API). The API utilizes
MAVLink to communicate with the onboard autopilot via serial connection [10]. The API
provides classes and methods to:
● Connect to a vehicle (or multiple vehicles) from a script
● Get and set vehicle state/telemetry and parameter information.
● Receive asynchronous notification of state changes.
● Guide a UAV to specified position (GUIDED mode).
● Send arbitrary custom messages to control UAV movement and other hardware
(GUIDED mode).
● Create and manage waypoint missions (AUTO mode).
● Override RC channel settings
20
-
3. System Design
3.1 Overview
The overall system functions in a rather simple design with the main middle onboard
system and peripherals outside the black box. Inside the box the Raspberry Pi 3 receives position
data from the Pixhawk and sends movements commands to the Pixhawk to be carried out.
Outside the onboard system the camera feeds visual information to Pi 3 for its target detection
algorithm, the mission commands antenna allows changes in mission parameters to be sent by
the human operator. The safety switch toggles the motor lock for safety and must be toggled
before the mission begins, while the status LED flashes the current status in red, yellow, blue, or
green. Override commands can be sent via the remote control to take manual control in case of
emergency. The motors, four of them, allow for three-dimensional movement and the GPS tracks
the movement. All of this can be seen in Figure 14.
21
-
Figure 14: System Block Diagram
The overall application software is comprised of a C++ application containing the
AprilTag Detector and a python script containing the Control Algorithm and MAVLink
Command Encoder. The two applications communicate with each other through a client-server
model of Interprocess Communication (IPC), wherein the C++ application is the server and the
Python script is the client. To launch the overall system, the user must first run the Python
executable before the C++ executable to ensure functional IPC.
The IPC is integrated through the use of Linux named pipes. Two pipes are created by the
Python client application in the /tmp/ directory of the embedded device. One is for
communication from the Python client to the C++ server and the other for communication from
the C++ server to the Python client.
22
-
FifoComm.cpp/hpp are the C++ files for handling IPC communication. They contain
the sendMessage() and receiveMessage() methods for passing character strings through
either pipe and are located within project-master.
3.2 AprilTag Detector
Figure 15: AprilTag Detector Block Diagram
The AprilTag Interface is a C++ class contained within the
AprilTagInterface.cpp/.hpp source files in the project-master directory. It handles
the complete integration of the AprilTag C++ library and exposes two methods for developers:
processFrame() and getFrame(). processFrame() when called captures a single image
frame from the video device, uses the AprilTag library functionality to extract tag information
from the image, and stores it in a private member variable. This member variable can be
accessed with getFrame(), which returns a struct containing the relevant tag information
including tag ID, x-position, y-position, z-position, distance, and time of extraction. Figure 15
shows the input/output information of the subsystem.
23
-
3.3 Control Algorithm
Figure 16: Control Algorithm Block Diagram
The Control Algorithm is implemented in the Python script quad.py within the
project-master/python-scripts directory in the repository. The C++ executable quad
should also be running in order for this to function properly. The script utilizes the
DroneKit-Python API to perform a specific set of tasks. Figure 16 shows the input/output
information of the subsystem and Figure 17 shows a general flow of the software. A more
technical description of the tasks are as follows:
1. Establish a MAVLink connection to the onboard autopilot through the /dev/ttyS0
serial device. This device corresponds to the General Purpose Input-Output (GPIO) pins
on the embedded device that are connected to the autopilot.
2. Arm the quadcopter and take off to a target altitude using the arm_and_takeoff()
function as defined in the source file.
3. Move to a target location as specified using the goto() function as defined in the source
file. This is expressed in terms of a two-dimensional distance vector relative to the
starting position of the quadcopter.
4. Request tag information from the Visual Tracking application. The application will
process a frame and respond with the tag information. Continue once response is
received.
24
-
5. If no tag is detected, return to step 4. If too many detection attempts have been made, the
quadcopter will return to its starting location on mission failure.
6. If the position data of the detected tag is within the specified tolerance, the quadcopter
will land to complete the mission. Otherwise, adjustments will be made and the algorithm
will continue to the next step.
7. Move to the relative location of the detected AprilTag using a series velocity based move
commands with the move_offset_velocity() function as defined in the source file.
This is expressed in terms of a direction, speed, and time. Each axis is adjusted using
separate commands. Return to step 4 to continue.
25
-
Figure 17: Control Algorithm Flow Chart
3.4 MAVLink Command Encoder
Figure 18: MAVLink Command Encoder Block Diagram
26
-
MAVLink Command Encoding is handled by the DroneKit-Python API. Using the
exposed API methods, it can interpret desired MAVLink commands, package them into
messages, and send them over the specified serial device. Figure 18 shows the input/output
information of the subsystem.
3.5 Interprocess Communication
Figure 19: Unix Pipeline Diagram
Since the Vision Detection and Control Algorithm subsystems run in concurrent
processes and depend on different programming languages, a form of Interprocess
Communication (IPC) was utilized to allow these subsystems to communicate with each other.
This is done through the use of Unix Pipelines in a Client-Server model, as illustrated in Figure
19. The Control Algorithm is the Client and will send request strings through the pipeline. The
Vision Detection, which is the Server, will listen on the pipeline for these requests and handle
them appropriately. The pipelines are created at the start of the Client application’s execution.
There are two requests implemented in this project: CHECK and GET. When the CHECK message
is received by the server, it will simply respond with the string ACK, acknowledging the receipt
27
-
of the message. This is a way of verifying that the IPC is functional and is used at the beginning
of the mission as a part of the initialization process. GET is a message that is sent when the
Control Algorithm is in need of AprilTag detection information. When received, the Vision
Detection subsystem will capture and process a single frame for AprilTag data and send the
information back as a response in the form of a comma separated list containing the Tag ID,
Hamming Distance, position relative to the camera measured in meters, and the time it took to
process the data. The exact format of this message can be found in the findTag() function in
quad.py and guided_test. If no detection is made, an empty string is returned. This comma
separated list will then be parsed and utilized by the Detection Algorithm.
4. Results
4.1 AprilTag Detector
There are two different parameters that we can control with the AprilTag system. What
resolution the video feed is recorded at and what AprilTag family we used. Three different
families were tested to find the most reliable and most efficient. The 36h11 family was not
reliable at distances around 5 to 8 meters of calculated distance at 320x240 resolution and was
more difficult to detect at skews over 30 degrees. This can be seen in Figure 20. 16h5 was very
reliable at the necessary detection distances and even at extreme skews but false and incorrect
detections would be reported due to its very simple patterns. 25h9 was selected as our target
family, because it retained an acceptable reliable detection distance of 5 to 8 meters and skew of
70 degrees without reporting false or incorrect detections. This can be seen in Figure 21.
28
-
Figure 20: Family 36h11 Showing Failed Detections While Skewed
Figure 21: Family 25h9 Showing Reliable Detections While Significantly Skewed
29
-
Figure 22: Examples of AprilTag Detection Delay (Left: 320x240, Right: 640x480)
Two different video capture resolutions were tested: 320x240 and 640x480. 320x240
produced a processing delay of well under 1 second and could reliably detect at a maximum
distance of approximately 5 meters. 640x480 produced a larger delay of just under 2 seconds but
could detect to a much larger distance of 10 meters. Delays are illustrated in Figure 22. The
larger resolution of 640x480 was eventually settled on after detection issues were experienced at
320x240 during debugging.
30
-
4.2 Simpler Mission
The ability to autonomously move the quadcopter was demonstrated in the
guided_test.py Python script. This script utilized the goto() function to arm throttle, take
off to a height of 5 meters, and navigate to a position 8 meters North and 5 meters to the West of
the starting position. Once this has completed, the quadcopter will be set to Land mode and will
land at its current location, adjusting for any drift that may occur. The execution of this is
illustrated in Figure 23.
Figure 23: Quadcopter Trajectory during the Simpler Mission
(Red dot = Launch Location)
4.3 Interprocess Communication
There are multiple executables and scripts for debugging IPC. The C++ executable fifo
as defined in fifo.cpp is used for a simple test of IPC function. It will simply listen for CHECK
over the incoming pipe and respond with ACK over the outgoing pipe. Likewise,
31
-
pipe_example.py is the Python equivalent of this logic and will send CHECK and listen for
ACK over the corresponding pipes. The receipt of GET is also tested in this example, however no
tag information is processed. Instead, SEND is sent back as a response to verify GET has been
received. The results from running the two executables concurrently demonstrate functional IPC
and are illustrated in Figure 24.
Figure 24: Example of Named Pipe Communication
4.4 Objective Mission
The full mission executes the entirety of the Control Algorithm as specified in Section
3.3. To execute this, first start the Vision Detection process by executing the C++ application
quad. Then execute the quad.py script to start the Control Algorithm and carry out the mission.
The quadcopter will take off to an altitude of 5 meters, fly over to a location that is 8 meters
North and 5 meters West of its position, center itself over the AprilTag within a tolerance of 0.25
meters, and then land near the target. An example of the quadcopter successfully completing the
objective mission can be seen in Figure 25.
32
-
Figure 25: An Example of a Successful Objective Mission Attempt
33
-
5. Conclusion and Future Work
In this project, a vision based autonomous control system for a quadcopter was designed.
The quadcopter successfully completed the mission plan autonomously under the guide of vision
input. Each subsystem was also individually tested and validated, so they should be portable to
any quadcopter platform that uses the Pixhawk. With that being said, to improve performance
and reliability we recommend obtaining a new quadcopter platform that has a higher thrust to
weight ratio. This would open up the possibility of more complex missions because the
quadcopter could fly longer, carry a heavier payload, and battle wind more efficiently.
Now that the ground work has been laid, we believe other groups could take the
foundation built in our project and extend it to complete more complex tasks. Examples of
project extensions can be seen in the list below.
● Implementation of search algorithms to cover a larger viewing area.
● Implementation of obstacle avoidance algorithms to operate in more complex
environments (extra sensors would be required).
● Implementation of a multiagent configuration for swarm intelligence to more efficiently
search for a target.
34
-
Appendix
AprilTagInterface.cpp
/*
* AprilTagInterface.cpp
*
* Created on: Dec 22, 2016
* Author: test
*/
#include "AprilTagInterface.hpp"
// utility function to provide current system time (used below in
// determining frame rate at which images are being processed)
double tic() { struct timeval t; gettimeofday(&t, NULL); return ((double)t.tv_sec + ((double)t.tv_usec)/1000000.); }
AprilTagInterface::AprilTagInterface() :
m_tagDetector(NULL), m_tagCodes(AprilTags::tagCodes36h11),
m_timing(true), m_width(640), m_height(480), m_tagSize(0.166), m_fx(600), m_fy(600), m_px(m_width/2), m_py(m_height/2), #if EXPOSURE_CONTROL==1 m_exposure(-1), m_gain(-1), m_brightness(-1), #endif m_deviceId(0) {
m_tagDetector = new AprilTags::TagDetector(m_tagCodes);
#if EXPOSURE_CONTROL==1 // manually setting camera exposure settings; OpenCV/v4l1 doesn't // support exposure control; so here we manually use v4l2 before // opening the device via OpenCV; confirmed to work with Logitech // C270; try exposure=20, gain=100, brightness=150
std::string video_str = "/dev/video0"; video_str[10] = '0' + m_deviceId; int device = v4l2_open(video_str.c_str(), O_RDWR | O_NONBLOCK);
if (m_exposure >= 0) { // not sure why, but v4l2_set_control() does not work for
35
-
// V4L2_CID_EXPOSURE_AUTO... struct v4l2_control c; c.id = V4L2_CID_EXPOSURE_AUTO;
c.value = 1; // 1=manual, 3=auto; V4L2_EXPOSURE_AUTO fails... if (v4l2_ioctl(device, VIDIOC_S_CTRL, &c) != 0) { std::cout
-
// detect April tags (requires a gray scale image) cv::cvtColor(image, image_gray, CV_BGR2GRAY);
double t0; if (m_timing) { t0 = tic();
}
std::vector detections = m_tagDetector->extractTags(image_gray); if (m_timing) { double dt = tic()-t0; std::cout
-
AprilTagInterface.hpp
/*
* AprilTagInterface.hpp
*
* Created on: Dec 22, 2016
* Author: test
*/
#ifndef APRILTAGINTERFACE_HPP_ #define APRILTAGINTERFACE_HPP_
#define EXPOSURE_CONTROL 1 #define ORIENTATION 1
#include #include #include
#include #include #include
#if EXPOSURE_CONTROL==1 //exposure control
#include #include #include #include #endif
// OpenCV library for easy access to USB camera and drawing of images
// on screen
#include "opencv2/opencv.hpp"
#include "TagDetector.h" #include "Tag16h5.h" #include "Tag25h7.h" #include "Tag25h9.h" #include "Tag36h9.h" #include "Tag36h11.h"
struct detectionInfo { int id; double hamming; double distance; double pos_x; double pos_y; double pos_z; double time; };
typedef std::vector tagFrame;
class AprilTagInterface {
38
-
AprilTags::TagDetector* m_tagDetector;
AprilTags::TagCodes m_tagCodes;
bool m_timing; // print timing information for each tag extraction call
int m_width; // image size in pixels int m_height; double m_tagSize; // April tag side length in meters of square black frame double m_fx; // camera focal length in pixels double m_fy; double m_px; // camera principal point double m_py;
int m_deviceId; // camera id (in case of multiple cameras)
#if EXPOSURE_CONTROL==1 int m_exposure; int m_gain; int m_brightness; #endif
std::list m_imgNames;
cv::VideoCapture m_cap;
tagFrame m_frame;
public: AprilTagInterface();
virtual ~AprilTagInterface();
void processFrame(); inline tagFrame getFrame() {return m_frame;}; };
#endif /* APRILTAGINTERFACE_HPP_ */
DetectionTest.cpp
#include "AprilTagInterface.hpp"
using namespace std;
int main() {
AprilTagInterface april;
while (true) {
april.processFrame();
tagFrame frame = april.getFrame();
if (!frame.empty()) {
for( tagFrame::iterator it = frame.begin(); it != frame.end(); it++) {
39
-
cout
-
}
FifoComm.hpp
#include "FifoComm.hpp"
void sendMessage(std::string str) {
std::cout
-
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
© Copyright 2015-2016, 3D Robotics.
guided_set_speed_yaw.py: (Copter Only)
This example shows how to move/direct Copter and send commands in GUIDED mode using DroneKit Python.
Example documentation: http://python.dronekit.io/examples/guided-set-speed-yaw-demo.html
"""
from dronekit import connect, VehicleMode, LocationGlobal, LocationGlobalRelative from pymavlink import mavutil # Needed for command message definitions import time import math import os import errno import re
connection_string = "/dev/ttyS0" FIFO_1 = "/tmp/server_to_client_fifo" FIFO_2 = "/tmp/client_to_server_fifo"
try: os.mkfifo(FIFO_1)
os.mkfifo(FIFO_2)
except OSError as oe: if oe.errno != errno.EEXIST: raise
# Connect to the Vehicle
print 'Connecting to vehicle on: %s' % connection_string vehicle = connect(connection_string, wait_ready=True, baud=57600) # Create and open log file
global outFile outFile = open('location.txt', 'w+')
def print_loc(): currentLocationStr = str(vehicle.location.global_relative_frame.lat) + " " + \ str(vehicle.location.global_relative_frame.lon) + " " + \ str(vehicle.location.global_relative_frame.alt) + "\n" outFile.write(currentLocationStr);
return
def arm_and_takeoff(aTargetAltitude): """ Arms vehicle and fly to aTargetAltitude.
"""
print "Basic pre-arm checks" # Don't let the user try to arm until autopilot is ready while not vehicle.is_armable: print " Waiting for vehicle to initialise..." time.sleep(1)
42
-
print "Arming motors" # Copter should arm in GUIDED mode vehicle.mode = VehicleMode("GUIDED") vehicle.armed = True
while not vehicle.armed: print " Waiting for arming..." time.sleep(1)
print "Taking off!" vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude
# Wait until the vehicle reaches a safe height before processing the goto (otherwise the command # after Vehicle.simple_takeoff will execute immediately). while True: print " Altitude: ", vehicle.location.global_relative_frame.alt print_loc();
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95: #Trigger just below target alt.
print "Reached target altitude" break time.sleep(0.25)
"""
Functions to make it easy to convert between the different frames-of-reference. In particular these
make it easy to navigate in terms of "metres from the current position" when using commands that take
absolute positions in decimal degrees.
The methods are approximations only, and may be less accurate over longer distances, and when close
to the Earth's poles.
Specifically, it provides:
* get_location_metres - Get LocationGlobal (decimal degrees) at distance (m) North & East of a given
LocationGlobal.
* get_distance_metres - Get the distance between two LocationGlobal objects in metres
* get_bearing - Get the bearing in degrees to a LocationGlobal
"""
def get_location_metres(original_location, dNorth, dEast): """ Returns a LocationGlobal object containing the latitude/longitude `dNorth` and `dEast` metres from
the
specified `original_location`. The returned LocationGlobal has the same `alt` value
as `original_location`.
The function is useful when you want to move the vehicle around specifying locations relative to
the current vehicle position.
The algorithm is relatively accurate over small distances (10m within 1km) except close to the
poles.
For more information see:
http://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount
43
-
-of-meters
"""
earth_radius = 6378137.0 #Radius of "spherical" earth #Coordinate offsets in radians dLat = dNorth/earth_radius
dLon = dEast/(earth_radius*math.cos(math.pi*original_location.lat/180))
#New position in decimal degrees newlat = original_location.lat + (dLat * 180/math.pi) newlon = original_location.lon + (dLon * 180/math.pi) if type(original_location) is LocationGlobal: targetlocation=LocationGlobal(newlat, newlon,original_location.alt)
elif type(original_location) is LocationGlobalRelative: targetlocation=LocationGlobalRelative(newlat, newlon,original_location.alt)
else: raise Exception("Invalid Location object passed")
return targetlocation;
def get_distance_metres(aLocation1, aLocation2): """ Returns the ground distance in metres between two LocationGlobal objects.
This method is an approximation, and will not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
dlat = aLocation2.lat - aLocation1.lat
dlong = aLocation2.lon - aLocation1.lon
return math.sqrt((dlat*dlat) + (dlong*dlong)) * 1.113195e5
def get_bearing(aLocation1, aLocation2): """ Returns the bearing between the two LocationGlobal objects passed as parameters.
This method is an approximation, and may not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
off_x = aLocation2.lon - aLocation1.lon
off_y = aLocation2.lat - aLocation1.lat
bearing = 90.00 + math.atan2(-off_y, off_x) * 57.2957795 if bearing < 0: bearing += 360.00 return bearing;
def goto(dNorth, dEast, gotoFunction=vehicle.simple_goto): """ Moves the vehicle to a position dNorth metres North and dEast metres East of the current position.
The method takes a function pointer argument with a single `dronekit.lib.LocationGlobal` parameter
for
the target position. This allows it to be called with different position-setting commands.
By default it uses the standard method: dronekit.lib.Vehicle.simple_goto().
44
-
The method reports the distance to target every two seconds.
"""
currentLocation = vehicle.location.global_relative_frame
targetLocation = get_location_metres(currentLocation, dNorth, dEast)
targetDistance = get_distance_metres(currentLocation, targetLocation)
gotoFunction(targetLocation)
#print "DEBUG: targetLocation: %s" % targetLocation #print "DEBUG: targetLocation: %s" % targetDistance
while vehicle.mode.name=="GUIDED": #Stop action if we are no longer in guided mode. #print "DEBUG: mode: %s" % vehicle.mode.name print_loc();
remainingDistance=get_distance_metres(vehicle.location.global_relative_frame, targetLocation)
print "Distance to target: ", remainingDistance if remainingDistance
-
print "Getting tag data." if ((time.clock() - startTime) >= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "ACK\n")): sys.exit("IPC Failure. Exiting.")
def getTagInfo(): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] TIMEOUT = 3 # seconds os.system("echo GET > " + FIFO_2) startTime = time.clock();
with open(FIFO_1) as fifo: print("FIFO opened") while True: print "Getting tag data." if ((time.clock() - startTime) >= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "NONE\n") and (dataStr.find(',')!=-1)): print "FIFO got: " + dataStr return dataStr.split(",") else: print "FIFO got: " + dataStr print "No Detection" return ""
def findTag(TARGET_ID): findAttempts = 0 MAX_FIND_ATTEMPTS = 5 while (True): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] info = getTagInfo()
if (info): id = int(info[0]) pos_x = float(info[3]) # Up/Down (Away from camera) pos_y = float(info[4]) # Forward/Back? pos_z = float(info[5]) # Left/Right? if (id == TARGET_ID): print "Found tag " + str(id) + " at " + str(pos_x) + "," + str(pos_y) + "," + str(pos_z)
return [pos_x,pos_y,pos_z] else:
46
-
findAttempts += 1 if(findAttempts > MAX_FIND_ATTEMPTS): print "Could not find tag." return [] else: print "Reattempting image find."
def centerQuad(tagInfo): pos_x = tagInfo[0] pos_y = tagInfo[1] #Left(+)/Right(-) pos_z = tagInfo[2] #Forward(+)/Backward(-)
centeredFB = False centeredLR = False
if (pos_y > 0.5): print "Centering L/R... (Left of target)" send_offset_velocity(0,-0.5,0,abs(pos_y)*2) #Move Right send_offset_velocity(0,0,0,1) elif (pos_y < -0.5): print "Centering L/R... (Right of target)" send_offset_velocity(0,0.5,0,abs(pos_y)*2) #Move Left send_offset_velocity(0,0,0,1) else: print "Centered to L/R!" centeredLR = True
if (pos_z > 0.5): print "Centering F/B... (Forward of target)" send_offset_velocity(-0.5,0,0,abs(pos_z)*2) #Move Backward send_offset_velocity(0,0,0,1) elif (pos_z < -0.5): print "Centering F/B... (Behind target)" send_offset_velocity(0.5,0,0,abs(pos_z)*2) #Move Forward send_offset_velocity(0,0,0,1) else: print "Centered to F/B!" centeredFB = True
return centeredFB and centeredLR
# HERE WE DO STUFF
checkFifo();
print("Set groundspeed to 5m/s.") vehicle.groundspeed=5
#Arm and take of to altitude of 5 meters
arm_and_takeoff(5)
print("Position North 8 West 5") goto(8, -5)
positionAttempts = 0
47
-
MAX_POSITION_ATTEMPTS = 5 TARGET_ID = 0;
tagInfo = findTag(TARGET_ID);
while(True): tagInfo = findTag(TARGET_ID);
if(tagInfo): print_loc();
atTag = centerQuad(tagInfo);
if(atTag): print "Successfully centered over target. Landing..." break else: positionAttempts += 1 if(positionAttempts > MAX_POSITION_ATTEMPTS): print "Cound not center over tag. Landing..." break else: print "Lost tag. Landing..." break
print("Setting LAND mode...") vehicle.mode = VehicleMode("LAND")
for i in range(80): print_loc();
time.sleep(0.25);
#Close vehicle object before exiting script
print "Close vehicle object" vehicle.close()
outFile.close()
print("Completed")
pipe_example.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
© Copyright 2015-2016, 3D Robotics.
guided_set_speed_yaw.py: (Copter Only)
This example shows how to move/direct Copter and send commands in GUIDED mode using DroneKit Python.
Example documentation: http://python.dronekit.io/examples/guided-set-speed-yaw-demo.html
"""
from dronekit import connect, VehicleMode, LocationGlobal, LocationGlobalRelative from pymavlink import mavutil # Needed for command message definitions import time import math import os
48
-
import errno import re
connection_string = "/dev/ttyS0" FIFO_1 = "/tmp/server_to_client_fifo" FIFO_2 = "/tmp/client_to_server_fifo"
try: os.mkfifo(FIFO_1)
os.mkfifo(FIFO_2)
except OSError as oe: if oe.errno != errno.EEXIST: raise
# Connect to the Vehicle
print 'Connecting to vehicle on: %s' % connection_string vehicle = connect(connection_string, wait_ready=True, baud=57600) # Create and open log file
global outFile outFile = open('location.txt', 'w+')
def print_loc(): currentLocationStr = str(vehicle.location.global_relative_frame.lat) + " " + \ str(vehicle.location.global_relative_frame.lon) + " " + \ str(vehicle.location.global_relative_frame.alt) + "\n" outFile.write(currentLocationStr);
return
def arm_and_takeoff(aTargetAltitude): """ Arms vehicle and fly to aTargetAltitude.
"""
print "Basic pre-arm checks" # Don't let the user try to arm until autopilot is ready while not vehicle.is_armable: print " Waiting for vehicle to initialise..." time.sleep(1)
print "Arming motors" # Copter should arm in GUIDED mode vehicle.mode = VehicleMode("GUIDED") vehicle.armed = True
while not vehicle.armed: print " Waiting for arming..." time.sleep(1)
print "Taking off!" vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude
# Wait until the vehicle reaches a safe height before processing the goto (otherwise the command # after Vehicle.simple_takeoff will execute immediately). while True:
49
-
print " Altitude: ", vehicle.location.global_relative_frame.alt print_loc();
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95: #Trigger just below target alt.
print "Reached target altitude" break time.sleep(0.25)
"""
Functions to make it easy to convert between the different frames-of-reference. In particular these
make it easy to navigate in terms of "metres from the current position" when using commands that take
absolute positions in decimal degrees.
The methods are approximations only, and may be less accurate over longer distances, and when close
to the Earth's poles.
Specifically, it provides:
* get_location_metres - Get LocationGlobal (decimal degrees) at distance (m) North & East of a given
LocationGlobal.
* get_distance_metres - Get the distance between two LocationGlobal objects in metres
* get_bearing - Get the bearing in degrees to a LocationGlobal
"""
def get_location_metres(original_location, dNorth, dEast): """ Returns a LocationGlobal object containing the latitude/longitude `dNorth` and `dEast` metres from
the
specified `original_location`. The returned LocationGlobal has the same `alt` value
as `original_location`.
The function is useful when you want to move the vehicle around specifying locations relative to
the current vehicle position.
The algorithm is relatively accurate over small distances (10m within 1km) except close to the
poles.
For more information see:
http://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount
-of-meters
"""
earth_radius = 6378137.0 #Radius of "spherical" earth #Coordinate offsets in radians dLat = dNorth/earth_radius
dLon = dEast/(earth_radius*math.cos(math.pi*original_location.lat/180))
#New position in decimal degrees newlat = original_location.lat + (dLat * 180/math.pi) newlon = original_location.lon + (dLon * 180/math.pi) if type(original_location) is LocationGlobal: targetlocation=LocationGlobal(newlat, newlon,original_location.alt)
elif type(original_location) is LocationGlobalRelative: targetlocation=LocationGlobalRelative(newlat, newlon,original_location.alt)
else: raise Exception("Invalid Location object passed")
50
-
return targetlocation;
def get_distance_metres(aLocation1, aLocation2): """ Returns the ground distance in metres between two LocationGlobal objects.
This method is an approximation, and will not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
dlat = aLocation2.lat - aLocation1.lat
dlong = aLocation2.lon - aLocation1.lon
return math.sqrt((dlat*dlat) + (dlong*dlong)) * 1.113195e5
def get_bearing(aLocation1, aLocation2): """ Returns the bearing between the two LocationGlobal objects passed as parameters.
This method is an approximation, and may not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
off_x = aLocation2.lon - aLocation1.lon
off_y = aLocation2.lat - aLocation1.lat
bearing = 90.00 + math.atan2(-off_y, off_x) * 57.2957795 if bearing < 0: bearing += 360.00 return bearing;
def goto(dNorth, dEast, gotoFunction=vehicle.simple_goto): """ Moves the vehicle to a position dNorth metres North and dEast metres East of the current position.
The method takes a function pointer argument with a single `dronekit.lib.LocationGlobal` parameter
for
the target position. This allows it to be called with different position-setting commands.
By default it uses the standard method: dronekit.lib.Vehicle.simple_goto().
The method reports the distance to target every two seconds.
"""
currentLocation = vehicle.location.global_relative_frame
targetLocation = get_location_metres(currentLocation, dNorth, dEast)
targetDistance = get_distance_metres(currentLocation, targetLocation)
gotoFunction(targetLocation)
#print "DEBUG: targetLocation: %s" % targetLocation #print "DEBUG: targetLocation: %s" % targetDistance
while vehicle.mode.name=="GUIDED": #Stop action if we are no longer in guided mode. #print "DEBUG: mode: %s" % vehicle.mode.name print_loc();
remainingDistance=get_distance_metres(vehicle.location.global_relative_frame, targetLocation)
print "Distance to target: ", remainingDistance
51
-
if remainingDistance= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "ACK\n")): sys.exit("IPC Failure. Exiting.")
def getTagInfo(): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] TIMEOUT = 3 # seconds os.system("echo GET > " + FIFO_2)
52
-
startTime = time.clock();
with open(FIFO_1) as fifo: print("FIFO opened") while True: print "Getting tag data." if ((time.clock() - startTime) >= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "NONE\n") and (dataStr.find(',')!=-1)): print "FIFO got: " + dataStr return dataStr.split(",") else: print "FIFO got: " + dataStr print "No Detection" return ""
def findTag(TARGET_ID): findAttempts = 0 MAX_FIND_ATTEMPTS = 5 while (True): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] info = getTagInfo()
if (info): id = int(info[0]) pos_x = float(info[3]) # Up/Down (Away from camera) pos_y = float(info[4]) # Forward/Back? pos_z = float(info[5]) # Left/Right? if (id == TARGET_ID): print "Found tag " + str(id) + " at " + str(pos_x) + "," + str(pos_y) + "," + str(pos_z)
return [pos_x,pos_y,pos_z] else: findAttempts += 1 if(findAttempts > MAX_FIND_ATTEMPTS): print "Could not find tag." return [] else: print "Reattempting image find."
def centerQuad(tagInfo): pos_x = tagInfo[0] pos_y = tagInfo[1] #Left(+)/Right(-) pos_z = tagInfo[2] #Forward(+)/Backward(-)
centeredFB = False centeredLR = False
if (pos_y > 0.5):
53
-
print "Centering L/R... (Left of target)" send_offset_velocity(0,-0.5,0,abs(pos_y)*2) #Move Right send_offset_velocity(0,0,0,1) elif (pos_y < -0.5): print "Centering L/R... (Right of target)" send_offset_velocity(0,0.5,0,abs(pos_y)*2) #Move Left send_offset_velocity(0,0,0,1) else: print "Centered to L/R!" centeredLR = True
if (pos_z > 0.5): print "Centering F/B... (Forward of target)" send_offset_velocity(-0.5,0,0,abs(pos_z)*2) #Move Backward send_offset_velocity(0,0,0,1) elif (pos_z < -0.5): print "Centering F/B... (Behind target)" send_offset_velocity(0.5,0,0,abs(pos_z)*2) #Move Forward send_offset_velocity(0,0,0,1) else: print "Centered to F/B!" centeredFB = True
return centeredFB and centeredLR
# HERE WE DO STUFF
checkFifo();
print("Set groundspeed to 5m/s.") vehicle.groundspeed=5
#Arm and take of to altitude of 5 meters
arm_and_takeoff(5)
print("Position North 8 West 5") goto(8, -5)
positionAttempts = 0 MAX_POSITION_ATTEMPTS = 5 TARGET_ID = 0;
tagInfo = findTag(TARGET_ID);
while(True): tagInfo = findTag(TARGET_ID);
if(tagInfo): print_loc();
atTag = centerQuad(tagInfo);
if(atTag): print "Successfully centered over target. Landing..." break else: positionAttempts += 1 if(positionAttempts > MAX_POSITION_ATTEMPTS): print "Cound not center over tag. Landing..." break
54
-
else: print "Lost tag. Landing..." break
print("Setting LAND mode...") vehicle.mode = VehicleMode("LAND")
for i in range(80): print_loc();
time.sleep(0.25);
#Close vehicle object before exiting script
print "Close vehicle object" vehicle.close()
outFile.close()
print("Completed")
quad.cpp
#include "AprilTagInterface.hpp" #include "FifoComm.hpp" #include #include // quad.py needs to be running in order for this to work properly
using namespace std;
int main() {
AprilTagInterface april;
while(true) {
string msg = receiveMessage(); cout
-
sendMessage("NONE"); }
}
else if(msg == "CHECK") {
sendMessage("ACK"); }
}
}
quad.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
© Copyright 2015-2016, 3D Robotics.
guided_set_speed_yaw.py: (Copter Only)
This example shows how to move/direct Copter and send commands in GUIDED mode using DroneKit Python.
Example documentation: http://python.dronekit.io/examples/guided-set-speed-yaw-demo.html
"""
from dronekit import connect, VehicleMode, LocationGlobal, LocationGlobalRelative from pymavlink import mavutil # Needed for command message definitions import time import math import os import errno import re
connection_string = "/dev/ttyS0" FIFO_1 = "/tmp/server_to_client_fifo" FIFO_2 = "/tmp/client_to_server_fifo"
try: os.mkfifo(FIFO_1)
os.mkfifo(FIFO_2)
except OSError as oe: if oe.errno != errno.EEXIST: raise
# Connect to the Vehicle
print 'Connecting to vehicle on: %s' % connection_string vehicle = connect(connection_string, wait_ready=True, baud=57600) # Create and open log file
global outFile outFile = open('location.txt', 'w+')
def print_loc(): currentLocationStr = str(vehicle.location.global_relative_frame.lat) + " " + \ str(vehicle.location.global_relative_frame.lon) + " " + \
56
-
str(vehicle.location.global_relative_frame.alt) + "\n" outFile.write(currentLocationStr);
return
def arm_and_takeoff(aTargetAltitude): """ Arms vehicle and fly to aTargetAltitude.
"""
print "Basic pre-arm checks" # Don't let the user try to arm until autopilot is ready while not vehicle.is_armable: print " Waiting for vehicle to initialise..." time.sleep(1)
print "Arming motors" # Copter should arm in GUIDED mode vehicle.mode = VehicleMode("GUIDED") vehicle.armed = True
while not vehicle.armed: print " Waiting for arming..." time.sleep(1)
print "Taking off!" vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude
# Wait until the vehicle reaches a safe height before processing the goto (otherwise the command # after Vehicle.simple_takeoff will execute immediately). while True: print " Altitude: ", vehicle.location.global_relative_frame.alt print_loc();
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95: #Trigger just below target alt.
print "Reached target altitude" break time.sleep(0.25)
"""
Functions to make it easy to convert between the different frames-of-reference. In particular these
make it easy to navigate in terms of "metres from the current position" when using commands that take
absolute positions in decimal degrees.
The methods are approximations only, and may be less accurate over longer distances, and when close
to the Earth's poles.
Specifically, it provides:
* get_location_metres - Get LocationGlobal (decimal degrees) at distance (m) North & East of a given
LocationGlobal.
* get_distance_metres - Get the distance between two LocationGlobal objects in metres
* get_bearing - Get the bearing in degrees to a LocationGlobal
"""
def get_location_metres(original_location, dNorth, dEast): """
57
-
Returns a LocationGlobal object containing the latitude/longitude `dNorth` and `dEast` metres from
the
specified `original_location`. The returned LocationGlobal has the same `alt` value
as `original_location`.
The function is useful when you want to move the vehicle around specifying locations relative to
the current vehicle position.
The algorithm is relatively accurate over small distances (10m within 1km) except close to the
poles.
For more information see:
http://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount
-of-meters
"""
earth_radius = 6378137.0 #Radius of "spherical" earth #Coordinate offsets in radians dLat = dNorth/earth_radius
dLon = dEast/(earth_radius*math.cos(math.pi*original_location.lat/180))
#New position in decimal degrees newlat = original_location.lat + (dLat * 180/math.pi) newlon = original_location.lon + (dLon * 180/math.pi) if type(original_location) is LocationGlobal: targetlocation=LocationGlobal(newlat, newlon,original_location.alt)
elif type(original_location) is LocationGlobalRelative: targetlocation=LocationGlobalRelative(newlat, newlon,original_location.alt)
else: raise Exception("Invalid Location object passed")
return targetlocation;
def get_distance_metres(aLocation1, aLocation2): """ Returns the ground distance in metres between two LocationGlobal objects.
This method is an approximation, and will not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
dlat = aLocation2.lat - aLocation1.lat
dlong = aLocation2.lon - aLocation1.lon
return math.sqrt((dlat*dlat) + (dlong*dlong)) * 1.113195e5
def get_bearing(aLocation1, aLocation2): """ Returns the bearing between the two LocationGlobal objects passed as parameters.
This method is an approximation, and may not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
off_x = aLocation2.lon - aLocation1.lon
58
-
off_y = aLocation2.lat - aLocation1.lat
bearing = 90.00 + math.atan2(-off_y, off_x) * 57.2957795 if bearing < 0: bearing += 360.00 return bearing;
def goto(dNorth, dEast, gotoFunction=vehicle.simple_goto): """ Moves the vehicle to a position dNorth metres North and dEast metres East of the current position.
The method takes a function pointer argument with a single `dronekit.lib.LocationGlobal` parameter
for
the target position. This allows it to be called with different position-setting commands.
By default it uses the standard method: dronekit.lib.Vehicle.simple_goto().
The method reports the distance to target every two seconds.
"""
currentLocation = vehicle.location.global_relative_frame
targetLocation = get_location_metres(currentLocation, dNorth, dEast)
targetDistance = get_distance_metres(currentLocation, targetLocation)
gotoFunction(targetLocation)
#print "DEBUG: targetLocation: %s" % targetLocation #print "DEBUG: targetLocation: %s" % targetDistance
while vehicle.mode.name=="GUIDED": #Stop action if we are no longer in guided mode. #print "DEBUG: mode: %s" % vehicle.mode.name print_loc();
remainingDistance=get_distance_metres(vehicle.location.global_relative_frame, targetLocation)
print "Distance to target: ", remainingDistance if remainingDistance
-
mavutil.mavlink.MAV_FRAME_BODY_OFFSET_NED, # frame 0b0000111111000111, # type_mask (only speeds enabled) 0, 0, 0, # x, y, z positions (not used) velocity_x, velocity_y, velocity_z, # x, y, z velocity in m/s 0, 0, 0, # x, y, z acceleration (not supported yet, ignored in GCS_Mavlink) 0, 0) # yaw, yaw_rate (not supported yet, ignored in GCS_Mavlink)
def checkFifo(): TIMEOUT = 3 # seconds os.system("echo CHECK > " + FIFO_2) startTime = time.clock();
with open(FIFO_1) as fifo: print("FIFO opened") while True: print "Getting tag data." if ((time.clock() - startTime) >= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "ACK\n")): sys.exit("IPC Failure. Exiting.")
def getTagInfo(): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] TIMEOUT = 3 # seconds os.system("echo GET > " + FIFO_2) startTime = time.clock();
with open(FIFO_1) as fifo: print("FIFO opened") while True: print "Getting tag data." if ((time.clock() - startTime) >= TIMEOUT): print "IPC timeout." return "" dataStr = fifo.read()
if len(dataStr) == 0: break else: dataStr.replace('\n', ''); if ((dataStr != "NONE\n") and (dataStr.find(',')!=-1)): print "FIFO got: " + dataStr return dataStr.split(",") else: print "FIFO got: " + dataStr print "No Detection" return ""
def findTag(TARGET_ID): findAttempts = 0 MAX_FIND_ATTEMPTS = 5
60
-
while (True): #TAG FORMAT IS LIST AS SUCH: #[int id,double hamming,double distance,double pos_x,double pos_y,double pos_z,double time] info = getTagInfo()
if (info): id = int(info[0]) pos_x = float(info[3]) # Up/Down (Away from camera) pos_y = float(info[4]) # Forward/Back? pos_z = float(info[5]) # Left/Right? if (id == TARGET_ID): print "Found tag " + str(id) + " at " + str(pos_x) + "," + str(pos_y) + "," + str(pos_z)
return [pos_x,pos_y,pos_z] else: findAttempts += 1 if(findAttempts > MAX_FIND_ATTEMPTS): print "Could not find tag." return [] else: print "Reattempting image find."
def centerQuad(tagInfo): pos_x = tagInfo[0] pos_y = tagInfo[1] #Left(+)/Right(-) pos_z = tagInfo[2] #Forward(+)/Backward(-)
centeredFB = False centeredLR = False
if (pos_y > 0.25): print "Centering L/R... (Left of target)" send_offset_velocity(0,-0.5,0,abs(pos_y)*2) #Move Right send_offset_velocity(0,0,0,1) elif (pos_y < -0.25): print "Centering L/R... (Right of target)" send_offset_velocity(0,0.5,0,abs(pos_y)*2) #Move Left send_offset_velocity(0,0,0,1) else: print "Centered to L/R!" centeredLR = True
if (pos_z > 0.25): print "Centering F/B... (Forward of target)" send_offset_velocity(-0.5,0,0,abs(pos_z)*2) #Move Backward send_offset_velocity(0,0,0,1) elif (pos_z < -0.25): print "Centering F/B... (Behind target)" send_offset_velocity(0.5,0,0,abs(pos_z)*2) #Move Forward send_offset_velocity(0,0,0,1) else: print "Centered to F/B!" centeredFB = True
return centeredFB and centeredLR
61
-
# HERE WE DO STUFF
checkFifo();
print("Set groundspeed to 5m/s.") vehicle.groundspeed=5
#Arm and take of to altitude of 5 meters
arm_and_takeoff(5)
print("Position North 8 West 5") goto(8, -5)
positionAttempts = 0 MAX_POSITION_ATTEMPTS = 5 TARGET_ID = 0;
tagInfo = findTag(TARGET_ID);
while(True): tagInfo = findTag(TARGET_ID);
if(tagInfo): print_loc();
atTag = centerQuad(tagInfo);
if(atTag): print "Successfully centered over target. Landing..." break else: positionAttempts += 1 if(positionAttempts > MAX_POSITION_ATTEMPTS): print "Cound not center over tag. Landing..." break else: print "Lost tag. Landing..." break
print("Setting LAND mode...") vehicle.mode = VehicleMode("LAND")
for i in range(80): print_loc();
time.sleep(0.25);
#Close vehicle object before exiting script
print "Close vehicle object" vehicle.close()
outFile.close()
print("Completed")
quad_pipe_example.py
#!/usr/bin/env python
62
-
# -*- coding: utf-8 -*-
"""
Based on: guided_set_speed_yaw.py
© Copyright 2015-2016, 3D Robotics.
"""
from dronekit import connect, VehicleMode, LocationGlobal, LocationGlobalRelative from pymavlink import mavutil # Needed for command message definitions import time import math import os import errno import re
connection_string = "/dev/ttyS0" FIFO_1 = "/tmp/server_to_client_fifo" FIFO_2 = "/tmp/client_to_server_fifo"
try: os.mkfifo(FIFO_1)
os.mkfifo(FIFO_2)
except OSError as oe: if oe.errno != errno.EEXIST: raise
# Connect to the Vehicle
print 'Connecting to vehicle on: %s' % connection_string vehicle = connect(connection_string, wait_ready=True, baud=57600) # Create and open log file
global outFile outFile = open('location.txt', 'w+')
def print_loc(): currentLocationStr = str(vehicle.location.global_relative_frame.lat) + " " + \ str(vehicle.location.global_relative_frame.lon) + " " + \ str(vehicle.location.global_relative_frame.alt) + "\n" outFile.write(currentLocationStr);
return
def arm_and_takeoff(aTargetAltitude): """ Arms vehicle and fly to aTargetAltitude.
"""
print "Basic pre-arm checks" # Don't let the user try to arm until autopilot is ready while not vehicle.is_armable: print " Waiting for vehicle to initialise..." time.sleep(1)
print "Arming motors" # Copter should arm in GUIDED mode
63
-
vehicle.mode = VehicleMode("GUIDED") vehicle.armed = True
while not vehicle.armed: print " Waiting for arming..." time.sleep(1)
print "Taking off!" vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude
# Wait until the vehicle reaches a safe height before processing the goto (otherwise the command # after Vehicle.simple_takeoff will execute immediately). while True: print " Altitude: ", vehicle.location.global_relative_frame.alt print_loc();
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95: #Trigger just below target alt.
print "Reached target altitude" break time.sleep(0.25)
"""
Functions to make it easy to convert between the different frames-of-reference. In particular these
make it easy to navigate in terms of "metres from the current position" when using commands that take
absolute positions in decimal degrees.
The methods are approximations only, and may be less accurate over longer distances, and when close
to the Earth's poles.
Specifically, it provides:
* get_location_metres - Get LocationGlobal (decimal degrees) at distance (m) North & East of a given
LocationGlobal.
* get_distance_metres - Get the distance between two LocationGlobal objects in metres
* get_bearing - Get the bearing in degrees to a LocationGlobal
"""
def get_location_metres(original_location, dNorth, dEast): """ Returns a LocationGlobal object containing the latitude/longitude `dNorth` and `dEast` metres from
the
specified `original_location`. The returned LocationGlobal has the same `alt` value
as `original_location`.
The function is useful when you want to move the vehicle around specifying locations relative to
the current vehicle position.
The algorithm is relatively accurate over small distances (10m within 1km) except close to the
poles.
For more information see:
http://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount
-of-meters
"""
earth_radius = 6378137.0 #Radius of "spherical" earth #Coordinate offsets in radians
64
-
dLat = dNorth/earth_radius
dLon = dEast/(earth_radius*math.cos(math.pi*original_location.lat/180))
#New position in decimal degrees newlat = original_location.lat + (dLat * 180/math.pi) newlon = original_location.lon + (dLon * 180/math.pi) if type(original_location) is LocationGlobal: targetlocation=LocationGlobal(newlat, newlon,original_location.alt)
elif type(original_location) is LocationGlobalRelative: targetlocation=LocationGlobalRelative(newlat, newlon,original_location.alt)
else: raise Exception("Invalid Location object passed")
return targetlocation;
def get_distance_metres(aLocation1, aLocation2): """ Returns the ground distance in metres between two LocationGlobal objects.
This method is an approximation, and will not be accurate over large distances and close to the
earth's poles. It comes from the ArduPilot test code:
https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py
"""
dlat = aLocation2.lat - aLocation1.lat
dlong = aLocation2.lon - aLocation1.lon
return math.sqrt((dlat*dlat) + (dlong*dlong)) * 1.113195e5
def get_bearing(aLocation1, aLocation2): """ Returns the bearing between the two LocationGlobal objects passed as parameters.
This method is an approximation, and may not be accurate over large distances and close to