Вы находитесь на странице: 1из 98

ESEIAAT

Bachelors Thesis

Fire prevention with a UAV closed


loop system
Report

Degree: Bachelors degree in engineering of Aerospace Vehicles

Student: Unzueta Canals, Marc

Director: Lordan Gonzalez, Oriol

CO-Director: Soria Guerrero, Manel

Delivery date: 10/06/2017


Acknowledgments

I think that the following paper is the culmination of many studying years, but most
importantly, a dream realization. Ive been dreaming in being an Aerospace engineer
since I was 10 years old, and finally being able to start thinking of an idea, translate it into
a project and make it real, has been the greatest satisfaction from also those years as an student.

Obviously, it would not have been possible without the support from a large group of people
that have been behind me giving me support all the time, but the key aspect its been their
trust. Knowing and feeling that people believes in you is the greatest motivation I could have.

Thank you to my to family for that amazing support, on the everyday basis especially in
the lowest part of that tough journey. Thank you friends, youve always understood that
many hours I could not spend with you. Thank you Marta for the continuous support in this
amazing and shared journey. Thank you Oriol Lordan, for trusting me since the day one and
for giving me the opportunity to work with you as great team while enjoying what we were
doing. Thank you Manel Soria for sharing with me all your knowledge.

Besides from the people I had on my everyday life, I must appreciate all my past teachers who,
thanks to their motivation while teaching have encouraged me to love science and technology.

Thank you to the open-source community, your knowledge and your desire to contribute on
teaching others in an unselfish way, has astonished me.

Last but not least, the Goteo crowdfounders, who nearly 6 years ago had blindly trusted
with me, a 16 years old student who was passionate by drones and had no kind of technical
knowledge. Most of that project motivation, comes from delivering you, a real final project
from whom 6 years ago, you trusted with.

Thank you to you all.

R-i
Preface

There is something I must explain; why fire prevention? Thats a history that starts 5
years ago. On 2011 I went to a month program on Boston, USA; in order to improve my
English. The family that hosted my, knew how passionate I was by mathematics, rockets,
space and actually any king of technology, so they took me to any engineer paradise, the
MIT. Obviously a visit in the MIT campus was mandatory, the aerospace and aeronautics
department. Thats where I found out what drones were, as in their corridor, a poster was
explaining how a group of 2 PhD students developed a drone with 4 Kinetic sensors whic
were able to create a 3D map of a closed space. That simply blew up my mind, I had to do
something similar too. From that day, my new hobby was basically trying to build cheap drones.

During my last bachelor year, on my final thesis I decided, the project had to by linked with
drones, but it did not make any sense for me to build a drone, just building the drone I mean.
Thanks to Estel Paloma and Victor Robert (my partner in the project and great friend)
the EyeCopter project was born. Its idea was to build up a drone when non tutorials or
pre-assembled kits parts where available and at the same time focusing on the fire prevention
capabilities. Moreover a crowd funding campaign was launched in Goteo.org in order to have
enough money to run the project as on those days, the parts were extremely expensive for
us. Crowdfunding campaigns expect you to return something to your "investors". Our return
was all the knowledge learned during the prototyping phase, so if anyone wanted to build a
drone from scratch once we finished ours, we would advise them how to; which a couple people
requested and we helped them delighted.

Now Im actually able to justify why fire prevention. I basically would love to deliver this thesis
to whom 5 years ago when I was no aerospace engineer, trusted and helped me to develop such
a project. I would like to prove them that it actually is possible to, and to remind them the
importance of their small gesture.

R - ii
Abstract

This project it is intended to analyse the possibilities of a real unmanned system. An


unmanned aerial vehicle must be able to perform the whole mission on its own, without
having an external input that guides it. Therefore that project focuses on a fire prevention
system that is able to send a drone to a specific point to perform a mission without a human
intervention.

Lets summarize the overall process in a nutshell. There are many sensors spread in a forest
that have internet connection. As soon as a sensors detects that theres a possible wildfire
about to start or the conditions are susceptible for a wildfire to occur, sends a message with
its location to the server. Then the drones that are closer to that specific sensor, decide which
of them is closer to the sensor, and one of them automatically takes off. Once the drone
reaches the sensors, performs a mission which has been auto programmed by the drone code in
function of the conditions. Once the mission is complete, the drone goes back to the landing
zone.

The system has been able to perform a mission, by its own, with no external inputs, and on
top of that, the decision making has also been done by the computer.

The possibilities of such a system are endless as we could automatize a large list of process that
nowadays require a human for repetitive tasks.

R - iii
CONTENTS

Contents

List of Figures vii

1 Aim 1

2 Scope 2

3 Requirements 3

4 Justification 4

5 Actual Solutions 5
5.1 Wildfires causes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
5.2 Wildfires types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.3 Wildfires prevention systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5.3.1 Aerial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5.3.2 Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
5.3.3 Terrestial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
5.3.4 Human . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
5.3.5 State of the art solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

6 Our solution 13
6.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
6.2 System workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
6.2.1 Sensor and drone placement . . . . . . . . . . . . . . . . . . . . . . . . . 14
6.2.2 Sending the alert . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
6.2.3 Drone trip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
6.2.4 Drone mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
6.2.5 Base return . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
6.3 Possible applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.3.1 Agricultural . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.3.2 Parcel transport . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

7 Framework Setup 23
7.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
7.1.1 Raspberry Pi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
7.1.2 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

R - iv
CONTENTS

7.1.2.1 Battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
7.1.2.2 Temperature, humidity and hydrometer sensor . . . . . . . . . . 27
7.1.2.3 Real time clock . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
7.1.2.4 Processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
7.1.2.5 Solar panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
7.1.2.6 Resistors, capacitors and others . . . . . . . . . . . . . . . . . . 31
7.1.3 Pixhawk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
7.1.4 Quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
7.1.5 PCB Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
7.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.2.1 SITL Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.2.2 Raspberry Pi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
7.2.3 MQTT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
7.2.4 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

8 Code Description 46
8.1 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
8.1.1 Loop function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
8.2 Functions declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
8.3 Pixhawk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.4 Drone python code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.5 Data python code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
8.6 MQTT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
8.6.1 MQTT Broker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
8.6.2 Raspberry Pi 3 MQTT installation . . . . . . . . . . . . . . . . . . . . . 68
8.6.3 Mac MQTT installation . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
8.7 Shell script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

9 Results 73
9.0.1 Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
9.0.2 SITL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
9.0.3 Prototypes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

10 Future Improvements 78

11 Economic and environmental impact 80


11.1 Economic impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
11.2 Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

R-v
CONTENTS

12 Conclusions 82

13 References 83

R - vi
LIST OF FIGURES

List of Figures
5.1 Fire triangle representation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
5.2 Lightning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.3 Intended human wildfire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.4 Human caused wildfires detected by NASA Earth Observatory. . . . . . . . . . . 7
5.5 Surface fire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
5.6 Height fire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
5.7 Cobra helicopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5.8 Cobra sensors by FLIR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5.9 Daily fire spread mapped by 1km Aqua/MODIS . . . . . . . . . . . . . . . . . . 9
5.10 Terrestial evapotranspiration map by MODIS and University of Montana . . . . 10
5.11 Human surveillance system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.12 ADF volunteers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.13 Prototype from the catalan research group. . . . . . . . . . . . . . . . . . . . . . 12
6.1 Sensor and drone placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
6.2 Automatic charging station . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
6.3 Landing gear with charging capabilities . . . . . . . . . . . . . . . . . . . . . . . 15
6.4 Global drone range and sensor distribution . . . . . . . . . . . . . . . . . . . . . 15
6.5 Global drone range and sensor distribution example . . . . . . . . . . . . . . . . 16
6.6 Data signal sent to MQTT broker and Raspberry Pi 3 . . . . . . . . . . . . . . 17
6.7 Drone trip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
6.8 Wildfires detected with thermal cameras . . . . . . . . . . . . . . . . . . . . . . 19
6.9 Drone thermal inspection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
6.10 Autoplanned drone mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
6.11 Drone returning to the landing zone . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.12 Multispectral aerial image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
6.13 Multispectral camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
6.14 Amazon Prime Now octocopter . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6.15 Amazon Prime Now VTOL system . . . . . . . . . . . . . . . . . . . . . . . . . 22
7.1 RaspberryPi Logo. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
7.2 Raspberry Pi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
7.3 Satelites orbit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
7.4 Patricia Andrews and Mark Finney wildfire predictive model . . . . . . . . . . . 26
7.5 Brand new 18650 battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
7.6 Recycled 18650 battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
7.7 DHT11 temperature and humidity sensor . . . . . . . . . . . . . . . . . . . . . . 28

R - vii
LIST OF FIGURES

7.8 Recycled 18650 battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28


7.9 ESP8266 Version 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
7.10 3 different ESP8266 V1 models tested . . . . . . . . . . . . . . . . . . . . . . . . 29
7.11 Development board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
7.12 Production board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
7.13 Solar panel from ebay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
7.14 Received solar panel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
7.15 LUAS Quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
7.16 LUAS workshop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
7.17 Sensors schematic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
7.18 PCB 3D render using KiCad tools . . . . . . . . . . . . . . . . . . . . . . . . . . 36
7.19 PCB distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.20 Gerber file detail using gerblook.org . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.21 Gerber file detail using gerblook.org 2 . . . . . . . . . . . . . . . . . . . . . . . . 37
7.22 Eurocirciuts budget . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.23 Firstpcb.com budget . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.24 Arduino MKRFOX1200 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
7.25 Sigfox modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
8.1 Waypoint specific mission created by the Python code . . . . . . . . . . . . . . . 67
8.2 Realtime of the drone path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
8.3 CloudMQTT console . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
8.4 Scrip for running the code on start . . . . . . . . . . . . . . . . . . . . . . . . . 71
9.1 Scrip to run code on start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
9.2 Version 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
9.3 Version 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
9.4 Version 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
9.5 Breadboard Versions 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
10.1 ESP32 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
10.2 Sigfox module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

R - viii
LIST OF FIGURES

List of acronyms

TFG [Catalan] Treball de fi de grau


GPS Global Positioning System
UAV Unmanned Aerial Vehicle
AESA [Spanish] Agencia Estatal de Seguridad Area
SITL Software In The Loop
RPi3 Raspberry Pi 3
MIT Massachusetts Institute of Technology
USA United States of America
PhD Philosophi doctor
GOES Geostationary Operational Environmental Satellite
AVHRR Advanced Very High Resolution Radiometer
MODIS Moderate-Resolution Imaging Spectroradiometer
EAATS Envisats Advanced Along Track Scanning Radiometer
NASA National Aeronautics and Space Administration
GPIO General purpose input-output connector
CPU Central Processing Unit
RAM Random-Access Memory
GPU Graphics Processing Unit
GPIO General Purpose Input-Output Connector
FLIR Forward Looking InfraRed
GIS Geographic Information System
ADF [Catalan] Agrupaci de Defensa Forestal
UPC [Catalan] Universitat Politecnica de Catalunya
UAB [Catalan] Universitat Autnoma de Barcelona
CVC [Catalan] Centre de Visi per Computador
USB Universal Serial Bus
HDMI High-Definition Multimedia Interface

R - ix
LIST OF FIGURES

MQTT Message Queue Telemetry Transport


LAN Local Area Network
BLE Bluetooth Low Energy
PCB Printed Circuit Board
LiPo Lithium Polymer Battery
DIY Do It Yourself
IoT Internet of Things
ADC Analog Digital Conversion
SPI Serial Peripheral Interface
OTA Over The Air
WEP Wired Equivalent Privacy
FTDI Future Technology Devices International
SMD Surface Mount Device
ESC Electronic Speed Controller
UART Universal Asynchronous Receiver-Transmitter
PWM Pulse Width Modulation
PLA Poly Lactic Acid
GCS Ground Control Station
SD Secure Digital
Kb Kilo Byte
WPA Wi-Fi Protected Access
I2C Inter-Integrated Circuit
GHz Giga Hertz
SSH Secure Shell
SIM Subscriber Identity Module
UDP User Datagram Protocol
FTP File Transfer Protocol
HTTP Hypertext Transfer Protocol
SSL Secure Sockets Layer
RSSI Received Signal Strength Indication
JSON JavaScript Object Notation
M2M Machine to Machine
HTTPS Hypertext Transfer Protocol Secure

R-x
LISTINGS

Listings
1 Libraries declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2 WiFi constants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3 Position coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4 Objects declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5 Globals variables and constants . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
6 Setup function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7 WiFi check and reconnection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
8 DHT11 sensor reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
9 JSON variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
10 MQTT publishing message . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
11 MQTT subscription . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
12 Voltage check function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
13 MQTT connection function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
14 Sensor delay function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
15 Libraries used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
16 MQTT variables on drone code . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
17 Global variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
18 Parser commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
19 Parse command when running the code . . . . . . . . . . . . . . . . . . . . . . . 55
20 Pixhawk and Raspberry Pi connection . . . . . . . . . . . . . . . . . . . . . . . 56
21 Commands clean up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
22 Arm and takeoff function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
23 MQTT message detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
24 Hover function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
25 Landing command . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
26 Get distance V1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
27 Get distance V2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
28 Distance until next waypoint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
29 Download new mission created . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
30 Creates the grid mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
31 Waypoint setting command . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
32 Main function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
33 RTL command . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
34 Merges the grid mission on to the general mission . . . . . . . . . . . . . . . . . 63
35 MQTT parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

R - xi
LISTINGS

36 Write data on text file from the MQTT messages . . . . . . . . . . . . . . . . . 65


37 Install mosquitto packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
38 Configuration file for mosquitto . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
39 Restart mosquitto . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
40 Installing mosquitto package . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
41 Installing mosquitto package 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
42 Python and MQTT link . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
43 SSH connection to Raspberry Pi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . 70
44 Creates the launcher.sh file . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
45 Start script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

R - xii
Section 1: Aim

1 Aim
The intention of the TFG is to create and deploy a net of temperature and humidity sensors
in a forest area, mapping the actual conditions in order to avoid any possible ignition because
of natural causes. Those sensors will all be connected to a central server that will deal with all
processes plus coordinating the actual drone net, which will also be connected to that central
server.

The main server reads the sensors and will take the decision of sending a drone to the conflicting
zone plus sending the mission that the drone will be doing, achieving a closed system which
is completely autonomous. This is the most ambitious objective, but the strategy will be the
following:

Ensure that the sensors send the info to a server through an internet connection

Process the data received from the server.

Code the decision-making algorithm.

Implement all the system to a drone in a SITL simulator

Implement all the system on a real scenario with a real drone.

R-1
Section 2: Scope

2 Scope
As the thesis has a really large objective, small objectives as described before must be set and
they will be the following:

General forest fire study. What causes them, main problems, major risks, which are the
actual protection measures...

Sensor that allow us to have a real time temperature mapping.

Sensors connected to the central server.

Electronics design for all the sensors. Prototype all the electronics.

How to connect all the data coming from the sensors to the Raspberry Pi 3

How to send control data from the Raspberry Pi 3 to the Pixhawk.

Mission algorithm to auto program itself.

R-2
Section 3: Requirements

3 Requirements
In order to have a feasible project I intend to prototype most of the parts Ill design but it
wont require me to buy all the material, as I have most of them already. Although for the
upcoming parts needed they must satisfy the following:

Cheap (I understand that it is really ambiguous) but basically getting the cheapest option
on the market and through manual modifications, improve its performance.

Design myself all the extra parts needed such as electronics, 3D models...

Check legal regulations during all the test.

Use as much as possible open source programs, in order to prove that technology is
accessible to everyone.

R-3
Section 4: Justification

4 Justification
Since the UAV world started noticing the whole advantages that the vehicles could bring
on, besides the military, many applications have started to appear. Some of them developed
by high-tech companies who invest millions on research and development, and some others
from the merge of amazing and talented people on communities to build fantastic open source
projects.

On top of that, the UAV term has been misinterpreted as it stands for Unmanned Aerial
Vehicle but most of the vehicles categorized by UAV are not really unmanned as human always
has to actually reprogram the vehicle.

Here is the point of this thesis, trying to create a closed loop system that programs itself and
sets the missions by itself too.

These small introduction is part of the background as I understand that a real UAV might
appear soon as the volume of drones used as a platform to undertake several missions are
exponentially increasing, and this thesis will try to tackle such opportunity to find out which
are the capabilities of the system.

There are many technical reasons to that I have found out too:

Large potentially dangerous forest extension remain unprotected or unsupervised.

Real time temperature, humidity sensorization, which could be used for many other
environmental studies.

Real time thermal aerial imaging.

R-4
Section 5: Actual Solutions

5 Actual Solutions
We must first understand what causes wildfires, so lets go back to basics:
For a wildfire to [1] occur we need the fire triangle to be full satisfied. What is the fire triangle?
A basic rule used to understand the 3 basic elements that are needed for a fire to ignite:

Fuel: dry wood, dead vegetation or rubbish

Heat: some source that helps the fire to ignite

Oxygen: basic for the combustion to take place

Figure 5.1: Fire triangle representation.

Notice that if one of the elements from the fire triangle is taken away, we will be able to
extinguish it. Usually its done by reducing the heat, pouring water on to the fire as the water
will absorb the heat, which dissipates the heat by using the energy to do a phase change,
converting the water into steam.

5.1 Wildfires causes


As the basic cause of a fire is now understood, we must ask ourselves why those 3 elements
come altogether and causes the wildfires. Based on statistics, wildfires are usually caused by
the following causes:

Natural: Both lighting and volcanic eruption, can be the last element from the fire
triangle that help the wildfire to ignite.

R-5
5.2 Wildfires types

Human deliberate fires: Most of them are done by pyromaniacs how do so just for
their pleasure. There are many of them that look for a benefit such as urban speculation
because once the land is completely burnt it can be requalified in a land where new
buildings can be built.

Human Negligence: Agricultural burns, cigarette ends, forestry work, barbecues...

Old fire reproduction: An old fire that was thought to be extinguished, comes back.

Figure 5.2: Lightning Figure 5.3: Intended human wildfire

It is important to notice which are the most common causes from the previously mentioned.
According to data provided by NASA Earth Observatory up to 80% of the wildfire are caused
by humans in the USA [2] [3].

5.2 Wildfires types


Its highly important to take into account which kind of wildfires can occur, as their solutions,
starting symptoms and effects will be really different from each other and therefore a different
approach should be taken.

Surface fires: The wildfires spread due to the combustible thats found on the ground
which includes shrubbery, weeds, litter and others. Here, the sensorization of the forests
that can be susceptible to wildfire will require some hydrometer, thermometer to the sense
the ground humidity and its temperature, both placed on the ground.

Height fire: In those forests where the trees have large treetops, they become a
potential hazard, as the fire can spread from one treetop to the other with effort than
from the ground, as they will be closer.

R-6
5.2 Wildfires types

Figure 5.4: Human caused wildfires detected by NASA Earth Observatory.

Figure 5.5: Surface fire Figure 5.6: Height fire

In these last case, regular aerial images can be more valuable than real time data, as the
wildfires will be spreading from the treetop, where the data collected with multi spectral
cameras can be highly valuable because nearly the 100% of the treetop will be visible, having
a lot of data to decide whether their "health" is correct or not.

Those aerial images can also be acquired by the same system that it is being designed. Our

R-7
5.3 Wildfires prevention systems

platform (the drone itself with its special electronics) can held many sensing systems.

5.3 Wildfires prevention systems


Fire prevention systems that are being used nowadays can be classified within its platform
technology.

5.3.1 Aerial

All those systems and methods that require a flying machine to monitor the state of our forests
using their on-board sensors and cameras, most likely airplanes and helicopters. Most common
technologies being used are: infrared scanning, multi-spectral cameras and thermal cameras.
For example, in the USA, a firewatch Cobra helicopter [1] with special capabilities to help
in both the prevention and fighting of the wildfire, has been developed with the help of the
thermal imaging company FLIR to develop a system that includes:

Geo-referenced infrared sensor

GIS mapping

Low light capability

Tracks targets or scenes with all images

Gyro stabilized gimbal

Laser illuminator

Figure 5.7: Cobra helicopter Figure 5.8: Cobra sensors by FLIR

The systems is used on periodical checks over forests in California.

R-8
5.3 Wildfires prevention systems

5.3.2 Space

Uses space telescopes and multiple sensors which usually orbit on a geostationary orbit.
Satellite-mounted sensors such as Envisats Advanced Along Track Scanning Radiometer
(EAATS) [4] and European Remote-Sensing Satellites Along-Track Scanning Radiometer can
measure infrared radiation emitted by fires, identifying hot spots greater than 39 C (102 F).

Which combined with remote-sensing data from satellite sources such as Geostationary
Operational Environmental Satellite (GOES), Moderate-Resolution Imaging Spectroradiometer
(MODIS), and Advanced Very High Resolution Radiometer (AVHRR) can provide a wider
view and may be sufficient to monitor very large and low risk areas.

Figure 5.9: Daily fire spread mapped by 1km Aqua/MODIS

Low risk areas are meant to be all the large forest areas that do not have a potential hazard
to wildfires and can be monitored through not specific devices or methods.

However, satellite detection has many offset errors, anywhere from 2 to 3 kilometers (1 to
2 mi) for MODIS and AVHRR data and up to 12 kilometres (7.5 mi) for GOES data [?].
Satellites in geostationary orbits may become disabled, and satellites in polar orbits are often
limited by their short window of observation time. Cloud cover and image resolution and may
also limit the effectiveness of satellite imagery.

5.3.3 Terrestial

Sensors that are positioned on the ground, now are largely used, but are just capable of
monitoring specific and small areas.

R-9
5.3 Wildfires prevention systems

Figure 5.10: Terrestial evapotranspiration map by MODIS and University of Montana

A small, high risk area that features thick vegetation, a strong human presence, or is close to
a critical urban area can be monitored using a local sensor network, which sends all the data
to a central station which process it and decides where and when to send a specific crew to
analyse the actual status.

Detection systems may include wireless sensor networks [5] that act as automated weather
systems: detecting temperature, humidity and smoke. These may be battery-powered,
solar-powered, or tree-rechargeable: able to recharge their battery systems using the small
electrical currents in plant material.

5.3.4 Human

All across the forest there is common to find some large turrets which are strategically situated
and controlled by forest rangers who are survelling all their perimeter in case some unexpected
wildfire takes places. Moreover their function is to control people and their activities. In
Catalonia exists the ADF [6] a volunteers organization that looks forward to protect the
environment, but most importantly preventing them.

R - 10
5.3 Wildfires prevention systems

Among all the tasks that they perform we can highlight:

Information campaigns at the rural and urban population.

Collective surveillance programs and prevention of forest fires.

Reforestation after the disaster wildfires.

Control the execution of measures approved by the government.

Support on the fire fighting process, thanks to their area knowledge.

Figure 5.11: Human surveillance system Figure 5.12: ADF volunteers

5.3.5 State of the art solutions

Back to 2005 a research group was already trying to develop an unmanned aerial system
capable of preveting wildfires in Catalonia. It was a group formed by the Universitat
Politecnica de Catalunya (UPC), Universitat Autnoma de Barcelona (UAB) and Centre de
Visi per Computador (CVC) [7].

Its main objective was to help firefighters to take decisions upon the data received from the
drons. For example, on a wildfire, it could provide real time data, cartographic hotspots once
the fire is controlled and a map showing which are the most difficult areas to access for the
ground vehicles.

R - 11
5.3 Wildfires prevention systems

Figure 5.13: Prototype from the catalan research group.

R - 12
Section 6: Our solution

6 Our solution

6.1 Overview
Ill try to summarize the whole idea as much as possible and make it graphic so anyone can
rapidly understand what is all about.

Im looking for a drone system thats able to nearly control itself as soon as the drone is
deployed, so I do not want anyone to directly or physically interact with it. Lets see some
points of what Im trying to remove and improve from the actual workflow on drone flying:

Nobody would prepare the drone for the take off, it will always be placed on the ground,
at the same place.

No radio control is needed.

No specific mission has to be upload manually to the drone.

The drone can be remotely controlled, via internet connection.

The drone will be capable of taking off completely by itself, no previous order will be
needed.

For the whole drone system or platform consist of 3 main parts:

A quadcopter.

Several sensors.

Internet connection, both on the quadcopter and sensors.

The sensors will act as beacons, which are in charge of setting on the procedures for the drone
to prepare for the mission, program the mission itself in function of the sensors data, take off,
carry out the mission and land back to the starting point.

The system Im trying to design will be using open-source tools.

6.2 System workflow


The following images intend to explain how the system would behave. First of all, the sensor
deployment.

R - 13
6.2 System workflow

6.2.1 Sensor and drone placement

Each sensor is only available of getting the temperature and humidity from a single narrow
point, but for the kind of wildfires we are trying to prevent, a low density network would fit.
By low-density I understand a sensor every 900 hectares (3 Km x 3 Km square area).

Figure 6.1: Sensor and drone placement

As seen in the previous picture, the drones would have around 12 sensors to cover approximately
within its actual range, that allows us to cover around 5000 ha. The point is to select really
specific areas that are really susceptible of wildfires or have a huge interest due to their
properties.

Therefore a study must be done to understand which areas have a higher risk. In order to
reduce the infrastructure needed, the drones are expected to use the actual surveillance towers
spread across the country as landing zones. On each landing spot, the drone would have a
charging stations, with special landing skids that allow the drone to charge itself as soon as it
touches the ground.

On the other hand, those critical zones might have multiple drones distributed around the
mountain in order to cover a larger area. The sensors deployment would not really look like
as perfect rectangular grid as it would not be efficient, instead, inside the critical area, some
vital spot will be covered.

R - 14
6.2 System workflow

Figure 6.2: Automatic charging station Figure 6.3: Landing gear with charging capabilities

Figure 6.4: Global drone range and sensor distribution

Its vital to have a perfectly studied sensor distribution to maximize the total area that the
system can protect.

R - 15
6.2 System workflow

Figure 6.5: Global drone range and sensor distribution example

6.2.2 Sending the alert

Lets focus on how would the system behave in case of an emergency. Theres a group of
sensors that have high temperature values and low humidity values too, that would be the
trigger. The data from the sensors is always sent to the MQTT broker, which just handles
the information, but does not any kind of computation with it, it forwards the confirmation
wherever its requested to. The sensors are the ones that decide whether or not, their data
is shows a hazard. However, even if the data is correctly and theres no risk of a wildfires to
occur, the data still is sent as it can be used for further studies or for prediction models.

Lets imagine a sensor, triggered an alert, then the message sent to the MQTT broker would
be of its data, its position, and a flag telling the drones that theres a potential wildfire on its
location.

Remember that the information is not directly processed by the drone, its processed by the
Raspberry Pi onboard of it. Its the Python code that decides if the drone is the closest one to
the sensor sending the alert message.
As the MQTT broker can forward the message to many other client, there could be a control
center that receives also in real time all the information, so can start preparing in case the
data received by the drone highlights a potential wildfires.

R - 16
6.2 System workflow

Figure 6.6: Data signal sent to MQTT broker and Raspberry Pi 3

6.2.3 Drone trip

The drone has decided thats the closest one and that has enough battery time to reach the
location, carry out a mission and go back safely to the landing spot. Before taking off the
drone must pass the pre-takeoff checklist, checking the GPS signal, the battery level, compass,
accelerometers and if everything seems correct, go to the location sent by the sensor.

R - 17
6.2 System workflow

Figure 6.7: Drone trip

6.2.4 Drone mission

One of main problems on the drone automation process is the mission itself. It takes time for
the technicians to draw each mission, upload and run it. Instead these system programs itself
the mission. The Python code draws a grid along the location sent by the sensor, and the grid
size can be previously predefined. it even could be possible to adjust the grid size on the go,
by sending a MQTT message on a topic.

The main objective for the drone is to gather data that helps the emergency services on the
decision making process. There are many situations in which a simple group of firefighters
could quickly act to reduce the potential wildfire. But in order to get in time they must exactly
know which area is in danger. Here is where the drone can assist the most. The sensors just
highlight the area that is in danger, but is the drone how closes the are where to act, because
its ability to fly on top of the trees enable it to take many thermal photos for example of a
large area to process it.

Therefore, from the 900 ha area the sensor detected a possible hazard, the drone could reduce
it to a 1 ha (100 m x 100 m) area so the firefighters can quickly act on the potential wildfire
focus instead of wasting time to actually combat the whole wildfire.

R - 18
6.2 System workflow

Figure 6.8: Wildfires detected with thermal cameras Figure 6.9: Drone thermal inspection

Figure 6.10: Autoplanned drone mission

6.2.5 Base return

Once the drone has finished the mission there are 2 options:

Waits for orders from the control center. If theres still enough battery the drone could
be live streaming in real time the thermal video to the emergency services to help them
understand how the fire behave and its possible evolution.

Go back to the landing zone, send all the information thats left to the control center (not
critical data, that was not send as it prioritizes the thermal images). Quickly recharge
the batteries (less than 30 min if its critical) and assist back the firefighters from the air.

R - 19
6.3 Possible applications

Figure 6.11: Drone returning to the landing zone

6.3 Possible applications


In that specific case the sensors will be a basic temperature, humidity and hygrometers
detector acting as the beacons, but can be replaced by any other kind of sensor or even more,
integrate it with other systems.

There are many others possibilities, where a system thats able to execute mission autonomously
would have a great advantage among the actual ones.

6.3.1 Agricultural

Nowadays agriculture is evolving towards a more technological one, as the tools we have
available, such as automated GPS tractors, multiespectral satellite photos [8], transgenic
products and more. All those solution intend to reduce the costs of the overall process and get
a better product.

With drones is starting to appear solutions so the farmers can exactly know which areas need
more water, nutrients or are infested by a bacteria. The problem here, is that a group of
specialized technicians has to actually go to the farm, deployed their drone, pre-configure the

R - 20
6.3 Possible applications

missions and go back to their studio in order to process all the data gathered.

Figure 6.12: Multispectral aerial image Figure 6.13: Multispectral camera

What if instead, the specialized company just presses a button, and the drone that is always
placed on the farm, carries out the mission, lands and sends the data in order to be processed?
Because it makes no senses selling drones to farmers who do not have any experience on how to
program and fly them. Moreover is not efficient at all, having to attend physical each farmer.
A service with no product (drone) makes senses in my opinion, as the value of the service its
the final document explaining the farmer where to spread the nutrients, water or medicines for
its crop.

On the other hand, the hardware cost is much much lower than the cost of each employer
going personally to the farmer each time an study is needed.

6.3.2 Parcel transport

For example, Amazon Prime Air [9] delivery system, its a great example. Their drones should
no be awaiting to be programmed each time a package has to be delivered, instead they will
receive all the information of their new target and as soon as they detect that the package is
correctly loaded, carry out the delivery process.

Before going deeper on the solution, we must have in mind that by drone, Im always be
referring to a "comercial" drone, a quadcopter, hexacopter... Because actual military drones
for example, are fully capable of doing so. I want to approach such expensive technologies to
every student, every hobbyist or make everyone understand that such technology is already
available and their options are endless.

R - 21
6.3 Possible applications

Figure 6.14: Amazon Prime Now octocopter Figure 6.15: Amazon Prime Now VTOL system

R - 22
Section 7: Framework Setup

7 Framework Setup
Once the idea is explained we must move on to the most important question; How does it
work? As the whole idea is kind of complex, it has been divided in many subsection to help
understand the technology behind it.

7.1 Hardware
7.1.1 Raspberry Pi 3

The Raspberry Pi 3 can be understood as the main "brain". The Raspberry receives all
the information from the server, as it can handle WiFi. Processes the information received
with its Python code, to decide whether or not to start a mission on the drone it is connected to.

To just understand what a Raspberry Pi 3, you just need to know the following: It is a
computer. A really really cheap computer.

The Raspberry Foundation it is a British charity that started developing single-board


computers in 2009, in order to "promote the study of computer science and related topics,
especially at school level, and to put the fun back into learning computing." By November
2016 they had sold more than 11 million units [10].

Figure 7.1: RaspberryPi Logo.

The Raspberry Pi 3 has all the hardware we would expect to find on a regular computer:

CPU (Central Processing Unit): A 1.2 GHz 64-bit quad-core ARMv8 CPU.

RAM (Random-access Memory): 1 GB RAM.

R - 23
7.1 Hardware

USB (Universal Serial Bus): 4, 2.0 ports.

Video and audio output: Full HDMI port and 3.5 mm audio jack

Real-time clock

Internal WiFi: 802.11n Wireless LAN and a Ethernet port.

GPU (Graphics Processing Unit)

Memory: Micro SD card slot.

Bluetooth: It can handle both BLE (Bluetooth Low Energy) and Bluetooth 4.1.

GPIO (General purpose input-output connector): 40 GPIO pins.

The best and most valuable aspect from the Raspberrys, is their GPIOs . The GPIOs allows
to easily connect other microcontrollers or sensors to it, in order to interact with them. They
can be used to output data from or to receive and process information from an external sensor.

Figure 7.2: Raspberry Pi 3

Its function on the whole setup is the following:

Receive all the data via WiFi from the central server.

Run the Python code.

Handle the communications with the Pixhawk.

In the near future (not in these project thesis) could handle image processing in order to
have high precision landing for example.

The core of all the processing is undertaken by the Raspberry as the Pixhawk cannot process
all the data coming from the server

R - 24
7.1 Hardware

7.1.2 Sensors

As part of the project consists sensors network too, they have been prototyped, designed and
deployed. After seeking for which where the variables that affect the most on wildfires we
get to the conclusion that high temperatures plus really low humidity values are the perfect
conditions for a wildfires to occur.

We must highlight that with these kind of sensors, human provoked wildfires (which are the
ones that burn larger areas) will be difficult to detected, but might be of a great help in order
to situate which areas might suffer a faster fire propagation or that detect were is going to be
easier to ignite one. Besides the human provoked wildfires, its a great to for the natural ones
as the conditions of the forest are the ones that end up determinating if the fire takes place or
not.

Therefore having real time data from the conditions are a key asset that must be used.

But nowadays many of these data can be acquired by satellites, predictive models or punctual
measurements, so why does it really make a difference having on-ground sensors?

Satellites data its expensive, and its not really real time data. For example, the Meteosat
satellite orbits the earth in a geostationary orbit at 35800 km [11] which means orbits the earth
once a day as its orbit speed allows it to be in the same position every time, a geostationary
orbit. Moreover, its accuracy varies for many km, therefore not providing consistent data.

Predictive models [12] such as the example from underneath, do not take into account
variations from the weather during the wildfires which usually can dramatically change due to
extreme temperature and humidity contrasts.

In a nutshell, having sensor placed over the forest allows us to monitor its state, before and
during wildfires providing extremely valuable data.

The hardware included in the sensors consists of the following:

Specific PCB

Battery

Temperature sensor

R - 25
7.1 Hardware

Figure 7.3: Satelites orbit.

Figure 7.4: Patricia Andrews and Mark Finney wildfire predictive model

Humidity sensor

Real time clock

Processor

Wireless connection

Resistors, capacitors and others

R - 26
7.1 Hardware

7.1.2.1 Battery

In the recent years batteries have improved so such until the point, electric cars start to be
really feasible with Tesla. But, what does Tesla do here? We are going to be using their cars
battery cell, the now famous 18650 cell.

The 18650 has many characteristics that make it the perfect option, for example:

Their reliability has been proved

Low explosion risk unlike LiPo batteries

Low nominal voltage

They can be easily recharged

Cheap

They best of it, is its availability, particularly ours were taken from an old laptop battery.

Figure 7.5: Brand new 18650 battery Figure 7.6: Recycled 18650 battery

7.1.2.2 Temperature, humidity and hydrometer sensor

Temperature sensor: The main sensor, has the critical task to be running 24/7 and
with a good accuracy level. The temperature sensors is the well-know by hobbyists,
DHT11 which offers a great temperature range, low measurement error and a great price.

Humidity sensor: It is included in the same package as the temperature sensor.

R - 27
7.1 Hardware

Hydrometer: During the research I found out that the actual ground form the forest
has a really valuable information that it is usually not used. Its humidity. Having
the humidity from the ground allows us to have a more consistent vision of the overall
state of the forest because having just the humidity sensor could make us think that the
conditions are correct because of the actual weather conditions while the forest itself its
really dry.

Figure 7.7: DHT11 temperature and humidity sensor Figure 7.8: Recycled 18650 battery

7.1.2.3 Real time clock

When sending data to a server its important to keep track of it. Therefore knowing when
each measurement was taken will allow us to map all the values over a certain period of time.
That could be easily done if the sensor never turns off. Thats not our case. As the sensor
must be outside for many days even years without any kind of maintenance, there might be
moments that the solar panel and the battery can not keep the sensor working due to low
battery or we just want to turn the whole sensor off the save up some energy. Then we would
lose the "counting" and would start over the time stamp. Having a real time clock, helps us to
maintain the same time stamp whatever the system is on or off.

7.1.2.4 Processor

How do we process all the data that we gather from the sensors? A processor is needed to set
up all the code in charge of getting all the data from the sensors.

The whole network to be efficient and functional need to be connected to the internet, in fact
to a server we must set-up. Therefore it was mandatory to find out a solution that includes

R - 28
7.1 Hardware

internet connection.

There are many basic solutions in the market that offer internet connection, but many of them
are too basic or too complex, most of them extremely expensive and difficult to find. Heres
attached a summary of the solutions I discovered in the market.

Finally, I decided to move to the ESP8266 [13] solution which was extremely cheap, has Wi-Fi
capabilities, enough memory and a great processor speed. At first I was trying to use its basic
version, but its low memory capacity made me rethink it because each time I was sending a
MQTT message, it was not able to send the whole package as the buffer was not large enough
to store it before sending it.

Figure 7.9: ESP8266 Version 1. Figure 7.10: 3 different ESP8266 V1 models tested

Next step, a newer version, the ESP8266 12E. In a glance, the DIY IoT future! It has a great
built-in processor that exceeds any Arduino in the market both in clock speed and memory
capacity. The main features of the newer version are:

Low power 32-bit MCU running at 80 MHz, but can be overclocked up to 160 MHz

16 GPIO pins

64 Kb of instruction RAM, 96 Kb of data RAM

10-bit Analog Digital Converter (ADC): Great for the battery monitoring

802.11 b/g/n protocol (WiFi)

SPI, I2C

R - 29
7.1 Hardware

4 MB external SPI flash

Over The Air (OTA) upload capability

Built-in WiFi antenna

WEP or WPA/WPA2 authentication

One of the main advantages is the community knowledge that even been a new product
has seen it a great IoT platform and there are really good tutorials and forum where many
information can be gathered.

On the other hand, there are 2 version that can be bought. The development board which has
a built-in FTDI USB serial processor that allows us to connect it directly ,to the computer
via a micro USB and already has some pin soldered on it. The other option, the "nacked"
processor, with no FTDI processor, no voltage regulator and no pins. The later one, is 3 times
cheaper, but harder to start with, therefore I first bought the development one to prototype
the whole system in a breadboard and once everything is running correctly I designed the
PCB with the naked option in mind.

Figure 7.11: Development board Figure 7.12: Production board

7.1.2.5 Solar panel

The sensor is placed on the base of the tree if it has the hydrometer or on the top of if it
only has the thermometer and the humidity sensor. As it was mentioned before, we need the
sensor to be fully operative 24/7 with nearly no maintenance, therefore the batteries must be
recharged somehow. Thats where we need the solar panels.

R - 30
7.1 Hardware

As the scope of the project, is not developing a great sensor, the characteristics of the solar
panel that I was looking for were nearly only economic. The solar panel we finally bought has
an output of 5 V, needed for the recharging process with a power of 6 W.

Figure 7.13: Solar panel from ebay Figure 7.14: Received solar panel.

7.1.2.6 Resistors, capacitors and others

All the small components are crucial for the correct operation of the sensors, as they have a
very specific function, but in can basically be understood as:

Capacitors: They tend to smooth the signal in order to have a cleaner power source.

Resistors: Adequate the signal for the upcoming components.

Battery cell charger: From these small circuit, the main component is the TP4056
processor, which manages the charging process to achieve a longer lifetime on the batteries

5V booster: The batteries are at 4.2 V when fully charged and can reach up to 3.2 V
but the whole electronics will be running at 5 V (some voltage regulators are needed for
some components, as the processor).

Most of the components were bought through RSComponents, who offer a wide variety of
SMD electronic components. Banggood has been the other way to go.

7.1.3 Pixhawk

The brain. Most of the components that have been previously described were part of the
sensors build. But probably the most important piece of hardware besides the Raspberry Pi

R - 31
7.1 Hardware

3, is the Pixhawk.

The Pixhawk is in charge of controlling the quadcopter. In normal operation, it receives via
a 2.4 Ghz signal the pilot controller commands, sends the information to the main processor
were in addition with the reading of all the sensors is able to send the best order to each of
the 4 ESC to stabilize the drone.

But the Pixhawk capabilities do not stop on just receiving basic orders from a radio controller,
processing them and stabilizing the radio. The Pixhawk is nearly a full computer with ability
to process many data. Pixhawk key features:

168 MHz Cortex M4F CPU (256 KB RAM, 2 MB Flash, 32bit)

3D ACC / Gyro / MAG / Baro

5 UARTs, CAN, I2C, SPI, ADC

Failsafe processor with mixing

14 PWM / Servo outputs

Redundant power supply inputs

Integrated backup system for in-flight recovery

GPS positioning system

2 telemetry/serial ports

Built-in micro-sd card, for debuging pourpouses

In our case, the Pixhawk will be receiving very specific orders via the telemetry port where
has the Raspberry Pi 3 connected running the python code.

7.1.4 Quadcopter

Used the same quadcopter as the students from ESEIAAT are building on the special drone
introductory course done by the LUAS (Laboratory of Unmanned Aerial Systems).
The quadcopter uses some basic but really tough and reliable components:

R - 32
7.1 Hardware

x4 Afro 30A ESC with the Simonk firmware

x4 NTM 2830 800Kv motors

3S 5000mah 20C Lipo battery

500 mm glass fiber frame

9x4.7 inch plastic propellers

Figure 7.15: LUAS Quadcopter Figure 7.16: LUAS workshop

7.1.5 PCB Design

When designing electronics, many steps go before the final PCB design such as simple
breadboard circuits and later on the through hole PCBs.

Its really important of having a clear view of which components are required on the circuit,
as the initial layout makes the difference because as we have wireless connection (WiFi in our
case), the interference due to high current picks can cause corrupt data transmission.

During the design I went through the breadboard first of all, checking all components where
functioning with the others correctly and that the code written was running properly. The
main advantage of breadboards is the ability to change components, as in the first steps a few
components were damaged due to a bad wiring schematic I made.

Once all the circuit was working correctly I moved to the computer, where I already had the
first schematic done for the first setup, to design the PCB itself. In the market there are many
Electronics Design Automation (EDA) programs such as Eagle, Altium, Fritzing and KiCad.
A few months ago I started learning about electronics design through an online course at

R - 33
7.1 Hardware

Udemy.com where the virtual teacher was using KiCad as it is an open-source platform with
all the tools needed.

On KiCad its first needed the schematic, where a basic diagram with all the components and
connections is made.

Figure 7.17: Sensors schematic

Each section is numbered so it can be briefly explained:

1. 5V Booster circuit: Some sensors work better if its input voltage is 5V and also the
charging circuit for the battery needs 5V to charge the batteries efficiently. The booster
circuits basically uses and inductor and the ML2623, a switching regulator. The inductor
stores some energy by generating a magnetic field and when the switching chip, opens
back the circuit the energy stored by the inductor is discharged to the circuit. Notice
there are some capacitors before arriving to the load, so their function is to provide enough
current to the load, when the switching chip closes the circuit and the inductor is storing
back energy.

R - 34
7.1 Hardware

2. USB charger and battery protection: LiPo batteries need a specific charging rate
in order to achieve a better lifetime. Therefore the TP4056 is in charge of providing
enough current to the batteries during their first charging stage to finally reduce it when
its getting closer to the final point. Notice that the batteries could be charged using the
solar panel that will be installed on top of it too and the USB would only be used in case
of insufficient solar power.

3. Voltage divider: The ESP8266 12E has a 10-bit resolution and its input voltage range
is 0 to 3.3V, and as the batteries range from 0-4.2V the voltage must be "converted" so
it can be correctly read.

4. I2C connections: The ESP8266 12E only has 1 SPI connection port so it limits our
capability to increase the sensor we can connect to it. Therefore I2C protocol is a great
solution, as we can connect up to 1000 components on the same pins, but just changing
in the code each address bus.

5. 3.3 V Voltage regulator: The ESP8266 12E needs a steady power input of 3.3V. Its
really important to having it over all the working circunstances as for example, during
WiFi communication, there are many current peaks that could reduce the overall voltage
if the regulator can not support such current peaks.

6. Reset button

7. 5V Booster circuit footprint: It is the first time I had designed a boostup circuit and
sended to manufacture, therefore I prepared a plan B. In case they last booster circuit
does not work, I would use a basic pre-manufactured one.

8. 18650 Batteries: Tied in parallel to achieve a larger capacity. They have a special
frame to make the mounting easier.

9. USB interface: Makes the upload process much much easier, because without the USB
interface the code has to be uploaded by a FTDI connector, which are easy to broke and
not reliable.

10. Speaker: For debugging purposes.

11. ESP8266 12E: The core of the setup. I finally used to most basic version, just the chip
as I preferred to design all the other components by my self, such as the voltage regulator,
the USB interface and also allowed me to chose the GPIO I wanted to use.

R - 35
7.1 Hardware

Next, we move to PCBNew a tool inside KiCad where we start designing the PCB itself.
Previously each components is arranged with is footprint and in case the later it is not already
available, it has to be designed too.

Figure 7.18: PCB 3D render using KiCad tools

A few important tips I learned during the PCB design process:

Do not try to fit components really close, they have to be soldered by hand (it can be
soldered by the machines, but would be too expensive for such low volume)

Keep wiring simple.

Use such 2 layer to have a cheaper PCB.

Use the first layer and second layer as a basic heat sink.

Make sure each footprint suits the components you pretend to buy.

Export the Gerber Files and check them on a specific page to make sure everything is as
it should be.

Silkscreen is your friend. If you pretend to assemble the board yourself, its going to save
you many hours of consulting the schematics in order to know which component goes in
each part.

R - 36
7.1 Hardware

Figure 7.19: PCB distribution

Once the design process is done, the Gerber files were send to the manufacture, in this case in
China, with firstpcb.com in order to have them done as cheap as possible. Its quality is not
astonishing, not because the are made in China but because you basically get what you pay.
On ther other hand, theres eurocircuits.com who offer a great service with great quality from
Belgium, so we do not have to worry about customs and taxes.

Figure 7.20: Gerber file detail using gerblook.org Figure 7.21: Gerber file detail using gerblook.org 2

Notice the price difference between the China and Belgium manufacturers. Its nearly 10 times
cheaper to manufacture them in China and the manufacture time still is 7 days, quite standard.

R - 37
7.2 Software

Figure 7.22: Eurocirciuts budget

Figure 7.23: Firstpcb.com budget

7.2 Software
7.2.1 SITL Setup

When working with drones, theres a big problem. How do we test that everything is working
correctly without having a special protected cage or being able to spend several thousand
euros on drone parts due to the crashes?

R - 38
7.2 Software

Software in the Loop will be helping us. SITL in a nutshell can be understood as an real
environment in a virtual world for your drone code to run. SITL runs your code in a simulated
world with a simulated drone, therefore you can test as many times as you want all the code
you have written without needing to test it in a real drone.

SITL installation is done directly on the computer over terminal using standard git functions.
As the whole thesis aim is also to help out people reproduce everything, Im going to go
through most of the steps.

I have installed everything on a Mac, but I would highly recommend to swap to Linux as all
the installation will be easier.

Lets start. Im going to summ up an amazing guide written by Daniel D. McKinnon [14].

We must have Python previously installed on our computer, so if its not the case, install it.

First of all, we are downloading and installing all the repositories for DroneKit (the environment
to develop code for the Pixhawk and other 3DRobotics platforms).

1 pip i n s t a l l dronekit

As soon as its done:

1 pip i n s t a l l dronekit s i t l

*The pip command comes from python, thats why we need python to be installed on our
computer.

Thats it!! I just wanted to point out that in Mac computers it can be that simple if Python
is previously installed because most of the guides on internet suggest you to install the SITL
through a virtual machine such as Vagran, which is a tedious process.

To test everything out, just type on the terminal:

1 dronekits i t l copter

And thats what you might get:

R - 39
7.2 Software

Now the SITL is installed, but it would not be of any help if theres not any GCS (Ground
Control System) connected to it. Then, we have to install the GCS and connect the SITL to
the GCS. This process is done by MAVProxy, which connects the SITL with the GCS where
the "information" is displayed. Lets install it:

1 brew tap homebrew/ s c i e n c e


2 brew i n s t a l l wxmac wxpython opencv
3 sudo p i p u n i n s t a l l pythond a t e u t i l
4 sudo p i p i n s t a l l numpy p y p a r s i n g
5 sudo p i p i n s t a l l MAVProxy

I wont get into detail of what each command does, it can be consulted on the guide I previously
mentioned, but short and sweet,some details:

brew comes from Homebrew; a package manager for OS X that makes sure all of your
packages are linked properly

OpenCV : open source computer vision

sudo: runs all the commands as an administrator

Great! Everything is now setup and ready to run. Give it a try by running the following on
the terminal:
1 dronekits i t l copter home = 4 1 . 5 2 6 2 1 0 , 2 . 3 9 5 8 4 1 , 2 0 , 1 2 1

The home command setups the starting point for the drone, and right after the dronekit-stil
command, is the kind of vehicle we are flying, in this case, a copter.

On that terminal window, the quadcopter is being run by a virtual environment, simulating
that the dron is connected to a flight controller.

To forward all the simulation done by the SITL to the GCS, we must use as mentioned, the
MAVProxy command.

Here we are forwarding all the information coming from MAVLink messages over TCP at the
local IP address 127.0.0.1:5760 at the 5760 port, to the 3 ports mentioned on the udpout
command. One of those 3 address is where the GCS will connect to.

Finally we can control the drone and see the changes it suffer on our preferred GCS program
being used.

R - 40
7.2 Software

A quick example:
1 mode g u i d e d # go i n t o g u i d e d mode , s o t h e dron i s l i s t e i n g f o r i n s t r u c t i o n s
2 arm t h r o t t l e # arm t h e drone
3 takeo f f 10 # t a k e o f f t o 10 m e t e r s

7.2.2 Raspberry Pi 3

The Raspberry Pi 3 comes with nothing on it, nor an operating system. In fact there is not
drive in it, you need to buy a separate micro sd card which will become your hard drive.

I decided to go for the Rasbian Jessie with Pixel, the most know operating system for the Raspi
as there is a lot of documentation of it, as well as regular updates and great user willing to help.

The installation process is quite straight forward as some tools help to make it easier and
faster. Download the .img image, uncompress it and mount the image to the SD card. In
order to properly burn the .img image to the SD card, it can be done through the terminal or
a great new tool for MAC users, Etcher which does all the hard terminal code type for you.

Once the .img file is burned into the SD card, just plug in into the Raspi and power it up.
Bingo! The Raspi is now alive and ready for some coding.

The Raspi environment must also be prepared for the correct functioning. Lets remember the
functions of the Raspi on the whole process:

Provide internet connection to the drone, act as a gate between sensors and the drone
itself

Run the python code in charge of the decision taking algorithm

Manage upcoming MQTT messages

Computational power in case image processing is needed through OpenCV

Taking into account the features the Raspberry Pi 3 must fulfil, its clear that the Python
environment and the full DroneKit package must also be installed on it. The main advantages
thats really similar to Linux, therefore installing new repository and packages is quite easy.

Type the following on the Raspi terminal:

R - 41
7.2 Software

1 $ sudo aptg e t update #update t h e l i s t o f p a c k a g e s i n t h e s o f t w a r e c e n t e r


2 $ sudo aptg e t i n s t a l l s c r e e n pythonwxgtk2 . 8 pythonm a t p l o t l i b pythonopencv
3 pythonp i p pythonnumpy pythondev l i b x m l 2 dev l i b x s l t dev
4 $ sudo p i p i n s t a l l f u t u r e
5 $ sudo p i p i n s t a l l pymavlink
6 $ sudo p i p i n s t a l l mavproxy

*Ignore the $ symbol, it is just used to highlight the different lines.

Now all the packages, python, MAVLink, OpenCV, PIP install manager, MAVProxy have
been installed.

On the Raspi we also need to disable the OS control of the serial port. Attention!! A screen
must be connected to the Raspi in order to complete these steps. The ones from before, could
be done through a SSH connection (which will be explained how to setup)[15].

On terminal, type:
1 sudo r a s p i c o n f i g

And in the utility, select Advanced Options:

And then Serial to disable OS use of the serial connection:

Reboot the Raspberry Pi when you are done.

Now lets wire up the Pixhawk with the Raspberry Pi 3 through the telemetry port 2. Follow
the wiring connections from the image below:

Before testing the connection, the Pixhawk has to be also setup in order to accept the sockets
coming from the Raspi over the serial port 2 (telem2).

Using the preferred GCS, in this case Mission Planner over Windows, set the following
parameters,on the advanced window:

SERIAL2_PROTOCOL = 1 (the default) to enable MAVLink on the serial port.

SERIAL2_BAUD = 921 so the Pixhawk can communicate with the RPi at 921600 baud.

LOG_BACKEND_TYPE = 3 if you are using APSync to stream the dataflash log files
to the RPi

R - 42
7.2 Software

Now everything is ready so we can check the connection between both computers. On the Raspi
terminal console type:
1 sudo s
2 mavproxy . py master=/dev / s e r i a l 0 b a u d r a t e 921600 a i r c r a f t MyCopter

The master command sets where the Raspi is connected in the Pixhawk. Attention in some
raspis the UART connection is disabled by default. Enable serial communication editing the
/boot/config.txt file and set set enable_uart=1[16].

If the connection is successful the following message might appear on the console:

Some really interesting tools used for a better work flow.

SSH connection, in order to control the terminal through my personal laptop without a
physical connection with the Raspberry Pi

Push files over the SSH connection

7.2.3 MQTT

What would we do with all the data sensors are collecting? They would be nearly useless if
these valuable data is not used in real-time..

Somehow the data had to be send back to the main server, a webpage or in these case back
to the drones itself. Analysing which options I had in the market, I started seeing numerous
problems. First of all, a mandatory small portable device should be in charge of providing the
basic connection.

I found out that there are numerous telecommunications companies that offer really cheap
and low data plans which are perfectly suitable for the needs of that project. The small SIM
cards which are bought from the companies are inserted to a small WiFi router, which creates
an access point for the sensors to connect. Its noticeable that these system is not efficient
at all as each sensors (which many of them are needed) would need a router, as they are too
separate the one from the other to have a central router. But it also has to be highlighted that
the internet communications is not inside the aim the project, it is just developed in order to
demonstrate the capabilities of the idea.

R - 43
7.2 Software

Nowadays there are many solutions that offer internet connection, in a global lower cost, but
they must be developed from scratch. For example, Sigfox company offers many internet
solutions for the IoT world, but they are not as open-source capable as Arduino, Raspberry Pi
are and you must use their communications protocol, and specific processors.

Figure 7.24: Arduino MKRFOX1200 Figure 7.25: Sigfox modules

Once the first step from the communication is solved, having a internet connection that can
be used by the sensor processor (includes a small WiFi module) we have to focus on the
communication protocol.

Before deciding whether to use, HTTP, HTTPS, UDP, FTP, SSH or any other, lets think
about what we need:

Low power consumption

High bandwidth is not needed, small messages will be sent

Easy to implement

Lightweight for the processor

Varying levels of latency, bandwidth constrains and unreliable connections may appear.

Now, a decision could be taken, and the MQTT protocol seems the perfect candidate. It is
easy to use, does not require high computation power and there are many reliable server on
the internet that are free to use. Moreover it is fully open-source.

But, what is MQTT? MQTT stand for Message Queue Telemetry Transport, a perfect protocol
for machine-to-machine communication (M2M)[17]. Its architecture is based on a "star", where
we have a central node, which acts as a server or broker. The broker is in charge of managing

R - 44
7.2 Software

the whole network and forward each message to its final destination.

The system is quite easy. The clients (each sensors in these case) must subscribe to a specific
topic, and there they will be publishing their messages (temperature, humidity...). These way,
each sensor can be subscribed to others topics that are not related with its own, so can receive
specific data from other channels, achieving a 2 way communication. Its main benefit is that,
on behalf HTTP we do not need to open a socket each time a message has to be sent as the
communication channel is always open.

7.2.4 Sensors

Sensors must be kept simple as the aim of these project is not to design efficient temperature
sensors. Therefore I decided that the programming would be done in Arduino as it is the
programming language that in my opinion is faster to prototype with.

R - 45
Section 8: Code Description

8 Code Description

8.1 Setup
Right before starting, if you ever come across with these project, please check in me GitHub
(@marcunzueta) for all the updates on the code, as Ill try to keep working on it.

The code in the sensors handles from the internet connection, the readings from the sensors
and the battery management. In order to explain their functions, each section is going to be
briefly explained so the whole code is explained.

1 #i n c l u d e " SPI . h "


2 #i n c l u d e "DHT11 . h "
3 #i n c l u d e " ArduinoJson . h "
4

5 // ESP8266 and WiFi


6 #i n c l u d e " ESP8266WiFi . h "
7 #i n c l u d e " Adafruit_MQTT . h "
8 #i n c l u d e " Adafruit_MQTT_Client . h "

Listing 1: Libraries declaration

The libraries include code needed for each specific sensor and they are usually developed by
the sensor manufacturer. For example, the SPI.h library manages the SPI protocol connection
and was developed by Arduino. Notice the Adafruit libraries, I found out how to use the
ESP8266 12E by their excelent tutorials and open-source hardware schematics.

1 #d e f i n e WLAN_SSID " OnePlus3 "


2 #d e f i n e WLAN_PASS " "
3

4 #d e f i n e WLAN_SSID2 "MOVISTAR_85C0"
5 #d e f i n e WLAN_PASS2 " "
6

7 #d e f i n e AIO_SERVER " m12 . cloudmqtt . com "


8 #d e f i n e AIO_SERVERPORT 12461 // u s e 8883 f o r SSL
9 #d e f i n e AIO_USERNAME " "
10 #d e f i n e AIO_KEY " "

Listing 2: WiFi constants

The MQTT broker information, the m12.cloudmqtt.com web address is a service provided by
Amazon [18], who offers a free MQTT broker with a great console where the topics are set.

R - 46
8.1 Setup

The variables with **** are just private keys or passwords. Notice Im not using the SSL
port, because it was slowing a bit the overall computation time, and the data is far from being
sensible or private to be protected with the SSL protocol.

1 // Using t h e World G e o d e t i c System WGS 8 4 ,


2 #d e f i n e LAT " 41.594240 "
3 #d e f i n e LONG " 2.078951 "
4 #d e f i n e ID " C a s t e l l a r D e l V a l l e s " // Small nickname t o q u i c k l y u n d e r s t a n d
5 which s e n s o r i t i s
6 // The l o c a t i o n v a l u e s must be PREi n c l u d e d i n t h e code f o r each s e n s o r ,
7 // a s t h e GPS method i s not i n t h e s c o p e o f t h e s e p r o j e c t .
Listing 3: Position coordinates

Each sensor must have a unique position, which will be send within the temperature data in
order to have a temperature, humidity map. Moreover the exact position is needed so the
drone in charge of the area surveillance can go there.

1 // WiFI and MQTT communication


2 W iF i C li e n t c l i e n t ;
3 Adafruit_MQTT_Client s e n s o r (& c l i e n t , AIO_SERVER, AIO_SERVERPORT, AIO_USERNAME,
4 AIO_KEY) ;
5 Adafruit_MQTT_Publish r e a d e r = Adafruit_MQTT_Publish(& s e n s o r , "GPIO" ) ;
6 Adafruit_MQTT_Subscribe o n o f f b u t t o n = Adafruit_MQTT_Subscribe(& s e n s o r , "GPIO" ) ;
7

8 // Humidity s e n s o r
9 DHT11 dht11 (D1) ;
10

11 // Need t o send Json commands


12 DynamicJsonBuffer j s o n B u f f e r ;
13 J s o n O b j e c t& r o o t = j s o n B u f f e r . c r e a t e O b j e c t ( ) ;
14 J s o n O b j e c t& r o o t 2 = j s o n B u f f e r . c r e a t e O b j e c t ( ) ;
Listing 4: Objects declaration

The objects are declared in order to use their specific commands later on the code. For
example, from Adafruit_MQTT_Client, we create the sensor object (any other name is valid)
so we can use their functions such as sensor.readSubscription(0) or sensor.connected(). For
the DHT11 sensor, its data pin is declared.

In order to have an easier and cleaner communication protocol, the JSON text format is used.

R - 47
8.1 Setup

1 // DHT11 S e n s o r
2 i n t SensorPin = 1 ;
3 f l o a t temp , hum ;
4 int counter = 0;
5 i n t amount = 3 ;
6 int err ;
7

8 // Hydrometer s e n s o r
9 i n t hydrometer = A0 ; // p o r t d e c l a r a t i o n
10 i n t hydrometerval = 0 ;
11

12 // V o l t a g e s e n s o r
13 flo at voltage = 0;
14

15 // S t r i n g v a l u e s
16 String publicacio ;
17 String cero = " 0 " ;

Listing 5: Globals variables and constants

Variable declaration and initialization. Its important to have in minded which kind of
variables are needed, to improve the code efficiency. For example, a float are 32 bits (4 bytes)
of information while the int variables are 16 bits (2 bytes). Furthermore, the uint8_t variable
type is just 8 bits (1 byte).

1 void setup ( void ) {


2

3 S e r i a l . begin (115200) ;
4 S e r i a l . p r i n t l n ( " Booting up . . . " ) ;
5

6 // Connect t o WiFi a c c e s s p o i n t .
7

8 Serial . println () ;
9 S e r i a l . p r i n t ( " Connecting t o " ) ;
10 S e r i a l . p r i n t l n (WLAN_SSID) ;
11

12 WiFi . b e g i n (WLAN_SSID, WLAN_PASS) ;


13

14 w h i l e ( WiFi . s t a t u s ( ) != WL_CONNECTED) {
15 delay (100) ;
16 Serial . print ( " . " ) ;
17 }
18

19 // Setup MQTT s u b s c r i p t i o n f o r o n o f f f e e d .

R - 48
8.1 Setup

20 s e n s o r . s u b s c r i b e (& o n o f f b u t t o n ) ;
21 }

Listing 6: Setup function

The setup function in Arduino, is just executed once every time the whole processors boots up.
Therefore their actions are just setup functions, as its name shows. Inside the setup, we are
connecting to the WiFi network, set the baudrate (in case communication with the computer
is needed) and we subscribe to the topic where information is expected to arrive from the
central server or the drone itself.

Note that in case the WiFi communication is lost, we must reconnect to it, later, on the loop
function.

8.1.1 Loop function

1 void loop ( void ) {


2

3 // R e c o n e c t i o n i f t h e WiFI i s no l o n g e r c o n n e c t e d
4 i f ( WiFi . s t a t u s ( ) != WL_CONNECTED)
5 {
6 S e r i a l . p r i n t ( " Connecting t o " ) ;
7 S e r i a l . p r i n t l n (WLAN_SSID) ;
8 WiFi . b e g i n (WLAN_SSID, WLAN_PASS) ;
9 w h i l e ( WiFi . s t a t u s ( ) != WL_CONNECTED) {
10 delay (100) ;
11 Serial . print ( " . " ) ;
12 }
13 }
14

15 MQTT_connect ( ) ;
16

17 l o n g r s s i = WiFi . RSSI ( ) ;
18 S e r i a l . p r i n t ( " RSSI : " ) ;
19 Serial . println ( rssi ) ;

Listing 7: WiFi check and reconnection

The loop function is divided in several areas as its the largest code part. First of all, check if
the connection is still alive, and if not, reconnect to the same network. The MQTT connect
function, connects the sensor to the MQTT broker using the configuration previously stated.

R - 49
8.1 Setup

WiFi.RSSI checks the strength of the WiFi signal, why should we have it here? It can be
interesting to check how the WiFi/4G signal varies along the mountain, with the weather and
could be handy to debug for problems as the rssi value will be sent within the sensor data
package.

1 i f ( ( e r r = dht11 . r e a d (hum , temp ) ) == 0 ) // S i d e v u e l v e 0 e s que ha l e i d o b i e n


2 {
3 S e r i a l . p r i n t ( " Humidity : " ) ;
4 S e r i a l . p r i n t (hum) ; S e r i a l . p r i n t ( " " ) ;
5 S e r i a l . p r i n t ( " Temperature : " ) ; S e r i a l . p r i n t ( " " ) ;
6 S e r i a l . p r i n t ( temp ) ;
7 }
8 else
9 {
10 Serial . println () ;
11 S e r i a l . p r i n t ( " E r r o r Num: " ) ; S e r i a l . p r i n t ( " " ) ;
12 Serial . print ( err ) ;
13 Serial . println ( " " ) ;
14 }
15

16 c o u n t e r ++;
17

18 S e r i a l . p r i n t ( " Hydrometer : " ) ; S e r i a l . p r i n t ( " " ) ;


19 h y d r o m e t e r v a l = analogRead ( hydrometer ) ;
20 S e r i a l . p r i n t ( hydrometerval ) ;
21

22 voltage = read_voltage () ;

Listing 8: DHT11 sensor reading

All the Serial.print that appear over the code, can be neglected on the code that is uploaded
to the final sensor, as no computer will be connected to it, therefore the data doesnt have to
be displayed to a serial monitor. It was used to check the code .

The hygrometer sensor sends back an analog signal, from 0 to 1026, which helps to understand
how wet or dry the ground is. On the other hand, the read_voltage function is used to check
the batteries capacity been able to reduce the communication rate in order to reduce the power
consumption.

1 // Add v a l u e s i n t h e o b j e c t
2 r o o t [ " Temperature " ] = temp ;
3 r o o t [ " Humidity " ] = hum ;

R - 50
8.1 Setup

4 // r o o t [ " Measurement " ] = c o u n t e r ;


5 r o o t [ " Hydrometer " ] = h y d r o m e t e r v a l ;
6 root [ " Battery Voltage " ] = voltage ;
7 r o o t 2 [ " L a t i t u d e " ] = LAT;
8 r o o t 2 [ " Longuitude " ] = LONG;
9 r o o t 2 [ " ID " ] = ID ;
10 r o o t 2 [ " WiFi_RSSI " ] = r s s i ;
11

12 // L a s t l y , you can p r i n t t h e r e s u l t i n g JSON t o a S t r i n g


13 S t r i n g output ; // i m p o r t a n t t o c l e a n each time t h e s t r i n g , i f not ,
14 // i t w i l l w r i t e o v e r i t , a c c u m u l a t i n g t h e p r e v i o u s m ess age s .
15 S t r i n g output2 ;
16 r o o t . p r i n t T o ( output ) ;
17 r o o t 2 . p r i n t T o ( output2 ) ;
18

19 i n t s t r _ l e n = output . l e n g t h ( ) + 1 ; // perque e l 1
20 c h a r char_array [ s t r _ l e n ] ;
21 output . toCharArray ( char_array , s t r _ l e n ) ;
22

23 i n t s t r _ l e n 2 = output2 . l e n g t h ( ) + 1 ; // perque e l 1
24 c h a r char_array2 [ s t r _ l e n 2 ] ;
25 output2 . toCharArray ( char_array2 , s t r _ l e n 2 ) ;

Listing 9: JSON variables

All the root function are used to manage the text in JSON and prepare it to be sent over the
MQTT protocol. A few transformations must be made before sending it as the reader.publish
function needs a char array instead of a string to send the information correctly.

1 // Publiquem a l s e r v i d o r MQTT
2

3 i f ( ! r e a d e r . p u b l i s h ( char_array ) ) {
4 S e r i a l . p r i n t l n (F( " F a i l e d " ) ) ;
5 } else {
6 Serial . println ( " " ) ;
7 S e r i a l . p r i n t l n (F( " Sended f i r s t a r r a y " ) ) ;
8 }
9

10 i f ( ! r e a d e r . p u b l i s h ( char_array2 ) ) {
11 S e r i a l . p r i n t l n (F( " F a i l e d " ) ) ;
12 } else {
13 S e r i a l . p r i n t l n (F( " Sended s e c o n d a r r a y " ) ) ;
14 }
15

R - 51
8.2 Functions declaration

16 y i e l d ( ) ; // l e t ESPcore h a n d l e WiFi s t u f f

Listing 10: MQTT publishing message

As explained before, the reader.publish sends the message to the MQTT broker so the drone
can receive it, through a subscription function in its code.

1 Adafruit_MQTT_Subscribe s u b s c r i p t i o n ;
2

3 while (( subscription = sensor . readSubscription (0) ) ) {


4 i f ( s u b s c r i p t i o n == &o n o f f b u t t o n ) {
5 S e r i a l . p r i n t (F( " Got : " ) ) ;
6 S e r i a l . p r i n t l n ( ( char ) onoffbutton . l a s t r e a d ) ;
7 }
8 }
9 // s e n s o r r e a d d e l a y
10

11 sensor_delay ( voltage ) ;
12 S e r i a l . p r i n t l n ( " \n\n " ) ;
13 }

Listing 11: MQTT subscription

8.2 Functions declaration

1 f l o a t read_voltage () {
2 float volt = 4.00;
3 v o l t = ( 4 . 2 analogRead (A0) ) / 9 7 5 ;
4 Serial . println ( volt ) ;
5 return volt ;
6 }

Listing 12: Voltage check function

Reads the voltage from the batteries using the analog pin from the ESP8266 12E and converts
the analog value to a normal Volt value. Note the 975, is achieved after calibrating the readings
from the ESP and a multimeter so they are as accurate as possible.

2 v o i d MQTT_connect ( ) {
3 int8_t r e t ;
4 // s i hem f e t un r e c o n n e c t , t o r n a r a s u s c r i u r e n s a l a cua d e l s u s c r i b e
5 // Stop i f a l r e a d y c o n n e c t e d .
6 i f ( sensor . connected ( ) ) {

R - 52
8.2 Functions declaration

7 return ;
8 }
9 S e r i a l . p r i n t ( " Connecting t o MQTT. . . " ) ;
10 uint8_t r e t r i e s = 3 ;
11 w h i l e ( ( r e t = s e n s o r . c o n n e c t ( ) ) != 0 ) { // c o n n e c t w i l l r e t u r n 0 i f c o n n e c t e d
12 Serial . println ( sensor . connectErrorString ( ret ) ) ;
13 S e r i a l . p r i n t l n ( " R e t r y i n g MQTT c o n n e c t i o n i n 5 s e c o n d s . . . " ) ;
14 sensor . disconnect () ;
15 d e l a y ( 5 0 0 0 ) ; // w a i t 5 s e c o n d s
16 r e t r i e s ;
17 i f ( r e t r i e s == 0 ) {
18 // b a s i c a l l y d i e and w a i t f o r WDT t o r e s e t me
19 while (1) ;
20 }
21 }
22 S e r i a l . p r i n t l n ( "MQTT Connected ! " ) ;
23 }

Listing 13: MQTT connection function

Checks if we are connected a maximum of 3 times and if it is not able to, the board restarts
itself using the while(1) command.

Function in charge of
1

2 void sensor_delay ( f l o a t voltage ) {


3 // Used t o r e d u c e power consumption
4 i f ( voltage < 3.00) {
5 S e r i a l . p r i n t l n ( " 10 minutes d e l a y " ) ;
6 d e l a y ( 6 0 0 0 0 0 ) ; // 10 minutes d e l a y
7 }
8 else i f ( voltage < 3.20)
9 {
10 S e r i a l . p r i n t l n ( " 5 minutes d e l a y " ) ;
11 d e l a y ( 3 0 0 0 0 0 ) ; // 5 minutes d e l a y
12 } else i f ( voltage < 3.60)
13 {
14 S e r i a l . p r i n t l n ( " 1 minute d e l a y " ) ;
15 d e l a y ( 6 0 0 0 0 ) ; // 1 minute d e l a y
16 } else i f ( voltage < 4.00)
17 {
18 S e r i a l . p r i n t l n ( " 20 s e c o n d d e l a y " ) ;
19 d e l a y ( 2 0 0 0 0 ) ; // 20 s e c o n d d e l a y
20 } else

R - 53
8.3 Pixhawk

21 {
22 S e r i a l . p r i n t l n ( " 5 second delay " ) ;
23 delay (5000) ;
24 }
25 }
26 }

Listing 14: Sensor delay function

As the board is battery powered and the solar panel might not be able to always recharge the
battery, depending of the battery value, we can decrease the data gathering, therefore reducing
the power consumption because, the WiFi communication is the process that consumes more
energy.

8.3 Pixhawk

8.4 Drone python code

1 # Libraries
2 import paho . mqtt . c l i e n t a s mqtt
3 from d r o n e k i t import connect , VehicleMode , L o c a t i o n G l o b a l R e l a t i v e , Command,
4 LocationGlobal
5 from pymavlink import m a v u t i l
6 import time
7 import a r g p a r s e
8 import math
9 import j s o n

Listing 15: Libraries used

Each of the libraries are mandatory for the code to work. Just like in Arduino, there are
functions that are already written and by just importing the libraries where they are located,
they can be used.

1 # MQTT V a r i a b l e s
2 h o s t = " m12 . cloudmqtt . com "
3 p o r t = 12461
4 t o p i c 2 = " DroneKit "
5 username = " marc "
6 passwd = " unzueta "

Listing 16: MQTT variables on drone code

R - 54
8.4 Drone python code

Basic data for the Raspberry Pi 3 to connect to the MQTT broker. The broker will be
managing 2 different topics, the DroneKit where is sent whether or not a mission must start,
and in case its needed to start, the location information will be sent. The other topic will be
GPIO, where all the data from the sensors is sent, so other teams can subscribe to it, and get
the data in case a study along many days has to be made for example. The Raspberry will
have a code running in parallel, which will be subscribe to that topic and writing all the data
coming to a .txt file.

1 hovering_time = 10 #s e c o n d s
2 a i r s p e e d = 5 #m/ s
3 f i r s t _ t i m e = True
4 l a t = 41.526210
5 lon = 2.395841
6 s i z e = 20
7 segx , s e g y = 5 , 5
Listing 17: Global variables

Global variables can be seen by all the functions that are declared later on the code. The lat
and lon variables are the base/landing position for the drone. In case of having a network of
drones, that value should therefore be different for each drone. The other variables will be
explain in its specific function. Notice that in Python its not needed to declare which kind of
variable its, however there are cases where its interesting to have it declared [?].

1 p a r s e r = a r g p a r s e . ArgumentParser ( )
2 p a r s e r . add_argument ( c o n n e c t , d e f a u l t= 1 2 7 . 0 . 0 . 1 : 1 4 5 5 0 ,
3 h e l p= As d e f a u l t c o n n e c t i o n we w i l l be c o n n e c t e d t o 1 2 7 . 0 . 0 . 1 : 1 4 5 5 0 ,
4 which i s t h e SITL )
5 p a r s e r . add_argument ( t o p i c , h e l p= Write t h e t o p i c where you want t o be s u s c r i b e d ,
6 remember t h a t i t s h o u l d be on your MQTT b r o k e r s e r v e r , i n our c a s e
7 cloudmqtt . com , d e f a u l t= DroneKit )
8 p a r s e r . add_argument ( a l t , d e f a u l t= 20 ,
9 h e l p ="The a l t i t u d e above t h e ground " )
10 args = parser . parse_args ( )
Listing 18: Parser commands

The parser helps us introduce specific variable values right before executing the code. For
example, in case the topic has to be different each time we run the code, we must do:

1 sudo python drone_code . py t o p i c NEWTOPIC


Listing 19: Parse command when running the code

R - 55
8.4 Drone python code

Theres also the option to set a default value for each value so it can we do not need to change,
we do not need to parse any value.

1 p r i n t " The t o p i c where we w i l l be s u s c r i b e d i s : "


2 print ( args . topic )
3

4 # Connect t o t h e V e h i c l e
5 p r i n t " Connecting t o v e h i c l e on : %s " % a r g s . c o n n e c t
6 v e h i c l e = c o n n e c t ( a r g s . connect , baud =57600 , wait_ready=True )

Listing 20: Pixhawk and Raspberry Pi connection

Then the Python code, must connect to the Pixhawk or the SITL. When the Rapsberry Pi
3 is connected to the Pixhawk via serial, we will need to check that the baudrate on both is
same. The args.connect should be the serial direction, no the UDP address, then check where
your Pixhawk will be sending data on your Raspberry Pi 3.

1 cmds = v e h i c l e . commands
2 cmds . c l e a r ( )
3 cmds . upload ( )

Listing 21: Commands clean up

Cleaning previous commands and mission so that they do not interfere with the new ones.

1 d e f arm_and_takeoff ( a T a r g e t A l t i t u d e ) :
2

3 p r i n t " B a s i c prearm c h e c k s "


4 # Don t l e t t h e u s e r t r y t o arm u n t i l a u t o p i l o t i s ready
5

6 w h i l e not v e h i c l e . is_armable :
7 p r i n t " Waiting f o r v e h i c l e t o i n i t i a l i s e . . . "
8 time . s l e e p ( 1 )
9

10 p r i n t " Arming motors "


11 # Copter s h o u l d arm i n GUIDED mode
12 v e h i c l e . mode = VehicleMode ( "GUIDED" )
13 v e h i c l e . armed = True
14

15 w h i l e not v e h i c l e . armed :
16 p r i n t " Waiting f o r arming . . . "
17 time . s l e e p ( 1 )
18

R - 56
8.4 Drone python code

19 p r i n t " Taking o f f ! "


20 v e h i c l e . s i m p l e _ t a k e o f f ( a T a r g e t A l t i t u d e ) # Take o f f t o t a r g e t a l t i t u d e
21

22 # Check t h a t v e h i c l e has r e a c h e d t a k e o f f a l t i t u d e
23 w h i l e True : # Mentre e s t i g u e m armed , f a r e f e r e n c i a a l v e h i c l e . armed
24 p r i n t " A l t i t u d e : " , v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t , "m"
25 p r i n t " B a t t e r y : %s " % v e h i c l e . b a t t e r y . v o l t a g e , "V"
26 print " "
27 # Break and r e t u r n from f u n c t i o n j u s t below t a r g e t a l t i t u d e .
28 i f v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t >=a T a r g e t A l t i t u d e 0 . 9 5 :
29 p r i n t " Reached t a r g e t a l t i t u d e "
30 p r i n t " Take o f f c o m p l e t e "
31 break
32 time . s l e e p ( 1 )
Listing 22: Arm and takeoff function

First function, sets the drone to takeoff to a predifened altitude, heigh over the ground. But
before the drone can take off a series of checks must be done, such as GPS check, barometer,
accelerometer, radio... Once the check is passed the drone is armed and set to GUIDED mode
so command can be sent. On the DroneKit environment, if the radio is on, we can suddenly
change to stable mode in order to take control of the drone, just in case something goes wrong.
Its also displayed on the terminal screen the battery and altitude information, but if we can
have the GCS connnected to it, so there is no point on having those messages on the terminal
too, just in case debugging is needed[19].

1 d e f on_connect ( c l i e n t , obj , r c ) :
2 print " rc : " + str ( rc )
3 i f r c == 0 :
4 p r i n t " Connection s u c c e s s f u l "
5 p r i n t " Connected t o t h e " + h o s t + " s e r v e r with p o r t " + s t r ( p o r t )
Listing 23: MQTT message detection

The on_connect function, is run each time a message from the MQTT broker is received.

1 d e f hover ( s e c o n d s ) :
2 p r i n t " Hovering f o r %s s e c o n d s " % s e c o n d s
3 f o r x in range (0 , seconds ) :
4 l e f t = s e c o n d s x
5 p r i n t "%s s e c o n d s l e f t " % l e f t
6 time . s l e e p ( 1 )
Listing 24: Hover function

R - 57
8.4 Drone python code

A useful hover function that usually is used when data has to gathered from inboard sensors
or a photo has to be taken.

1 def land ( ) :
2 p r i n t "Now l e t s l a n d "
3 v e h i c l e . mode = VehicleMode ( "LAND" )
4 w h i l e v e h i c l e . mode . name =="LAND" :
5 p r i n t " A l t i t u d e : " , v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t , "m"
6 p r i n t " B a t t e r y : %s " % v e h i c l e . b a t t e r y . v o l t a g e , "V"
7 print " "
8 time . s l e e p ( 1 ) #Es e l mateix que e l d e l a y d Arduino
9 i f v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t <=1:
10 p r i n t " Reached ground "
11 p r i n t " M i s s i o n ended "
12 break

Listing 25: Landing command

Basic land command, just triggers the pre-defined LAND mode, and displays information
about the landing procedure.

1 def get_distance_metres ( aLocation1 , aLocation2 ) :


2 """
3 h t t p s : / / g i t h u b . com/ d i y d r o n e s / a r d u p i l o t / b l o b / master / T o o l s / a u t o t e s t /common . py
4 """
5 dlat = aLocation2 . l a t aLocation1 . l a t
6 dlong = aLocation2 . lon aLocation1 . lon
7 r e t u r n math . s q r t ( ( d l a t d l a t ) + ( d l o n g d l o n g ) ) 1 . 1 1 3 1 9 5 e5

Listing 26: Get distance V1

1 d e f g e t _ l o c a t i o n _ m e t r e s ( o r i g i n a l _ l o c a t i o n , dNorth , dEast ) :
2 """
3 For more i n f o r m a t i o n s e e :
4 h t t p : / / g i s . s t a c k e x c h a n g e . com/ q u e s t i o n s /2951/ a l g o r i t h m f o r o f f s e t t i n g al a t i t u d e l o n g i t u
5 """
6 e a r t h _ r a d i u s =6378137.0 #Radius o f " s p h e r i c a l " e a r t h
7 #C o o r d i n a t e o f f s e t s i n r a d i a n s
8 dLat = dNorth / e a r t h _ r a d i u s
9 dLon = dEast / ( e a r t h _ r a d i u s math . c o s ( math . p i o r i g i n a l _ l o c a t i o n . l a t / 1 8 0 ) )
10

11 #New p o s i t i o n i n d e c i m a l d e g r e e s
12 n e w l a t = o r i g i n a l _ l o c a t i o n . l a t + ( dLat 180/ math . p i )
13 newlon = o r i g i n a l _ l o c a t i o n . l o n + ( dLon 180/ math . p i )

R - 58
8.4 Drone python code

14 r e t u r n L o c a t i o n G l o b a l ( newlat , newlon , o r i g i n a l _ l o c a t i o n . a l t )

Listing 27: Get distance V2

1 d e f d i s t a n c e _ t o _ c u rr e n t _ w a y p o i n t ( ) :
2 nextwaypoint = v e h i c l e . commands . next
3 i f nextwaypoint ==0:
4 r e t u r n None
5 m i s s i o n i t e m=v e h i c l e . commands [ nextwaypoint 1] #commands a r e z e r o i n d e x e d
6 l a t = missionitem . x
7 lon = missionitem . y
8 a l t = missionitem . z
9 t a r g e t W a y p o i n t L o c a t i o n = L o c a t i o n G l o b a l R e l a t i v e ( l a t , lon , a l t )
10 d i s t a n c e t o p o i n t = g e t _ d i s t a n c e _ m e t r e s ( v e h i c l e . l o c a t i o n . global_frame ,
11 targetWaypointLocation )
12 return distancetopoint

Listing 28: Distance until next waypoint

The last 3 functions compute the distance between each waypoint, reference or object.Most of
them are from the python.dronekit.io examples [20].

1 d e f download_mission ( ) :
2 cmds = v e h i c l e . commands
3 cmds . download ( )
4 cmds . wait_ready ( ) # w a i t u n t i l download i s c o m p l e t e .

Listing 29: Download new mission created

Uploads the current created mission to the Pixhawk.

1 d e f g r i d ( a L o c a t i o n , a S i z e , SegmentX , SegmentY ) :
2

3 cmds = v e h i c l e . commands
4

5 p r i n t " C l e a r any e x i s t i n g commands "


6 cmds . c l e a r ( )
7

8 p r i n t " D e f i n e /add new commands . "


9

10 cmds . add (Command( 0 , 0 , 0 , m a v u t i l . mavlink .MAV_FRAME_GLOBAL_RELATIVE_ALT,


11 m a v u t i l . mavlink .MAV_CMD_NAV_TAKEOFF, 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 1 0 ) )
12 k = 1
13

14 p o i n t l a t = range (1000)

R - 59
8.4 Drone python code

15 p o i n t l o n = range (1000)
16 p o i n t a l t = range (1000)
17 h=1
18

19 print r a n g e ( SegmentY / 2 , 0 , 1)
20 print r a n g e ( SegmentX /2 ,0 , 1)
21 print r a n g e ( 1 , SegmentX/2+1)
22 print r a n g e ( 1 , SegmentY/2+1)
23 print r a n g e ( SegmentX /2 ,0 , 1)
24 print r a n g e ( 1 , SegmentX/2+1)
25

26 w h i l e k < ( SegmentY+1) ( SegmentX+1) +10:


27 f o r j i n r a n g e ( SegmentY /2+1 , 0 , 1) :
28 #i f back_track == F a l s e && back_track2 == F a l s e :
29 h=h
30 f o r i i n r a n g e ( SegmentX /2+1 ,0 , 1) :
31 p o i n t l a t [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . l a t
32 p o i n t l o n [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . l o n
33 p o i n t a l t [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . a l t
34 k = k+1
35 print k
36 print "1"
37 p o i n t l a t [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , 0 , a S i z e j ) . l a t
38 p o i n t l o n [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , 0 , a S i z e j ) . l o n
39 p o i n t a l t [ k]= g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , 0 , a S i z e j ) . a l t
40 k=k+1
41 f o r i i n r a n g e ( 1 , SegmentX/2+1) :
42 p o i n t l a t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . l a t
43 p o i n t l o n [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . l o n
44 p o i n t a l t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h , a S i z e j ) . a l t
45 k = k+1
46 print k
47 print "2"
48

49 f o r j i n r a n g e ( 1 , SegmentY/2+1) :
50 h=h
51 f o r i i n r a n g e ( SegmentX /2+1 ,0 , 1) :
52 p o i n t l a t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . l a t
53 p o i n t l o n [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . l o n
54 p o i n t a l t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . a l t
55 k = k+1
56 print "3"
57 p o i n t l a t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n ,0 , a S i z e j ) . l a t
58 p o i n t l o n [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n ,0 , a S i z e j ) . l o n

R - 60
8.4 Drone python code

59 p o i n t a l t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n ,0 , a S i z e j ) . a l t
60 k=k+1
61 f o r i i n r a n g e ( 1 , SegmentX/2+1) :
62 p o i n t l a t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . l a t
63 p o i n t l o n [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . l o n
64 p o i n t a l t [ k ] = g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e i h, a S i z e j ) . a l t
65 k = k+1
66 print "4"
67

68 p r i n t g e t _ l o c a t i o n _ m e t r e s ( a L o c a t i o n , a S i z e 2, a S i z e 3 ) . l a t
69 print pointlat [ 2 ]
70 f o r k i n r a n g e ( ( SegmentY+1) ( SegmentX+1)+10) :
71 cmds . add (Command( 0 , 0 , 0 , m a v u t i l . mavlink .MAV_FRAME_GLOBAL_RELATIVE_ALT,
72 m a v u t i l . mavlink .MAV_CMD_NAV_WAYPOINT, 0 , 0 , 0 , 0 , 0 , 0 , p o i n t l a t [ k ] ,
73 pointlon [ k ] , pointalt [ k ] ) )
74

75 #add dummy waypoint "K+1" a t p o i n t k


76 cmds . add (Command( 0 , 0 , 0 , m a v u t i l . mavlink .MAV_FRAME_GLOBAL_RELATIVE_ALT,
77 m a v u t i l . mavlink .MAV_CMD_NAV_WAYPOINT, 0 , 0 , 0 , 0 , 0 , 0 , p o i n t l a t [ k ] ,
78 pointlon [ k ] , pointalt [ k ] ) )
79

80 p r i n t " Upload new commands t o v e h i c l e "


81 cmds . upload ( )

Listing 30: Creates the grid mission

Creates a basic grid, by creating waypoints along a starting one. The size of the grid can be
increased or decreased by changing the SegmentX, SegmentY and aSize values.

1 def gotopoint () :
2 p r i n t " S e t d e f a u l t / t a r g e t a i r s p e e d t o %s " % a i r s p e e d
3 v e h i c l e . a i r s p e e d = 20 # m/ s
4

5 # Using t h e World G e o d e t i c System WGS 8 4 ,


6 p o i n t 1 = L o c a t i o n G l o b a l R e l a t i v e ( l a t , lon , 3 0 )
7

8 # h t t p : / / python . d r o n e k i t . i o / automodule . html#d r o n e k i t . V e h i c l e . simple_goto


9 # ATENCIO === There i s no mechanism f o r n o t i f i c a t i o n when t h e t a r g e t l o c a t i o n
10 # i s reached
11 # and i f a n o t h e r command a r r i v e s b e f o r e t h a t p o i n t t h a t w i l l be e x e c u t e d
12 # immediately .
13 v e h i c l e . simple_goto ( p o i n t 1 )
14

15 currentLocation = vehicle . location . global_relative_frame

R - 61
8.4 Drone python code

16 print currentLocation
17 targetDistance = get_distance_metres ( currentLocation , point1 )
18 p r i n t " The t o t a l d i s t a n c e i s : %s " % t a r g e t D i s t a n c e , "m"
19 w h i l e v e h i c l e . mode . name=="GUIDED" :
20 #Stop a c t i o n i f we a r e no l o n g e r i n g u i d e d mode .
21 remainingDistance = get_distance_metres ( v e h i c l e . l o c a t i o n . global_relative_frame , po
22 p r i n t " D i s t a n c e t o t a r g e t : " , r e m a i n i n g D i s t a n c e , "m"
23 p r i n t " A l t i t u d e : " , v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t , "m"
24 p r i n t " B a t t e r y : %s " % v e h i c l e . b a t t e r y . v o l t a g e , "V"
25 print " "
26 i f r e m a i n i n g D i s t a n c e <=t a r g e t D i s t a n c e 0 . 0 1 :
27 p r i n t " Reached t a r g e t "
28 break
29 time . s l e e p ( 2 )
Listing 31: Waypoint setting command

Sends the vehicle to the next location and displays information on the terminal.

1 # We can d e f i n e f u n c t i o n s , and i n c l u d e them i n a n o t h e r s e p a r a t e f u n c t i o n


2 d e f main ( c l i e n t , obj , msg ) : #used t o be on_message
3 p r i n t " Topic : " + msg . t o p i c + " \n " + " Message : " + s t r ( msg . payload )
4 print " "
5 decoded = j s o n . l o a d s ( msg . payload )
6 i f first_time :
7 global first_time , lat , lon
8 first_time = False
9 print ( " " )
10 p r i n t ( " Topic : " + msg . t o p i c + " \n " + " Message : " + s t r ( msg . payload ) )
11 print ( " " )
12 decoded = j s o n . l o a d s ( msg . payload )
13 p r i n t " ID : " , decoded [ " ID " ] ,
14 p r i n t " Lat : " , decoded [ L a t i t u d e ] ,
15 p r i n t " Long : " , decoded [ Longuitude ]
16 p r i n t
17 # I n i t i a l i z e t h e t a k e o f f s e q u e n c e t o 20m
18 p r i n t " t r y i n g to t a k e o f f "
19

20 l a t = f l o a t ( decoded [ L a t i t u d e ] )
21 l o n = f l o a t ( decoded [ Longuitude ] )
22

23 arm_and_takeoff ( 3 0 )
24

25 # Hover f o r 10 s e c o n d s
26 hover ( hovering_time )

R - 62
8.4 Drone python code

27 gotopoint ()
28 grid_mission ()
29 return_to_launch ( )
30 f i r s t _ t i m e = True

Listing 32: Main function

1 d e f return_to_launch ( ) :
2 p r i n t " Ret ur ni ng t o Launch "
3 v e h i c l e . mode = VehicleMode ( "RTL" )
4 w h i l e VehicleMode ( "RTL" ) :
5 p r i n t " Altitude : " , v e h i c l e . l o c a t i o n . global_
6 r e l a t i v e _ f r a m e . a l t , "m"
7 p r i n t " B a t t e r y : %s " % v e h i c l e . b a t t e r y . v o l t a g e , "V"
8 print " "
9 time . s l e e p ( 1 )
10 i f v e h i c l e . l o c a t i o n . g l o b a l _ r e l a t i v e _ f r a m e . a l t <=0.5:
11 p r i n t " Reached ground "
12 p r i n t " M i s s i o n ended "
13 break ;

Listing 33: RTL command

Sets the basic Return to Launch mode, and sends some information back to the terminal.

1 def grid_mission () :
2 p r i n t C r e a t e a new m i s s i o n ( f o r c u r r e n t l o c a t i o n )
3

4 SegmentY=f l o a t ( s e g y )
5 SegmentX=f l o a t ( s e g x )
6

7 g r i d ( v e h i c l e . l o c a t i o n . global_frame , 2 0 , 8 , 8 )
8

9 print " Starting mission "


10 # Reset m i s s i o n s e t t o f i r s t ( 0 ) waypoint
11 v e h i c l e . commands . next=0
12

13 # S e t mode t o AUTO t o s t a r t m i s s i o n
14 v e h i c l e . mode = VehicleMode ( "AUTO" )
15

16 # Monitor m i s s i o n .
17 # Demonstrates g e t t i n g and s e t t i n g t h e command number
18 # Uses d i s t a n c e _ t o _ c u r r e n t _ w a y p o i n t ( ) , a c o n v e n i e n c e f u n c t i o n f o r f i n d i n g t h e
19 # d i s t a n c e t o t h e next waypoint .
20

R - 63
8.4 Drone python code

21 w h i l e True :
22 nextwaypoint=v e h i c l e . commands . next
23 p r i n t D i s t a n c e t o waypoint (%s ) : %s % ( nextwaypoint ,
24 distance_to_current_waypoint ( ) )
25

26 i f nextwaypoint==(SegmentY+4) ( SegmentX+4)+8:
27 p r i n t " E x i t s t a n d a r d m i s s i o n when s t a r t h e a d i n g t o f i n a l waypoint "
28 break ;
29 time . s l e e p ( 1 )

Listing 34: Merges the grid mission on to the general mission

client.on_message: Each time we receive a message on the topic we are subscribed to,
the client.on_message is called.

client.on_connect:The function is executed each time the broker answers our


connection request.

client.username_pw_set(username,passwd): Sets the username and password for


the communication channel.

client.connect(host,port): Connects to the host we define.

client.subscribe(topic2, 0): Sets the topic where we will be listening for messages.

client.loop_forever(): Blocks from disconnecting from the MQTT service.

1 # Object c l i e n t
2 c l i e n t = mqtt . C l i e n t ( )
3

4 c l i e n t . on_message = main
5

6 c l i e n t . on_connect = on_connect
7

8 # Connect , t h e one WE have s t a b i l i s e d on cloudmqtt . com


9 c l i e n t . username_pw_set ( username , passwd )
10 c l i e n t . c o n n e c t ( host , p o r t )
11

12 # S t a r t s u b s c r i b e , with QoS l e v e l 0
13 c l i e n t . s u b s c r i b e ( topic2 , 0)
14

15 p r i n t " Waiting t h e m i s s i o n t o s t a r t "


16 c l i e n t . loop_forever ()

Listing 35: MQTT parameters

R - 64
8.5 Data python code

8.5 Data python code

1 d e f on_message ( c l i e n t , obj , msg ) :


2 # This i s a c l a s s with members t o p i c , payload , qos , r e t a i n .
3 p r i n t ( " Topic : " + msg . t o p i c + " \n " + " Message : " + s t r ( msg . payload ) )
4 p r i n t ( "" )
5 decoded = j s o n . l o a d s ( msg . payload )
6 p r i n t " Temperature : " , decoded [ " Temperature " ] , "C"
7 p r i n t " Humidity : " , decoded [ Humidity ] , "%"
8 p r i n t " Hydrometer : " , decoded [ Hydrometer ]
9 p r i n t " Measurement : " , decoded [ Measurement ]
10 p r i n t " B a t t e r y v o l t a g e : " , decoded [ B a t t e r y ] , "V"
11 p r i n t " Time " , time . s t r f t i m e ( "%Y%m%d %H:%M:%S " )
12 p r i n t
13 file_name = data_sensors_test1 . txt
14 f i l e _ t e m p e r a t u r e = temp . t x t
15 f i l e _ h u m i d i t y = hum . t x t
16 f i l e _ h y d r o m e t e r = hydro . t x t
17 f i l e _ b a t t e r y = bat . t x t
18 f i l e _ m e a s u r e m e n t = measur . t x t
19 f i l e _ t i m e = time . t x t
20

21 with open ( file_name , a ) a s x _ f i l e :


22 # wb i s t h e mode
23 # x _ f i l e i s t h e o b j e c t where we s a v e data
24 x _ f i l e . w r i t e ( Temperature : {} \n . format ( decoded [ Temperature ] ) ) , "C"
25 x _ f i l e . w r i t e ( Humidity : {} \n . format ( decoded [ Humidity ] ) ) , "%"
26 x _ f i l e . w r i t e ( Hydrometer : {} \n . format ( decoded [ Hydrometer ] ) )
27 x _ f i l e . w r i t e ( B a t t e r y : {} \n . format ( decoded [ B a t t e r y ] ) ) , "V"
28 x _ f i l e . w r i t e ( Measurement : {} \n . format ( decoded [ Measurement ] ) )
29 time_value = time . s t r f t i m e ( "%Y%m%d %H:%M:%S " )
30 x _ f i l e . w r i t e ( Time : {} \n . format ( time_value ) )
31 x _ f i l e . w r i t e ( \n )
32 with open ( f i l e _ t e m p e r a t u r e , a ) a s x _ f i l e :
33 # wb i s t h e mode
34 # x _ f i l e i s t h e o b j e c t where we s a v e data
35 x _ f i l e . w r i t e ( {} . format ( decoded [ Temperature ] ) )
36 x _ f i l e . w r i t e ( \n ) , " \n "
37 with open ( f i l e _ h u m i d i t y , a ) a s x _ f i l e :
38 # wb i s t h e mode
39 # x _ f i l e i s t h e o b j e c t where we s a v e data
40 x _ f i l e . w r i t e ( {} . format ( decoded [ Humidity ] ) )
41 x _ f i l e . w r i t e ( \n ) , " \n "
42 with open ( f i l e _ h y d r o m e t e r , a ) a s x _ f i l e :

R - 65
8.5 Data python code

43 # wb i s t h e mode
44 # x _ f i l e i s t h e o b j e c t where we s a v e data
45 x _ f i l e . w r i t e ( {} . format ( decoded [ Hydrometer ] ) )
46 x _ f i l e . w r i t e ( \n ) , " \n "
47 with open ( f i l e _ b a t t e r y , a ) a s x _ f i l e :
48 # wb i s t h e mode
49 # x _ f i l e i s t h e o b j e c t where we s a v e data
50 x _ f i l e . w r i t e ( {} . format ( decoded [ B a t t e r y ] ) )
51 x _ f i l e . w r i t e ( \n ) , " \n "
52 with open ( file_measurement , a ) a s x _ f i l e :
53 # wb i s t h e mode
54 # x _ f i l e i s t h e o b j e c t where we s a v e data
55 x _ f i l e . w r i t e ( {} . format ( decoded [ Measurement ] ) )
56 x _ f i l e . w r i t e ( \n ) , " \n "
57 with open ( f i l e _ t i m e , a ) a s x _ f i l e :
58 # wb i s t h e mode
59 # x _ f i l e i s t h e o b j e c t where we s a v e data
60 x _ f i l e . w r i t e ( {} . format ( decoded [ Time ] ) )
61 x _ f i l e . w r i t e ( \n ) , " \n "

Listing 36: Write data on text file from the MQTT messages

It has a really similar code structure to the drone_code.py file, the only difference is the above
part, where the JSON packet coming is also saved to a .txt file for further study .

Notice that each value is written on a different file, so it will be easier to graphic each of them
using Excel for example.

R - 66
8.5 Data python code

Figure 8.1: Waypoint specific mission created by the Python code

Figure 8.2: Realtime of the drone path

R - 67
8.6 MQTT

8.6 MQTT
The MQTT broker must be installed both in the Raspberry Pi and in the computer where the
tests are done.

8.6.1 MQTT Broker

The MQTT service needs of a server where all the messages arrive. Amazon developed a great
system which enables developers to test their apps or systems directly on their platform through
the AWS services. They have a free plan, with limited functionalities. On the control panel we
set the username, password and a url with a port is given to use, which we must use to connect
with the sensor to the broker.

8.6.2 Raspberry Pi 3 MQTT installation

The following commands are from an Adafruit tutorial.


There are several versions of MQTT available in the market and ours is the well-known
Mosquitto
1 sudo aptg e t update
2 sudo aptg e t upgrade
3 sudo aptg e t d i s t upgrade
4 sudo aptg e t i n s t a l l m o s q u i t t o mosquittoc l i e n t s pythonm o s q u i t t o

Listing 37: Install mosquitto packages

Then, we must configure the mosquitto.conf file. Its path should be


/etc/mosquitto/mosquitto.conf
1 # Config f i l e f o r mosquitto
2 #
3 # See m o s q u i t t o . c o n f ( 5 ) f o r more i n f o r m a t i o n .
4

5 user mosquitto
6 max_queued_messages 200
7 message_size_limit 0
8 allow_zero_length_clientid true
9 allow_duplicate_messages f a l s e
10

11 l i s t e n e r 1883
12 a u t o s a v e _ i n t e r v a l 900
13 autosave_on_changes f a l s e
14 persistence true
15 p e r s i s t e n c e _ f i l e m o s q u i t t o . db

R - 68
8.6 MQTT

Figure 8.3: CloudMQTT console

16 allow_anonymous t r u e
17 p a s s w o r d _ f i l e / e t c / m o s q u i t t o / passwd

Listing 38: Configuration file for mosquitto

R - 69
8.7 Shell script

1 sudo s y s t e m c t l r e s t a r t m o s q u i t t o
Listing 39: Restart mosquitto

8.6.3 Mac MQTT installation

In order to start testing the code written both for the Raspberry and the Arduino is much
easier to send the messages directly from the terminal, rather than from the sensor itself, as
from the terminal can be personalized easily.
Well be using for the Mac, the mosquitto browser, and as we have already installed the mac
package manager homebrew, then just type:
1 brew i n s t a l l m o s q u i t t o
Listing 40: Installing mosquitto package

1 l n s f v / u s r / l o c a l / opt / m o s q u i t t o / . p l i s t ~/ L i b r a r y / LaunchAgents
Listing 41: Installing mosquitto package 2

Finally we need to create a link between Python and MQTT, and it requires the Python Eclipse
MQTT library
1 t a r x v f o r g . e c l i p s e . pho . mqtt . python 1 . 1 . t a r
2 cd o r g . e c l i p s e . pho . mqtt . python 1.1
3 sudo python s e t u p . py i n s t a l l
Listing 42: Python and MQTT link

8.7 Shell script


Weve been talking about making all the system and process completely autonomous, but
weve been running manually all the code to the raspberry from the Mac terminal.

We need the Raspberry Pi to automatically run the pythons codes, once it boots up. Heres
where the script comes really handy. A script basically "write" a number of commands directly
to the terminal, so that we do not have to do so directly.

Its a really basic file, which can be directly created from the Mac terminal, to the Raspberry
Pi.
Connect via SSH to the Raspberry Pi using the IP address:
1 s s h pi@your_ip_address
Listing 43: SSH connection to Raspberry Pi 3

R - 70
8.7 Shell script

Enter your password, and move with cd to the directory where the code to be run will be
stored. A new directory can be created using the mkdir command.

Now, lets create the shell script using the text editor, nano
1 sudo nano l a u c h e r . sh

Listing 44: Creates the launcher.sh file

Enter the script, which basically the commands we would manually introduce to the terminal
in order to run the python code.

1 cd /
2 cd home/ p i / y o u r _ d i r e c t o r y
3 sudo python python_name . py
4 cd /

Listing 45: Start script

Figure 8.4: Scrip for running the code on start

The script, first moves to the main root so it has a standard reference and moves to the
specific directory where the code and the script is saved. Finally enter the command to
execute the code. Notice the sleep 30 command, and in these case is mandatory as most part
of the code requires internet connection and when first trying the script, everything went well,
but the code misbehaved because was running part of it without having internet connection

R - 71
8.7 Shell script

as the Raspberry Pi was not connected yet. So just a few seconds for the Raspberry to connect.

R - 72
Section 9: Results

9 Results
In order to test all the systems where working correctly I decided to test each part separately
each time I was doing some kind of improvement. That way I could test the results from each
system without needing to finish the whole project to see the final result. Here the test and its
order:

9.0.1 Sensor

Testing the sensor internet connection: As soon as I had the first prototype on a
breadboard of the sensors, I started to test whether or not it was capable of connecting
to a WiFi network. Once it was connected, I tried to receive information to it.

Testing the sensor MQTT platform: Using the MQTT client installed on the Mac I
started sending random values to the sensors, to check if they were able to subscribe to
the topic they were assigned.

Sensor MQTT messages publication: Later, I uploaded the final code to the sensors
and I started heating up the sensor using a hair dryer. I was pretending a wildfire was
about to start, so the sensor must detect so and send a message to the DroneKit topic
with its location.

Battery life and charging capacity: As the sensor I going to be used on battery I
wanted to check how long they would last with the actual configuration, but having in
mind that i was just a prototype and the main objective was not to develop the sensor
even less making it battery efficient.

9.0.2 SITL

The SITL connection: The drone is actually being simulated by the SITL code, and
the first stage was installing it, and once it was done, we must check if we can succesfully
run an example code to it, to both check if the installation went well and if we understand
how to run the code directly to the drone.

Write the first code: Understanding the basics of the Python coding within the
DroneKit environment took some time as it was a first for me.

Connecting with UDP to a GCS: We need to forward all the information the SITL
is generating to a visual tool as the terminal does not provide enough information.

R - 73
Section 9: Results

Connecting the Rapsberry via serial with the Pixhawk: During the real
application of the drone system the Pixhawk and Raspberry Pi are connected via serial.
Several test were made to check which baudrate worked best.

Checking the closed loop for the first time: I wanted to check if I was able to trigger
the drone to take off, by heating up the sensor. There was nothing connected together,
ust the Rapsberry to the Pixhawk, and the sensors to a power source.

Figure 9.1: Scrip to run code on start

1. Starting the SITL: In order to simulated the mission before actually flying it comes
really handy to check for bugs on the code. The SITL simulates the hardware, so the
code can be running in a simulated Pixhawk environment.

2. Forward the connection: All the data that the SITL is simulating can be displayed
on a GCS as if the drone was real and it were sending the data through a 900 MHz link.

3. Run the Python code: As the SITL is being run on the same computer, we have to
run the Python code in it too. However, we could be running the code on a Raspberry
and parse the UDP address where the SITL is "listening"

R - 74
Section 9: Results

9.0.3 Prototypes

While I was looking for all the hardware to build a proper sensor with internet connection,
I had to test many processor as it was mentioned before. The main problem is that most of
them were not breadboard friendly so I ended up soldering them in some PCB board I usually
have with me.

Version 1: It had an Arduino Nano and the simplest ESP8266 1E wired together but no
sensors were included as I was just testing the internet connection. At first, I was testing a
HTTP connection protocol, but I had many problems as I had to open the socket each time I
wanted to send information, and for that kind of processor it was too demanding and unstable.
I also included a basic voltage regulator with built with a LM317.

Figure 9.2: Version 1

Version 2: In that prototype I was testing the LiPo recharging possibilities so I wired up a
TP4056 prebuilt charge to test if it would be sufficient for the batteries I was planning to use.
Also, as the input voltage of the batteries was 4.2 V if the batteries were fully charged and
it was not sufficient for the Arduino to power, I had to install a 5 V booster which was also
prebuilt to make everything easier for the prototype part.

R - 75
Section 9: Results

Figure 9.3: Version 2

Version 3: Finally on the version 3 I moved to the ESP8266 12E as I found out it was quite
impossible to send a considerable amount of data in a reliable way with the 1E version. On top
of that, the 12E version had a much more powerful chip already on it, therefore the Arduino
was no longer needed.

Figure 9.4: Version 3

R - 76
Section 9: Results

Once I tested everything was working correctly I tested with the DT11 sensor and the solar
panel.

Figure 9.5: Breadboard Versions 2

R - 77
Section 10: Future Improvements

10 Future Improvements
The project Ive developed consists of many sub projects in which I have not entered. In order
to have a fully operative system there are many aspects that need further investigation and
development.
The subsystems that require more development are:

Integrate PCB with the Raspberry and Pixhawk: Having a multiple cables
connecting different hardware components highly increase the amount of problems that
can appear due to incorrect connections, cables and connectors broken and the overall
weight also increases due to different housings. On the other hand,it would be cheaper to
manufacture a simple board.

Integrated 4G modem: As seen during the project, internet connection is needed on


both the sensors and the quadcopter, and a portable 4G router has been used. There
are many downsides on the solution used, as its powered by a battery, has a significant
weight and the signal quality is not as good as a normal router.

Code: The Python code has endless possibilities, as the coding process is only limited
nearly by our imagination. The Raspberry is able to handle even image and video
processing, so the landing could be done through image recognition in order to have
a more precise and reliable landing.

Quadcopter: The platform used to carry all the hardware was not design with it in
mind. Is the same drone used during the special course in my university. The efficiency
for example, could be highly improved by using a large frame and therefore being able to
install large propellers. On the other side, the quadcopter itself is not the most efficient
platform too. The new system such as VTOL (Vertical Take Off Landing) that are able to
take off and land as a quadcopter but during the cruising flight they become an airplane,
achieving longer flight times.

Sensors: The sensors used are the most basic ones found in the market and are usually
used for DIY project with Arduino, so their accuracy is not fully tested. In addition, the
system has not been designed to be power efficient, so its continuous working time is still
far from acceptable.

Batteries and solar panel: From the sensors too, we have used 2 18650 batteries in
parallel, but a 5V booster was needed to have enough voltage fro the system to work
correctly. The 5V booster had a 85% efficiency so even before connecting all the other

R - 78
Section 10: Future Improvements

components, we were losing energy. Also, the solar panel, were the cheapest ones available
and the power and efficiency could be easily improved with a higher quality product.

Landing zone: Its not been designed the landing zone for each drone, but a proper
housing in case of bad weather would be needed if the idea is to keep the drones 24h
unattended.

Charging system: The charging process requires someone to plug the batteries on the
charger and setup the whole process. An intelligent charging system thats able to "read"
the battery characteristics to adjust the charging curve to it would be mandatory. There
are several systems available at the market that already do it, so it would not be necessary
to design it from scratch.

Figure 10.1: ESP32 Figure 10.2: Sigfox module

R - 79
Section 11: Economic and environmental impact

11 Economic and environmental impact

11.1 Economic impact


The final cost of the system has the main advantage that depends on the number of sensors,
but taking into account that the drone-sensors ratio is around 1-20, a single network would
consist of the following:

x20 Sensors

x1 Quadcopter

x2 4G WiFi modules

x1 Charging spot

Its important to compare the total are we can look out with these new system and its cost,
and compare it with the actual cost.

11.2 Environment
The main advantage from using the surveillance tower as landing zones helps us reduce the
direct impact the whole network might have, as the infrastructure is already there, with
electricity and viable road to access it.

On the other hand, the sensors. The exterior sensor box is intended to mimic with its
surrounding using similar colors. Moreover its size is really small and will be away from
animals, so they cannot injure their self.

Theres an issue we must take really into account. The batteries. Its known that LiPo
batteries can explode and create bug flames that would be a huge problem, as it could ignite
a large wildfires while we were trying to avoid and prevent them. Therefore the box where the
batteries are contained must be completely fireproof, to prevent so from happing. However,
there already is a discharge protection circuit which prevents the batteries from discharging
too much which could let to the unprovable explosion.

Besides from the possible solutions that having stated, we could use the whole system as:

Search and seek mission for disappeared people.

R - 80
11.2 Environment

Animal conservation and poaching prevention.

Pest control.

Change in vegetation.

Climate change studies.

R - 81
Section 12: Conclusions

12 Conclusions
Finally, the project is done. What do I have learned? During the project Ive found with many
aspects I could not do as I had planned, both because I did not know how to, or because it
was to expensive if I used the approach I had in mind.

Therefore, some a the greatest learnings has been the capability to overcome such difficulties
while keeping on track. Focusing on the technical part of the project, I can humbly said Ive
been able to end up doing what I had planned and Ive been dreaming off, since I first met
with the drone industry.

Although it is a basic prototype system, Ive been delighted by the amount of problems it can
tackle, its simplicity overall and its price over actual solutions.

Do I think it is feasible? Absolutely yes. We are going towards a future where more and more
systems are getting automatized, and drones wont be an exception. Such capabilities allow
us to squeeze its real potential, that from my point of view, is reducing cost on most of the
aerial operations and on top of that bring the sky closer to everyone.

On the other hand, besides from its technical solutions, I would love to think, that the project
can be used as an education resource. The project can be used as a start for a subject on
drone coding, as the framework needed to install and run all the programs has been deeply
explained.

Finally, open-source. Ive been amazed by the amount of information its available on the
internet, especially the community that makes it possible. Thats why I want to make the
whole project, accessible to everyone, therefore Ill be uploading the code to Github and the
written project to the university platform.

R - 82
Section 13: References

13 References
[1] Stan Kubota, Fire Management Today, Unitated States Dep. Agric., vol. 67, pp.
21,22,23, 2007. [Online]. Available: https://naldc.nal.usda.gov/download/8077/PDF

[2] J. K. Balch, B. A. Bradley, J. T. Abatzoglou, R. C. Nagy, E. J. Fusco, and


A. L. Mahood, Human-started wildfires expand the fire niche across the United
States. Proc. Natl. Acad. Sci. U. S. A., vol. 114, no. 11, pp. 29462951,
mar 2017. [Online]. Available: http://www.ncbi.nlm.nih.gov/pubmed/28242690http:
//www.pubmedcentral.nih.gov/articlerender.fcgi?artid=PMC5358354

[3] H.-W. Lin, J. L. McCarty, D. Wang, B. M. Rogers, D. C. Morton, G. J. Collatz, Y. Jin, and
J. T. Randerson, Management and climate contributions to satellite-derived active fire
trends in the contiguous United States, J. Geophys. Res. Biogeosciences, vol. 119, no. 4,
pp. 645660, apr 2014. [Online]. Available: http://doi.wiley.com/10.1002/2013JG002382

[4] R. Hunneman and G. Hawkins, Infrared filters and dichroics for the advanced along-track
scanning radiometer, Appl. Opt., vol. 35, no. 28, p. 5524, oct 1996. [Online]. Available:
https://www.osapublishing.org/abstract.cfm?URI=ao-35-28-5524

[5] H. Sabit, A. Al-Anbuky, and H. GholamHosseini, Wireless Sensor Network Based


Wildfire Hazard Prediction System Modeling, Procedia Comput. Sci., vol. 5, pp. 106114,
2011. [Online]. Available: http://linkinghub.elsevier.com/retrieve/pii/S1877050911003413

[6] ADF, Objectius i finalitats | ADF. [Online]. Available: http://snadf.org/ADF/


objectius-i-finalitats.html

[7] U. UPC, CVC, Desarrollarn tecnologa aplicable a drones para gestionar emergencias
- UAB Barcelona. [Online]. Available: http://www.uab.cat/web/sala-de-prensa/
detalle-noticia-1345667994339.html?noticiaid=1345683454867

[8] DroneZon, Multispectral Imaging Camera Drones In


Farming Yield Big Benefits | DroneZon. [Online].
Available: https://www.dronezon.com/learn-about-drones-quadcopters/
multispectral-sensor-drones-in-farming-yield-big-benefits/

[9] Amazon, Amazon Prime Air. [Online]. Available: https://www.amazon.com/


Amazon-Prime-Air/b?node=8037720011

[10] FortuneTech, The $35 Raspberry Pi Computer Has Sold 10 Million Units | Fortune.com.
[Online]. Available: http://fortune.com/2016/09/08/raspberry-pi-10-million/

R - 83
Section 13: References

[11] Eumetsat, Meteosat Orbital Parameters EUMETSAT.


[Online]. Available: http://www.eumetsat.int/website/home/Data/ServiceStatus/
MeteosatOrbitalParameters/index.html

[12] M. Salehi, L. I. Rusu, T. Lynar, and A. Phan, Dynamic and Robust Wildfire
Risk Prediction System: An Unsupervised Approach. [Online]. Available: http:
//dx.doi.org/10.1145/2939672.2939685

[13] Espressif, About This Guide. [Online]. Available: http://espressif.com/sites/default/


files/documentation/0a-esp8266ex{_}datasheet{_}en.pdf

[14] Daniel D. McKinnon, Idiots Guide to DroneKit-Python: A Journey to WHOZ CHILLIN


- Daniel D. McKinnonDaniel D. McKinnon. [Online]. Available: http://www.ddmckinnon.
com/2015/12/30/idiots-guide-to-dronekit-python-a-journey-to-whoz-chillin/

[15] Communicating with Raspberry Pi via MAVLink Dev documentation. [Online].


Available: http://ardupilot.org/dev/docs/raspberry-pi-via-mavlink.html

[16] Connecting to a Vehicle. [Online]. Available: http://python.dronekit.io/guide/


connecting{_}vehicle.html

[17] Mqtt.org, FAQ - Frequently Asked Questions | MQTT. [Online]. Available:


http://mqtt.org/faq

[18] CloudMQTT - A globally distributed MQTT broker. [Online]. Available: https:


//www.cloudmqtt.com/

[19] Taking Off. [Online]. Available: http://python.dronekit.io/guide/taking{_}off.html

[20] Example: Guided Mode Movement and Commands (Copter). [Online]. Available:
http://python.dronekit.io/examples/guided-set-speed-yaw-demo.html

R - 84

Вам также может понравиться