Вы находитесь на странице: 1из 21

water

Article
Remote Image Capture System to Improve Aerial
Supervision for Precision Irrigation in Agriculture
Antonio Mateo-Aroca 1 , Ginés García-Mateos 2, * , Antonio Ruiz-Canales 3 ,
José María Molina-García-Pardo 4 and José Miguel Molina-Martínez 5
1 Electronic Engineering Department, Technical University of Cartagena, 30203 Cartagena, Spain;
antonio.mateo@upct.es
2 Computer Science and Systems Department, University of Murcia, 30100 Murcia, Spain
3 Engineering Department, Miguel Hernandez University of Elche, 03312 Orihuela, Alicante, Spain;
acanales@umh.es
4 Information Technologies and Communications Department, Technical University of Cartagena,
30203 Cartagena, Spain; josemaria.molina@upct.es
5 Food Engineering and Agricultural Equipment Department, Technical University of Cartagena,
30203 Cartagena, Spain; josem.molina@upct.es
* Correspondence: ginesgm@um.es; Tel.: +34-868-888-530

Received: 25 December 2018; Accepted: 29 January 2019; Published: 1 February 2019 

Abstract: Due to the limitations of drones and satellites to obtain aerial images of the crops in real
time, the time to flight delay, the problems caused by adverse weather conditions and other issues,
the use of fixed cameras placed on the regions of interest is essential to get closer, periodic and
on-demand images. Water management in agriculture is one of the most important applications of
these images. Top view images of a crop can be processed for determining the percentage of green
cover (PGC), and 2D images from different viewing angles can be applied for obtaining 3D models
of the crops. In both cases, the obtained data can be managed for calculating several parameters
such as crop evapotranspiration, water demand, detection of water deficit and indicators about
solute transport of fertilizers in the plant. For this purpose, a remote image capture system has been
developed for an application in lettuce crops. The system consists of several capture nodes and a
local processing base station which includes image processing algorithms to obtain key features
for decision-making in irrigation and harvesting strategies. Placing multiple image capture nodes
allows obtaining different observation zones that are representative of the entire crop. The nodes
have been designed to have autonomous power supply and wireless connection with the base station.
This station carries out irrigation and harvesting decisions using the results of the processing of the
images captured by the nodes and the information of other local sensors. The wireless connection is
made using the ZigBee communication architecture, supported by XBee hardware. The two main
benefits of this choice are its low energy consumption and the long range of the connection.

Keywords: image capture system; lettuce; irrigation management; wireless; ZigBee and XBee

1. Introduction
Developing sustainable systems for water management is one of the main research challenges
in agricultural engineering [1]. Applying new tools and methodologies is necessary to increase the
efficiency in using water by the crop and maintaining production levels [2]. The region of Murcia,
Spain, is one of the most productive zones of agricultural products in its context. However, it is
a water-deficit area with annual rainfall between 150 mm and 350 mm; availability and efficiency
in water use are critical factors in this region. Thus, the development of new methods for water
management and estimation of irrigation requirements of the crops is particularly important.

Water 2019, 11, 255; doi:10.3390/w11020255 www.mdpi.com/journal/water


Water 2019, 11, 255 2 of 21

Traditionally, irrigation decisions are done by farmers, who estimate the requirements based on
historic values of the reference and crop evapotranspiration (ETo and ETc, respectively). At most,
it is done by estimating the current water consumption of the crop based on information offered by
agrometeorological stations near the field. The management of the crop growth is an essential task that
is done manually by the farmers. This know-how is tried to be replicated in the automatic monitoring
systems using different parameters of the crops.
Precision agriculture applies geospatial technology (such as remote sensing, GPS and GIS) to
monitor the fields. Currently, high-resolution satellite imagery is very frequently applied to study
changes in crop and soil. However, the prohibitive cost and limited availability of these resources
encourage the development of alternative or complementary systems for agricultural applications.
This way, images captured with drones and, in general, unmanned aerial vehicles (UAVs), have proved
to be a feasible solution because of their high temporal and spatial resolution, reduced operational cost,
and high flexibility in inflight programming. UAVs ground stations provide user interfaces including
flight control, planning and image acquisition [3,4].
Some authors have defined precision irrigation as “the accurate and precise application of water
to meet the specific requirements of individual management units or plants and minimize the adverse
environmental impact” [5]. To achieve this goal, new technologies have to be developed to create
“an irrigation system that knows what to do, knows how to do it, knows what it has done, and learns
from what it has done” [6]. For this purpose, using images of crop areas has proved to be adequate to
estimate parameters such as the crop coefficient, Kc, and the evapotranspiration, ET [7,8].
The lack of water resources has led to the development of automatic systems to evaluate and
establish criteria to achieve savings in water consumption. The combination of information technology
in sensor networks and in field stations [9,10] is very beneficial for water management, allowing for an
optimal control and monitoring in the use of water [11–13]. One of the specific techniques for managing
the water in the crop and in the soil is the use of images [14]. The images of the green cover of the
crop or the canopy give an idea of the amount of water the crop is losing by evapotranspiration [15].
The percentage of the green cover of the crop (PGC) is related to the crop evaporation model [16]. In this
sense, the top view images of a crop can be processed for determining the PGC, which is then used for
computing the crop coefficient, Kc, and then the crop evapotranspiration, ETc. For example, in the case
of lettuce crops, the effective diameter of the plants can be estimated accurately [8], from which the
root depth, plant height and ETc are computed.
In order to improve the evapotranspiration model of a crop, a 3D model can also be obtained [17].
In this case, the 2D images from several viewing angles can be integrated for obtaining 3D models of
the crop. In both cases (2D and 3D models), the obtained data can be managed for calculating several
parameters, such as water demand, detection of water deficit and indicators about solute transport
of fertilizers in the plant [18]. Moreover, the images of the soil can be processed to obtain several
parameters related to soil water management (such as soil moisture and detection of fertilizers) [19].
Image surveillance systems can be based on static cameras or aerial capture using drones
or satellites. In any case, regular crop sampling is necessary to obtain robust and reliable
systems. Remote sensing has the advantage of providing information of bigger areas in longer time
periods [20]. These facts permit the farmers to analyze the change in crop growth over time. Moreover,
remote sensing information combined with measurements obtained in situ is essential to make more
timely and accurate decisions [20]. More recently, the technology related to big data and Internet of
Things (IoT) has also been incorporated into the agricultural engineering domain; an interesting review
of these applications can be found in [21].

1.1. Limitations of UAVs in Precision Agriculture Applications


Despite their great advantages, there are also some recognized technical difficulties related to
drones and UAVs, like the engine power, the short duration of the flights, the maintenance of flying
altitude, the stability in turbulence and conditions of wind, the self-shadowing depending on the
Water 2019, 11, 255 3 of 21

orientation of the sun and the current altitude, the existence of particles in the surrounding area,
and the variations in lighting because of the clouds [3,22]. Reliability is yet another concern for UAV
applications [22]. Payload weight is another important matter which limits the selection of the cameras
for UAVs. It is generally around 20–30% of the global weight of the vehicle [23], thus influencing the
camera that can be installed.
Accordingly, different low-cost cameras have been analyzed for agricultural applications due to
their low weight [24]. However, there are other identified problems with this kinds of cameras, such as
the small optical quality, the zoom and the focus [23]. Issues arising from aviation regulations [22,25]
can also be serious obstacles to the use of UAVs in agriculture. For example, as a part of the flying
permission, in order to cover the risks of causing damages to buildings, livestock or even humans,
an insurance is necessary. This kind of requisites are considered as the major impediments for their
use in practice [26]. Besides, the UAVs must always be in the view of the operator, who also needs to
have a pilot license. Thus, a flying team is necessary to operate the UAV, thus increasing the total cost.

1.2. Wireless Sensor Networks in Agriculture and Food Industry


The implementation of wireless sensor networks (WSNs) in precision agriculture enables
increasing efficiency, profitability and productivity, and at the same time reducing possible impacts
on the environment and wildlife. WSNs powered by batteries include different types of components:
sensors, processors and radio frequency (RF) modules. The motes or sensor nodes have wireless
connectivity to transmit the obtained data to a coordinator node or base station either directly or
through a gateway. The real-time information obtained in the field provides a useful base for the
managers to define the adequate strategies. This way, the decisions are not taken based on supposed
typical conditions, which could not be realistic, but on real and updated data.
The use of WSN technologies depends on the propagation of radio signals in practical
environments, being difficult because of shadowing of the signal, multipath propagation and
attenuation. In agriculture, radio frequency (RF) must face problems due to the placement of nodes
for a large coverage and adequate link quality over the crop canopies. WSN must operate in different
conditions like vineyards and orchards, in flat land or abrupt areas, and in all weather conditions,
all of which affect radio performance [27]. In these cases, the link power budget depends on the state
of the crop and the land, as well as other frequent factors such as node spacing and the height of the
antenna [28]. Normally, a received level of the signal between 10 dB and 20 dB over the sensitive limit
of the receiver is a convenient rate for the link budget [28].
The proportion of messages lost or packet reception rate is also vital in WSN development and
must be measured for any application. In accordance with the operating environment, an important
signal loss may occur at some frequencies, especially when radio nodes need line-of-sight for optimum
performance, being 2.4 GHz less robust than 900 MHz. Looking for better results, an alternative could
be putting intermediates nodes, thus allowing peer-to-peer connection to the base node. Another option
can be testing other frequencies, such as 916 MHz or 868 MHz, which are used in some nodes
commercially available, or developing nodes with higher RF power which are able to reach longer
distances. Additionally, connectivity ca be improved using directional antennas and optimizing
the orientation of the antenna, its shape and configuration [29]. Other authors have focused in the
development of autonomous image acquisition modules, such as the interesting work presented in [30],
where a thermal-RGB color space capture module is described with WiFi and Ethernet connectivity.
The weather is especially important in agricultural applications, since it affects the signal loss due
to atmospheric conditions [28]. Outdoor use should consider the consequences of moisture due to
humidity and precipitation. Some authors have analyzed the propagation of radio signals in potato
fields using 433 MHz nodes, finding that the propagation of better in wet weather [31]. Moreover,
rain and high humidity produced bigger signal strength in the receiver. Nevertheless, experiments
using 868 MHz and 916 MHz nodes indicated the reverse results; the gateway only received 60% of
Water 2019, 11, 255 4 of 21

sent messages when the relative humidity along the day was high. This ratio can grow to more than
70% in the driest day [32].
Ambient temperature also affects the performance of the motes. Low temperature has an
undesirable effect in the life of the node battery; tests in cooling chambers at diverse conditions
proved that battery has a shorter life at low temperature. For instance, for 2.4 GHz ZigBee nodes,
the battery life at 0 ◦ C is reduced by half in relation to its life at 20 ◦ C [33]. Finally, crop canopy has
also been proved to influence the signal. It has been observed that signal strength and attenuation
depend on the line-of-sight losses and heights less than the Fresnel zone radius [34].

1.3. ZigBee Wireless Protocol for Precision Agriculture


The communication among nodes in a WSN allows for the integration of varied sensors types from
the simplest (e.g., pressure, humidity and temperature) into the most complex ones (e.g., GPS location,
micro-radars, tracking and image capture), thus allowing them to screen a wide area to obtain detailed
information of the crops [35]. For this purpose, ZigBee, Bluetooth and WiFi wireless protocols,
GPRS/3G/4G technology and Long Range Radio (LoRa) protocol are some of the communication
technologies suitable for precision agriculture. Different works have compared these standards
according to different parameters such as the communication range, power consumption, bandwidth
speed, cost, system complexity, and other aspects [36].
Among them, Zigbee has been adopted as the ideal wireless protocol for many agricultural
applications due to its lower power consumption, the high number of nodes allowed (more than
65,000), its high reliability and low duty cycle [36–38]. The low duty cycle is important in domains as
water quality management, watering supervision, and control of fertilizers and pesticides, requiring a
frequent update of the information [39]. For this purpose, the ZigBee wireless protocol has been
used for preserving energy by switching between the sleep and active states. Therefore, energy use is
diminished and battery life of the motes is stretched. Some important aspects to investigate before
applying ZigBee are the effects of signal strength on the spacing of the nodes, the height of the antenna
in the base station and the leaf density. For example, in [40], different tests were performed in palm
orchards to evaluate a model of the signal propagation using the received signal strength meter of the
ZigBee protocol. They found that the wireless channel propagation model has to be adjusted before
deploying the motes in the orchards to achieve the robust signal strength. ZigBee is also being used in
smart beehives [41], orange orchards [42], automation in irrigation [43], and greenhouse monitoring
systems [44], among many other applications.
This paper describes the design, development and implementation of a remote image capture
system controlled by a central processing coordinator node. The system includes a web-based
application for accessing the information obtained from the capture nodes and measurement stations
installed in the farm. It can be integrated with watering equipment, such as furrow and surface
irrigation, different types of crop and production scales. It is designed to offer agronomists and
owners updated and detailed information of the state of the crop, the weather conditions and the
watering system, so better decisions can be taken to solve possible problems. The proposed structure
contains multiple image capture nodes to get many observation zones representative of the entire area
of plantation. They have autonomous power supply and wireless connection with the central unit,
which takes the irrigation and harvesting decisions using the results of processing the images and other
local sensors. This research opens the possibility for the development of a completely autonomous
irrigation decision system, which is still an open problem. On the other hand, one of the objectives
of the proposed system is to provide a technological tool that enhances the generation of irrigation
specialists, giving them as much information as possible about the processes in the farm or field.
This way, agricultural engineering schools could have powerful but inexpensive tools for the education
of their students.
Water 2019, 11, 255 5 of 21

2. Materials and Methods


Water 2019, 11, x FOR PEER REVIEW 5 of 21
2.1. Description of the Remote Image Capture System (RICS)
2.1. Description of the Remote Image Capture System (RICS)
The main objective of the development of the RICS is to create a device that captures images of the
Theofmain
region objective
interest the development
(ROI) of lettuce crops. ThisofRICSthe RICS is to createas
is implemented a device
a remotethat captures
node that isimages of
wirelessly
the region
linked withofainterest (ROI) of lettuce
local coordinator node thatcrops. This RICS
processes is implemented
the captured images.asAn a remote
embedded node that is
processor
wirelessly
architecturelinked with a local
implemented in thecoordinator node that
local coordinator processes
node, the captured
and a cloud images.
server running An embedded
a web application
processor architecture
take control of the wholeimplemented
system [45].inBesides,
the localthecoordinator
system obtains node, and a cloud
additional server running
information a web
from different
application take control of the whole system [45]. Besides, the system obtains additional
sensors installed in wireless nodes, actuators and connection devices. All of them operate together to information
from different
gather, processsensors installed
and display to theinlocal
wireless nodes,users
and remote actuators
all the and connection
information devices.
related to theAll of them
performance
operate togetheragriculture
of the precision to gather, process
tasks. and display to the local and remote users all the information
related to the performance of
Thanks to the visual monitoringthe precision agriculture
of the tasks.
crops, periodically capturing images of the ROIs and
Thanksthem
analyzing to thetovisual monitoring
determine the growthof theofcrops, periodically
the plants, capturing
it is possible imagesan
to obtain ofestimate
the ROIsofandthe
analyzing
percentagethem to determine
of vegetation cover,the
whichgrowth
can beofdefined
the plants,
as theitfraction
is possible to obtain
of land coveredan byestimate of the
the crop canopy
percentage
of the plantsof [46].
vegetation cover, which can be defined as the fraction of land covered by the crop
canopy of the1plants
Figure presents[46].
a global view of main elements of the proposed system. The remote vision
Figure 1 presents a global view
nodes or RICS act as autonomous of main
devices elements
capturing of thewith
images proposed system.
their own Thelogic
control remote
andvision
power
nodes
supply. They are connected to the local control node via XBee. This control node performspower
or RICS act as autonomous devices capturing images with their own control logic and image
supply.
storage,They are connected
processing to the
and transfer to local
a webcontrol
server node
in thevia XBee.
cloud. TheThis
webcontrol
server node performs
has the Internetimage
access,
storage,
allowing processing
the clientsand transferusers
or remote to a web server and
to monitor in the cloud. the
manage Thesystem.
web server has the Internet access,
allowing the clients or remote users to monitor and manage the system.

Globalview
Figure1.1.Global
Figure viewofofthe
theproposed
proposedsystem.
system.The
Themain
maincomponents
componentsarearethe
thein-field
in-fieldremote
remotevision
vision
nodes, the local control node web, the web server and the remote clients.
nodes, the local control node web, the web server and the remote clients.

TheRICS
The RICSnodes
nodesare areinstalled
installedinindifferent
differentmeasurement
measurementstations,
stations,located
locatedininspecific
specificzones
zonesofofthe
the
crop. An outline of the installation of the node in the crop is shown in Figure 2. The
crop. An outline of the installation of the node in the crop is shown in Figure 2. The RICS box that RICS box that
containsall
contains allthe
theelectronics
electronicsisislocated
locatedon onaamast
mastfixed
fixedtotothe
theground
groundatataaheight
heightofof1.5
1.5mmand
andwith
withanan
inclination of 30 ◦ with respect to the horizontal plane. This configuration defines a visual field of the
inclination of 30° with respect to the horizontal plane. This configuration defines a visual field of the
camera,asascan
camera, canbe
beobserved
observedininFigure
Figure3.3.The
Thefixing
fixinghardware
hardwareofofthetheRICS
RICSbox
boxallows
allowsmodifying
modifyingthe the
inclination to change the visual field and therefore the observation area of the crop. The selection ofof
inclination to change the visual field and therefore the observation area of the crop. The selection
thedifferent
the differentsample
sampleplots
plotsininthe
thecrop
cropand
andthethe viewing
viewing angle
angle determine
determine thethe portion
portion of the
of the plotplot
thatthat
is
monitored. They have to be fixed by the human expert considering the specific characteristics of the
crop.
Water 2019, 11, 255 6 of 21

is monitored.
Water 2019, 11,They
x FOR have to be fixed
PEER REVIEW by the human expert considering the specific characteristics6 of
of 21
Water 2019, 11, x FOR PEER REVIEW
the crop. 6 of 21

Figure
Figure 2. Schematic
2. Schematic configuration
configuration of aofremote
a remote vision
vision nodenode or RICS
or RICS box.
box. TheThe placement
placement of the
of the boxbox
is is
Figure 2. Schematic configuration of a remote vision node or RICS box. The placement of the box is
configurable
configurable andand determines
determines thethe part
part of the
of the crop
crop thatthat is visible
is visible under
under thethe
areaarea of interest.
of interest.
configurable and determines the part of the crop that is visible under the area of interest.

(a) (b)
(a) (b)

(c) (d)
(c) (d)
Figure 3. Some
Figure 3. Somesamples of theofimages
samples captured
the images by the RICS
captured nodes
by the in anodes
RICS crop ofinlettuce.
a cropImage resolution
of lettuce. Image
Figure 3. Some samples of the images captured by the RICS nodes in a crop of lettuce. Image
is 640 × 480 pixels.
resolution is 640 ×(a,b,d) are images
480 pixels. of and
(a), (b) romaine lettuce
(d) are (Lactuca
images sativa L.lettuce
of romaine var. Longifolia). (c) isL.
(Lactuca sativa thevar.
resolution is 640 × 480 pixels. (a), (b) and (d) are images of romaine lettuce (Lactuca sativa L. var.
image of iceberg
Longifolia). lettuce
(c) is (Lactuca
the image sativa L.lettuce
of iceberg var. Capitata ‘Iceberg’).
(Lactuca sativa L. var. Capitata 'Iceberg').
Longifolia). (c) is the image of iceberg lettuce (Lactuca sativa L. var. Capitata 'Iceberg').
The remote
The remote RICS
RICSnodenodestations consist
stations of of
consist an an
A20 A20 or or
A6CA6C(Ai-Thinker
(Ai-Thinker Co.Co.
Ltd., Shenzhen,
Ltd., Shenzhen,
The remote
Guangdong, China)RICS nodemodule
camera stations(both
consist
can of used
be an A20in or system,
the A6C (Ai-Thinker
since they Co.compatible)
are Ltd., Shenzhen,with
Guangdong, China) camera module (both can be used in the system, since they are compatible) with
Guangdong,
GPRS and WiFiChina) camera
communication module (both can
capabilities, be used
a ZigBee in
radiothe system,
device since
forfor they
wireless are
datacompatible)
communication, with
GPRS and WiFi communication capabilities, a ZigBee radio device wireless data communication,
GPRS
a powerand WiFi
unit communication
(composed of a of
solarcapabilities,
panel, a ZigBee
a charger and radio
a and device
battery), as for wireless
shown data 4.
in Figure communication,
Since4.theSince
RICSthe
a power unit (composed a solar panel, a charger a battery), as shown in Figure
anode
power
is unit
the (composed
main element of
of athe
solar panel, system,
proposed a charger
theand
next a sections
battery),describe
as shown in in Figure
detail the 4. Sinceand
design the
RICS node is the main element of the proposed system, the next sections describe in detail the design
RICS node is
implementation the main element
of theseofnodes. of the proposed system, the next sections describe in detail the design
and implementation these nodes.
and implementation of these nodes.
Water 2019, 11, 255 7 of 21
Water 2019, 11, x FOR PEER REVIEW 7 of 21

(a) (b)

(c)
Figure
Figure 4. 4. Sample
Sample view
view andand diagram
diagram of RICS
of the the RICS
box. box. (a) Assembly
(a) Assembly of theofprototype
the prototype RICS node.
RICS node. (b) The(b)
Theprototype
same same prototype with
with the the
box box closed,
closed, showing showing
the twothephotovoltaic
two photovoltaic
modulesmodules and
and the the position
position of theof
antenna. (c) Diagram
the antenna. (c) of the elements
Diagram in a elements
of the stacked mount:
in a XBee 868LP,
stacked a battery
mount: charger/photovoltaic
XBee 868LP, a battery
module controller, A20 module
charger/photovoltaic moduleand the Li-ionA20
controller, battery in theand
module top the
of the enclosure.
Li-ion battery in the top of the
enclosure.
2.1.1. Communication Module of the RICS Node
2.1.1.
The Communication
communication Module
moduleof theisRICS Node
based on a low-power XBee 868LP ZigBee module
(Digi International Inc., Minnetonka, MN, USA). It is configured as a transmission gateway of the
The communication module is based on a low-power XBee 868LP ZigBee module (Digi
images captured by the image module of the A20/A6C subsystem. This device also enables the
International Inc., Minnetonka, MN, USA). It is configured as a transmission gateway of the images
measuring nodes to be part of the wireless network, by sharing the same connection. To cover
captured by the image module of the A20/A6C subsystem. This device also enables the measuring
extensive distances from the local to the remote nodes, and to reduce the effect of signal attenuation
nodes to be part of the wireless network, by sharing the same connection. To cover extensive
caused by the vegetation, which is higher in fruit crops [47], wireless networks that operate in lower
distances from the local to the remote nodes, and to reduce the effect of signal attenuation caused by
frequency bands should be used [4]. The nodes contain 2 dBi omnidirectional antennas to reach
the vegetation, which is higher in fruit crops [47], wireless networks that operate in lower frequency
distances in outdoor/line-of-sight range up to 800 m, and with 12 dBi five-element Yagi antennas
bands should be used [4]. The nodes contain 2 dBi omnidirectional antennas to reach distances in
to allow for distances up to 1500 m. These maximum distances were tested in the experimentation
outdoor/line-of-sight range up to 800 m, and with 12 dBi five-element Yagi antennas to allow for
described in Section 3.
distances up to 1500 m. These maximum distances were tested in the experimentation described in
It should be noted that the configuration of the XBee node in ZigBee AT mode makes it possible
Section 3.
to dispense with the implementation of a microcontroller in the structure of the remote node, since the
It should be noted that the configuration of the XBee node in ZigBee AT mode makes it possible
requests of the coordinator node are transferred transparently by the XBee module and processed
to dispense with the implementation of a microcontroller in the structure of the remote node, since
by the images capture module A20/A6C. As a disadvantage, the coordinating system must make a
the requests of the coordinator node are transferred transparently by the XBee module and
modification in the configuration of the local XBee communication module by adjusting the appropriate
processed by the images capture module A20/A6C. As a disadvantage, the coordinating system
link address (i.e., the DL register: destination address) to the remote node with which it is desired to
must make a modification in the configuration of the local XBee communication module by
establish a communication link. This task does not involve significant delays or drawbacks since the
adjusting the appropriate link address (i.e., the DL register: destination address) to the remote node
configuration time is only a few seconds.
with which it is desired to establish a communication link. This task does not involve significant
delays
2.1.2. Imageor drawbacks since the
Capture Module configuration
of the RICS Nodetime is only a few seconds.
There
2.1.2. areCapture
Image many commercially
Module of the available
RICS Nodecamera modules that are highly compatible with
systems based on the Arduino platform (Arduino, Ivrea, Italy). These capture modules have
There are many commercially available camera modules that are highly compatible with
different specifications in terms of image resolution, connection type, output format, connectivity,
systems based on the Arduino platform (Arduino, Ivrea, Italy). These capture modules have
different specifications in terms of image resolution, connection type, output format, connectivity,
Water 2019, 11, 255 8 of 21

etc. To maintain compatibility with the ZigBee communication system, an important feature is to
use a camera where the transmission of image information is in UART (Universal Asynchronous
Receiver-Transmitter) serial format, and controllable with AT-type commands, a system that shares the
Digi XBee communication platform used in this development.
Related to the size of the camera module, there are solutions for the Arduino and tiny devices
such as ArduCam (ArduCam, China), although it cannot be connected directly to the Arduino and
need several connections. On the other hand, the A20/A6C module has the same camera (OV7670)
but it is smaller and needs just two connections for the Arduino, RX/TX. Besides, this camera module
can be configured with different resolutions (VGA, Quarter-VGA and Quarter-QVGA) and allows
configuring different parameters such as flash mode, night vision, image quality, image rotation,
exposure, brightness, white balance and contrast. The cost of the module is also an important aspect to
consider, since other solutions that have been considered offer a smaller size, better resolution and
better GPRS, but are much more expensive.
The transmission system and the coding of control instructions in the A20/A6C system make it
possible to avoid using a software to save images from the camera. This would require a coordinating
module to implement the reception and compression of the images, which would be a hard bottleneck
in the development of the system. Another interesting feature of this A20/A6C module is that it has
WiFi and GPRS communication, which uses the same communication/control AT protocol to allow
connecting the system to a WiFi or GPRS data network for sending information.

2.1.3. Solar Power Supply of the RICS Node


An important constraint of capture nodes is their reduced power capacity. Different
energy-efficient methods have been proposed in existing works to deal with the power battery problem
of the motes. For example, energy harvesting techniques has been introduced as a way to recover
energy. Some of these methods include solar panels, mechanical vibration, wireless power transfer
(WPT), kinetic and wind energy [48]. These mechanisms allow obtaining rechargeable nodes that are
able to operate uninterruptedly for longer periods. Solar energy using photovoltaic systems have been
used in agricultural applications of the WSNs [49]. Solar panels offer an excellent solution to guarantee
the operation of the agricultural monitoring system [50]. Solar cells have been previously used in some
studies to provide long-term power to sensor motes in agronomy. For example, an irrigation system is
developed in [51] based on the ZigBee wireless protocol and powered by rechargeable batteries and
solar panels.
The direct current (DC) power system of the RICS node consists of a pair of 1.5 W photovoltaic
solar panels (in total 3 W), connected to a Sunny Buddy module (SparkFun Electronics, Boulder, CO,
USA), with a maximum power point tracking (MPPT) system which provides an efficient management
of the energy coming from the solar panels, as well as an adequate management of the battery charge
based on the LT3652 chip (Analog Devices Inc., Norwood, MA, USA). The photovoltaic panels use
polycrystalline silicon with a size of 15 cm × 15 cm, and have a maximum power point voltage of
6 V and a maximum power point current of 0.25 A. The battery used in the remote node has been
dimensioned to achieve the appropriate days of autonomy considering the meteorology and the
average rate of cloudy days at the installation place. According to the measurements and calculations
of energy needs, approximately 2000 mAh are required for 7 days of autonomy, i.e., 7 days with
null radiation. The selected battery will be of Li-ION technology since it provides the best energy
performance. The work load is considered with power on and in automatic standby mode during
24 h at a capture rate of 1 image per hour. In the case of intensive use of image capture on demand,
the consumption of energy can be increased in a timely manner. However, in the typical use, a rate of 1
image per day should be enough to monitor crop canopy due to its slow growth rate.
If this happens, the local control node will take into account this increase of energy consumption.
Thus, it will reduce image captures at the following time intervals to allow for recovery of the energy
Water 2019, 11, 255 9 of 21

Water 2019, 11, x FOR PEER REVIEW 9 of 21

stored in the battery in case of solar radiation deficit. The typical consumption pattern of the remote
consumption
node in imagepattern
captureofmode
the remote node of
at intervals inone
image
hourcapture modeinatFigure
is depicted intervals
5. of one hour is depicted
in Figure 5.

Figure 5. Power supply current profile for a 1-hour data cycle of remote node.
Figure 5. Power supply current profile for a 1-hour data cycle of remote node.
The wireless communication modules (XBee 868LP) and the image capture processor in
The wireless mode
low-power/sleep communication modulesto (XBee
will be configured 868LP)
achieve and the efficiency
the maximum image capture processor in
in the management
low-power/sleep mode will be configured to achieve the maximum efficiency in the management
of the energy of the remote nodes. The functional tests performed showed consumption data in the of
the energy of the remote nodes. The functional tests performed
different operations with the values presented in Table 1. showed consumption data in the
different operations with the values presented in Table 1.
Table 1. Measured currents of the remote node in different states.
Table 1. Measured currents of the remote node in different states.
Measured Current High Rate Sleep
Power on Idle Idle + Cam on
MeasuredConsumption
Current High Rate
Transmission Mode Sleep
Power on Idle Idle + Cam on
Consumption
A20/A6C + XBee 868LP + 140 mA (180 Transmission
200 mA (240 Mode
200 mA 95 mA 1.2 mA
A20/A6C + XBee 868LP
Battery +
charger 140mAmApeaks) mA 200 mA
peaks)
XBee 868LP 200 mA 95 mA 25 mA mA peaks)1.7 µA1.2 mA
Battery charger (180 mA25peaks)
mA (24050 mA
Battery charger - - - - 85 µA
XBee 868LP 25 mA 25 mA 50 mA 1.7 µA
A20/A6C camera 60 mA 115 mA 150 mA <1 mA
Battery charger - - - - 85 µA
A20/A6C camera 60 mA 115 mA 150 mA <1 mA
In relation to the amount of energy available in the batteries of a remote system to capture images,
and from the point
In relation to of
theview of theofuse
amount of drones,
energy essential
available maintenance
in the tasks
batteries of and recharges
a remote system are needed
to capture
to maintain
images, and their
from operability. In addition
the point of view of the useto the necessary
of drones, infrastructure
essential maintenance to perform
tasks andthisrecharges
process,
it
are needed to maintain their operability. In addition to the necessary infrastructure to performcause
cannot be done automatically without human operators. A problem in this energy supply can this
an accident
process, and endanger
it cannot the material and
be done automatically human
without elements
human nearby,
operators. as well asinthe
A problem thisstructure of the
energy supply
drone itself.an
can cause However,
accidentthe proposed
and endangersolution can operate
the material andcontinuously
human elementsfor very long periods
nearby, as wellofastime.
the
structure of the drone itself. However, the proposed solution can operate continuously for very long
2.2. Description of the Base Station Coordinator Node
periods of time.
The base station coordinator node, or local control node, consists of different components:
2.2.Arduino
an Description
DUEof the Base Station
platform Coordinator
(Arduino, Ivrea,Node
Italy) based on the microcontroller Atmel SAM3X8E
(Microchip Technology Inc., Chandler, AZ, USA), an Ethernet/WiFi module for the connectivity with
The base station coordinator node, or local control node, consists of different components: an
the SCADA (Supervisory Control And Data Acquisition) web-based application, a ZigBee radio device
Arduino DUE platform (Arduino, Ivrea, Italy) based on the microcontroller Atmel SAM3X8E
XBee 868LP for low-power-consumption wireless data communication, and a real-time clock module
(Microchip Technology Inc., Chandler, AZ, USA), an Ethernet/WiFi module for the connectivity with
RTC DS3231 (Maxim Integrated, San José, CA, USA) to implement timestamps to image data files,
the SCADA (Supervisory Control And Data Acquisition) web-based application, a ZigBee radio
and a microSD card memory module to store the received images in a database. The capacity of the
device XBee 868LP for low-power-consumption wireless data communication, and a real-time clock
module RTC DS3231 (Maxim Integrated, San José, CA, USA) to implement timestamps to image
data files, and a microSD card memory module to store the received images in a database. The
Water 2019, 11, 255 10 of 21
Water 2019, 11, x FOR PEER REVIEW 10 of 21

card is 4 Gb
capacity (although
of the card isother
4 Gbcards can be other
(although used),cards
allowing
can storing around
be used), 52,000storing
allowing imagesaround
at an average
52,000
size of 80 Kb. The assembly of all these elements can be seen in Figure 6.
images at an average size of 80 Kb. The assembly of all these elements can be seen in Figure 6.

(a) (b)

(c)
Figure 6.6. Sample
Sampleview
viewand
anddiagram
diagramof of
thethe coordinator
coordinator node.
node. (a) Final
(a) Final prototype
prototype assembly
assembly of theof the
node.
(b) Schematic
node. diagramdiagram
(b) Schematic of the node. (c) node.
of the Diagram (c) of the elements
Diagram of theinelements
a stackedinmount; the elements
a stacked mount; are
the
placed
elementsin are
a stacked
placedmount to savemount
in a stacked space toand increase
save space the
andcapabilities
increase the ofcapabilities
the prototype.
of the prototype.

The
The coordinator
coordinator node
node can
can be
be connected
connected via
via USB
USB to
to aa PC
PC with
with aa serial
serial terminal
terminal application
application where
where
control commands, responses and data received can be monitored to check the
control commands, responses and data received can be monitored to check the system procedure system procedure
and
and analyze
analyze failures. Depending on
failures. Depending on the
the operating
operating mode,
mode, the
the microprocessor
microprocessor captures
captures anan image
image
either
either periodically
periodically or or on
on demand,
demand, analyzes
analyzes the
the integrity,
integrity, processes
processes the
the information,
information, controls
controls the
the
communication modules, and receives commands and sends information
communication modules, and receives commands and sends information to the server. to the server.
The
The system
system isis configured
configured so so that
that all
all the
the remote
remote nodes
nodes perform
perform aa measurement
measurement every every hour.
hour.
These
These measurements are synchronized and locally stored. The frequency of this process can be
measurements are synchronized and locally stored. The frequency of this process can be
configured
configured bybythe
theuser. These
user. tasks
These do not
tasks do demand
not demandhigh computational requirements,
high computational but it is essential
requirements, but it is
to use a robust
essential to usePC and anPC
a robust uninterruptible power supply
and an uninterruptible power tosupply
ensure to
nonstop
ensureoperation.
nonstop operation.

2.2.1. Real-Time Clock


2.2.1. Real-Time Clock (RTC)
(RTC) Module
Module of
of the
the Base
Base Station
Station
In
In the
thedevelopment
development of of
systems
systems for precision agriculture,
for precision it is fundamental
agriculture, to consider
it is fundamental the temporal
to consider the
reference
temporal in which the
reference information
in which of the sensors
the information is sensors
of the obtained, is to establish
obtained, tothe fertigation
establish strategies
the fertigation
in an appropriate
strategies in an way. Without way.
appropriate such information,
Without such precision irrigation
information, would beirrigation
precision impossible. For this
would be
purpose, the coordinator module incorporates an RTC module DS3231,
impossible. For this purpose, the coordinator module incorporates an RTC module DS3231, which is which is connected to the
Arduino
connectedDUE to theplatform
Arduinothat
DUE manages
platformthe setmanages
that throughthe communication signals using signals
set through communication the I2Cusing
bus.
To safeguard the operation of the RTC, it has a backup battery apart from the
the I2C bus. To safeguard the operation of the RTC, it has a backup battery apart from the general general power supply of
the
powercontrol
supply module.
of the control module.
The information
The information receivedreceivedfrom fromthe theRTCRTC module
module is used
is used to create
to create a timestamp
a timestamp system.
system. This
This timestamp is incorporated in two ways: in the properties of the image
timestamp is incorporated in two ways: in the properties of the image file, and in the name assigned file, and in the name
assigned
to the file.toSome
the file. Some problems
problems have to be have to beregarding
solved solved regarding
the numberthe number of characters
of characters in the of
in the name name
the
of the file, since it is limited to 8 characters due to the file structure of the hardware
file, since it is limited to 8 characters due to the file structure of the hardware of the RTC module. of the RTC module.

2.2.2. Control and Verification Console


Water 2019, 11, 255 11 of 21

2.2.2. Control and Verification Console


Water 2019, 11, x FOR PEER REVIEW 11 of 21
The coordinator node can be connected to a PC via USB or through a serial communication.
The coordinator
This way, node can
a human operator can verify
be connected to working
the correct a PC viaofUSB or through
the system a serial communication.
and intervene in the system if
This way, a human operator can verify the correct working of the system and intervene
necessary. As presented in Figure 7, the PC runs a console application showing the messages in thethat
system
the
if necessary. As presented in Figure 7, the PC runs a console application showing the messages
coordinator module generates for each task. The messages sent to the remote nodes can correspond that
thethe
to coordinator
tasks, eithermodule generates
generated from thefor each
web task. The
application or messages sentgenerated
automatically to the remote
for the nodes can
periodical
correspond to the tasks, either generated from the web
tasks implemented in the processor of the Arduino DUE system. application or automatically generated for
the periodical tasks implemented in the processor of the Arduino DUE system.

Figure 7.
Figure Sample view
7. Sample view of
of the
the control
control and
and verification
verification console,
console, with
with check
check commands
commands received.
received.

The
The console
console shown
shown in in Figure
Figure 77 is
is aa RealTerm
RealTerm serial
serial terminal.
terminal. This
This is
is aa terminal
terminal program
program for
for
engineers
engineers specially designed for capturing, controlling and debugging binary and other complex
specially designed for capturing, controlling and debugging binary and other complex
data streams. ItItisisaafree
data streams. freesoftware
softwaredevelopment
development available
available in in
thethe SourceForge
SourceForge (SourceForge
(SourceForge Media,
Media, La
La Jolla, CA, USA) open source community. In our case, it is used as a debugging
Jolla, CA, USA) open source community. In our case, it is used as a debugging tool to monitor thetool to monitor the
communication
communication between
between the the RICSs
RICSs and
and the
the local
local control
control node.
node.
2.2.3. Communication Module XBee 868LP of the Base Station
2.2.3. Communication Module XBee 868LP of the Base Station
The module XBee 868LP that is an important part of the base station node. It is configured in
The module XBee 868LP that is an important part of the base station node. It is configured in
coordination mode through the appropriate programming by sequence of AT commands generated
coordination mode through the appropriate programming by sequence of AT commands generated
from the microprocessor, and is responsible for implementing the wireless transmission of image data.
from the microprocessor, and is responsible for implementing the wireless transmission of image
More specifically, the sequence of steps executed in the request and reception of the images is depicted
data. More specifically, the sequence of steps executed in the request and reception of the images is
in Figure 8.
depicted in Figure 8.
The configuration in AT transparent mode allows obtaining the data stream of the captured
image in JPEG format, avoiding the need for a later compression. In this way, the image file is
available in a readable format immediately after capture and transmission. Additionally, the XBee
configuration in AT mode improves the transmission speed due to the direct transfer of the image
data stream.
Water 2019, 11, 255 12 of 21
Water 2019, 11, x FOR PEER REVIEW 12 of 21

Figure 8. Steps of the protocol for image request and transmission.


Figure 8. Steps of the protocol for image request and transmission.
The configuration in AT transparent mode allows obtaining the data stream of the captured image
2.2.4. Embedded Microcontroller SAM3X8E
in JPEG format, avoiding the need for a later compression. In this way, the image file is available in a
The format
readable embedded microcontroller
immediately SAM3X8E
after capture is integrated into
and transmission. the Arduino
Additionally, DUE configuration
the XBee platform so that
in
connectivity
AT to usethethe
mode improves I2C and SPI
transmission speed(Serial
due toPeripheral Interface)
the direct transfer communication
of the resources is
image data stream.
provided for the connection of the RTC and microSD modules, in addition to having several UART
2.2.4.
ports Embedded MicrocontrollerasSAM3X8E
for serial communication, in the case of the connectivity of the XBee 868LP device. The tasks
performed by the microcontroller in the coordinator
The embedded microcontroller SAM3X8E node are:
is integrated into the Arduino DUE platform so that
connectivity
• tothe
Selecting useRICS
the I2C
nodeandthat
SPI has
(Serial Peripheral
to take Interface)
an image communication
using live, resources is
timed or on-demand provided
mode. This
for the
selection is done by the configuration of the XBee module of the local coordinator nodeports
connection of the RTC and microSD modules, in addition to having several UART for
with the
serialadequate
communication,
addressasofinthe
theremote
case ofXBee
the connectivity
module. of the XBee 868LP device. The tasks performed
by
• the microcontroller in the coordinator
Sending the adequate set of AT instructionsnode are: to the A20/A6C module to get a fully readable

• image format
Selecting without
the RICS node need
thatofhas datato edition.
take an Withimagethis feature,
using live, an improvement
timed or on-demand in themode.
data
transmission
This selection speed
is doneisbyachieved.
the configuration of the XBee module of the local coordinator node with
• the adequate address ofofthe
Checking the integrity the received
remote XBee image
module.and, in case of errors in the image data, launching
a new capture to obtain an image without data losses.
• Sending the adequate set of AT instructions to the A20/A6C module to get a fully readable image
• Storing the image in a database, formatted with timestamp data to get a set of temporal
format without need of data edition. With this feature, an improvement in the data transmission
readable images. In this way, the web control and monitoring system can make requests for any
speed is achieved.
image in the database. The database is stored in a first-level SRAM (Static RAM) memory for
• Checking the integrity of the received image and, in case of errors in the image data, launching a
quick access and in removable media storage (microSD).
new capture to obtain an image without data losses.
• Setting the image capture to the time interval mode, configurable by the control node, to get the
• Storing the image in a database, formatted with timestamp data to get a set of temporal readable
information required for irrigation, fertilization or harvesting decisions.
images. In this way, the web control and monitoring system can make requests for any image in
• Processing and transmission of the image received by HTML-POST on the remote server that
the database. The database is stored in a first-level SRAM (Static RAM) memory for quick access
links to the web application that contains the SCADA system [45].
and in removable media storage (microSD).
• The selection
Setting of the
the image RICS to
capture node
the by
time theinterval
local coordinator node is done
mode, configurable by thebycontrol
modifying
node,a to
record
get thein
its XBee hardware
information module.for
required The development
irrigation, of the image
fertilization transmission
or harvesting system makes use of the AT
decisions.
communication
• mode of the XBee hardware instead of
Processing and transmission of the image received by HTML-POST on the API mode. The purpose is to increase
the remote server thatthe
transmission speed and to simplify the transmission
links to the web application that contains the SCADA system [45]. process of the image. In this way, in the
reception of the data stream, the complete image is obtained with an adequate header and end-of-file
formatThethat
selection
wouldof the RICS
allow node
reading by the local
it without coordinator
the need of editingnode
in is done by modifying
hexadecimal format toarestore
record itsin
its XBee hardware module. The development of the image transmission system
correct format (as usually happens in these transmission systems). The final purpose is to directly makes use of the AT
communication
transmit the image mode of the XBee hardware
point-to-point between the instead of the API
coordinator andmode.
remote The purpose
nodes, is to increase
without the needthe of
transmission
splitting thespeed and into
images to simplify
small the transmission
packages. process of through
In addition, the image.the In this way, in theorder,
appropriate receptionthe
of the data stream,
microcontroller canthe complete
also transferimage is obtained
on demand the with
imagean received
adequatevia headerUSBandto aend-of-file
PC, in caseformatthe
that would allow
coordinator systemreading it without
is connected to it.the
Inneed of editing
case the API modein hexadecimal
is used, the format
imagesto restore
should beits correct
split into
format
blocks (as usually
of 255 happens
bytes; an imagein these transmission
between 35–50 Kbytes systems).
wouldThebefinal
splitpurpose is to directly
into 140–200 transmit
packages, thus
the image point-to-point
requiring 120–150 s. between the coordinator and remote nodes, without the need of splitting the

3. Results and Discussion


Water 2019, 11, 255 13 of 21

images into small packages. In addition, through the appropriate order, the microcontroller can also
transfer on demand the image received via USB to a PC, in case the coordinator system is connected to
it. In case the API mode is used, the images should be split into blocks of 255 bytes; an image between
35–50 Kbytes would be split into 140–200 packages, thus requiring 120–150 s.

3. Results and Discussion


Water 2019, 11, x FOR PEER REVIEW 13 of 21
The system described in Section 2 has been implemented in a prototype model which includes
severalThe system
remote described
nodes, in coordinator
the local Section 2 has been
node andimplemented in a prototype
the web application modelFigure
in the cloud. which9includes
shows
a global view of an RICS node in the field and a captured sample image. In this case, the application 9
several remote nodes, the local coordinator node and the web application in the cloud. Figure
shows a global
corresponds view ofcrop
to a lettuce an of
RICS nodeLittle
variety in the
Gemfield andfield
in the a captured sample
of Cartagena, image. In this case, the
Spain.
application corresponds to a lettuce crop of variety Little Gem in the field of Cartagena, Spain.

(a) (b)
Figure 9. Global view of a remote capture node in the field. (a) Location of the node on a crop of lettuce.
(b) Sample
Figure 9. image
Globalcaptured
view of abyremote
the system.
capture node in the field. (a) Location of the node on a crop of
lettuce. (b) Sample image captured by the system.
The selection of the Arduino DUE platform presents several advantages, in addition to its
processing capabilities and connection buses available. An important aspect is the internal memory size
The selection of the Arduino DUE platform presents several advantages, in addition to its
of the microcontroller, which allows storing up to two images of VGA size (640 × 480) in JPEG format.
processing capabilities and connection buses available. An important aspect is the internal memory
sizeExperiments
3.1. of the microcontroller, which allows
on the Data Transmission storing up to two images of VGA size (640 × 480) in JPEG
Speed
format.
In order to achieve a reliable wireless link with the minimum data loss, the remote and local
nodes have been on
3.1. Experiments tested by transferring
the Data Transmissionimages
Speed at different speeds. Therefore, XBee and A20/A6C
modules were configured with the available transfer rates (in bit per second, bps), and tested with the
In order to achieve a reliable wireless link with the minimum data loss, the remote and local
goal of obtaining the fastest communication speed without data loss. As an example, Figure 10 shows
nodes have been tested by transferring images at different speeds. Therefore, XBee and A20/A6C
some of the in-lab images received in these tests. In this case, the files transmitted are VGA images of
modules were configured with the available transfer rates (in bit per second, bps), and tested with
640 × 480 pixels.
the goal of obtaining the fastest communication speed without data loss. As an example, Figure 10
shows some of the in-lab images received in these tests. In this case, the files transmitted are VGA
images of 640 × 480 pixels.

(a) (b)
In order to achieve a reliable wireless link with the minimum data loss, the remote and local
nodes have been tested by transferring images at different speeds. Therefore, XBee and A20/A6C
modules were configured with the available transfer rates (in bit per second, bps), and tested with
the goal of obtaining the fastest communication speed without data loss. As an example, Figure
Water 2019, 11, 255
10
14 of 21
shows some of the in-lab images received in these tests. In this case, the files transmitted are VGA
images of 640 × 480 pixels.

Water 2019, 11, x FOR PEER REVIEW 14 of 21


(a) (b)

(c) (d)

Figure 10. Some sample images received in the coordinator node with different transmission speeds:
(a) 57,600 bps; (b) 38,400 bps; (c) 28,800 bps; (d) 19,200 bps (image received without errors).
Figure 10. Some sample images received in the coordinator node with different transmission speeds:
(a) 57,600 bps; (b) 38,400 bps; (c) 28,800 bps; (d) 19,200 bps (image received without errors).
In general, it was observed that higher speed rates produce more errors in the transmission,
being unacceptable for values above 38,400 bps. A bandwidth of 28,800 bps was acceptable in some
In general, it was observed that higher speed rates produce more errors in the transmission,
cases,
beingbut produced errors
unacceptable in other
for values abovetests, as in
38,400 Figure
bps. 10c. The maximum
A bandwidth speed
of 28,800 bps wasallowed forin
acceptable a robust
some
transmission was 19,200 bps. In consequence, as the result of this experiment, the
cases, but produced errors in other tests, as in Figure 10c. The maximum speed allowed for a robustremote and local
nodes are set to
transmission was19,200
19,200 bps
bps.to In
ensure a high reliability
consequence, in error-free
as the result data transmission
of this experiment, the remote at and
the target
local
distances between local and remote nodes. For this speed, the estimated time to
nodes are set to 19,200 bps to ensure a high reliability in error-free data transmission at the transmit an target
image
between
distances35between
and 50 Kbytes
local and(which is the
remote typical
nodes. For size
this at VGAthe
speed, resolution)
estimatedistime
15–22tos;transmit
hence, the
an system
image
can capture a maximum of 163 images per hour.
between 35 and 50 Kbytes (which is the typical size at VGA resolution) is 15–22 s; hence, the system
can capture a maximum of 163 images per hour.
3.2. Experiments on Capture Distance
3.2. The
Experiments
maximum on Capture
distanceDistance
allowed from the remote node to the control node is another important
factor The
to bemaximum
analyzed.distance
Therefore, an extensive
allowed from theseries of experiments
remote has been
node to the control done
node to find the
is another optimal
important
working distances. Since the system was designed for horticultural crops, such as lettuce,
factor to be analyzed. Therefore, an extensive series of experiments has been done to find the optimal the signal
loss due todistances.
working intermediate
Sinceobstacles
the system canwas
be neglected.
designed for This means thatcrops,
horticultural the line-of-sight of the
such as lettuce, different
the signal
nodes can be maintained, which in general allows for much greater distances.
loss due to intermediate obstacles can be neglected. This means that the line-of-sight of the different
Thecan
nodes experiment consisted
be maintained, whichin placing theallows
in general controlfor
node at agreater
much fixed position of the crop, near the farm
distances.
house,Theandexperiment
11 remote consisted
capture nodes at different
in placing distances
the control nodefrom 150 mposition
at a fixed to 1125 of
m.the
Forcrop,
each near
distance,
the
between 20 and 50 images were captured and transmitted to the control node. This
farm house, and 11 remote capture nodes at different distances from 150 m to 1125 m. For each way, the error rate,
defined
distance,as between
the percentage
20 andof 50images
imagesthat produced
were capturedany andtransmission
transmitted error,
to thewas calculated.
control node. ThisFigure
way,11
contains an aerial view of the crop where the experiments were conducted, with
the error rate, defined as the percentage of images that produced any transmission error, was the location of the
RICS nodes and
calculated. the11
Figure control node.
contains an aerial view of the crop where the experiments were conducted,
with the location of the RICS nodes and the control node.
Water
Water2019, 11,11,
2019, 255x FOR PEER REVIEW 15
15ofof2121
Water 2019, 11, x FOR PEER REVIEW 15 of 21

Figure 11. Aerial view of the location of the local control node and the remote capture nodes (source:
Figure 11.Aerial
Figure11. Aerialview
viewofofthe
thelocation
locationofofthe
thelocal
localcontrol
controlnode
nodeand
andthe
theremote
remotecapture
capturenodes
nodes(source:
(source:
DigitalGlobe, European Space Imaging).
DigitalGlobe, European Space Imaging).
DigitalGlobe, European Space Imaging).
Theobtained
The obtained graph
graph ofof
thethe transmission
transmission error
error as aas a function of the distance is presented in Figure
The obtained graph of the transmission error asfunction of the
a function distance
of the is presented
distance in Figure
is presented 12.
in Figure
12.
In12. In
these these
tests,tests,
the the
imagesimages
have have
a a resolution
resolution of of
320 ×320
240 × 240 pixels.
pixels.
In these tests, the images have a resolution of 320 × 240 pixels.

Figure Error
Figure12.12. rates
Error in the
rates intransmission of images
the transmission from thefrom
of images remote
theRICS nodesRICS
remote to thenodes
local coordinator
to the local
Figure
node, 12. Error
as a function rates
of thein the transmission
distanceofbetween of images
them.betweenfrom
The speed the remote
is 19,200 RICS
bps and nodes to the local
the resolution
coordinator node, as a function the distance them. The speed is 19,200 bps and isthe
coordinator
320 × 240 node,
pixels. as a function of the distance between them. The speed is 19,200 bps and the
resolution is 320 × 240 pixels.
resolution is 320 × 240 pixels.
As is evident from Figure 12, the error progressively increases with distance. The error is 0 for
As is evident from Figure 12, the error progressively increases with distance. The error is 0 for
all theAs is evident
distances from600
below Figure 12, the error
m, meaning progressively
that no increases
error occurred in thewith distance. of
transmission The errorimages.
these is 0 for
all the distances below 600 m, meaning that no error occurred in the transmission of these images. It
Itall the distances
remains at low below
values600 m,the
until meaning thatreaches
distance no error occurred
750 in the transmission
m. However, the error grows of these
veryimages.
quicklyIt
remains at low values until the distance reaches 750 m. However, the error grows very quickly from
remains
from 800 m,at low
being values until the for
unacceptable distance reaches
distances 750 than
greater m. However, the error
1000 m. Thus, grows
if the localvery quicklynode
controller from
800 m, being unacceptable for distances greater than 1000 m. Thus, if the local controller node is
is800 m, being
situated in theunacceptable
center of thefor distances
crop, greater
the system can than
cover1000 m. Thus,
effectively if thewith
an area locala diameter
controlleraround
node is
situated in the center of the crop, the system can cover effectively an area with a diameter around
situated
1500 in theshould
m, which center be
of enough
the crop,for
themost
system
smallcanand
cover effectively
average farms.anFigure
area with a diameter
13 shows some around
of the
1500 m, which should be enough for most small and average farms. Figure 13 shows some of the
1500 m,
typical which
errors should
in the be produced
images enough for bymost small
loss of and average
packages farms.
due to the Figure 13 shows some of the
distance.
typical errors in the images produced by loss of packages due to the distance.
typical errors in the images produced by loss of packages due to the distance.
Water 2019, 11, 255 16 of 21
Water 2019, 11, x FOR PEER REVIEW 16 of 21

(a) (b)

(c) (d)
Figure 13. Some sample images received in the coordinator node at different distances from the remote
node: (a) 600 m (image received without errors); (b) 700 m; (c) 950 m; (d) 1125 m.
Figure 13. Some sample images received in the coordinator node at different distances from the
remote node: (a) 600 m (image received without errors); (b) 700 m; (c) 950 m; (d) 1125 m.
In the experiments, it has been observed that the loss of data in the transmission affects the final
size of the received images. For example, the files received in Figure 13a–d have sizes 50.5 Kb, 44.1 Kb,
In the experiments, it has been observed that the loss of data in the transmission affects the final
25.1 Kb and 13.0 Kb, respectively. This allows detecting errors in the transmission of the images and,
size of the received images. For example, the files received in Figure 13a–d have sizes 50.5 Kb, 44.1
consequently, automate the request of a new image to the active remote node. Thereby, the embedded
Kb, 25.1 Kb and 13.0 Kb, respectively. This allows detecting errors in the transmission of the images
microprocessor of the coordinator node verifies the integrity of the image file by checking the header
and, consequently, automate the request of a new image to the active remote node. Thereby, the
and end-of-file according to the JPEG image format, as well as the size of the received data stream.
embedded microprocessor of the coordinator node verifies the integrity of the image file by checking
theExperiments
3.3. header and onend-of-file according
Camera Capture to theand
Parameters JPEG image Segmentation
Plant/Soil format, as well as the size of the received
data stream.
The A20/A6C camera used in the RICS nodes allows for the configuration of the main capture
parameters, such as
3.3. Experiments onthe brightness,
Camera Capturecontrast andand
Parameters white balance.
Plant/Soil On the other hand, the viewing angle
Segmentation
can also be manually adjusted by modifying the angle of the capture box, as shown in Figure 9a.
The
For A20/A6C
this reason, camera used in the
some additional RICS
tests havenodes
beenallows
done toforfind
the the
configuration of the
optimal setup of main capture
the system.
The same scene was captured under different configurations of the parameters, searching for angle
parameters, such as the brightness, contrast and white balance. On the other hand, the viewing the
can also be manually adjusted by modifying the angle of the capture box, as shown
optimal setup. Figure 14 shows an example of some of these tests for different configurations in Figure 9a.
under For this reason,
outdoor naturalsome additional
lighting tests have
conditions. beenthe
Since done to find
image the optimal
quality setup of the
is a subjective system.
property,
The same scene was captured under different configurations of the
the experimentation was done by trial and errors by experts in image processing. parameters, searching for the
optimal setup. Figure 14 shows an example of some of these tests for different configurations under
outdoor natural lighting conditions. Since the image quality is a subjective property, the
experimentation was done by trial and errors by experts in image processing.
Water 2019, 11, 255 17 of 21
Water 2019, 11, x FOR PEER REVIEW 17 of 21
Water 2019, 11, x FOR PEER REVIEW 17 of 21

(a) (b) (c)


(a) (b) (c)
Figure14.
Figure 14. Sample
Sample images
images obtained
obtained with
with different
different capture configurations
configurations of
of the
the A20/A6C
A20/A6Ccamera,
camera,
Figure 14. Sample images obtained with different capture configurations of the A20/A6C camera,
changingthe
changing theparameters
parameters
of of brightness
brightness andand contrast:
contrast: (a) cloudy
(a) cloudy mode;mode; (b) daylight
(b) daylight mode 1;mode 1; (c)
(c) daylight
changingmode
daylight
the 2.
parameters of brightness and contrast: (a) cloudy mode; (b) daylight mode 1; (c)
mode 2.
daylight mode 2.
The
Theultimate
ultimateobjective
objectiveof of
thethe
system
systemis toiscalculate
to calculatethe percentage
the percentageof greenof cover
green(PGC)cover to estimate
(PGC) to
the The
water the
estimate ultimate
requirements objective of the
of the crop.
water requirements of the system
Therefore, is to calculate
the last experiment
crop. Therefore, the percentage of
consistsconsists
the last experiment green
in analyzing cover (PGC)the to
the images
in analyzing
estimate
with thewith
images the water
plant/soil requirements
segmentation
the plant/soil of the
algorithm
segmentation crop. Therefore,
implemented
algorithm the
implemented last experiment
in theinapplication
the applicationconsists
and and in analyzing
presented inthe
in [52].
presented
images
This with
[52].algorithm the
This algorithm plant/soil segmentation
uses models
uses models for and
for plant plant algorithm
and
soil implemented
soildistributions
color color distributions in the application
in different
in different and presented
color spaces.
color spaces. The in
The models
[52]. This
models
consist algorithm
ofconsist
the of theuses
normalized models
normalized
histograms for plant
of theand
histograms of soil
plant color pixels
the pixels
plant distributions
and the andsoil insoil
different
thepixels, pixels, color
selected byspaces.
selected The
by user
the the
models
user during
during consist of the
the training
the training normalized
phase.phase.
GivenGiven histograms
a newaimage,new image,of the plant pixels
the algorithm
the algorithm and the
reprojects
reprojects soil pixels, selected
the histograms
the histograms in the in by
thethe
image,
user during
image,
obtaining the the
obtaining training
probabilities phase.
the probabilities
of plantGiven
and a new
of plant
soil image,
and
for each the
for algorithm
soilpixel. each pixel.reprojects
The class Thethe
with classthewith
highest histograms in the
the highest
probability is
image,
probability
selected obtaining the probabilities
is selected Finally,
pixel-by-pixel. pixel-by-pixel.
morphology of plant
Finally, and soil
morphology
operators for
open and each pixel.
operators
close areopen The class
and to
applied close with the
are applied
remove highest
to
some false
probability
remove
positives andisfalse
some selected
false pixel-by-pixel.
positives
negatives. and
Thefalse Finally,
negatives.
algorithm morphology
tests The algorithm
11 standard operators 11open
testsspaces
color and selects
standard
and close
coloraretheapplied
spaces and to
optimal
remove
selects
space the
and some falsespace
optimal
combination positives
of and and false
[13]. negatives.
combination
channels of channels The[13].algorithm tests 11 standard color spaces and
selects the optimal space
accuracy and
and combination
To test the accuracy and robustness of the method, 10
To test the robustness of of
the channels
method, [13].
10 images
images of of the
the crops
cropswerewereanalyzed
analyzedusingusing
To test
different models
different the accuracy
models basedbased on and robustness
on L*a*b*
L*a*b* and and I1I2I3of the
I1I2I3 color method,
color spaces. 10
spaces. Someimages of
Some sample the crops
sample results were
resultsare areanalyzed
presented
presented using
inin
different
Figure 15. models
Generic based
color on L*a*b*
models for and
plants I1I2I3
were color
applied
Figure 15. Generic color models for plants were applied in both cases. spaces.
in both Some
cases. sample results are presented in
Figure 15. Generic color models for plants were applied in both cases.

(a) (b) (c)


(a) (b) (c)

(d) PGC1 = 54.22; PGC2 = 56.18 (e) PGC1 = 40.00; PGC2 = 41.03 (f) PGC1 = 79.30; PGC2 = 80.06
(d) PGC 1 =15.
Figure 54.22; PGC2segmentation
Plant/soil = 56.18 (e)
of PGC 1 = 40.00;
the images PGCby
obtained 2 = the
41.03 (f) PGC
RICS nodes for 1a =crop
79.30; PGC2 =(a),
of lettuce. 80.06
(b) and (c) are captured images. (d), (e) and (f) are segmented images. For each image, the percentage
Figure15.
Figure 15.Plant/soil
Plant/soilsegmentation
segmentationofofthe theimages
images obtained
obtained byby
thethe RICS
RICS nodes
nodes forfor a crop
a crop of lettuce.
of lettuce. (a),
(a–c)
of green cover is indicated using the L*a*b* (PGC1) and I1I2I3 color model (PGC2). PGC = the
(b)captured
are and (c) are captured
images. images.
(d–f) (d), (e) and
are segmented (f) are For
images. segmented images.
each image, theFor each image,
percentage the percentage
of green cover is
percentage of green cover.
indicated
of greenusing
covertheisL*a*b* (PGCusing
indicated 1 ) and the
I1I2I3 color (PGC
L*a*b* model1)(PGC ). PGCcolor
and 2I1I2I3 = the model
percentage
(PGCof2).green
PGCcover.
= the
percentage of green cover.
The relative error in the obtained PGCs, calculated as the average absolute difference between
the PGCs produced by both models, is only 0.945%. This high accuracy allows for a precise
The relative error in the obtained PGCs, calculated as the average absolute difference between
the PGCs produced by both models, is only 0.945%. This high accuracy allows for a precise
Water 2019, 11, 255 18 of 21

The relative error in the obtained PGCs, calculated as the average absolute difference between the
PGCs produced by both models, is only 0.945%. This high accuracy allows for a precise estimation
of the crop coefficient parameter, Kc, which according to [8] can be obtained within a margin of
error around 2%. From this coefficient Kc, the water balance can be computed using the FAO-56
methodology [53], achieving an optimal management of the water resources.

4. Conclusions
Computer vision is known to be a very powerful tool for water management in agricultural
engineering. However, the use of image capture nodes in the field is limited by the environmental
conditions, the requirement to be used in large areas, and the low availability of external
communication systems. In this sense, ZigBee systems can be adjusted to provide an optimal solution,
given their high level of scalability, their large coverage distances, and their low power consumption.
The main weak point is its low data transmission speed, which in theory can be up to 250 Kbps,
but must be reduced to only 19 Kbps for a reliable link without data loss.
The experiments have successfully proved that it is possible to transfer images of VGA resolution
through the XBee network, with average times of 15–22 s for images of 35–50 Kbytes. This has been
achieved using the AT mode of XBee, which avoids cutting the images into small information packages
required in the API mode. In that case, the image would be split in 255 bytes per packet, consuming a
transmission time of 120–150 s per image.
These results are similar to those observed by other authors, such as the performance comparison
of wireless protocols presented in [36], where the LoRa and ZigBee protocols were found to be the
most adequate technologies for agronomy applications. Moreover, compared to the use of drones
and satellites, these systems based on inexpensive networks of fixed camera nodes can offer a greater
availability of images at a lower cost of installation and operation. By contrast, they do not offer the
possibility of capturing the entire crop. The captured images allow for the computation of the PGC
parameter with an error around 1%, from which the crop coefficient, Kc, and the water balance of the
plants can be estimated.
As the future work, there is still a long way to get a completely automatic system that estimates
the irrigation needs of the crops with the obtained visual and meteorological information. For example,
an important aspect is to define the pattern of automatic capture through the incorporation of weather
forecasts. The time between captures could be increased or reduced, with the objective of maintaining
the energy in the RICS nodes at optimum levels.

Author Contributions: Conceptualization, A.M.-A., J.M.M.-G.-P., A.R.-C., G.G.-M. and J.M.M.-M.; methodology,
A.M.-A., J.M.M.-G.-P. and J.M.M.-M.; software, A.M.-A. and J.M.M.-G.-P.; validation, A.M.-A., J.M.M.-G.-P.,
A.R.-C., G.G.-M. and J.M.M.-M.; investigation, A.M.-A. and J.M.M.-G.-P.; resources, J.M.M.-M.; writing of the
original draft preparation, A.M.-A. and A.R.-C.; writing of review and editing, G.G.-M.; visualization, A.M.-A.,
J.M.M.-G.-P. and A.R.-C.; supervision, J.M.M.-G.-P. and J.M.M.-M.; project administration, J.M.M.-M.; funding
acquisition, A.R.-C., G.G.-M. and J.M.M.-M.
Funding: This research was funded by the Spanish Ministry of Economy and Competitiveness (MINECO),
as well as the European Regional Development Fund (ERDF) under the projects TIN2015-66972-C5-3-R and
AGL2015-66938-C2-1-R. This article is the result of the activity carried out under the “Research Program for
Groups of Scientific Excellence in the Region of Murcia” of the Seneca Foundation (Agency for Science and
Technology of the Region of Murcia, Spain).
Conflicts of Interest: The authors declare no conflict of interest. The funders had no role in the design of the
study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to
publish the results.

References
1. Sesma, J.; Molina-Martínez, J.M.; Cavas-Martínez, F.; Fernández-Pacheco, D.G. A mobile application to
calculate optimum drip irrigation laterals. Agric. Water Manag. 2015, 151, 13–18. [CrossRef]
2. Levidow, L.; Pimbert, M.; Vanloqueren, G. Agroecological Research: Conforming—Or Transforming the
Dominant Agro-Food Regime? Agroecol. Sustain. Food Syst. 2014, 38, 1127–1155. [CrossRef]
Water 2019, 11, 255 19 of 21

3. Laliberte, A.S.; Rango, A.; Herrick, J. Unmanned aerial vehicles for rangeland mapping and monitoring:
A comparison of two systems. In Proceedings of the American Society for Photogrammetry and Remote
Sensing, Tampa, FL, USA, 7–11 May 2007.
4. Xiang, H.; Tian, L. Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned
aerial vehicle (UAV) platform. Biosyst. Eng. 2011, 108, 104–113. [CrossRef]
5. Raine, S.R.; Meyer, W.S.; Rassam, D.W.; Hutson, J.L.; Cook, F.J. Soil-water and solute movement under
precision irrigation: Knowledge gaps for managing sustainable root zones. Irrig. Sci. 2007, 26, 91–100.
[CrossRef]
6. Adeyemi, O.; Grove, I.; Peets, S.; Norton, T. Advanced monitoring and management systems for improving
sustainability in precision irrigation. Sustainability 2017, 9, 353. [CrossRef]
7. Escarabajal-Henarejos, D.; Molina-Martínez, J.M.; Fernández-Pacheco, D.G.; Cavas-Martínez, F.;
García-Mateos, G. Digital photography applied to irrigation management of Little Gem lettuce.
Agric. Water Manag. 2015, 151, 148–157. [CrossRef]
8. González-Esquiva, J.M.; García-Mateos, G.; Escarabajal-Henarejos, D.; Hernández-Hernández, J.L.;
Ruiz-Canales, A.; Molina-Martínez, J.M. A new model for water balance estimation on lettuce crops using
effective diameter obtained with image analysis. Agric. Water Manag. 2017, 183, 116–122. [CrossRef]
9. Bogena, H.R.; Huisman, J.A.; Oberdörster, C.; Vereecken, H. Evaluation of a low-cost soil water content
sensor for wireless network applications. J. Hydrol. 2007, 344, 32–42. [CrossRef]
10. Díaz, S.E.; Pérez, J.C.; Mateos, A.C.; Marinescu, M.C.; Guerra, B.B. A novel methodology for the monitoring
of the agricultural production process based on wireless sensor networks. Comput. Electron. Agric. 2011, 76,
252–265. [CrossRef]
11. Coates, R.W.; Delwiche, M.J.; Broad, A.; Holler, M. Wireless sensor network with irrigation valve control.
Comput. Electron. Agric. 2013, 96, 13–22. [CrossRef]
12. Goumopoulos, C.; O’Flynn, B.; Kameas, A. Automated zone-specific irrigation with wireless sensor/actuator
network and adaptable decision support. Comput. Electron. Agric. 2014, 105, 20–33. [CrossRef]
13. Hernández-Hernández, J.L.; García-Mateos, G.; González-Esquiva, J.M.; Escarabajal-Henarejos, D.;
Ruiz-Canales, A.; Molina-Martínez, J.M. Optimal color space selection method for plant/soil segmentation
in agriculture. Comput. Electron. Agric. 2016, 122, 124–132. [CrossRef]
14. Segovia-Cardozo, D.A.; Rodriguez-Sinobas, L.; Zubelzu, S. Water use efficiency of corn among the irrigation
districts across the Duero river basin (Spain): Estimation of local crop coefficients by satellite images.
Agric. Water Manag. 2019, 212, 241–251. [CrossRef]
15. Li, L.; Mu, X.; Macfarlane, C.; Song, W.; Chen, J.; Yan, K.; Yan, G. A half-Gaussian fitting method for estimating
fractional vegetation cover of corn crops using unmanned aerial vehicle images. Agric. For. Meteorol. 2018,
262, 379–390. [CrossRef]
16. González-Esquiva, J.M.; Oates, M.J.; García-Mateos, G.; Moros-Valle, B.; Molina-Martínez, J.M.;
Ruiz-Canales, A. Development of a visual monitoring system for water balance estimation of horticultural
crops using low cost cameras. Comput. Electron. Agric. 2017, 141, 15–26. [CrossRef]
17. Mora, M.; Avila, F.; Carrasco-Benavides, M.; Maldonado, G.; Olguín-Cáceres, J.; Fuentes, S. Automated
computation of leaf area index from fruit trees using improved image processing algorithms applied to
canopy cover digital photograpies. Comput. Electron. Agric. 2016, 123, 195–202. [CrossRef]
18. Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in
agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [CrossRef]
19. Brogi, C.; Huisman, J.A.; Pätzold, S.; von Hebel, C.; Weihermüller, L.; Kaufmann, M.S.; van der Kruk, J.;
Vereecken, H. Large-scale soil mapping using multi-configuration EMI and supervised image classification.
Geoderma 2019, 335, 133–148. [CrossRef]
20. Schaeffer, B.A.; Schaeffer, K.G.; Keith, D.; Lunetta, R.S.; Conmy, R.; Gould, R.W. Barriers to adopting satellite
remote sensing for water quality management. Int. J. Remote Sens. 2013, 34, 7534–7544. [CrossRef]
21. Tzounis, A.; Katsoulas, N.; Bartzanas, T.; Kittas, C. Internet of Things in agriculture, recent advances and
future challenges. Biosyst. Eng. 2017, 164, 31–48. [CrossRef]
22. Hardin, P.J.; Hardin, T.J. Small-scale remotely piloted vehicles in environmental research. Geogr. Compass
2010, 4, 1297–1311. [CrossRef]
Water 2019, 11, 255 20 of 21

23. Nebiker, S.; Annen, A.; Scherrer, M.; Oesch, D. A ligth weight multispectral sensor for micro
UAV—Opportunities for very high resolution airborne remote sensing. Int. Arch. Photogramm. Remote Sens.
Spat. Inf. Sci. 2008, 37, 1193–1199.
24. Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Labbé, S.; Baret, F. Assessment of unmanned aerial vehicles
imagery for quantitative monitoring of wheat crop in small plots. Sensors 2008, 8, 3557–3585. [CrossRef]
[PubMed]
25. Laliberte, A.S.; Rango, A. Image Processing and Classification Procedures for Analysis of Sub-decimeter
Imagery Acquired with an Unmanned Aircraft over Arid Rangelands. GIScience Remote Sens. 2011, 48, 4–23.
[CrossRef]
26. Hardin, P.J.; Jensen, R.R. Small-Scale Unmanned Aerial Vehicles in Environmental Remote Sensing:
Challenges and Opportunities. GIScience Remote Sens. 2011, 48, 99–111. [CrossRef]
27. Andrade-sanchez, P.; Pierce, F.J.; Elliott, T.V. Performance Assessment of Wireless Sensor Networks in
Agricultural Settings. In 2007 ASAE Annual Meeting; American Society of Agricultural and Biological
Engineers: St. Joseph, MI, USA, 2007; p. 1.
28. Tate, R.F.; Hebel, M.A.; Watson, D.G. WSN link budget analysis for precision agriculture. In Proceedings of
the American Society of Agricultural and Biological Engineers Annual International Meeting, St. Joseph, MI,
USA, 29 June–2 July 2008.
29. Mayer, K.; Ellis, K.; Taylor, K. Cattle health monitoring using wireless sensor networks. In Proceedings of the
Communication and Computer Networks Conference (CCN 2004); ACTA Press: Calgary, AB, Canada, 2004;
pp. 8–10.
30. Osroosh, Y.; Khot, L.R.; Peters, R.T. Economical thermal-RGB imaging system for monitoring agricultural
crops. Comput. Electron. Agric. 2018, 147, 34–43. [CrossRef]
31. Goense, D.; Thelen, J. Wireless sensor networks for precise Phytophthora decision supportv. In Proceedings
of the 2005 ASAE Annual Meeting. American Society of Agricultural and Biological Engineers, Tampa, FL,
USA, 25–27 September 2005.
32. Haneveld, P.K. Evading Murphy: A Sensor Network Deployment in Precision Agriculture; Delft, The Netherlands,
28 June 2007.
33. Ruiz-Garcia, L.; Barreiro, P.; Robla, J.I. Performance of ZigBee-Based wireless sensor nodes for real-time
monitoring of fruit logistics. J. Food Eng. 2008, 87, 405–415. [CrossRef]
34. Martin, A.; Hebel, M.A.; Ralph, F.; Tate, R.F.; Dennis, G.; Watson, D.G. Results of Wireless Sensor Network
Transceiver Testing for Agricultural Applications. In Proceedings of the 2007 ASAE Annual Meeting,
Minneapolis, MN, USA, 17–20 June 2007.
35. Zhang, N.; Wang, M.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002,
36, 113–132. [CrossRef]
36. Jawad, H.M.; Nordin, R.; Gharghan, S.K.; Jawad, A.M.; Ismail, M. Energy-efficient wireless sensor networks
for precision agriculture: A review. Sensors 2017, 17, 1781. [CrossRef]
37. Malaver, A.; Motta, N.; Corke, P.; Gonzalez, F. Development and integration of a solar powered unmanned
aerial vehicle and a wireless sensor network to monitor greenhouse gases. Sensors 2015, 15, 4072–4096.
[CrossRef]
38. Aquino-Santos, R.; González-Potes, A.; Edwards-Block, A.; Virgen-Ortiz, R.A. Developing a new wireless
sensor network platform and its application in precision agriculture. Sensors 2011, 11, 1192–1211. [CrossRef]
[PubMed]
39. Cancela, J.J.; Fandiño, M.; Rey, B.J.; Martínez, E.M. Automatic irrigation system based on dual crop coefficient,
soil and plant water status for Vitis vinifera (cv Godello and cv Mencía). Agric. Water Manag. 2015, 151,
52–63. [CrossRef]
40. Rao, Y.; Jiang, Z.H.; Lazarovitch, N. Investigating signal propagation and strength distribution characteristics
of wireless sensor networks in date palm orchards. Comput. Electron. Agric. 2016, 124, 107–120. [CrossRef]
41. Edwards-Murphy, F.; Magno, M.; Whelan, P.M.; O’Halloran, J.; Popovici, E.M. B+WSN: Smart beehive with
preliminary decision tree analysis for agriculture and honey bee health monitoring. Comput. Electron. Agric.
2016, 124, 211–219. [CrossRef]
42. Sai, Z.; Fan, Y.; Yuliang, T.; Lei, X.; Yifong, Z. Optimized algorithm of sensor node deployment for intelligent
agricultural monitoring. Comput. Electron. Agric. 2016, 127, 76–86. [CrossRef]
Water 2019, 11, 255 21 of 21

43. Fernández-Pacheco, D.G.; Ferrández-Villena, M.; Molina-Martínez, J.M.; Ruiz-Canales, A. Performance


indicators to assess the implementation of automation in water user associations: A case study in southeast
Spain. Agric. Water Manag. 2015, 151, 87–92. [CrossRef]
44. Aiello, G.; Giovino, I.; Vallone, M.; Catania, P.; Argento, A. A decision support system based on multisensor
data fusion for sustainable greenhouse management. J. Clean. Prod. 2018, 151, 87–92. [CrossRef]
45. González-Esquiva, J.M.; García-Mateos, G.; Hernández-Hernández, J.L.; Ruiz-Canales, A.;
Escarabajal-Henerajos, D.; Molina-Martínez, J.M. Web application for analysis of digital photography in the
estimation of irrigation requirements for lettuce crops. Agric. Water Manag. 2017, 183, 136–145. [CrossRef]
46. Escarabajal-Henarejos, D.; Molina-Martínez, J.M.; Fernández-Pacheco, D.G.; García-Mateos, G. Methodology
for obtaining prediction models of the root depth of lettuce for its application in irrigation automation.
Agric. Water Manag. 2015, 151, 167–173. [CrossRef]
47. Warren, G.; Metternicht, G. Agricultural applications of high-resolution digital multispectral imagery:
Evaluating within-field spatial variability of canola (Brassica napus) in Western Australia. Photogramm. Eng.
Remote Sens. 2005, 71, 595–602. [CrossRef]
48. Sudevalayam, S.; Kulkarni, P. Energy harvesting sensor nodes: Survey and implications. IEEE Commun.
Surv. Tutorials 2011, 43, 443–461. [CrossRef]
49. Akhtar, F.; Rehmani, M.H. Energy replenishment using renewable and traditional energy resources for
sustainable wireless sensor networks: A review. Renew. Sustain. Energy Rev. 2015, 45, 769–784. [CrossRef]
50. Zhang, Z.; Wu, P.; Han, W.; Yu, X. Remote monitoring system for agricultural information based on wireless
sensor network. J. Chin. Inst. Eng. Trans. Chin. Inst. Eng. A 2017, 40, 75–81. [CrossRef]
51. Gutierrez, J.; Villa-Medina, J.F.; Nieto-Garibay, A.; Porta-Gandara, M.A. Automated irrigation system using
a wireless sensor network and GPRS module. IEEE Trans. Instrum. Meas. 2014, 63, 166–176. [CrossRef]
52. Hernández-Hernández, J.L.; Ruiz-Hernández, J.; García-Mateos, G.; González-Esquiva, J.M.;
Ruiz-Canales, A.; Molina-Martínez, J.M. A new portable application for automatic segmentation
of plants in agriculture. Agric. Water Manag. 2017, 183, 146–157. [CrossRef]
53. Allen, R.G.; Luis, S.P.; RAES, D.; Smith, M. FAO Irrigation and Drainage Paper No. 56. Crop Evapotranspiration
(Guidelines for Computing Crop Water Requirements); FAO: Rome, Italy, 1998.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Вам также может понравиться