You are on page 1of 5

Periodic Report Summary 1 - VINBOT (AUTONOMOUS CLOUDCOMPUTING VINEYARD ROBOT TO OPTIMISE YIELD MANAGEMENT

AND WINE QUALITY)


Project Context and Objectives:
VinBot is an all-terrain autonomous mobile robot with a set of sensors capable of capturing and
analyzing vineyard images and 3D data by means of cloud computing applications, to determine
the yield of vineyards and to share this information with the winegrowers. VinBot responds to a
need to boost the quality of European wines by implementing precision viticulture to estimate the
yield (amount of fruit per square meter of vine area).
Winegrowers need to be able to estimate yield accurately to perform yearly canopy management
techniques. For a wine producer association, accurate yield predictions allow SME-AGs to organize
the production and marketing of their members' wines. At the present moment most SME wine
grower associations have little or none control over yield management throughout their members
vineyards. Typically wine growers use inaccurate visual sample-based surveys to estimate yield. To
some estimations this inefficient yield management is the source of annual losses of 20% in the
market price of their members' wine.
The VinBot project aims to tackle these challenges through the development of an autonomous
mobile robot enhanced with cloud computing applications to automatically acquire, process and
present comprehensive and precise yield information to wine growers and associations in the form
of a web-based map. Through their collective expertise in viticulture, the wine growers and
associations can then coordinate crucial yield management techniques to improve efficiency and
wine quality, per their commercial strategies.
Inaccurate yield estimates result in two main problems for wine producers and their associations:
1) if vineyard yield is too small or too large per hectare, the quantity of wine they expect to
market is affected, and 2) when over-ripe and under-ripe grapes are mixed, a lower quality wine
than desired is produced. Both of these problems affect the bottom line of the consortium
associations' members, besides the production restriction per hectare of protected domain of origin
(PDO).
VinBot automates the traditional visual yield estimation process. Using a variety of sensors
(detailed below) and computer vision, the VinBot will autonomously capture the leaf-to-fruit ratio
(to indicate yield) at exact positions throughout the entire vineyard. A small onboard computer will
offload data-intensive computer vision algorithms to be processed on external internet servers (a
cloud software service). Using the web-based yield map generated by the VinBot system, SME-AGs
can centralize and coordinate yield management throughout their thousands of members'
vineyards.
Each VinBot can autonomously monitor hundreds of hectares several times a year. A trained, but
not necessarily expert employee sets the initial boundaries of the vineyard - using the user-friendly
web service on her smartphone/tablet/pc - monitoring the robot's movement, and transporting it
from site to site. Due to the robust map-building and path-planning capabilities of the system the
VinBot is able to navigate intelligently and autonomously through the vineyard, collecting localized
leaf-to-fruit ratios throughout the vineyard.
In concordance with the previously explained context of the precision viticulture, the general
objectives of the Vinbot is to develop a solution that allows the obtaining of realistic and accurate
vigor and yield estimations of the grape production from data obtained in the vineyard fields. This
solution has four main elements, namely, a data corpus, an autonomous mobile robot, a sensory
head, the processing software, and a series of cloud-based services to off board process and
visualize the data.
The corpus data hand gathered by the experts participating in the project, which constitutes the
ground truth database to test the methods of data analysis, an autonomous mobile robot system
that allows the navigation in the field and sensor data capture, the sensor unit that encompasses a
series of sensors to obtain relevant data to estimate vigor and yield, the data processing software
to analyze such sensor data and build maps, and the cloud-based web application to process
images, create 3D maps and implement path planning systems with a user-friendly
PC/smartphone/tablet software interface. The VinBot offloads computer-intensive tasks like image
processing, 3D map building, and path planning by processing the information gathered in the
vineyard on the cloud.
Additionally, a website of the project has been created at Vinbot.eu to show news and some of the
accomplishment of the project, such as images and videos of the robot, participation of in events
of the wine industry, validation experiments, and contact information of the members of the
consortium.

Project Results:
During the first period the worked performed consisted of the following parts:
Specification of the Vinbot System
In the first months of the project, several aspects were defined in precision. We split this
specification in two main parts, the elements concerning with the vineyards (structure and
varieties) and those related to the actual hardware and software parts of the robotic system.
Definition of the Structure of the Fields
In the case of vines, we had to decide several key aspects of the vineyards that were taken into
consideration in the tasks of data gathering and in the future testing of the system. It was
important to define what variety were under consideration and the training system of the
vineyards. For this period we have considered the most used training system (Vertical Shoot
Position in a Trellis system) and two varieties, Viosinho (green grapes) and Trincadeira (red
grapes). The plantation is located at ISA facilities in Lisbon (see Fig. 1 of the Annex).
Creation of the Ground Truth Database
Additionally to this, the procedures to obtain the data were also defined and set to work. The sort
of measurements were established and systematized along the growing period of the vines, as well
as the key factors to consider in the measurements. Once the methodology was stablished, it was
implemented starting in the summer season by measuring weights and numbers of clusters, which
are related to the yield. The goals was to establish correlation between certain phyto-physical
values and the yield of the plants. (See figure 2 in the Annex)
The process of data gathering data was prolonged also during winter and spring. The objective
during winter was the collection of vegetative data (number and diameter of shoots and pruning
weight) the vines have produced during 2014 season. These measures are important for the
estimation of vigor and vine balance (ratio between yield and pruning weight). . The study in
spring 2015 wants to stablish the patterns in growth (leaves and inflorescences).
Definition of the Robotic and Sensory Systems
The second aspect in the definition of the project is the determination of the robotic and sensory
systems. In this period the robotic system was completely defined in all the aspects that would
allow its posterior construction. The robotic system has several composing parts, the sensory
head, the mobile platform, the power system and the communications. We defined the sensory
head as a set of cameras and 3D range finder that allow the capture of relevant data to measure
the size of the canopy and the amount of grapes in the vines. Additional sensors measure the GPS
position of the robot as well as dynamic measures obtained by an Inertial Measurement Unite
(IMU) of the inclination of the head that is used to remove effect from the 3D points obtained with
the laser.
The mobile platform has also been defined not just in its mechanical components, but also in the
set of sensors that will allow it to autonomously navigate in the fields. Despite we used as
departing point an existing mobile robot, several improvements to its basic structure were made in
order to adapt it to the vineyards terrain and to integrate the sensory head both mechanically,
electrically, and with respect to the controlling software.
Development of the Computer Vision Algorithms
After the obtaining of data from the sensors during the 2014 ripening period we have been
developing a series of algorithms to process and extract relevant data related to the vigor and
yield of vineyards.
The software developed can be divided in several types according to the type of sensor used to
obtain the data. First of all, using the RGB+NIR cameras we are able to compute the Normalized
Difference Vegetation Index (NDVI) of the vines and several other features related to the amount
of leaves and grapes. Secondly, we are able to compute the surface that covers the 3D points
obtained using the laser range finder and estimate structure features such as the total surface,
volume and height of the plants along a row (e.g. Fig. 3 of the Annex).
More specifically, we are very interested in measuring appearance features of the image which
allows us to compute what part of the image corresponds to a certain class of elements of the
plant, such as leaves or grapes. To this end we have developed techniques to recombine the
images obtained from the two cameras, which are in different levels, into one single image. Also
color corrections were implemented to obtain images that were constant along a whole session of
data capture.
The most important part has been the determination of features related to the amount of leaves
and grapes. In the case of leaves, we were able to segment the leaves from the rest of the image
using color and NDVI. As a consequence we can estimate the amount of leaves both in the images
and within the region that encompasses the whole plant. These measures are related to important

viticultural indices like exposed leaf area and canopy porosity


On the other hand, we are able to determine the position of the clusters in both white and red
varieties, and counting the amount of pixels that correlated to the amount of grapes present in the
vines. We employ also color and other appearance features to train a machine learning process
that classify regions in the images.
Development of the Robotic Field Navigation Algorithms
Many of the project objectives are easier to achieve with expensive sensors, i.e. RTK-DGPS
systems provide RT2 location estimations with +/-2cm accuracy that could be used to navigate
inside vineyards, and a similar principle can be applied to many of the devices (cameras, laser
range-finders, etc.). An RTK-DGPS base and rover device can cost 20K excluding antennas and
radios. Therefore, the main research and development effort is related with the selection of the
solutions able to do the task without exceeding the budget, in summary to find a compromise
between price, reliability and simplicity (easy deployment) of the solution.
For this specific task ROS and Gazebo are providing an important support. We have modeled
vineyards in 5 different growing stages reducing a 90% the number of polygons of commercial 3D
vineyards. The same model is used for both visualization and collision avoidance in each vineyard
since the simulated laser sees in the collision layer. Thus we can simulate a realistic environment
and get a first estimation of the data that could be provided by the sensor suite.
A number of localization algorithms and sensors were tested in simulation and in the lab. In some
cases the algorithms were tested using Rosbags from data acquisition on the field. Table 1 shows a
group of tested algorithms and the sensor used as input.
In parallel with the testing of the mentioned localization algorithms, we were researching the idea
of combining them with other low cost sensors and get benefit of the specific characteristics of the
vineyard arrangement. Vinbot addresses vineyards are trained in a Vertical Shoot Position (VSP)
system (Fig. 2), the most used vineyard training system worldwide. This training system creates a
structured environment along the rows characterized by a single continuous canopy formed by a
set of vertical shoots and its leaves. Figure 4 of the Annex show an example of the addressed
environment. See also Table 1 in the Annex.
Building of the Sensory System
For the benefit of the project we decided to completely build the sensory head some months
before the date specified in the DOW. From the specification, the head containing all the sensors,
inertial unit, CPU, batteries, and communications was built during this period and extensively used
in the fields at ISA to obtain the data required to construct the Ground Truth (GT) database. Also
we have endeavored the integration of the unit with the mobile platform. This integration consists
of several parts, namely, mechanical, electrical and software.
Mechanical integration means that the head must be mounted on the robotic platform and the unit
must be stable during the movements in the fields. This was accomplished and tried out in real
situations. The electrical integration is referred to the power connection between the batteries of
the platform and the head, bypassing the heads own set of batteries. The latter batteries allow the
operation of the head as a stand-alone unit, but can be removed to save extra wait when the head
is mounted on the robot. Finally the software integration stands for using the same software
platform to control the whole system. We employ the ROS robot operation system to control both
the mobile platform and the operation of the sensory head.
Currently the two elements can be used together as a single system and also separately. The
sensory is a complete functional unit that captures data and sends it automatically to a cloudbased service to store and visualize the data.
Summary
The main results accomplished during the present period are the following: (1) definition of the
system; 2) creation of the database with images and sensor measures of vegetative and
reproductive growth obtained along several months in summer and winter, and spring; 3)
development of a series of algorithms to process and extract data related to the vines vigor and
yield measurements; (4) development of a series of algorithms and strategies to make the mobile
platform navigate in the vineyard fields autonomously; 5) building of the sensory head and its
integration with the mobile platform.
Potential Impact:
The Expected Final Results
At the end of the project we expect that the following results will be accomplished. First, we will
have obtained a huge database of measurements gathered from the sensors on the robot along
several periods of the grapevine growing cycle. This information will be used during the timespan

of the project to analyze and extract vegetative and reproductive features describing the state of
the vines. Nevertheless, from a broader point of view, the newly created database can also be
used after the end of the project for further analysis and data extraction that improve the
knowledge of vineyard growing and grape production.
The second final result of the project will be the robotic unit composed of both the autonomous
mobile platform and the sensory head. Despite these two elements were built out of off-the-shelf
components, the two complete units are not in the market and are an important part of the results
of the project. In addition to this, the integration of the two parts forming a single robotic unit is
an important achievement of the project. The integration will be performed in three different
levels, i.e., mechanical, electrical and with respect to the controlling software.
Another key element that will be an outcome of this project is the set of algorithms and code
developed for processing the data gathered by the robot. This software will be able to extract
vegetative and reproductive information and to create maps to show this information in a
meaningful way for its use by the winegrowers. Another important part of the software is related
to the navigation and control of the operation of the autonomous mobile platform. This software
will allow the robot to autonomously navigate, following the structure of the vines in the vineyards,
avoiding obstacles and setting goals for the accomplishments of missions.
The final result obtained from the project will be the cloud-based services that will actually allow
the replication of the process to extract information from the data on demand. The cloud services
will also allow the storage and visualization of the data to the users belonging to the SME-AGs in
the project.
Potential Impacts
Although its market leader position, the EU wine sector suffers from serious structural
shortcomings, the most relevant being the inflated surplus of wine. The survival of European SME
vineyards is threatened by current market forces. As most European vineyards are relatively small,
the large majority form part of associations like the VinBot SME-AGs. The associations centralize
and optimize production, as well as distributing and marketing their members' wine. Costeffective, centralized, automated yield management is a critical tool to boost quality, productivity,
competitiveness and brand recognition in the European wine sector and allow EU wine producers to
embrace precision viticulture technologies.
The VinBot will address the strategic objectives of the European Commissions wine sector reform
through improved yield management by introducing a novel, autonomous, computer vision, and
cloud-computing based robot economically accessible to SME vineyards. The consortium is
confident that the VinBot system will accurately estimate, process and present yield information.
Another advantage of VinBot is that it is an open robotics platform, comprised of off-the-shelf
sensors. As such, other sensors may be added, and the web services can be configured according
to the needs of the industry to measure additional vineyard conditions in the future. The proposed
technology can work with existing standard sensors to monitor temperature, humidity, diseases,
vegetative stress, as well as other standardized sensors developed in the future.
In the field of the EU robotics sector, the diversification of the robotics sector is a European
priority, moving away from manufacturing to new sectors. The general strategic behavior of the
major EU players is to seek new segments and expand out of the conventional industrial robots
market. As the manufacturing of the VinBot technology will be outsourced to a European SME
robotics company(s), the EU robotics sector will be poised to become global leaders in a
completely innovative field of robotics technology: agricultural cloud-computing robot.
There will also be an impact in the standards of the future of agriculture and viticulture that will
rely in precision agricultural tools, specifically robotics. The ability to autonomously carry out
specific actions, whether they be crop monitoring, planting, harvesting, etc., has huge implications
in the agricultural sector, where traditional agricultural methods are not enough to sustain the
competitiveness of European agriculture. Complementary emerging technologies, such as
agricultural robotics, are increasingly accepted by farmers, due to their cost effectiveness, and the
automation of work that is dirty, difficult and dangerous.
Environmentally, the VinBot system uses electric energy (lithium-ion batteries) to power the
mobile robotic platform with no CO2 pollution. In a broader point of view, the Vinbot will compile a
database of visual vineyard characteristics online. Although this is beyond the scope of this current
project, future research could be carried out regarding the effects of climate change and other
environmental considerations at the vineyard through the use of this database (always with the
explicit authorization of the parties involved). A key aspect of winegrowing is that is essential to
avoid desertification, which is an on-going source of environmental concern, especially in the
Mediterranean basin.
A common fear with respect to robots is that they create unemployment. This is quite on the
contrary and it is been widely stablished that the industries where robots are more present tend to

generate more and better jobs. In particular, we think this could also happen in rural economies,
which have particularly suffered the hardest by the global financial crisis. Robots deployed on the
field will be accompanied by non-expert personnel to set up the basic functions of the system and
area of work, and monitor its proper functioning. In this respect, the VinBot will create direct
employment opportunities. Furthermore, through increased revenue generated by use of the
system, the successful implementation of the VinBot will encourage indirect employment by
injecting revenue into struggling rural economies. Finally, in line with the ECs objectives to create
the worlds most competitive knowledge based economy, the VinBot project will boost employment
and the international profile of the EU robotics sector by creating the first agricultural cloudcomputing robot.
List of Websites:
www.vinbot.eu

You might also like