Demonstrator for cranberry protection (DCP)

Latvia

The main problem to solve in this local scenario is the Frosty (radiation frosts), since this one of the most dangerous phenomena strongly affecting plants. A particular solution is covering the field with nodes of Wireless Sensor Network (WSN) early warning system, nevertheless it can fail, because frosty can take place in small squares around the whole field directly between all WSN nodes. For lowering forecast fail rate of the early warning system based on WSN sensors supplementary it would be controlled by Infrared imaging provided by RUAVs. This demonstration scenario after pre-analysis of the data will deploy RUAV and/or its components for configuration and planning of necessary missions and tasks depending on the context of obtained data. It will include mechanisms and/or algorithms for delocalization of robot (UAVs) tasks/activities and realizing decision procedures. Furthermore; the system using data fusion for task planning will take into account different data from other heterogeneous sources like ground placed wireless sensor network (WSNs) that allows to define perception, interpretation and task planning services in a flexible way. 

DCP as monitoring, control, and actuator subsystem developed by IMCS, aims to act as a part of AFarCloud integrated software environment able to follow plants at all the stages of their life. DCP will show classification and reasoning system with capabilities and measurements of sensors, provenance of measurements, providing smart intelligent actuators as a macro-instrument for frost and/or summer overheating control. DCP will also implements distributed components for redistribution, synchronization, configuration and planning of tasks depending on the context. It will include mechanisms for delocalization of robot (UAVs) tasks/activities and realizing decision procedures (from centralize to local approach). The system will take into account both different other UAVs or UGVs behaviour with their possibilities for interaction and ground placed wireless sensor networks (WSNs) that allows to define perception, interpretation and task planning services in a flexible way.

Carrera d´en Bas - Beef cattle organic farm in Spain

Cranberry growing farm - located in Bīriņi parish Latvia

DCP will show examples of solutions for sensing-on-the-move (i.e., “measure and fly” and “grab and fly”) mechanisms to avoid the deployment of a large number of sensors in a large crop fields. UAVs will be equipped with sensors and grippers able to measure the soil conditions, crop conditions (moisture, temperature etc.) and even to get samples that will be autonomously delivered in the laboratory for analyses. UAVs will incorporate Technology solutions for inspection and sensing for precision actuation of control actions. The actions will be according to the specific commitments of the control algorithm, the irrigation or spraying of a specific area, the operation of a actuators for precise fertilizing or distribution of the nutrients to a plant (or an area), the change of some climate parameters in a specific portion of a plant nursery.

Technology and innovation: (a) fully automatic or semiautomatic UAVs with integration of image processing, GIS and other location technologies, and the state of the art sensors for monitoring purposes and (b) fully automatic UAVs with operative mission planning, simulation, and testing tool. By other side, data gathered by WSN sensors and UAVs/UGVs fusion (moisture, temperature, wind, light etc.) and integration with data obtained from external sources (for example weather forecasts) to generate aggregated values. The data processing would be achieved by applying data analytics for integration it in decision-making software applications or presenting the results in necessary way depending on the final usage.

Technologies used with DCP: (a) UAVs contactless charging; (b) Accurate positioning; (c) Methodology for precision farming automation; (d) Intervention tools for UAVs and (e) Autonomous detection and control. DCP will integrate a set of semi-autonomous and autonomous vehicles and machines. It will demonstrate: (1) minimum need for specialized education for the farmer; (2) a heterogeneous set of vehicles and machines with respect to function and brand; (3) decision support (and monitoring) in real-time, based on previous years and other user experiences and (4) Vehicles and machines will be able to communicate with each other to share information and make real-time planning to address the mission goals defined by the farmer.

Image

Measurement of health status through dairy robotics and gas monitoring

In Latvia the demonstrator will deploy end-to-end solutions and/or it’s components based on the output of other work packages for dairy-cattle farms developed by AFarCloud for testing and validation. Because milk parameters, milk yield from particular cow, cattle health monitoring, illnesses prevention and actuation, feeding, etc. monitoring is already functional by proprietary (Lely) dairy robotics in Robežnieki - demonstrator would develop and employ interfaces for data aggregation to the AFarCloud through appropriate gateways to feed big data analytical algorithms of decision support system for dairy-cattle farms located as AFarCloud cloud services for milk- related information processing and exchange. Targeted solution will be integrated with other relevant services and data sources provided by AFarCloud partners within the scope of “Livestock Management” main demonstrator use cases for testing and validation. The deployment will cover also the interfacing with other external proprietary transportation management systems dedicated for large scale cropping purposes using ISOBUS where it’s available. This demonstrator functionality will be integrated with cropping scenarios as a holistic seamless functional system.

Despite of current high level of automation of milk production in the farm Robežnieki ammonia (NH3), methane (CH4), and sulphur containing organic and non-organic gasses (SH- Sulfhydryl) and its control is a problem. The anaerobic decomposition of organic wastes and manure stored as a slurry or in anaerobic lagoons produces methane, carbon dioxide, ammonia, and hydrogen sulphide. Hydrogen sulphide released from actual or stored manure into confined spaces can reach lethal concentrations. In framework of the demonstrator the problem will be researched and control possibility assumptions will be tested and assessed.

Image

Silage / Cereal monitoring and control

For this local scenario there is available, as testing facility, a milk production farm located in Latvia with around 300 cows and harvested areas of grass and maize. The goal of the scenario is to improve the quality of produced milk by improving the quality of food supplied to animals. Direct correlation between milk and food quality it is highly critical to detect and is related to the following issues: (1) the proper time and amount of fertilization of both grass and fodder maize; (2) the proper time of harvesting of grass (the issue is to get grass harvested 3x per vegetation period in latitudes of Latvia); (3) detect harvesting time of maize - must be in a precise moment after first autumn frosts and (4) tamping level control of harvested maize and grass before fermentation for proper silage.

Then objectives of the demonstrator are: (1) detect the proper and precise moments of (a) fertilization of grass and maize and (b) Precise moments of harvesting of both maize and grass; (2) detect occurrence and risk of occurrence of different kind of pests – illnesses of the plants, invasion of insects, and weeds, etc. and (3) tamping level control of harvested maize and grass before fermentation (could be achieved only with partner support). Demonstrator will show potential of UAVs of playing a major role for increase of efficiency and quality of harvested food (grass and fodder maize) by field regular on every day coverage and improved data quality. Automation through the usage of a set of collaborating UAVs supported by a ground located WSN measuring nodes and automated mission planning of UAVS. Demonstration will also show experimental sample of classification and reasoning system with capabilities of data gathering from heterogeneous sensor sources, data transport, and fusion with multilayer matrix of linked data to feed intelligence of AFarCloud based decision support system.

The demonstration scenario will deploy components for redistribution, synchronization, configuration and planning of tasks depending on the context. It will include mechanisms for delocalization of robot (UAVs) tasks/activities and realizing decision procedures (from centralize to local approach). UAVs will incorporate technology solutions, for instance, GNSS based precision positioning system with RTK (Real Time Kinematics) correction, for precision inspection, sensing, and deployment of control actions. The actions will be according to the specific commitments of the control algorithm for precise fertilization of a specific area in a specific time.