Hierarchical classification of robotic grippers applied for agricultural object manipulations

: An overview of agricultural grippers used to control weeds and harvesting is presented. The classification of the grippers, which are installed on robotic agricultural tools for manipulation of fruits, weeds and other objects, is compiled. There are 22 types of grippers depending on 6 selected criteria: drive type, the presence of the drive in the grip, the number of fingers, the type of gripper movement, the type of mechanism, the type of sensors. In this classification, we mainly consider the characteristics of the gripper, which is installed at the end of the manipulator and is responsible for physical contact with the object. Therefore, the main attention is paid to problems requiring direct capture of objects by the agrobots. The issue of joint interaction of a group of heterogeneous terrestrial and airborne robots in the performance of the target agrarian task in an autonomous mission will also be investigated.


Introduction
Precision technologies have been developed mostly for traditional crops since the advent of precision agriculture in the early 1990s.Taking into account these technologies, the productivity of experienced agricultural plots increased significantly.The concept of precision agriculture involves the assessment of in-field spatial variability of different factors such as fertility, soil type and characteristics, and water content in a field and the subsequent management of each crop production input in a more precise and sitespecific manner according to the variability.
With the development of science and technology, various sensors and sensor systems are increasingly being used in precision farming to solve a number of important tasks, such as monitoring of land, detection of fruits, weeds, diseases, etc.For example, over the past decade, much effort has been made to develop sensing technologies for site-specific crop management in Florida, USA.Citrus production in USA has been seriously affected by the emergence of exotic diseases such as citrus greening or Huanglongbing (HLB), citrus canker, and citrus black spot.It has been estimated that citrus greening alone has cost about $3.63 billion in lost revenues.[1].In managing plant disease, detection is one of the most important steps.Detecting disease at an early stage, especially at the asymptomatic stage, could be the most cost-effective method of disease control.Currently, human scouting is most commonly used technique for disease detection in the majority of crops.However, human scouting is costly, time-consuming, and limited to human senses for the detection of disease.In recent years, progress in the area of low-cost, low-altitude satellites and unmanned aerial systems have provided an opportunity for continuous monitoring of plants.Among the sensing systems developed, the following have great potential to be adopted by the growers: immature green citrus detection, debris detection and cleaning system (currently in the process of being patented), blueberry yield mapping system, the HLB detection systems developed using optical properties and starch accumulation, silage yield mapping, grain insect detection using NIR, soil P sensing systems, citrus debris detection system from mechanical harvesting, and detection of dropped citrus fruit on the ground.
The main phases of the agricultural cycle are soil preparation, planting, production, and harvesting.Soil preparation refers to the mechanical manipulation of soil that alters its structure and strength, in order to provide and maintain optimum soil conditions for germination, growth, and development of plants, thus showing their productive capacity.In outdoor conditions, the planting process consists of dropping the seeds or placing them into the soil.In the case of intensive crops, this process of planting has automated machines that perform this task.In outdoor conditions, the planting process consists of dropping the seeds or placing them into the soil.In the case of intensive crops, this process of planting has automated machines that perform this task.Production includes the period between the transplant or the appearance of the first true leaves, and the last harvesting in the case of annual crops.It is composed of tasks such as pesticide spraying, pruning, precision fertirrigation, plague recognition, weed removal, harvesting, and crop removal.Harvesting is the task that requires most inputs and resources [2].
The advent of agricultural robots has the potential to raise the quality of fresh produce, lower production costs, reduce the drudgery of manual labour, and, in some parts of the world, to compensate for the lack of workers in some agricultural sectors.Agricultural robots are generally designed to execute a 'main task', which is usually a specific agricultural task such as planting, weeding, pruning, picking, harvesting, packing, handling, etc. [3].The nature of these analyzed tasks requires the use of mobile robots combined with manipulator robots with different end-effectors (spraying nozzles, irrigation droppers, vacuums, harvesting cutters, suction pads, etc.).End-effectors may consist of a gripper or a tool.Grippers are subsystems of handling mechanisms which provide temporary contact with the object to be grasped [4].
In section 2, we will analyze the existing robotic solutions in the most complex and time-consuming stages of the agricultural cycle: weed control, disease monitoring and harvesting.In Section 3 we give a classification of the grippers used for manipulating agrarian objects.
2 Analysis of existing robotics in the field of agriculture

Weed control and disease monitoring
Different abiotic and biotic stresses affect the potential yield of crops.40% of world food production is lost through diseases, insects and weeds.To achieve high yields in agricultural crop systems, the control of biotic stress is highly relevant.Monitoring diseases and pests during the growing and harvesting stages is essential to fulfilling the plant's production potential, detecting and preventing disease spread, and avoiding significant yield losses.Pest, disease and weed control is a frequently performed and often time-consuming task, which sometimes exposes the HO to the danger of contamination with hazardous chemicals.Development of systems for weed control, including weed detection and removal, has been one of the major fields of research in agricultural robotics in the last few decades.To date, some complete weed-control ARS have been tested under field conditions.The main stages of weed control include: guidance, weed detection and identification, precision in-row weed removal, and mapping.Four types of weed-removal mechanisms are suitable for selective in-row weed control by ARS: mechanical, thermal, chemical, and electrical means [5].
Knowledge of the precise position of crop plants is a prerequisite for effective mechanical weed control in robotic weeding application such as in crops like sugar beets which are sensitive to mechanical stress.In work [6] the potential of using knowledge about the crop seed pattern is investigated based on simulated output from a perception system.Cereals like barley and wheat are placed in rows with no clear structure, whilst maize, sugar beets and other high value crops are placed in rows with a clear defined intra-row spacing between crop plants.In robotic weeding applications plant recognition is often based on machine vision either using spectral properties or plant morphology/shape information.Plant classification based on spectral properties and plant morphology are vulnerable to variations in plant appearance.The results of previous studies have shown that a priori information about the location of plants can be used to recognize seed crops.An experimental model for evaluating context-based crop recognition based on simulating plant positions of both crop and weed plants, in an artificial field.A list of all the plant positions was then used as input to a context-based plant recognizer, which localized crop plants based on the known sowing geometry.The results of the tests showed that the recognition reliability can be described with the positive predictive value (PPV), which is limited by the seeding pattern uncertainty and the weed density according to the inequality: , where -weed density, -crop plant pattern position.In organic crop management the use of conventional pesticides is prohibited, placing a major challenge and priority on most organic farms for mechanical weed control.Agricultural workers who perform manual weeding are exposed to several musculoskeletal disorder (MSD) risk factors, particularly prolonged trunk flexion angles.Hand weeding (and thinning for lettuce) operations in organic production of these crops represents ~95% of their total weed control costs.Currently, some commercial machines for intra-row weeding are available to farmers, such as: the finger weeder, the torsion weeder, the weed blower, flame weeding, current state-of-the-art intelligent systems.In work [7] is presented the design and assess the performance of an automatic intra-row mechanical weeding corobot with automatic hoe positioning based only on a low cost (below US$100) odometry sensing technique.In place of machine vision or RTK-GPS sensors, a human partner will provide visual crop detection capability.To reduce human exposure to musculoskeletal disorder risk factors, the human will ride in a seated position on the hoeing platform equipped with a pair of small (~7 cm wide) hoes (shown as red triangles) was used to control weeds in the intra-row zone and a standard cultivator was used to control inter-row zone weeds.The following results of tests of the collaborative robot on the tomato field are obtained: the average speed of movement is 1.2 km / h, the time for processing weeds in the ripening zone decreased by 57.5% compared to manual processing.Based on the cost of production of broccoli in 2012, an average farm labour cost of US$12.33 ha -1 for hand hoeing can be estimated.The estimated corobot's labour savings of 13.8 h ha -1 suggests US$170.15ha -1 in labour savings for weed control.
In recent decades, research into weed detection technologies, data integration from heterogeneous sensors and selective crop management through herbicides or other treatments has improved significantly.Of particular note is the improved navigation capabilities of unmanned vehicles and agricultural implements by combining information from the sensors of the global navigation satellite system and from local measurement devices.Nevertheless, many of these studies were focused exclusively on specialized tasks in which only individual components of robotic systems in a narrow context were studied, tested and evaluated.The idea of using groups of heterogeneous robotic systems for agricultural applications was discussed several years ago, but the first real attempts to use them were carried out only recently [8].
In tests for field resistance of potato to late blight, crop scientists rate the disease severity exclusively using visual examinations of infections on the leaves.However, this visual assessment is generally time-consuming and quite subjective.In work [9] is proposed to use a new a new estimation technique for disease severity in a field using RGB imagery from an unmanned aerial vehicle (UAV).Potato late blight is one of the most serious diseases affecting potato production in Japan.The disease rapidly destroys leaves, and it consequently leads to yield losses and tuber quality deterioration.The results of studies based on the analysis of images from UAVs confirmed the possibility of a high-throughput phenotyping system to assess disease resistance.When monitoring the field, UAVs are widely used for aerial photography due to low cost, high resolution images compared to satellite images due to low flight altitude, the ability to move in any direction, hover, maintain a stable position in flight, etc.In work [9] images of the potato field were taken by a UAV (HiSystems GmbH Mikrokopter, Germany) with four counter rotating propeller pairs and eight brushless motors.The test field of 36 rows laid out along the long side of the field in an area that was 53.8 m x 27.0 m, and the space between rows was 75 cm.The area of 42.2 m x 22.5 m, located in the centre of the field, was partitioned into 360 experimental plots.Phased image processing included geometric image correction, parametric processing, threshold severity assessment of the disease using the developed algorithm.The results of the tests in 2012 showed: the Root mean square error of disease severity estimated by the image processing error calculated was 14.7%, the determination coefficient was 0.77.In 2013, the results remained approximately at the same level -17.1% and 0.73%, respectively, which confirms the promise of using the image processing method derived from UAVs to assess plant diseases and resistance to late blight.

Harvesting
Currently labour for harvesting constitute more than 50% of total production costs and about 71% of the total human labour required for sweet cherry production.With the growth of competition, the possibility of reducing the cost of production through the mechanization of harvesting is becoming increasingly relevant.Variability of physical and geometrical characteristics of fruits and methods of their collection does not allow to create universal robotic systems, therefore at the current moment researches on designing mechanisms of processing of fruits of separate cultures are conducted.Fruit detection accuracy is critical for obtaining high harvesting efficiency because sweet cherry is characterised by many small fruit.Automated mechanical shakers may be more practical than robotic harvesting for crops like sweet cherry.One advantage of mechanical shaking method is that not every fruit needs detection as long as concentrated areas of fruit in branches are detected.For automatically harvesting cherries using mechanical shakers, a machine vision system needs to be capable of detecting and localising fruit as well as branches.In [10], machine-vision system was developed to segment and detect cherry tree branches with full foliage is given, when only intermittent segments of branches were visible.The specific objectives are to: 1) segment branch pixels in the cherry tree canopy images captured in the presence of leaves and fruit in orchard environment, and 2) detect individual branches using segmented branch regions and estimate their location and orientation in 2D images and assess the detection accuracy.The method of detecting cherry tree branches included 4 main stages: image acquisition, image preprocessing, pixel-based image classification and individual branch detection.Bayesian classifier was used to classify image pixels into four classes e branch, cherry, leaf and background.The algorithm achieved 89.6% accuracy in identifying branch pixels.The morphological properties of the segmented branch sections were used to filter out noises as well as to group together the segments of the same branch in a specified neighbourhood.A curve fitting method was then used to fit an equation through detected branch segments to connect them together.The overall accuracy in detecting individual branches was 89.2%.
In work [11] the method of detection of tomatoes is considered on the basis of the images obtained from UAVs in remote sensing.Tomatoes grow in clusters and are generally partially hidden by their leaves and stalks.At the stage of image acquisition, a quad-copter with a takeoff weight 2.1 kg including a payload of 0.1 kg was used.Based on the spectral-spatial segmentation and classification, tomatoes were searched on the general background of images obtained from different heights with different resolutions -293 x 415 pixels and 1080 x 1920 pixels, with an accuracy of 92%.
Fruit harvesting is one of the most difficult agricultural tasks for autonomous robotic systems.This task consists of several different stages and involves physical contact with the fruit at high location and orientation accuracy, decision-making in real time, detachment of the fruit from the plant without damaging either, and temporarily storing the fruit under safe conditions.The need to use robotic harvesters is associated with a shortage of labor in agriculture and fierce competition, which requires lower production costs.For example, the costs of harvesting citrus fruits in US currently exceeding their cost of production, and approaches four times Brazilian harvesting cost.According to economic studies, harvesting costs must be reduced by 50% to maintain global competitiveness [12].
In work [13] a vision-based estimation and control system for robotic fruit harvesting was developed.In this work, a cooperative visual servo controller was developed to developed to regulate a robot end-effector to the target fruit location.The Lyapunov-based stability analysis is presented to guarantee uniform global exponential stability of the closed-loop system.The robotic system includes a manipulator, a fixed camera and a closed-loop camera-in-hand (CiH).The fixed camera provides a global view of a tree canopy, while the CiH, due to proximity, provides high resolution fruit images.For improved dexterity and accuracy, a seven DOF kinematically redundant, electric motor driven manipulator is selected for the task.In order to control the gripper of the robot during the harvesting process, a rotation controller for orientation of the robot endeffector and a translation controller for regulation the CiH to the target fruit position are developed.The results of the depth estimation show that the depth error estimation graph depends on the distance between the fruit and the camera.The depth estimation exhibits small error (< 2%) for camera distance greater than 450 mm.The accuracy of the controller was observed to be about 15 mm, thus making the system suitable for harvesting medium and large varieties of citrus fruit but may limit operation for small varieties such as page and blood oranges.
Planning the movement of the robot on the field is one of the important and not completely solved tasks for autonomous harvesting.An analysis of this problem was carried out in [14], taking into account the specificity of the production of sweet pepper.The manipulator and end-effector need to avoid these obstacles to successfully approach a target fruit and prevent damages to the plant or nearby fruit.Testing of the sweet-pepper harversting robot was made in a greenhouse with a pre-known number of fruits and stem location.During preliminary testing in the greenhouse, it was found that the endeffector could successfully detach a fruit if а collisionfree azimuth angle was selected for the end-effector, whereas the skew and elevation angle of the fruit hardly influenced detachment success.The sensitivity analysis revealed that reducing end-effector dimensions and widening stem spacing significantly improve the goal configuration success.

Classification of grippers for robotic agricultural manipulators
Based on the results of the analysis of recent publications, a classification of grippers (Fig. 1), which are used in robotic agricultural vehicles for manipulation of the fruits, weeds, and other objects.Range, accuracy, speed and other capture characteristics also depend on the parameters of the manipulator.In this classification, we mainly consider the characteristics of the gripper, which is installed at the end of the manipulator and is responsible for physical contact with the object.It should also be noted that the figure shows tasks that require direct capture of objects by an agrobot, therefore, directional spraying of weeds or pruning of branches and leaves, in which manipulators also participate, but the objects of impact are most often not captured later, are also relevant, but not fully are reflected here.In addition, there are examples where, after trimming, objects fall into a pipe channel and are delivered to a container on a robotic platform.For example, in [14] the manipulator is equipped with a knife and a pipe channel, into which the cut fruits of sweet pepper fall under the force of their gravity.
In the right part of Figure 1 there are 22 types of capture depending on 6 selected criteria: the type of drive, the presence of the drive in the gripper, the number of fingers, the type of movement of the gripper, the type of mechanism, the type of sensors.Figure 2 shows examples of existing research agricultural robots equipped with combined grippers according to the proposed classification, referring to different types.

Conclusion
The performance of monotonous physically heavy operations in agricultural production leads to the risk of disorders of the musculoskeletal system of workers, and in some cases, contamination with chemical preparations.Therefore, the use of robotic means for physical contact and manipulation of objects in agricultural production is an urgent task, ensuring a reduction in the cost of production and improving the quality of operations performed.
Operations to remove weeds, pruning branches, harvesting fruits require an accurate three-dimensional definition of the position of all objects involved in the interaction.The task is complicated by the fact that image analysis is performed on a complex background with the presence of similar ones (garden trees, a number of plants) and overlapping objects (leaves, branches, fruits).To compile soil and crop cartograms necessary for the operation of ground-based robotic means, unmanned aerial vehicles are already being actively used.
Variability of physical and geometrical characteristics of fruits and methods of their collection does not allow to create universal robotic systems, therefore at the current moment researches on designing mechanisms of processing of fruits of separate cultures are conducted.The proposed classification lists the main types of seizures used to collect fruits and vegetables.
Further work will be devoted to the study of the problems of the physical interaction of agrobots with processed objects, differing in weight, density, geometry, surface roughness and other parameters.When developing the capture and design of the manipulator, experience will be used to develop software and hardware for anthropomorphic robots [20,21].The issue of joint interaction of a group of heterogeneous terrestrial and flying robots in the performance of the target agrarian task in an autonomous mission will also be investigated [22][23].

Examples of grips for manipulation in agrarian problems
Three-finger gripper with video camera for citrus harvesting [13] Vacuum gripper with a video camera for Harvesting tomatoes [15] Six-finger pneumatic gripper with video camera to harvesting eggplants [16] Two-finger grip with pressure and collision sensors for harvesting apple [17] Two-fingered permanent gripper with a cutting knife and a video camera for removing tomato leaves [18]