A Tactile-enabled Grasping Method for Robotic Fruit Harvesting

10/18/2021
by   Hongyu Zhou, et al.
Monash University
0

In the robotic crop harvesting environment, foreign objects intrusion in the gripper workspace is frequently occurring and unignorable, however, rarely addressed. This paper presents a novel intelligent robotic grasping method capable of handling obstacle interference, which is the first of its kind in the literature. The proposed method combines deep learning algorithms with low-cost tactile sensing hardware on a multi-DoF soft robotic gripper. Through experimental validations, the proposed method demonstrated promising performance in distinguishing various grasping scenarios. The 4-finger independently controlled gripper presented outstanding adaptability to handle various picking scenarios. The overall performance of this work indicated great potential for solving the robotic fruit harvesting challenges.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 2

page 3

page 4

page 6

11/02/2020

Grasping in the Dark: Compliant Grasping using Shadow Dexterous Hand and BioTac Tactile Sensor

When it comes to grasping and manipulating objects, the human hand is th...
12/06/2020

Design of an Optoelectronically Innervated Gripper for Rigid-Soft Interactive Grasping

Over the past few decades, efforts have been made towards robust robotic...
07/06/2021

Tactile Sensing with a Tendon-Driven Soft Robotic Finger

In this paper, a novel tactile sensing mechanism for soft robotic finger...
03/07/2021

Tendon-Driven Soft Robotic Gripper for Berry Harvesting

Global berry production and consumption have significantly increased in ...
12/15/2018

Enumeration, structural and dimensional synthesis of robotic hands: theory and implementation

Designing robotic hands for specific tasks could help in the creation of...
02/07/2017

An Integrated Simulator and Dataset that Combines Grasping and Vision for Deep Learning

Deep learning is an established framework for learning hierarchical data...
09/21/2018

3D Move to See: Multi-perspective visual servoing for improving object views with semantic segmentation

In this paper, we present a new approach to visual servoing for robotics...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Robotic harvesting is one of the most challenging tasks in the agriculture industry. However, due to the complexity of the orchards environment, performing successful robotic harvesting is significantly difficult [18]. In the past two decades, significant efforts have been made in this field [7, 9, 11], but few of them are proved to be efficient and reliable in in-field conditions. One major challenge of developing a successful fruit harvesting system is to enable robots with powerful capability in visual and tactile sensing, and therefore to reach and pick fruits like a human. As fruits on trees are most likely surrounded by stiff obstacles such as branches, trellis wires, or sprinkler lines. Those obstacles can not only reduce the success rate of the harvesting but also damage the robots and fruits. Recently, enormous efforts have been made in vision sensing, which enables robots to reach fruits by using visual sensors, such as cameras or ranged sensors [6, 3]. However, visual sensing can hardly work when the robotic arm reaches position which is to close to fruits. At this stage, tactile sensing is required to feedback that whether a robot can pick fruits successfully or not. Moreover, another issue of robotic harvesting is that the damage rate of fruits picked by robots is always exceeding an acceptable value, which leads to substantial loss of fruit yield and quality. Introducing tactile sensing in picking procedures is also promising to reduce the damage rate [20].

There are number of pilot researches conducted to apply tactile sensing with robotic grippers to provide self-sensing capacity [5]. Elliott et al. [2] developed a high-resolution tactile-sensing finger for grasping purposes. It is inspired by GelSight sensing and outputs the images with object information like shape and texture when contacting. Yancheng et al. [14] proposed a flexible tactile sensor array with a 3 3 sensing unit and each has a five-electrode pattern’s design. This tactile array can provide three-axis contact force perception while grasping different objects. Lingfeng et al. [21] designed a fully flexible tactile pressure sensor with the flexible graphene and silver composites as the sensing element and stretchable electrodes, respectively. The tactile sensor has relatively high sensitivity, wide sensing range, and considerable repeatability. The tactile sensor showed good performance after being integrated with robotic fingers to distinguish the cylinder and tennis ball during grasping. Linhan et al. [16] proposed a novel design of opto-electronic innervated tactile fingers to collect the tactile data such as normal force, torque, which were utilized to adjust the configuration of grasping. Besides, integrating vision and tactile sensing for the robotic grasping task also shows it advantages. For example, Di et al. [4] combined visual and tactile sensing, while the former helps to detect the grasp rectangle from the visual images, and the latter helps to assess the stability of the grasping.

Tactile sensing has been widely explored in many robotic grasping tasks, while only limited work focuses on robotic harvesting. To detach the target fruits from the plant, various grasping techniques with tactile sensing capacity have been implemented in the robotic harvesting application. Zhen et al. [17] proposed a two-finger gripper with a tactile sensor array with 4 6 elements installed inside of the fingers. The sensors can visualize the exertion force on the surface of the fruit, which can be further combined with displacement of the fingers to find the hardness of the target. The integration of tactile sensors for sensing hardness purposes helps the picking manipulation. Victoria et al. [1] proposed novel non-destructive sensing for mango ripeness classification by combining the tactile sensors and near-infrared reflectance spectroscopy method. This ripeness detection helps to selectively pick the fruit during the robotic harvesting process.

From the aforementioned reviews, it is important to develop a gripper that can adjust its action based on tactile sensing feedback. In this research, we present a novel design of a fin-ray gripper with embedded tactile sensor and multi-DoF, which can adjust gripper state based on tactile feedback. To classify grasping status by using stress distribution from each finger, a robust perception algorithm and a deep-learning network is developed and validated. Lastly, we comprehensively evaluate our proposed system under controller experimental setup and in real harvesting action. The detailed contributions of this work are shown as follow,

  • We proposed a tactile enabled dexterous soft robotic gripper to address the obstacle interference challenge in the robotic fruit harvesting.

  • We proposed a classification of the grasping pattern during robotic harvesting in orchard environment, which can be perceived by the tactile sensing array and processed by both traditional and deep-learning-based algorithms.

  • We validated the tactile sensing patterns and applied them to control the dexterous soft robotic gripper to achieve various grasping pattern under different grasping scenarios.

For the rest of this paper, the design and fabrication process of our gripper is presented in Section II , and theoretical modelling and tactile sensing is included in Section III. The experiments and results are presented in Section IV, followed by the conclusion in Section V.

Ii Gripper design and fabrication

Ii-a Mechanical Design

The main objectives of the gripper design in the robotic fruit harvesting applications were: i) low fruit damage rate; ii) high harvest success rate; iii) ability of self-protection under semi-structured environment; iv) low cost. Low fruit damage rate demand gentle contact, while the high harvest success rate requires both essential grasping force and the capability to deal with obstacle interference. In terms of self-protection, the gripper has to be capable of sensing the external force during operation.

With the above four objectives, a hybrid gripper actuated by four air cylinders with 4 flexible fingers is designed and prototyped, as shown in Figure 1. Four actuators are introduced to provide essential grasping force while the flexible fingers are designed to achieve gentle contact. Specifically, the finger skeleton utilises the fin-ray effect for its outstanding shape adaptability, and the silicone skin cast on the surface of the finger can increase the friction for a stable grasp while distributing the pressure to mitigate the potential damage introduced by force concentration. Piezoresistive tactile sensing arrays (RX-M0404S) are integrated to enable the gripper with some extent of environment sensing ability.

Fig. 1: Mechanical design of the proposed multi-DoF soft gripper

To provide the adaptive gripper with self protection function as well as essential dexterity to deal with the obstacle interference, four fingers are designed to be able to act independently. Whenever an obstacle was detected by one finger, the central control system can react by either adjusting the approaching posture or releasing the finger and let the remaining fingers to complete the picking process.

Ii-B Finite Element Modeling

The grasping motion of the proposed soft gripper/finger can be simulated to predict the deformation of soft fingers [19, 12, 13, 8] and provide the basic foundation for tactile sensing integration. To perform the simulation, the property of the hyperelastic material used (TPU: NinjaFlex) is initially tested by a uniaxial tensile test (ISO 37 standard). The dumbbell samples are printed and tested in two patterns, cross and longitudinal (Figure 2a). The average stress and strain relation has been tested with Instron Universal Tester E3000 (Figure 2b and c). The average experimental data are fitted into three hyperelastic models, Yeoh, Ogden, and Mooney-Rivlin. Among these, the Ogden model has the best constitutive model with the parameter values , = 0.03829, = 4.1352, = 24.4601, = 0.2123, = 24.4613, = 0.2122.

Fig. 2: (a) Dumbbell test sample with longitudinal and cross pattern, (b) Instron tensile test machine, (c) Stress-strain curve of NinjaFlex
Fig. 3: Simulation of soft fin ray finger’s displacement when contacting (a) the apple model, (b) both apple and branch model, stress distribution of inner layer when contacting (c) apple model, (d) both apple and branch model

The material is imported into the software Abaqus (Dassault System, MA). The soft finger is meshed using 4-node tetrahedron element. For the contact between the gripper and the target, the tangential contact behavior is model using the penalty method. Normal contact behavior is modeled as ’hard’ contact. The lateral end of the soft finger is pinned and a displacement is provided on the medial side to simulate the actuation of fin-ray finger. Static analysis is performed with the nonlinear deformation enabled when grasping an apple model and apple model with branch interference, respectively. The final contracting state with the maximum displacement is shown in Figure 3a and b, where the soft finger contacts the target fruit or both the fruit and branch. The stress at the inner side where the tactile sensors can be attached is recorded as shown in Figure 3c and d. It needs to be noticed that once contacting the apple or both apple and branch, there is a significant stress concentration occurs in the bottom plane of the soft finger. The stress reaches the maximum where there is a contact area. At the position where the connection between the bottom plane and hinge connector occurs, the stress will increase as well. This simulation results verify the feasibility of integrating tactile sensors at the inner side of the soft finger to provide real-time feedback of the stress when contacting. These simulation also provides guidance on how and where the tactile sensors should be mounted to achieve better sensor reading.

Ii-C Electronic Design

Piezoresistive effect-based sensors are applied due to their advantage in robustness and energy efficiency [15]. Specifically, 24 piezoresistive sensing arrays(RX-M0404S) are embedded on the gripper, with 6 sensors on each finger, as shown in a and b in Figure 4. Each sensor array has 44 taxels in a 14mm 14mm area. Each taxel can output resistance response to the external force. The detectable force range of taxel is from 0.2N to 20N.

Fig. 4: Layout of the tactile sensor and its integration of fin ray finger

To process data from the tactile sensors, a signal isolation circuit [10] is used to overcome the potential extensive crosstalk between taxels. Then the measured resistor value will be uploaded to the Cypress PSoC controller via a selected channel of a multiplexer. The electrical schematic and the data processing circuit are shown in Figure 5a and b, respectively.

Fig. 5: Electrical circuit design of the data processing unit

Iii Sensing Algorithm

We consider four grasping statuses in this work: null grasp, good grasp, finger interference grasp, and obstacle obstructed grasp, as shown in Figure 6. Specifically, null grasp means no fruits are retrieved in the gripper, good grasp means all fingers have stably held the fruits, finger interference grasp means a branch gets caught in between one or more fingers of the gripper, blocked grasp indicates that one or more fingers are obstructed by branches or other obstacles such as trellis wires, trellis support beam, etc.

Fig. 6: Four grasping patterns defined during robotic harvesting

Iii-a State-estimation algorithm

To distinguish the are status of the gripper during the fruit picking process, three assumptions are established. First of all, the pressure value detected by the tactile sensing arrays of the four fingers against a null grasp are assumed to be much smaller than which is in a good grasp. Secondly, in the obstacle grasp, one or two fingers will be stopped by the obstacle when approaching the target fruit, which means the obstructed finger or fingers will output significantly earlier change than the other fingers. Last but most important, for the branch interference category, according to the theoretical modelling analysis, a branch being grasped in between a finger and the target fruit will trigger force concentration near the contact area. We assume such force concentration would lead to two outcomes: on the one hand, the pressure value in the force concentration area tend to be higher than an adjacent area; on the other hand, the change of the pressure value in a certain time frame would be much sharper. With these assumptions, moving variance of the pressure value on each taxel was taken as an indicator to characterize the rate of pressure change.

To analyse the moving variance of the pressure value during the grasping process, the output of the 384 taxel feedback was fed into a 2416t matrix, where t is the number of time frames, as shown in Figure 7. The interval of each time frame was set as 60ms. To monitor the changes of the values of each matrix element, the 2416t matrix is reshaped to a 384t matrix (M) after data normalization. A moving average of every four data points was then applied to the normalized data to filter out the noise, after which a moving variance was calculated against each matrix element, four maximum moving variance on the taxels of four fingers was then selected for comparison.

Fig. 7: Time series datasets
(1)

where,

(2)

Various conventional statistic analysis methods were tested, including moving average, moving variance, Fast Fourier Transform, and power spectral density.

Iii-B State-estimation Network

The embedded tactile sensors in fin-ray fingers can continuously feedback a stress distribution matrix in pixels. To simplify the signal preprocessing from conventional analysis, a CNN model Deep-touch is also designed to predict the grasping status. Deep-touch includes three networks, a local finger network to predict the status of a single finger, a global network to extract features of global stress distributions of four fingers, and a fully-connected network to combine features from local and global networks to predict current grasping status (Figure 8).

Fig. 8: Deep-touch CNN network for grasping pattern classification

Local finger network applies a ResNet-18 model, while the pooling layer kernel is changed to 2 1. Each local finger network generates a 1

64 vector. The global network also uses ResNet-18 model while the input is the stress distribution of four fingers and the output is a 1

256 feature vector. The feature vectors from four fingers and the global network are concatenated together, generating a feature vector of 1 512. This vector is then fed into the fully-connected network to predict the current status of the gripper.

Iv Experiment and Results

Iv-a Experiment on tactile-enabled grasping

Iv-A1 Experiment Setup

To validate the proposed method, the experiment is set up: the sensor-integrated gripper was fixed on a desk, then grasp tests were conducted on four scenarios. Twelve apples of different varieties were used as target objects to be grasped by the gripper. Each apple was grasped multiple times in different orientations. To make the branch settings close to reality, the branches were also set to different orientations and positions.

Fig. 9: Lab experiment setup with (a) branch grasping, (b)obstacle obstructed grasping

Iv-A2 Experiment on conventional methods

200 grasp tests were implemented to validate the proposed conventional method, as shown in Figure 9. Specifically, 96 grasps for branch interference, 48 grasps for good grasp, 30 for null grasp and 26 for finger obstructed grasp.

For all the tests, the proposed moving variance method achieved an overall accuracy of 100 percent in detecting the 30 null grasps, 92 percent in detecting the 26 obstacle grasps, and 75 percent in determining which finger has a branch grasped among the 96 branch interference grasps. The detail of the test result is shown in Table I.

Figure 10 shows the moving variance changes over the entire grasp process (approaching-grasp-hold-release), the amplitude of the maximum moving variance calculated on the four fingers of the null grasp are significantly smaller than other scenarios. While in the obstacle grasp scenario, the variance change on four fingers shows apparent asynchronicity, representing that the deformation of one finger occurs remarkably earlier than other fingers.

Fig. 10: Voltage and variance change during grasping process among different scenarios

The moving variance changes of finger interference grasp where a branch grasped by different fingers is presented in Figure 11 a, b, c, and d, respectively, from which we can see that the finger with a branch been grasped in output a much higher moving variance in the grasp phase.

Fig. 11: Variance change when one finger grasped a branch

In terms of differentiating good grasp from branch interference grasp, various conventional statistic analysis methods were tried, including moving average, moving variance, Fast Fourier Transform, power spectral density, each of these conventional feature only works for a certain condition but not able to cover all. This indicates that some hidden features were yet to be fully extracted to distinguish the difference between good grasp and branch interference grasp. Therefore the deep learning methods are required to bridge the gap.

Iv-A3 Experiment on Deep-learning methods

To evaluate the performance of the trained deep-touch classification network, the same data set is utilized as compared with conventional method. The stress distribution image of each of the four fingers and the merged results are input into the classification network. The predicted labels are output and compared with the ground truth label to verify the accuracy of deep-touch network. The classification accuracy for each grasping status is summarised in Table I

as well. It can be seen that the deep-touch network achieves the same classification accuracy on null grasp status, as compared with the convention algorithm. This is due to the factor that the features are quite obvious to extract for both convention and neural networks. The null grasp shows a much smaller voltage change, thus a much less dense color visualization in the stress distribution image. As for the other grasp status, the deep-learning-based algorithm shows its superior performance in classification accuracy. This is due to the fact that our network has both global and local networks which help with abundant feature extractions. The deep-touch network demonstrated a minimum 3.8% accuracy improvement while identifying these three statuses. The average accuracy improvement reaches to 15.13% among these three. The deep-learning-based network shows great superiority in differentiating good grasp and branch interference grasp. An overall 89.4% accuracy in distinguishing the four scenarios is quite promising.

Scenarios Category Number of grasps Detection accuracy
Conventional method Deep learning
Null grasp Without leaves 15 96.6% 96.6%
With leaves 15
Finger obstructed grasp N/A 26 88.5% 92.3%
Good grasp N/A 48 52.1% 85.4%
Branch interference grasp Finger 1 24 75.0% 83.3%
Finger 2 24
Finger 3 24
Finger 4 24
TABLE I: Experiment result of the proposed conventional method

Iv-B Experiment on Harvesting system

A robotic system is used to demonstrate tactile-enabled grasping in the lab environment. The system includes four subsystems, a vehicle, a 6-DoF universal robotic arm, the proposed gripper, and a vision system, as shown in Figure 12. Our robot has two vision blocks: global and a local block. The global vision includes a DJI-Livox Mid-70 LiDAR and an RGB camera. The color image is utilized to perform the 2D fruit recognition and segmentation (Figure 13 a), which is also calibrated and fused with point cloud from the LiDAR, as shown in Figure 13 b, c, and d. Local vision has a depth camera on the end-effector to perform second-time processing. A geometry-aware detection network is applied in both blocks, which can localise and predict the reach angle of each apple. Octomap is used to modelling the occupancy space by using point cloud from the vision. MoveIt! framework is used in arm planning and execution.

Fig. 12: Monash robotic fruit retrieving system
Fig. 13: (a) RGB image captured by the global camera with apple detection and segmentation, colored point clouds of apple tree model from RGB-LiDAR fusion (b) left view, (c) centre view, (d) right view

The experiments were conducted in the lab environment, and seven apples were hanging on the fake apple tree to simulate the different occlusion. The targets were detected and validated by the global and local visual system with position information, the proposed multi-DoF gripper was then manipulated to approach it. The gripper was actuated to grasp when reaching the front position of the target fruit, in this case, the tactile sensors received and sent the contact force information. The deep-learning-based classification network classified the grasping pattern into one of the four patterns. The multi-DoF gripper can adjust the modes depending on different patterns classified by the network. For example, it opens and re-attempts to grasping once there is a null/obstacle grasp pattern. It detaches the grasped fruit once the good grasp is received. In the case branch interference grasp is detected (shown in Figure 14a), the traditional algorithm was utilized afterward to locate the finger where there is a branch interference. After the finger was identified, that finger was pneumatically controlled to open and release the grasped branch, as shown in Figure 14b. The detach continued to remove the target fruit off the tree.

Fig. 14: Adjusted grasping pattern of the multi-DoF gripper

V Conclusion

This work developed an intelligent robotic grasping method based on sensing algorithms and a novel designed soft gripper prototype, which can distinguish various grasping scenarios during the robotic harvesting process and adjust its grasping action based on the multi-DoF mechanism. The proposed sensing-based grasping method is the first of its kind in the literature to handle the branch interference challenge. Such method can be further applied to broader fields, whenever there are foreign objects intrude into the gripper workspace. Besides, the deep learning algorithms have been developed to achieve accurate grasping status detection, regardless of the noises generated by the silicone skin of the gripper fingers. Compared with the conventional method, the proposed method presented an average 15.13% increase in the classification accuracy. A robotic harvesting system has been built to included different core components. The gripper is able to approach and contact the target fruit based on the robust vision and manipulation system. The demonstration of the proposed tactile-enabled grasping method has been validated trough experiments. The grasp patterns can be identified and utilized to control the soft gripper to adjust the actions.

Acknowledgment

We gratefully acknowledge the financial support from Australian Research Council (ARC ITRH IH150100006). We would like to thank Mr. Cooper Gerwing, Mr. Charles Troeung, Dr. Wesley Au, Dr. Shao Liu and Dr. Godfrey Keung in the Laboratory of Motion Generation Analysis at Monash University for their assistance on this work.

References

  • [1] V. Cortés, C. Blanes, J. Blasco, C. Ortiz, N. Aleixos, M. Mellado, S. Cubero, and P. Talens (2017) Integration of simultaneous tactile sensing and visible and near-infrared reflectance spectroscopy in a robot gripper for mango quality assessment. Biosystems Engineering 162, pp. 112–123. Cited by: §I.
  • [2] E. Donlon, S. Dong, M. Liu, J. Li, E. Adelson, and A. Rodriguez (2018) Gelslim: a high-resolution, compact, robust, and calibrated tactile-sensing finger. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1927–1934. Cited by: §I.
  • [3] D. Font, T. Pallejà, M. Tresanchez, D. Runcan, J. Moreno, D. Martínez, M. Teixidó, and J. Palacín (2014) A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm. Sensors 14 (7), pp. 11557–11579. Cited by: §I.
  • [4] D. Guo, F. Sun, B. Fang, C. Yang, and N. Xi (2017) Robotic grasping using visual and tactile sensing. Information Sciences 417, pp. 274–286. Cited by: §I.
  • [5] T. Jin, Z. Sun, L. Li, Q. Zhang, M. Zhu, Z. Zhang, G. Yuan, T. Chen, Y. Tian, X. Hou, et al. (2020) Triboelectric nanogenerator sensors for soft robotics aiming at digital twin applications. Nature communications 11 (1), pp. 1–12. Cited by: §I.
  • [6] H. Kang and C. Chen (2020) Fast implementation of real-time fruit detection in apple orchards using deep learning. Computers and Electronics in Agriculture 168, pp. 105108. Cited by: §I.
  • [7] H. Kang, H. Zhou, X. Wang, and C. Chen (2020)

    Real-time fruit recognition and grasping estimation for robotic apple harvesting

    .
    Sensors 20 (19), pp. 5670. Cited by: §I.
  • [8] F. Largilliere, V. Verona, E. Coevoet, M. Sanz-Lopez, J. Dequidt, and C. Duriez (2015) Real-time control of soft-robots using asynchronous finite element modeling. In 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 2550–2555. Cited by: §II-B.
  • [9] G. Lin, L. Zhu, J. Li, X. Zou, and Y. Tang (2021)

    Collision-free path planning for a guava-harvesting robot based on recurrent deep reinforcement learning

    .
    Computers and Electronics in Agriculture 188, pp. 106350. Cited by: §I.
  • [10] J. M. Romano, K. Hsiao, G. Niemeyer, S. Chitta, and K. J. Kuchenbecker (2011) Human-inspired robotic grasp control with tactile sensing. IEEE Transactions on Robotics 27 (6), pp. 1067–1079. Cited by: §II-C.
  • [11] Y. Tang, M. Chen, C. Wang, L. Luo, J. Li, G. Lian, and X. Zou (2020) Recognition and localization methods for vision-based fruit picking robots: a review. Frontiers in Plant Science 11, pp. 510. Cited by: §I.
  • [12] X. Wang, A. Khara, and C. Chen (2020) A soft pneumatic bistable reinforced actuator bioinspired by venus flytrap with enhanced grasping capability. Bioinspiration & Biomimetics 15 (5), pp. 056017. Cited by: §II-B.
  • [13] X. Wang, H. Zhou, H. Kang, W. Au, and C. Chen (2021) Bio-inspired soft bistable actuator with dual actuations. Smart Materials and Structures. Cited by: §II-B.
  • [14] Y. Wang, J. Chen, and D. Mei (2019) Flexible tactile sensor array for slippage and grooved surface recognition in sliding movement. Micromachines 10 (9), pp. 579. Cited by: §I.
  • [15] F. Xu, X. Li, Y. Shi, L. Li, W. Wang, L. He, and R. Liu (2018) Recent developments for flexible pressure sensors: a review. Micromachines 9 (11), pp. 580. Cited by: §II-C.
  • [16] L. Yang, X. Han, W. Guo, F. Wan, J. Pan, and C. Song (2021) Learning-based optoelectronically innervated tactile finger for rigid-soft interactive grasping. IEEE Robotics and Automation Letters 6 (2), pp. 3817–3824. Cited by: §I.
  • [17] Z. Zhang, J. Zhou, Z. Yan, K. Wang, J. Mao, and Z. Jiang (2021) Hardness recognition of fruits and vegetables based on tactile array information of manipulator. Computers and Electronics in Agriculture 181, pp. 105959. Cited by: §I.
  • [18] Y. Zhao, L. Gong, Y. Huang, and C. Liu (2016) A review of key techniques of vision-based control for harvesting robot. Computers and Electronics in Agriculture 127, pp. 311–323. Cited by: §I.
  • [19] G. Zheng, O. Goury, M. Thieffry, A. Kruszewski, and C. Duriez (2019) Controllability pre-verification of silicone soft robots based on finite-element method. In 2019 International Conference on Robotics and Automation (ICRA), pp. 7395–7400. Cited by: §II-B.
  • [20] H. Zhou, X. Wang, W. Au, H. Kang, and C. Chen (2021) Intelligent robots for fruit harvesting: recent developments and future challenges. Cited by: §I.
  • [21] L. Zhu, Y. Wang, D. Mei, and C. Jiang (2020) Development of fully flexible tactile pressure sensor with bilayer interlaced bumps for robotic grasping applications. Micromachines 11 (8), pp. 770. Cited by: §I.