This work studies tactile manipulation of cables and ropes. In particular we look at the task of picking one end of a cable with a gripper and following it to the other end with a second gripper, as shown in Fig. 1. One can think of this task as similar to contour following, but with the additional complexity that cables are deformable and are not rigidly attached to the environment. This skill is important to many tasks that involve manipulation of deformable objects, such as folding clothes, tying knots, or handling cable harnesses.
Cable following is challenging—and interesting—because it requires controlling grasp force (to enable smooth sliding of the cable), as well as controlling grasp pose (to prevent the cable from falling from the gripper fingers). A key motivation is that neither the configuration of the cable in the gripper, nor the forces on the cable in the grasp, are directly observable with vision sensors, but can be observed with tactile sensors.
We approach cable manipulation by dividing the behavior into two goals: 1) Cable grip control, which monitors the gripping force to maintain friction forces within a useful range; and 2) Cable pose control, which regulates the configuration of the cable to be centered and aligned with the fingers. To accomplish this task, we build and integrate a system with the following modules:
Tactile perception. We estimate in real-time the pose of the cable in the gripper, the friction force pulling from the gripper, and the quality of the tactile imprints (Sec. IV-B).
Cable grip controller. The gripper regulates the gripping force by combining a PD controller and a leaky integrator that modulates the friction force on the cable, and provides grip forces to yield tactile imprints of sufficient quality for perception (Sec. IV-C).
Cable pose controller. The robot controls the cable configuration on the gripper fingers with an LQR controller, based on a learned linear model of the dynamics of the sliding cable (Sec. IV-C).
We evaluate the complete system in the task of cable following for various cables, sliding at different velocities, and benchmark against several baseline algorithms.The results in Sec. VI show that the system succeeds at following different cables from the beginning to tail end with very few regrasps.
Ii Related Work
In this section, we review related works on cable manipulation using either vision or tactile feedback as well as classical contour following. We also review other control applications of vision-based tactile sensors.
Ii-a Cable manipulation & Contour following
Cable/rope manipulation Manipulating wire-like objects has attracted attention in the robotics community [2, 3] with tasks including tying knots , surgical suturing , or wire reshaping [6, 7]. Kudoh et al.  proposed a method for in-air knotting by a dual-arm, multi-finger robot. The knotting task is represented as a sequence of steps that consists of hand motions and supporting arm motions without sensor feedback. Nair et al.  developed a learning-based robot system to reproduce the rope configuration demonstrated by a human. Li et al.  proposed a vision-based controller for USB wire grasping and manipulation, which allows the robot to automatically grasp and align the USB wires into grooves. Palli and Pirozzi  implemented a cable insertion task with tactile feedback. All of these works have in common that they rely on stable grasps of the cable, but do not exploit cable sliding.
Contour following Cable following is related to contour following, which has been widely studied  and is relevant for example to surface cleaning, inspection, or polishing. Baeten and De Schutter  proposed a hybrid vision/force control approach for planar-contour following tasks, where vision is used to measure the contour and watch for corners. Lepora et al.  developed a planar-contour following controller by using Tactip sensors  to detect the orientation and position of the edge. Most of the contour following works do not involve grasping the object since it is part of the rigid environment, which simplifies the dynamics of the task. The closest to our work is Hellman et al. 
, which proposed a reinforcement learning approach to close a plastic ziplock bag with BioTac sensors, requiring to both grasp and follow the edge of the bag. In contrast to our approach, they use discrete actions and constant grasping force.
Ii-B Vision-based tactile sensor for control
Vision-based tactile sensors [1, 15, 13, 16, 17] use a camera to capture the deformation of their contact surface with high-resolution images, which may encode information of local geometry, forces, and possibly slip. These geometric features encode the potential to control contact interactions.
Geometry information Li et al. 
implemented a USB insertion task from random grasping poses based object pose estimation from the GelSight sensor. Izattet al. 
used the 3D point cloud from GelSight sensor in a Kalman filter to better register the position of a screwdriver in a peg-in-hole task. Calandraet al.  and Hogan et al.  demonstrated that the tactile images from GelSight and GelSlim sensors could be used to evaluate the quality of a grasp and choose a suitable regrasp. Pestell et al. 
used features extracted from Tactip images to stabilize the grasp of a Shadow Modular Grasper. Tianet al.  proposed a tactile-based model predictive control method to reposition an object.
Force information Force information in most vision-based tactile sensors comes from tracking the motion of markers on the sensor surface. Yamaguchi et al.  designed a FingerVision tactile sensor and demonstrated that the extracted force information is useful to control the grasp of a knife cutting vegetables. Dong et al.  proposed an incipient slip detection algorithm with the GelSlim sensor and used it to screw a cap on a bottle with online adaptive grip force. Song et al.  demonstrated the use of GelSight and FingerVision for skewering food.
Contact information Dong and Rodriguez  used a learned model to estimate misalignments when packing objects in a box with GelSlim, to handle external contacts.
Iii Cable Following Task
Task Description The goal of the cable following task is to use a robot gripper to grip the beginning of the cable with proper force and then control the gripper to follow the cable contour all the way to its tail end. The beginning end of the cable is initially firmly gripped by another fixed gripper during the cable following process. The moving gripper is allowed to regrasp the cable by bringing it back to the fixed gripper, resulting in two-hand coordination with one of the hands fixed. This method can generalize to cables with different properties (shape, stiffness, surface roughness).
Robot System In order to tackle this task, the following four hardware elements are necessary:
tactile sensor to measure the grasped cable position and orientation in real time
tactile sensor to measure the amount of friction force during sliding in real time
fast reactive gripper to modulate the grasping force according to the measured friction
fast reactive robot arm to follow the measured cable orientation and keep the measured cable position in the center of the gripper.
The key idea of our method is to use tactile control to monitor and manipulate the position and forces on the cable between the fingers. The concept is illustrated in Fig. 2. We divide the tactile controller in two parts:
Cable Grip Control so the cable alternates between firmly grasped and sliding smoothly,
Cable Pose Control so the cable remains centered and aligned with the fingers when pulling and sliding.
In this section, we describe the implementation of the tactile controller by introducing the reactive gripper, the tactile perception system, the modeling of the cable, and the two controllers.
Most commercialized robotic grippers do not offer sufficient bandwidth for real-time feedback control. To that end we design a parallel gripper with 2 fingers (with a revised GelSight sensor), a compliant parallel-guiding mechanisms, and slide-crank linkages actuated by a servo motor (Dynamixel XM430-W210-T, Robotis) as shown in Fig. 3.
Mechanism design Parallelogram mechanisms are widely used to yield lateral displacement and slider-crank mechanisms are broadly employed to actuate the parallelogram mechanism for parallel grippers. We use them to facilitate parallel grasping . To make a compact actuation mechanism, we use a tendon-driven system.
One end of a string (tendon) is tied to a motor disk which is fixed on the servo motor installed in a motor case. The other end of the string is tied to the slider as shown in Fig. 3a. We use a compression spring between the slider and the motor box with pre-tension forming a slider-string-spring system. One end of the crank is connected to the slider and the other is coupled with the rocker of the parallelogram mechanism. The finger is attached to the coupler of the parallelogram mechanism. The string drives the slider down, actuating the parallelogram mechanism via the crank linkage and producing the desired lateral displacement of the finger. Two fingers assembled symmetrically around the slider yields a parallel gripper.
Mechanism dimensions The design guidelines are as follows: 1) The max opening of the gripper is targeted at 100 mm, i.e., 50 mm displacement for each finger; 2) The parallelogram mechanism should the fit the size of GelSight fingertips; 3) Reduce overall size of the gripper. With these constraints, we found a candidate dimension of the mechanisms as: = 15 mm; = 50 mm; = 30 mm; = 20 mm; = 23.25 mm; = 100 mm; = 127 ; = 307 . Refer to Fig. 3 for the definition of all variables. Note that and are the initial values of and .
Compliant joint design The rigid parallelogram mechanism in Fig. 3b contains 28 pieces making their assembly tedious and time-consuming. We redesign the linkage with flexural joints  to simplify the mechanism. That process yields the compliant parallel-guiding mechanism in Fig. 3c equivalent to the rigid parallelogram, reducing the 28 pieces to a single part and offering the same kinematic functionality. The overall size of the final prototype has length mm, width mm, and thickness mm at the rest position.
Figure 4 illustrates the process to extract cable pose, cable force and grasp quality from tactile images.
Cable pose estimation First, we compute depth images from GelSight images with Fast Poisson Solver 
. Then, we extract the contact region by thresholding the depth image. Finally, we use Principal Component Analysis (PCA) on the contact region to get the principal axis of the imprint of the cable on the sensor.
Cable friction force estimation We use blob detection to locate the center of the black markers in the tactile images . Then we use a matching algorithm to associate marker positions between frames, with a regularization term to maintain the smoothness of the marker displacement flow. We compute the mean of the marker displacement field (D), which is approximately proportional to the friction force.
Cable grasp quality In this task, we evaluate the grasp quality (S) based on whether the area of contact region is larger than a certain area. Tactile imprint with poor quality (small contact region) will give noisy and uncertain pose estimation. By increasing grasping force, as shown in Fig. 5, we can increase the grasp quality.
Cable Grip Controller The goal of the grip controller is to modulate the grasping force such that 1) the friction force stays within a reasonable value for cable sliding (too small and the cable falls, too large and the cable gets stuck), and 2) monitor the tactile signal quality. We employ a combination of PD controller and leaky integrator. The PD controller uses the mean value of the marker displacement (D) (approx. friction force) as feedback and regulates it to a predefined target value (). The PD controller is expressed as:
where and are the coefficients for the proportional and derivative terms, and is the measured mean value of the marker displacement. The leaky integrator raises of the PD controller if the signal quality (S) is poor as follows:
where is the leakage at each time step and S is the signal quality indicator.
Model of Cable-Gripper Dynamics We model the cable-gripper dynamics as a planar pulling problem. As shown in Fig. 6, the region of the cable in contact with the tactile sensor (blue rectangle on the right) is represented as a 2D object. We parameterize its position and orientation with respect to axis of the sensor frame with and . We further define the angle between the center of the cable on the moving gripper and the orientation of the fixed gripper (blue rectangle on the left). These three parameters define the state of the cable-gripper system. We finally define the input of control on the system as the direction of pulling relative to the angle (labeled with red arrow).
Since a deformable gel surface has complex friction dynamics, we use a data-driven method to fit a linear dynamic model rather than first principles. The state of the model is , the control input , and the linear dynamic model:
where and are the linear coefficients of the model.
To efficiently collect data, we use a simple proportional (P) controller as the base controller supplemented with uniform noise for the data collection process. The P controller controls the velocity of the robot TCP in the axis and we leave the velocity in the axis constant. The controller is expressed in Equation 4, where is the coefficient of the proportional term, and
is random noise sampled from a uniform distribution. The intuition for this baseline controller is that when the robot (sensor) moves in the positive direction, the cable is pulled in the opposite direction and travels back to the center of the sensor.
We collect approximately 2000 data points with a single cable by running several trajectories with different initial cable configurations. We use 80% of the data for linear regression of matricesand and validate the result with the rest.
Cable Pose Controller The goal of the cable pose controller is to maintain the cable position in the center of the GelSight sensor (), the orientation of the cable to be parallel to the axis () and the inclination of the pulled cable also parallel to the axis (). The nominal trajectory of the cable pose controller (, ) is then constant and equal to (, ), that is, regulating around zero.
We formulate an LQR controller with the and matrices from the linear model and solve for the optimal feedback gain , which in turn gives us the optimal control input . The parameters of the LQR controller we use are and , since regulating and (making sure the cable does not fall) is more important than regulating (maintain the cable straight).
V-a Experimental Setup
The experimental setup in Fig. 7 includes a 6-DOF UR5 robot arm, two reactive grippers (as described in Section IV-A) and two pairs of revised fingertip GelSight sensors attached to the gripper fingers. One of the grippers is fixed on the table and another one is attached to the robot. The control loop frequencies of the UR5 and the gripper are 125 Hz and 60 Hz, respectively.
We use five different cables/ropes to test the controllers (Fig. 9 bottom): Thin USB cable with nylon surface; thin nylon rope; thin USB cable with rubber surface; thick nylon rope; and thick HDMI cable with rubber surface.
V-B Cable following experiments
Experimental process The beginning end of the cable is initially grasped firmly with the fixed gripper secured at a known position. The moving gripper picks up the cable and follows it along its length until reaching its tail end. During that process, the grasping force is modulated with the cable grip controller and the pose of the cable regulated with the cable pose controller. The moving gripper can regrasp the cable by feeding the holding part to the fixed gripper if it feels it is going to loose control of the cable, or the robot reaches the workspace bounds. Within a regrasp, the robot adjusts the position of the moving gripper according to the position of the cable detected by the fixed gripper.
Metrics we use three metrics to evaluate performance:
Ratio of cable followed vs. total cable length
Distance traveled per regrasp, normalized by the maximum workspace of the moving gripper.
Velocity, normalized by max velocity in the direction.
Note that all these metrics have a max and ideal value of 1.
Controller comparison We compare the proposed LQR controller with three baselines: 1) purely moving the robot to the direction without any feedback (open-loop controller), 2) open-loop controller with emergency regrasps before losing the cable, 3) Proportional (P) controller we use to collect data. Since the initial configuration of the cable affects the result dramatically, we try to keep the configuration as similar as possible for the control experiments and average the results for 5 trials of each experiment.
Generalization We conduct control experiments with the LQR robot controller + PD gripper controller to test the performance across 1) different velocities: 0.025, 0.045, and 0.065 m/s; and 2) 5 different cables. Similarly, we also conduct 5 trials for each experiment and average the results.
Vi Experimental Results
In this section we detail the results of the cable following experiments with different robot controller, with different velocities and different cables, according to the evaluation metrics. See Fig.9 for a summary.
Vi-a Controller evaluation
We compare four different robot controllers: open-loop, open-loop with emergency regrasps, P controller, and LQR controller. The top row in Fig. 9 shows that the open-loop controller only follows 36% of the total length. The gripper loses the cable easily when it curves. The simple addition of emergency regrasp is sufficient for the open loop controller to finish the task. This indicates that a timely detection of when the cable is about to fall from the gripper is important for this task. This controller, however still requires many regrasps and is slower than the P and the LQR controllers. The results show that the LQR controller uses the least number of regrasp compared to other controllers. The LQR controller does not show much experimental improvement in the velocity metric, in part because the robot travels more trying to correct for cable deviations.
Figure 8 shows snapshots of the experimental process using the LQR controller. Note that this controller always tries to move the gripper back to the center of the trajectory once the cable is within the nominal configuration since (the angle between the center of the cable in hand to the beginning end) is also part of the state. This can be observed from the middle row of Fig. 8, where the cable is straight and close to the middle of the GelSight sensor, and the output of controller shows the direction to the center of the trajectory. This feature is an advantage of the LQR controller over the P controller.
Vi-B Generalization to different velocities
The model of the cable-gripper dynamics is fit with data collected at velocity 0.025 m/s. We also test the LQR controller at 0.045 and 0.065 m/s. The results in the second row of Fig. 9 show that the performance does not degrade, except that it requires more regrasps per unit of distance travelled. This is because, going faster, it has less time to react to sudden pose changes and, therefore, tends to trigger regrasp more. Although the number of regrasp increases with bigger velocity, the total time is still shorter due to faster velocity.
Vi-C Generalization to different cables
We also test the system with LQR controller on 5 different cables, each with different physical properties (diameters, materials, stiffness). In experiments, the system generalizes well and can follow 98.2% of the total length of the cables.
Comparing the performance on the different cables shows that cable 2 (thin and light nylon rope) requires most regrasps. It is difficult to adjust in-hand pose since it is very light and the un-followed part of the cable tends to move with the gripper. The cable with best performance is cable 3 (thin and stiff rubber USB cable), which is stiff and locally straight most of the time.
Vii Conclusions and Discussion
In this paper, we present a perception and control framework to tackle the cable following task. The main contributions of the work are:
Tactile Perception uses the GelSight sensor to estimate the pose of the cable and the approximated friction force during sliding, and provide the tactile signal quality in real time. This tactile features are relevant to many other tactile control tasks.
Gripper Design Reactive gripper with compliant parallel-guiding and slider-crank mechanisms enabling grasping force modulation at 60 Hz.
Cable Grip Controller PD controller and leaky integrator for the gripper to maintain adequate friction and good tactile signal quality. The controller tradeoffs the gripping force to yield good tactile imprints and smooth sliding.
Cable Pose Controller Data-fitted linear model describe the cable-gripper dynamics by projecting it to a planar pulling problem. An LQR controller regulates the cable at the center of the finger during sliding. The successful implementation of the model-based controller in the cable following task and its adaption to different cables and following velocities, demonstrates that it is possible to use simple models and controllers to manipulate deformable objects by exploiting good tactile feedback.
There are several aspects of the system that can be improved: 1) The frequency of the tactile signal (30Hz) and the control loop of the gripper (60Hz) can be potentially raised to 90Hz and 200Hz, respectively; 2) We observe that it is difficult to pull the cable back when it falls on the edge of the finger, because of the convex surface of the GelSight sensor. The finger-sensor shape could be better optimized to improve performance; 3) It would be interesting to explore other models and controllers. Model-based reinforcement learning with a more complex function approximator could be a good fit for this task to handle with more accuracy the cable-gripper dynamics. The perception and control frameworks proposed here might enable addressing more complex robotic tasks.
-  W. Yuan, S. Dong, and E. Adelson, “Gelsight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
-  J. R. White, P. E. Satterlee Jr, K. L. Walker, and H. W. Harvey, “Remotely controlled and/or powered mobile robot with cable management arrangement,” Apr. 12 1988, uS Patent 4,736,826.
-  J. E. Hopcroft, J. K. Kearney, and D. B. Krafft, “A case study of flexible object manipulation,” The International Journal of Robotics Research, vol. 10, no. 1, pp. 41–50, 1991.
-  T. Morita, J. Takamatsu, K. Ogawara, H. Kimura, and K. Ikeuchi, “Knot planning from observation,” in 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), vol. 3. IEEE, 2003, pp. 3887–3892.
H. Mayer, F. Gomez, D. Wierstra, I. Nagy, A. Knoll, and J. Schmidhuber, “A system for robotic heart surgery that learns to tie knots using recurrent neural networks,”Advanced Robotics, vol. 22, no. 13-14, pp. 1521–1537, 2008.
-  J. Zhu, B. Navarro, P. Fraisse, A. Crosnier, and A. Cherubini, “Dual-arm robotic manipulation of flexible cables,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 479–484.
A. Nair, D. Chen, P. Agrawal, P. Isola, P. Abbeel, J. Malik, and S. Levine, “Combining self-supervised learning and imitation for vision-based rope manipulation,” in2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2017, pp. 2146–2153.
-  S. Kudoh, T. Gomi, R. Katano, T. Tomizawa, and T. Suehiro, “In-air knotting of rope by a dual-arm multi-finger robot,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2015, pp. 6202–6207.
-  X. Li, X. Su, Y. Gao, and Y.-H. Liu, “Vision-based robotic grasping and manipulation of usb wires,” in 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018, pp. 1–6.
-  G. Palli and S. Pirozzi, “A tactile-based wire manipulation system for manufacturing applications,” Robotics, vol. 8, no. 2, p. 46, 2019.
-  N. Chen, H. Zhang, and R. Rink, “Edge tracking using tactile servo,” in Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, vol. 2. IEEE, 1995, pp. 84–89.
-  J. Baeten and J. De Schutter, “Hybrid vision/force control at corners in planar robotic-contour following,” IEEE/ASME Transactions on mechatronics, vol. 7, no. 2, pp. 143–151, 2002.
-  B. Ward-Cherrier, N. Pestell, L. Cramphorn, B. Winstone, M. E. Giannaccini, J. Rossiter, and N. F. Lepora, “The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies,” Soft robotics, vol. 5, no. 2, pp. 216–227, 2018.
-  R. B. Hellman, C. Tekin, M. van der Schaar, and V. J. Santos, “Functional contour-following via haptic perception and reinforcement learning,” IEEE transactions on haptics, vol. 11, no. 1, pp. 61–72, 2017.
-  E. Donlon, S. Dong, M. Liu, J. Li, E. Adelson, and A. Rodriguez, “Gelslim: A high-resolution, compact, robust, and calibrated tactile-sensing finger,” in 2018 IEEE/RSJ IROS. IEEE, 2018, pp. 1927–1934.
-  A. Yamaguchi and C. G. Atkeson, “Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables,” in 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE, 2016, pp. 1045–1051.
-  Y. Zhang, Z. Kan, Y. A. Tse, Y. Yang, and M. Y. Wang, “Fingervision tactile sensor design and slip detection using convolutional lstm network,” arXiv preprint arXiv:1810.02653, 2018.
-  R. Li, R. Platt, W. Yuan, A. ten Pas, N. Roscup, M. A. Srinivasan, and E. Adelson, “Localization and manipulation of small parts using gelsight tactile sensing,” in Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on. IEEE, 2014, pp. 3988–3993.
-  G. Izatt, G. Mirano, E. Adelson, and R. Tedrake, “Tracking objects with point clouds from vision and touch,” in 2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2017, pp. 4000–4007.
-  R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine, “More than a feeling: Learning to grasp and regrasp using vision and touch,” IEEE Robotics and Automation Letters, vol. 3, no. 4, pp. 3300–3307, 2018.
-  F. R. Hogan, M. Bauza, O. Canal, E. Donlon, and A. Rodriguez, “Tactile regrasp: Grasp adjustments via simulated tactile transformations,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 2963–2970.
-  N. Pestell, L. Cramphorn, F. Papadopoulos, and N. F. Lepora, “A sense of touch for the shadow modular grasper,” IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 2220–2226, 2019.
-  S. Tian, F. Ebert, D. Jayaraman, M. Mudigonda, C. Finn, R. Calandra, and S. Levine, “Manipulation by feel: Touch-based control with deep predictive models,” arXiv preprint arXiv:1903.04128, 2019.
-  S. Dong, D. Ma, E. Donlon, and A. Rodriguez, “Maintaining grasps within slipping bound by monitoring incipient slip,” in IEEE ICRA, 2018.
-  H. Song, T. Bhattacharjee, and S. S. Srinivasa, “Sensing shear forces during food manipulation: Resolving the trade-off between range and sensitivity.”
-  S. Dong and A. Rodriguez, “Tactile-based insertion for dense box-packing,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019.
-  D. B. Modesitt, “Micro-gripper assembly,” Sept. 10 1991, uS Patent 5,046,773.
-  L. L. Howell, Compliant mechanisms. John Wiley & Sons, 2001.