Tactile perception plays a crucial role in modern robotics, opening new frontiers in human-robot interaction and significantly increasing the environmental awareness of autonomous robots. In addition to visual estimation, humans and animals actively use tactile sensors in their skin and muscles to maintain balance and perform various agile motions[1, 2]. However, high attention has been brought to visual feedback systems in the field of legged robot locomotion, for instance, the laser range finder applied for surface adaptation by Plagemann et al. , stereo-vision system proposed by Sabe et al. , or infrared (IR) camera combined with ultrasound sensors proposed by Chen et al. .
Several works estimate the surface for legged robot locomotion through evaluation of joint position . Camurri et al.  developed a Pronto state estimator for legged robots that can integrate pose corrections from the RGB camera, LIDAR, and odometry feedback. Sarkisov et al. 
introduced a novel landing gear that allows surface profile estimation based on foot-pad IMU orientation and joint angles. Zhang et al. explored visual-based estimation of the tactile patterns by designing a robotic skin with a painted inner surface and installing a camera inside the robot leg. Smith et al. 
suggested coupling data from foot contact sensors and Inertial Measurement Unit (IMU) to teach quadruped robot locomotion skills via Reinforcement Learning. A hybrid tactile sensor-based system was proposed by Luneckas et al., which are used in hexapod-legged robots to overcome obstacles. The sensor was designed to combine a limit switch and flexible polypropylene material that was connected with foot by silicone material, allowing the robot to estimate solid ground obstacles. Legged robots are currently using direct feedback from the environment such as sonar, vision, LIDAR, and force feedback from joint actuators. Tactile sensors have recently been applied to expand the awareness of collaborative robots to its environment by feedback from a skin-like surface. In case of the legged robot, such sensors may be used beneath the robot’s feet to estimate the properties of the surface. Adding tactile sensing to the robot’s feet can be beneficial for walking in challenging terrains in the same way haptic sensing plays an important role for animal locomotion in the nature.
In this paper, we present the Touch Sensitive Foot (TSF), which is able to recognize the surface texture where the robot walks on with the help of trained CNN model. This research opens an efficient way to achieve environmental awareness for autonomous robots. So, the robot gait can be predetermined and the robot can walk on unknown terrains.
Ii Related Works
The concept of haptic perception in robotic systems has been extensively applied in prototyping manipulators, mobile robots, underwater robots , and drones. Several approaches towards surface estimation have been proposed, including integrated force and tactile sensors in the joints , surface, and inner structures  of the robotic limbs. For example, Tsetserukou et al.  introduced a whole-sensitive robotic arm with optical torque sensors embedded in its joints. Contact force detection and control for robotic arm by joint torque sensors were investigated Dong et al. . The proposed methods allow robots to efficiently estimate contact with surfaces, however, joint sensors are unable to measure fine details of the surface texture.
To estimate the distribution of forces during contact with the environment, a higher number of sensors should be embedded in robotic limbs. A pressure-sensitive skin that can be adapted to complex geometries was introduced by Fritzsche et al.  for safe human-robot interaction. This concept was further explored by Cheng et al.  presenting a humanoid robot with a sensor array on its surface. This system was enhanced with a low-resolution robot skin located on the bipedal robot soles suggested by Guadarrama-Olvera et al.  to reconstruct the supporting polygon and the pressure footprint online. The two-layer design of artificial robotic skin was suggested by Klimaszewski et al. , which allows measuring the location, value, and direction of pressure from external force. Liu et al.  developed a large-scaled artificial sensitive skin for robots based on electrical impedance tomography. Dilibal et al.  proposed soft sensors for robotic grippers by a screen printing process with flexible material and ionic liquids.
The aforementioned approaches allow to cover large areas with sensors, however, most of them lack the resolution necessary to be placed at the feet of a quadruped robot. There are, however, few displays that are developed to provide a high-resolution tactile data to the robot limbs. For example, a small thumb-sized vision-based sensor developed by Sun et al.  provides data with spatial resolution of 0.4 and can be applied for dexterous manipulation. To obtain a high resolution of tilt estimation of the mobile charging robot, Okunevich et al.  proposed to evaluate tactile patterns. The developed vision-tactile perception system allows precise positioning of the charger end-effector and ensures a reliable connection between the electrodes of the two robots.
This paper presents a novel perception system for quadruped robots with CNN-based texture recognition from the data of a high-resolution tactile sensor array embedded in the sole of robotic foot. We evaluated system performance with eight 3D-printed samples of the surface. The proposed approach aims at improving the navigation of quadruped robots and their environmental awareness through special patterns placed on the surface and potentially teaching the adaptive locomotion of robots.
Iii DogTouch System Overview
All components of the system can be divided into three main modules, as shown in Fig. 2
: Touch Sensitive Foot (TSF) with tactile sensor array, ESP32 microcontroller, and CNN model running on NVIDIA Jetson Nano to classify textures.
The system works as follows: the ESP32 reads the tactile sensor arrays to recognize whether or not the sole touched the ground. When contact occurs, the ESP32 obtains the data matrix from the tactile sensor array and sends it to the CNN model running on Jetson Nano computer. The CNN model has been trained to recognize the surface textures. When the robot is aware of the texture type, the robot can localize itself (considering that the patterns are priory located in the specific configuration on the floor or pavement) and to optimize the gait to avoid slippage.
Iii-a Leg Design of Quadruped Robot
We have developed a unique customizable leg for a quadruped robot. The leg was designed to decrease the inertia, which is critical for the robot to have stable and efficient locomotion. 3D printed and carbon fiber parts were used for the fabrication of robotic legs. The manufactured legs have not only a lightweight structure but also a high strength. Each leg has 3 degrees of freedom: hip joint, upper leg joint and lower leg joint. Joints are actuated by RDS5160 SSG high torque digital servo motors with 7maximum torque. Each servo motor is driven with 8.4 and 2.5 maximum current. The TSF was 3D printed using TPU (Thermoplastic polyurethane) material, which is flexible and strong enough to walk on harsh terrains. The tactile sensor (see III-B) was installed in the sole as shown in Fig. 3.
Once the foot touches the ground, the tactile sensor data is used to recognize surface texture with the CNN model.
Iii-B Embedded Tactile Sensor Array
The TSF relies on the high-resolution tactile sensor array proposed by Yem et. al. . The sensor is integrated into the soft sole of the robotic leg to provide the high-resolution perception of surface texture. High-resolution tactile sensor arrays allow the quadruped robot to collect detailed data of the textured surface. It is capable of sensing the maximum contact area of 5.8 with a resolution of 100 points per frame. The sensing frequency is 120 (frames per second). The sensors allow the system to precisely detect the pressure on the small surface protrusions. The force detection range of the sensors is from 1 to 9 .
Iii-C CNN Model for Tactile Perception
The model receives the tactile sensor data as a three-dimensional matrix with the shape of
. The resolution of the data is relatively low in comparison with high-resolution camera frames or point cloud datasets. Therefore, our architecture does not include max pooling layers or strided convolutions in order to preserve the information. Such additions in the neural network architecture could be considered in the future with a higher number of tactile sensors or larger areas covered by tactile arrays. After convolutional layers, the data with the shape of
was flattened to one-dimensional vectors with 12800 elements.
Finally, after three linear layers with output dimensions of 256, 128, and 8 (the number of texture types), the model output was received as the matrix with predictions for each class for all inputs in the batch.
Iv Texture Recognition Experiment
The experiment was conducted with eight different textured patterns shown in Fig. 5.
The following textured patterns were selected: diagonal lines with 1 width and 5 interval (Fig. 5a), dots of 1 diameter with 1 interval (Fig. 5b), vertical lines with 1 width and 5 interval (Fig. 5c), dots of 3 diameter with 1 interval (Fig. 5d), dots of 5 diameter with 1 interval (Fig. 5e), grid with 5 interval (Fig. 5f), dots of 1 diameter with 5 interval (Fig. 5g), cylinders of 3 diameter with 1 interval (Fig. 5h). The sample of each ground surface pattern was 3D-printed with a size of 50x50 from the PLA material. The selected patterns vary in profile and resolution of the textures. In this research, we hypothesized that the proposed sensor arrays would allow us to obtain noticeable differences in the sensor readings with a minimal change of 2 in the size of texture element.
Iv-a Dataset collection
For training and validation of the CNN-based classification model, we collected a dataset including 800 data arrays from the tactile sensor (100 data arrays for 8 textured pattern). Dataset was divided into (90%) for the train part, and (10%) for the validation. TSF stepped on a certain textured plate 100 times, each time at a different angle. When there was a contact between the robotic leg and pattern, the system recorded the array in the dataset.
Iv-B Experimental Results
The results of the CNN training are shown in Fig. 6.
The training was conducted on a computer using the NVIDIA Tesla V100 GPU. Validation accuracy for our trained CNN-based model equals 74.37%. After 12 epochs of training, accuracy does not change. Learning time was 12.6 for the CNN-based model. After the experiment, we conclude that prediction for larger spheres is the same as for the cylindrical pattern. Line patterns demonstrated a high prediction rate of 90%.
V Conclusions and Future Work
A novel quadruped robot DogTouch is developed to leverage the tactile sensing of the robotic leg for the surface detection. The proposed CNN-driven tactile perception system using data from tactile pressure sensors recognizes different textured patterns under the foot of the quadruped robot in 74.37% cases in average. The highest prediction rate of 90% is achieved for line texture pattern. The neural network has sufficient accuracy in texture recognition. The sample, which is introduced into the network, is a part of a particular surface type with a certain texture. The cases, where the sample contains several types of texture, were not taken into account, but possible in the case of a continuous walk over an area (with a mixture of textures). With some modifications to the neural network presented in the paper (e.g. separating samples with texture mixes into additional classes or segmenting texture by its type and adding the dropout layers to reduce overfitting) higher accuracy of the texture prediction could be achieved in the future.
The proposed technology DogTouch can potentially considerably improve the robustness of navigation of legged robots regardless of the lighting conditions. Leveraging the sense of touch, such robots can navigate in unknown environment by reading the information from the tactile paving textured surface. Additionally, robots will be capable to adapt the gait to the detected type of surface to avoid slippage.
-  Albertsen, I. M., Temprado, J. J., & Berton, E. (2010). “Effect of haptic supplementation on postural stabilization: A comparison of fixed and mobile support conditions,” Human movement science, 29(6), 999–1010. https://doi.org/10.1016/j.humov.2010.07.013
-  Martinelli, A. R., Coelho, D. B., Magalhães, F. H., Kohn, A. F., & Teixeira, L. A. (2015). “Light touch modulates balance recovery following perturbation: from fast response to stance restabilization,” Experimental brain research, 233(5), 1399–1408. https://doi.org/10.1007/s00221-015-4214-z
-  C. Plagemann, S. Mischke, S. Prentice, K. Kersting, N. Roy and W. Burgard, “Learning predictive terrain models for legged robot locomotion,” 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, pp. 3545-3552, doi: 10.1109/IROS.2008.4651026.
-  K. Sabe, M. Fukuchi, J. . -S. Gutmann, T. Ohashi, K. Kawamoto and T. Yoshigahara, “Obstacle avoidance and path planning for humanoid robots using stereo vision,” IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA ’04. 2004, 2004, pp. 592-597 Vol.1, doi: 10.1109/ROBOT.2004.1307213.
-  Chen, W., Ren, G., Wang, J. et al. “An adaptive locomotion controller for a hexapod robot: CPG, kinematics and force feedback,” Sci. China Inf. Sci. 57, 1–18 (2014). https://doi.org/10.1007/s11432-014-5148-y
-  “Virtual Sensors for Walking Robots,” In: Quadrupedal Locomotion. Springer, London. https://doi.org/10.1007/1-84628-307-8_8
-  M. Camurri, M. Ramezani, S. Nobili, and M. Fallon, “Pronto: A Multi-Sensor State Estimator for Legged Robots in Real-World Scenarios,” Frontiers in Robotics and AI, vol. 7, Jun. 2020, doi: 10.3389/frobt.2020.00068.
-  Y. S. Sarkisov, G. A. Yashin, E. V. Tsykunov and D. Tsetserukou, “DroneGear: A Novel Robotic Landing Gear With Embedded Optical Torque Sensors for Safe Multicopter Landing on an Uneven Surface,” in IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 1912-1917, July 2018, doi: 10.1109/LRA.2018.2806080.
-  G. Zhang, Y. Du, Y. Zhang and M. Y. Wang, “A Tactile Sensing Foot for Single Robot Leg Stabilization,” 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 14076-14082, doi: 10.1109/ICRA48506.2021.9560967.
-  Smith, L., Kew, J.C., Peng, X.B., Ha, S., Tan, J., & Levine, S. (2021). “Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World,” ArXiv, abs/2110.05457.
-  Luneckas, M., Luneckas, T., Udris, D. et al. “A hybrid tactile sensor-based obstacle overcoming method for hexapod walking robots,” Intel Serv Robotics 14, 9–24 (2021). https://doi.org/10.1007/s11370-020-00340-9
-  Subad, Rafsan A.S.I., Liam B. Cross, and Kihan Park. 2021. “Soft Robotic Hands and Tactile Sensors for Underwater Robotics,” Applied Mechanics 2, no. 2: 356-382. https://doi.org/10.3390/applmech202002
-  Grossard, Mathieu, Javier Martin, and Benoît Huard. 2015. “Force-Sensing Actuator with a Compliant Flexure-Type Joint for a Robotic Manipulator,” Actuators 4, no. 4: 281-300. https://doi.org/10.3390/act4040281
-  “Muscle sensor controls robot arm,” Nature 517, 124 (2015). https://doi.org/10.1038/517124b
-  D. Tsetserukou, R. Tadakuma, H. Kajimoto, N. Kawakami and S. Tachi, “Intelligent Variable Joint Impedance Control and Development of a New Whole-Sensitive Anthropomorphic Robot Arm,” 2007 International Symposium on Computational Intelligence in Robotics and Automation, 2007, pp. 338-343, doi: 10.1109/CIRA.2007.382885
-  Dong, Y., Ren, T., Hu, K. et al. “Contact force detection and control for robotic polishing based on joint torque sensors,” Int J Adv Manuf Technol 107, 2745–2756 (2020). https://doi.org/10.1007/s00170-020-05162-8
-  M. Fritzsche, N. Elkmann and E. Schulenburg, “Tactile sensing: A key technology for safe physical human robot interaction,” 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2011, pp. 139-140, doi: 10.1145/1957656.1957700.
-  G. Cheng, E. Dean-Leon, F. Bergner, J. Rogelio Guadarrama Olvera, Q. Leboutet and P. Mittendorfer, “A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications,” in Proceedings of the IEEE, vol. 107, no. 10, pp. 2034-2051, Oct. 2019, doi: 10.1109/JPROC.2019.2933348.
-  J. Rogelio Guadarrama-Olvera, F. Bergner, E. Dean and G. Cheng, “Enhancing Biped Locomotion on Unknown Terrain Using Tactile Feedback,” 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), 2018, pp. 1-9, doi: 10.1109/HUMANOIDS.2018.8625024.
-  J. Klimaszewski, D. Janczak, and P. Piorun. “Tactile Robotic Skin with Pressure Direction Detection,” Sensors. 2019; 19(21):4697. https://doi.org/10.3390/s19214697
-  Liu, K., Wu, Y., Wang, S., Wang, H., Chen, H., Chen, B. and Yao, J. (2020), “Artificial Sensitive Skin for Robotics Based on Electrical Impedance Tomography,” Adv. Intell. Syst., 2: 1900161. https://doi.org/10.1002/aisy.201900161
-  Dilibal, S., Sahin, H., Danquah, J.O. et al. “Additively Manufactured Custom Soft Gripper with Embedded Soft Force Sensors for an Industrial Robot,” Int. J. Precis. Eng. Manuf. 22, 709–718 (2021). https://doi.org/10.1007/s12541-021-00479-0
-  Sun, H., Kuchenbecker, K.J. & Martius, G. “A soft thumb-sized vision-based sensor with accurate all-round force perception,” Nat Mach Intell 4, 135–145 (2022). https://doi.org/10.1038/s42256-021-00439-3
-  I. Okunevich, D. Trinitatova, P. Kopanev and D. Tsetserukou, “MobileCharger: an Autonomous Mobile Robot with Inverted Delta Actuator for Robust and Safe Robot Charging,” 2021 26th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA ), 2021, pp. 1-8, doi: 10.1109/ETFA45728.2021.9613366.
-  V. Yem and H. Kajimoto, “Wearable tactile device using mechanical and electrical stimulation for fingertip interaction with virtual world,” 2017 IEEE Virtual Reality (VR), 2017, pp. 99-104, doi: 10.1109/VR.2017.7892236.