Towards vision-based robotic skins: a data-driven, multi-camera tactile sensor

10/31/2019
by   Camill Trueeb, et al.
7

This paper describes the design of a multi-camera optical tactile sensor that provides information about the contact force distribution applied to its soft surface. This information is contained in the motion of spherical particles spread within the surface, which deforms when subject to force. The small embedded cameras capture images of the different particle patterns that are then mapped to the three-dimensional contact force distribution through a machine learning architecture. The design proposed in this paper exhibits a larger contact surface and a thinner structure than most of the existing camera-based tactile sensors, without the use of additional reflecting components such as mirrors. A modular implementation of the learning architecture is discussed that facilitates the scalability to larger surfaces such as robotic skins.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset