Publications
Deep Visuo-Tactile Learning: Estimation of Tactile Properties from Images (Extended Abstract)
IJCAI2020
Area
By : Kuniyuki Takahashi, Jethro Tan
Estimation of tactile properties from vision, such as slipperiness or roughness, is important to effectively interact with the environment. These tactile properties help humans, as well as robots, decide which actions they should choose and how to perform them. We, therefore, propose a model to estimate the degree of tactile properties from visual perception alone (e.g., the level of slipperiness or roughness). Our method extends an encoder-decoder network, in which the latent variables are visual and tactile features. In contrast to previous works, our method does not require manual labeling, but only RGB images and the corresponding tactile sensor data. All our data is collected with a webcam and tactile sensor mounted on the end-effector of a robot, which strokes the material surfaces. We show that our model generalizes to materials not included in the training data.