Paper: TactileGCN: A graph convolutional network for predicting grasp stability with tactile sensors

Title: TactileGCN: A graph convolutional network for predicting grasp stability with tactile sensors
Author: A. Garcia-Garcia, B.S. Zapata-Impata, S. Orts-Escolano, P. Gil, J. García
Conference: International Joint Conference on Neural Netwoks (IJCNN), 14-19 July 2019
Abstract: Tactile sensors provide useful contact data during the interaction with an object which can be used to accurately learn to determine the stability of a grasp. Most of the works in the literature represented tactile readings as plain feature vectors or matrix-like tactile images, using them to train machine learning models. In this work, we explore an alternative way of exploiting tactile information to predict grasp stability by leveraging graph-like representations of tactile data, which preserve the actual spatial arrangement of the sensor’s taxels and their locality. In experimentation, we trained a Graph Neural Network to binary classify grasps as stable or slippery ones. To train such network and prove its predictive capabilities for the problem at hand, we captured a novel dataset of ~ 5000 three-fingered grasps across 41 objects for training and 1000 grasps with 10 unknown objects for testing. Our experiments prove that this novel approach can be effectively used to predict grasp stability.
Download paper