Paper: Fast geometry-based computation of grasping points on three-dimensional point clouds

Title: Fast geometry-based computation of grasping points on three-dimensional point clouds
Author: Brayan S Zapata-Impata, Pablo Gil, Jorge Pomares, Fernando Torres
Journal: International Journal of Advanced Robotic Systems, January-February 2019: 1–18, https://doi.org/10.1177/1729881419831846
Abstract: Industrial and service robots deal with the complex task of grasping objects that have different shapes and which are seen from diverse points of view. In order to autonomously perform grasps, the robot must calculate where to place its robotic hand to ensure that the grasp is stable. We propose a method to find the best pair of grasping points given a three-dimensional point cloud with the partial view of an unknown object. We use a set of straightforward geometric rules to explore the cloud and propose grasping points on the surface of the object. We then adapt the pair of contacts to a multi-fingered hand used in experimentation. We prove that, after performing 500 grasps of different objects, our approach is fast, taking an average of 17.5 ms to propose contacts, while attaining a grasp success rate of 85.5%. Moreover, the method is sufficiently flexible and stable to work with objects in changing environments, such as those confronted by industrial or service robots.
Download paper

6ème journée mobilité innovante – Robotique coopérative pour la transitique : convoyage, transfert, manipulation

LabEx IMobS3, the ViaMéca competitiveness cluster, the I-SITE CAP 20-25 and the regional cluster Coboteam proposed this year to devote their event to innovative robotics solutions for workflow.

Among users, productive environments are particularly targeted: industry (manufacturing, food, pharmaceuticals…) and also agriculture, construction, mining and construction… Logistics, order preparation, distribution, etc. are also concerned, as are more specific intra-logistics (hospitals, services, etc.). Users will report successful deployments and integrations and their feedback will motivate new initiatives. The technological (unlocking, innovation), economic (return on investment, Robotics As A Service…), or human (acceptance, change management…) aspects can be addressed. Solution and technology providers, design offices, engineers and integrators, laboratories, will showcase their innovative know-how with regard to the state of the art.

Our colleagues Juan Antonio Corrales and Miguel Aranda presented different works at the “26ème journée mobilité innovante” about collaborative robotics with industrial applications in the framework of COMMANDIA project.

Date: February 7, 2019
Location: Cézeaux Campus, Clermont- Ferrand, France.

Paper: Learning Spatio Temporal Tactile Features with a ConvLSTM for the Direction Of Slip Detection

Title: Learning Spatio Temporal Tactile Features with a ConvLSTM for the Direction Of Slip Detection
Author: Zapata-Impata, Brayan S. and Gil, Pablo and Torres, Fernando
Journal: Sensors 2019, 19(3), 523; https://doi.org/10.3390/s19030523
Abstract: Robotic manipulators have to constantly deal with the complex task of detecting whether a grasp is stable or, in contrast, whether the grasped object is slipping. Recognising the type of slippage—translational, rotational—and its direction is more challenging than detecting only stability, but is simultaneously of greater use as regards correcting the aforementioned grasping issues. In this work, we propose a learning methodology for detecting the direction of a slip (seven categories) using spatio-temporal tactile features learnt from one tactile sensor. Tactile readings are, therefore, pre-processed and fed to a ConvLSTM that learns to detect these directions with just 50 ms of data. We have extensively evaluated the performance of the system and have achieved relatively high results at the detection of the direction of slip on unseen objects with familiar properties (82.56% accuracy).
Paper download

Project COMMANDIA in European Cooperation Day

On 21 September 2018, The municipality of Alicante organised an event for citizens and local actors to show what “European territorial cooperation” really means. In order to do so, Interreg funds were presented along with real European projects promoting cooperation in different fields. José Francisco Gómez (INESCOP) presented project COMMANDIA during the event. The European Cooperation Day is celebrated all over Europe and beyond on 21 September every year, promoting achievements of cooperation among regions.

INESCOP in Futurmoda

INESCOP and RED 21 presented a new scanner at Futurmoda: INESCOP and RED 21, under the concept of “Industrial digitalization and product conceptualization”, presented in IFA (Alicante) during the 17th and 18th of October in the Futurmoda fair, a scanner for the computer design of footwear floors.
INESCOP has developed a set of CAD/CAM tools to meet the needs of the footwear sector in terms of design and manufacture of floors, increasing productivity and efficiency, while reducing errors relating to the transfer of information between the different agents involved in the manufacturing process of floors.

MOMAD 2018

MOMAD, acronym for fashion in Madrid (MOda en MADrid), is a fair that takes

Stand of INESCOP at MOMAD

place usually in September in Madrid fair pavilion, Ifema. MOMAD is the largest fashion showcase of the Iberian Peninsula for the presentation of new collections, new brand and retail concepts, lifestyle and trends.
Under the umbrella brand of ShoesRoom by MOMAD, footwear companies exhibit at MOMAD Fashion and Accesories show. Moreover, ShoesRoom also takes place in winter as a separate event, so they can present Fall/Winter and Spring/Summer collections.

Paper: Non-Matrix Tactile Sensors: How Can Be Exploited Their Local Connectivity For Predicting Grasp Stability?

Title: Non-Matrix Tactile Sensors: How Can Be Exploited Their Local Connectivity For Predicting Grasp Stability?
Author: Brayan S. Zapata-Impata, Pablo Gil, Fernando Torres
Publication: arXiv.org – arXiv:1809.05551
Abstract: Tactile sensors supply useful information during the interaction with an object that can be used for assessing the stability of a grasp. Most of the previous works on this topic processed tactile readings as signals by calculating hand-picked features. Some of them have processed these readings as images calculating characteristics on matrix-like sensors. In this work, we explore how non-matrix sensors (sensors with taxels not arranged exactly in a matrix) can be processed as tactile images as well. In addition, we prove that they can be used for predicting grasp stability by training a Convolutional Neural Network (CNN) with them. We captured over 2500 real three-fingered grasps on 41 everyday objects to train a CNN that exploited the local connectivity inherent on the non-matrix tactile sensors, achieving 94.2% F1-score on predicting stability.
Paper download