Paper: Generation of tactile data from 3D vision and target robotic grasps

Title: Generation of tactile data from 3D vision and target robotic grasps
Autor: B.S. Zapata-Impata, P. Gil, Y. Mezouar, F. Torres
Journal: IEEE Transactions on Haptics, July 2020


Abstract:Tactile perception is a rich source of information for robotic grasping: it allows a robot to identify a grasped object and assess the stability of a grasp, among other things. However, the tactile sensor must come into contact with the target object in order to produce readings. As a result, tactile data can only be attained if a real contact is made. We propose to overcome this restriction by employing a method that models the behaviour of a tactile sensor using 3D vision and grasp information as a stimulus. Our system regresses the quantified tactile response that would be experienced if this grasp were performed on the object. We experiment with 16 items and 4 tactile data modalities to show that our proposal learns this task with low error.
Paper at IEEE

Paper: Monocular visual shape tracking and servoing for isometrically deforming objects

Title: Monocular visual shape tracking and servoing for isometrically deforming objects
Author: Miguel Aranda, Juan Antonio Corrales Ramon, Youcef Mezouar, Adrien Bartoli, Erol Özgür
Conference: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 25-29, 2020, Las Vegas, NV, USA (Virtual).

Abstract: We address the monocular visual shape servoing problem. This pushes the challenging visual servoing problem one step further from rigid object manipulation towards deformable object manipulation. Explicitly, it implies deforming the object towards a desired shape in 3D space by robots using monocular 2D vision. We specifically concentrate on a scheme capable of controlling large isometric deformations. Two important open subproblems arise for implementing such a scheme. (P1) Since it is concerned with large deformations, perception requires tracking the deformable object’s 3D shape from monocular 2D images which is a severely underconstrained problem. (P2) Since rigid robots have fewer degrees of freedom than a deformable object, the shape control becomes underactuated. We propose a template-based shape servoing scheme in which we solve these two problems. The template allows us to both infer the object’s shape using an improved Shape-from-Template algorithm and steer the object’s deformation by means of the robots’ movements. We validate the scheme via simulations and real experiments.

Paper download

Conference – Digital transformation in footwear

The industry in general, and the footwear industry in particular, are going through a digital transformation in every sense. With the arrival of the new industrial revolution, known as Industry 4.0, the existing concepts and technologies that were to come and that were going to change the manufacturing and sales processes of products were identified.

INESCOP participated in the online conference “Transformación digital en calzado” which was held on October 6th, 2020. In this conference on “Digital Transformation in Footwear”, we will be able to see those challenges and opportunities that digitalization offers us, from the initial stages of product definition, through manufacturing and production management to the possibilities offered to us to reach the end consumer thanks to new technologies based on the cloud and online sales.

Workshop ROMADO at IROS 2020

We are preparing a workshop on Robotic Manipulation of Deformable Objects (ROMADO) at IROS 2020. Notice that IROS 2020 will not be held in-person due to the COVID-19 pandemic. Instead, the contents of the conference will be provided on the IROS On-Demand platform. This platform will be used to present the contents of this workshop.

This year the access to all the conference contents is free, check details at IROS 2020 web.

More details about the workshop with the list of invited speakers and contributed papers can be found in the workshop web. In the meanwhile, take a look to this introduction video of the workshop ROMADO:

Paper: RGB-D tracking and optimal perception of deformable objects

Title: RGB-D tracking and optimal perception of deformable objects
Author: Ignacio Cuiral-Zueco, Gonzalo López-Nicolás
Journal: IEEE Access, vol. 8, pp. 136884-136897, 2020.


Abstract: Addressing the perception problem of texture-less objects that undergo large deformations and movements, this article presents a novel RGB-D learning-free deformable object tracker in combination with a camera position optimisation system for optimal deformable object perception. The approach is based on the discretisation of the object’s visible area through the generation of a supervoxel graph that allows weighting new supervoxel candidates between object states over time. Once a deformation state of the object is determined, supervoxels of its associated graph serve as input for the camera position optimisation problem. Satisfactory results have been obtained in real time with a variety of objects that present different deformation characteristics.
Download paper

PhD Thesis: Robotic manipulation based on visual and tactile perception

Brayan Stiven Zapata Impata (University of Alicante, Linkedin) is going to defend his PhD Thesis on the topics of COMMANDIA under the supervision of Professor Pablo Gil. The dissertation is planned on September, 17th, 2020 in Alicante, Spain.

Author: Brayan Stiven Zapata Impata
Supervisor: Pablo Gil Vazquez
Dissertation date: September 17th, 2020.
Title: Robotic manipulation based on visual and tactile perception.
Abstract: In this thesis, we provide solutions for various challenges in robotic manipulation. Applying visual perception, a robotic assistant could find grasps on unknown objects. With the use of tactile perception, the robot could predict whether the grasp is stable and even identify in which direction might be slipping the grasped object. As a result, the robot could trigger strategies for keeping the object stable. Finally, integrating our tactile data generation system with the rest of the modules, the assistive robot could feel the grasps before actually moving itself, so less objects would be dropped due to visually stable grasps that are actually slippery.

Paper: Robotic workcell for sole grasping in footwear manufacturing

Title: Robotic workcell for sole grasping in footwear manufacturing
Author: Guillermo Oliver, Pablo Gil, Fernando Torres
Conference: 25th Int. Conf. on Emerging Technologies and Factory Automation (ETFA), Vienna (Austria), 8-11 September 2020.

 
Abstract: The goal of this paper is to present a robotic workcell to automate several tasks of the cementing process in footwear manufacturing. Our cell’s main applications are sole digitization of a wide variety of footwear, glue dispensing and sole grasping from conveyor belts. This cell is made up of a manipulator arm endowed with a gripper, a conveyor belt and a 3D scanner. We have integrated all the elements into a ROS simulation environment facilitating control and communication among them, also providing flexibility to support future extensions. We propose a novel method to grasp soles of different shape, size and material, exploiting the particular characteristics of these objects. Our method relies on object contour extraction using concave hulls. We evaluate it on point clouds of 16 digitized real soles in three different scenarios: concave hull, k-NNs extension and PCA correction. While we have tested this workcell in a simulated environment, the presented system’s performance is scheduled to be tested on a real setup at INESCOP facilities in the upcoming months.

Paper at IEEE

Paper: Blind Manipulation of Deformable Objects Based on Force Sensing and Finite Element Modeling

Title: Blind Manipulation of Deformable Objects Based on Force Sensing and Finite Element Modeling
Author: Jose Sanchez, Kamal Mohy El Dine, Juan Antonio Corrales, Belhassen-Chedli Bouzgarrou and Youcef Mezouar
Journal: Frontiers in Robotics and AI. 09 June 2020. 7:73. doi: 10.3389/frobt.2020.00073
Abstract: In this paper, we present a novel pipeline to simultaneously estimate and manipulate the deformation of an object using only force sensing and an FEM model. The pipeline is composed of a sensor model, a deformation model and a pose controller. The sensor model computes the contact forces that are used as input to the deformation model which updates the volumetric mesh of a manipulated object. The controller then deforms the object such that a given pose on the mesh reaches a desired pose. The proposed approach is thoroughly evaluated in real experiments using a robot manipulator and a force-torque sensor to show its accuracy in estimating and manipulating deformations without the use of vision sensors.
Download paper