Paper: Robotic motion coordination based on a geometric deformation measure

Title: Robotic motion coordination based on a geometric deformation measure

Authors: Miguel Aranda, Jose Sanchez, Juan Antonio Corrales Ramon and Youcef Mezouar

Journal: IEEE Systems Journal, doi: 10.1109/JSYST.2021.3107779

Abstract: This article describes a novel approach to achieve motion coordination in a multirobot system based on the concept of deformation. Our main novel contribution is to link these two elements (namely, coordination and deformation). In particular, the core idea of our approach is that the robots’ motions minimize a global measure of the deformation of their positions relative to a prescribed shape. Based on this idea we propose a linear shape controller, that also incorporates a term modeling an affine deformation. We show that the affine term is particularly useful when the deformation to be controlled is large. We also propose controls for the other variables (centroid, rotation, size) that define the geometric configuration of the team. Importantly, these additional controls are completely decoupled from the shape control. The overall approach is simple and robust, and it creates closely coordinated robot motions. Being based on deformation, it is useful in several scenarios involving manipulation tasks: e.g., handling of a highly deformable object, control of an object’s shape, or regulation of the shape formed by the fingertips of a robotic hand. We present simulation and experimental results to validate the proposed approach.

Download paper

Paper: Collision-free Transport of 2D Deformable Objects

Title: Collision-free Transport of 2D Deformable Objects

Authors: Rafael Herguedas, Gonzalo Lopez-Nicolas, Carlos Sagues

Conference: International Conference on Control, Automation, and Systems (ICCAS 2021), Jeju, Korea, October 12-15, 2021

Abstract: We propose a novel system to transport 2D cloth-like deformable objects with mobile manipulators and without collisions along a known path. First, a new deformation model that allows for real-time shape prediction, based on the paradigm of deformable bounding box, is presented. The transport task is next defined as an optimization problem, which includes a set of linear and nonlinear constraints. These constraints allow to limit the object’s deformations and rotations and to avoid obstacles, respectively. Simulation results are reported to demonstrate the validity of our method.

Download paper

Paper: Multi-scale Laplacian-based FMM for shape control

Title: Multi-scale Laplacian-based FMM for shape control

Authors: Ignacio Cuiral-Zueco and Gonzalo Lopez-Nicolas

Conference: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). September 27 – October 1, 2021. Prague, Czech Republic

Abstract: Shape control has become a prominent research field as it enables the automation of tasks in many applications. Overall, deforming an object to a desired target shape by using few grippers is a major challenge. The limited information about the object dynamics, the need to combine small and large deformations in order to achieve certain target shapes and the non-linear nature of most deformable objects are factors that significantly hamper shape control performance. In this paper, we propose a shape control method for multi-robot manipulation of large-strain deformable objects. Our approach is based on multi-scale Laplacian descriptors that feed an FMM (Fast Marching Method) for elastic shape contour matching. The FMM’s resulting path and the Laplacian operator are used to define a control strategy for the robot grippers. Simulation experiments carried out with an ARAP (As Rigid As Possible) deformation model provide satisfactory results.

Download paper

Paper: Enclosing a moving target with an optimally rotated and scaled multiagent pattern

Title: Enclosing a moving target with an optimally rotated and scaled multiagent pattern

Authors: M. Aranda, Y. Mezouar, G. López-Nicolás, C. Sagüés

Journal: International Journal of Control, vol. 94, no. 3, pp. 601-611, 2021

Abstract: We propose a novel control method to enclose a moving target in a two-dimensional setting with a team of agents forming a prescribed geometric pattern. The approach optimises a measure of the overall agent motion costs, via the minimisation of a suitably defined cost function encapsulating the pattern rotation and scaling. We propose two control laws which use global information and make the agents exponentially converge to the prescribed formation with an optimal scale that remains constant, while the team’s centroid tracks the target. One control law results in a multiagent pattern that keeps a constant orientation in the workspace; for the other, the pattern rotates with constant speed. These behaviors, whose optimality and steadiness are very relevant for the task addressed, occur independently from the target’s velocity. Moreover, the methodology does not require distance measurements, common coordinate references, or communications. We also present formal guarantees of collision avoidance for the proposed approach. Illustrative simulation examples are provided.

Download paper

Paper: Distributed Linear Control of Multirobot Formations Organized in Triads

Title: Distributed Linear Control of Multirobot Formations Organized in Triads

Authors: M. Aranda, G. López-Nicolás and Y. Mezouar

Journal: IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 310-317, Oct. 2021

Abstract: This letter addresses the problem of controlling multiple robots to form a prescribed team shape in two-dimensional space. We consider a team organization in interlaced triads (i.e., groups of three robots). For each triad we define a measure of geometric deformation relative to its prescribed shape. Our main contribution is a novel distributed control law, defined as the gradient descent on the sum of these triangular deformation measures. We show that this geometrically motivated control law is linear, and bears analogies with existing formulations. Moreover, in comparisonwith these formulations our controller is simpler and more flexible to design, converges to the globally optimal shape by construction, and allows analysis of the team size dynamics. We illustrate the proposed approach in simulation.

Download paper

Paper: Towards footwear manufacturing 4.0: shoe sole robotic grasping in assembling operations

Title: Towards footwear manufacturing 4.0: shoe sole robotic grasping in assembling operations
Author: Guillermo Oliver, Pablo Gil, Jose F. Gomez, Fernando Torres
Journal: The International Journal of Advanced Manufacturing Technology, 2021

Abstract: In this paper, we present a robotic workcell for task automation in footwear manufacturing such as sole digitization, glue dispensing, and sole manipulation from different places within the factory plant. We aim to make progress towards shoe industry 4.0. To achieve it, we have implemented a novel sole grasping method, compatible with soles of different shapes, sizes, and materials, by exploiting the particular characteristics of these objects. Our proposal is able to work well with low density point clouds from a single RGBD camera and also with dense point clouds obtained from a laser scanner digitizer. The method computes antipodal grasping points from visual data in both cases and it does not require a previous recognition of sole. It relies on sole contour extraction using concave hulls and measuring the curvature on contour areas. Our method was tested both in a simulated environment and in real conditions of manufacturing at INESCOP facilities, processing 20 soles with different sizes and characteristics. Grasps were performed in two different configurations, obtaining an average score of 97.5% of successful real grasps for soles without heel made with materials of low or medium flexibility. In both cases, the grasping method was tested without carrying out tactile control throughout the task.

Paper at Springer

Paper: 3D reconstruction of deformable objects from RGB-D cameras: an omnidirectional inward-facing multi-camera system

Title: 3D reconstruction of deformable objects from RGB-D cameras: an omnidirectional inward-facing multi-camera system
Authors: Eva Curto, Helder Araujo
Conference: 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISAPP’2021)

Abstract: This is a paper describing a system made up of several inward-facing cameras able to perform reconstruction of deformable objects through synchronous acquisition of RGBD data. The configuration of the camera system allows the acquisition of 3D omnidirectional images of the objects. The paper describes the structure of the system as well as an approach for the extrinsic calibration, which allows the estimation of the coordinate transformations between the cameras. Reconstruction results are also presented.
Download paper

Paper: Intel RealSense SR305, D415 and L515: Experimental evaluation and comparison of depth estimation

Title: Intel RealSense SR305, D415 and L515: Experimental evaluation and comparison of depth estimation
Authors: Francisco Lourenco, Helder Araujo
Conference: 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISAPP’2021)

Abstract: In the last few years Intel has launched several low cost RGB-D cameras. Three of these cameras are the SR305, the L415 and the L515. These three cameras are based on different operating principles. The SR305 is based on structured light projection, the D415 is based on stereo based using also the projection of random dots and the L515 is based on LIDAR. In addition they all provide RGB images. In this paper we perform and experimental analysis and comparison of the depth estimation by the three cameras.
Download paper

Paper: RGB-D Sensing of Challenging Deformable Objects

Title: RGB-D Sensing of Challenging Deformable Objects

Authors: Ignacio Cuiral-Zueco and Gonzalo Lopez-Nicolas

Workshop: Workshop on Managing deformation: A step towards higher robot autonomy (MaDef), 25 October – 25 December, 2020

Abstract: The problem of deformable object tracking is prominent in recent robot shape-manipulation research. Additionally, texture-less objects that undergo large deformations and movements lead to difficult scenarios. Three RGB-D sequences of different challenging scenarios are processed in order to evaluate the robustness and versatility of a deformable object tracking method. Everyday objects of different complex characteristics are manipulated and tracked. The tracking system, pushed out the comfort zone, performs satisfactorily.

Webpage

Paper: Experimental multi-camera setup for perception of dynamic objects

Title: Experimental multi-camera setup for perception of dynamic objects

Authors: Rafael Herguedas, Gonzalo Lopez-Nicolas and Carlos Sagues

Workshop: Robotic Manipulation of Deformable Objects (ROMADO), 25 October – 25 December, 2020

Abstract: Currently,  perception  and  manipulation  of  dynamic  objects  represent  an  open  research  problem.  In this paper,  we  show  a  proof  of  concept  of  a  multi-camera  robotic setup   which   is   intended   to   perform   coverage   of   dynamic objects.  The  system  includes  a  set  of  RGB-D  cameras,  which are  positioned  and  oriented  to  cover  the  object’s  contour  as required  in  terms  of  visibility.  An algorithm of a previous study allows us to minimize and configure the cameras so that collisions and occlusions are avoided. We test the validity of the platform with the Robot Operating System (ROS) in simulations with  the  software  Gazebo  and  in  real  experiments  with  Intel RealSense  modules.

Download paper