Papers at “Jornada de Jóvenes Investigadores del I3A”

Published in “Jornada de Jóvenes Investigadores del I3A”, vol. 7 (Proceedings of the VIII Jornada de Jóvenes Investigadores del I3A – June 6, 2019).

  • R. Herguedas, G. López-Nicolás, C. Sagüés. Minimal multi-camera system for perception of deformable shapes. (Link)
  • J. Martínez-Cesteros, G. López-Nicolás. Automatic image dataset generation for footwear detection. (Link)
  • E. Hernández-Murillo, R. Aragüés, G. López-Nicolás. Volumetric object reconstruction in multi-camera scenarios. (Link)

I3A Young Researchers Conference

We presented three different works of COMMANDIA in the “Jornada de Jóvenes Investigadores del I3A”, which were held on June 6 2019 in Zaragoza, Spain:

  • R. Herguedas, G. López-Nicolás, C. Sagüés. Minimal multi-camera system for perception of deformable shapes. (Link)
  • J. Martínez-Cesteros, G. López-Nicolás. Automatic image dataset generation for footwear detection. (Link)
  • E. Hernández-Murillo, R. Aragüés, G. López-Nicolás. Volumetric object reconstruction in multi-camera scenarios. (Link)

The journal “Jornada de Jóvenes Investigadores del I3A” (ISSN:2341-4790) collects the proceedings of the conferences that are held annually since 2012 at the Instituto Universitario de Investigación en Ingeniería de Aragón (I3A), belonging to the University of Zaragoza. These conferences are the meeting point for researchers starting their research career at the I3A.

Mobile manipulator at COMMANDIA

Let us introduce our mobile robotic platform Campero. This is a mobile manipulator prototype developed in the framework of project COMMANDIA. The main goal is the definition, design and implementation of integrated functionalities in robotic platforms that extend the capabilities of robotic systems for the manipulation of deformable objects in the context of industrial production. With this platform, we will provide a laboratory prototype of a multi-sensorial multi-robot with manipulation and ground locomotion capabilities, increasing precision in complex autonomous manipulation tasks of deformable objects.

Paper: Deformation-Based Shape Control with a Multirobot System

Title: Deformation-based shape control with a multirobot system
Authors: Miguel Aranda, Juan Antonio Corrales and Youcef Mezouar
Conference: IEEE International Conference on Robotics and Automation, May 20-24, 2019 Montreal, Canada.
Abstract: We present a novel method to control the relative positions of the members of a robotic team. The application scenario we consider is the cooperative manipulation of a deformable object in 2D space. A typical goal in this kind of scenario is to minimize the deformation of the object with respect to a desired state. Our contribution, then, is to use a global measure of deformation directly in the feedback loop. In particular, the robot motions are based on the descent along the gradient of a metric that expresses the difference between the team’s current configuration and its desired shape. Crucially, the resulting multirobot controller has a simple expression and is inexpensive to compute, and the approach lends itself to analysis of both the transient and asymptotic dynamics of the system. This analysis reveals a number of properties that are interesting for a manipulation task: fundamental geometric parameters of the team (size, orientation, centroid, and distances between robots) can be suitably steered or bounded. We describe different policies within the proposed deformation-based control framework that produce useful team behaviors. We illustrate the methodology with computer simulations.
Download paper

Download video

IEEE International Conference on Robotics and Automation 2019

Miguel Aranda presented at ICRA2019 the work entitled “Deformation-Based Shape Control with a Multirobot System”, which was coauthored by Juan Antonio Corrales and Youcef Mezouar. The conference was held on May 20-24, 2019 Montreal, Canada.

Abstract: We present a novel method to control the relative positions of the members of a robotic team. The application scenario we consider is the cooperative manipulation of a deformable object in 2D space. A typical goal in this kind of scenario is to minimize the deformation of the object with respect to a desired state. Our contribution, then, is to use a global measure of deformation directly in the feedback loop. In particular, the robot motions are based on the descent along the gradient of a metric that expresses the difference between the team’s current configuration and its desired shape. Crucially, the resulting multirobot controller has a simple expression and is inexpensive to compute, and the approach lends itself to analysis of both the transient and asymptotic dynamics of the system. This analysis reveals a number of properties that are interesting for a manipulation task: fundamental geometric parameters of the team (size, orientation, centroid, and distances between robots) can be suitably steered or bounded. We describe different policies within the proposed deformation-based control framework that produce useful team behaviors. We illustrate the methodology with computer simulations.

Download paper

Paper: Framework for Fast Experimental Testing of Autonomous Navigation Algorithms

Title: Framework for Fast Experimental Testing of Autonomous Navigation Algorithms
Author: Muñoz–Bañón MÁ, del Pino I, Candelas FA, Torres F. Framework for Fast Experimental Testing of Autonomous Navigation Algorithms.
Journal: Applied Sciences. 2019; 9(10):1997. doi:10.3390/app9101997
Abstract: Research in mobile robotics requires fully operative autonomous systems to test and compare algorithms in real-world conditions. However, the implementation of such systems remains to be a highly time-consuming process. In this work, we present a robot operating system (ROS)-based navigation framework that allows the generation of new autonomous navigation applications in a fast and simple way. Our framework provides a powerful basic structure based on abstraction levels that ease the implementation of minimal solutions with all the functionalities required to implement a whole autonomous system. This approach helps to keep the focus in any sub-problem of interest (i.g. localization or control) while permitting to carry out experimental tests in the context of a complete application. To show the validity of the proposed framework we implement an autonomous navigation system for a ground robot using a localization module that fuses global navigation satellite system (GNSS) positioning and Monte Carlo localization by means of a Kalman filter. Experimental tests are performed in two different outdoor environments, over more than twenty kilometers. All the developed software is available in a GitHub repository.
Download paper

Paper: Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico

Title: Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico
Author: Edison Velasco, Brayan S. Zapata-Impata, Pablo Gil, Fernando Torres
Journal: Revista Iberoamericana de Automática e Informática industrial, [S.l.], abr. 2019. ISSN 1697-7920. doi: https://doi.org/10.4995/riai.2019.10923.
Abstract: Este trabajo presenta un método para clasificar objetos agarrados con una mano robótica multidedo combinando en un descriptor híbrido datos propioceptivos y táctiles. Los datos propioceptivos se obtienen a partir de las posiciones articulares de la mano y los táctiles del contacto registrado por células de presión en las falanges. La aproximación propuesta permite identificar el objeto, extrayendo de la pose de la mano la geometría de contacto y de los sensores táctiles la estimación de la rigidez y flexibilidad de éste. El método muestra que usar datos bimodales de distinta naturaleza y técnicas de aprendizaje supervisado mejora la tasa de reconocimiento. En la experimentación, se han llevado a cabo más de 3000 agarres de hasta 7 objetos domésticos distintos, obteniendo clasificaciones correctas del 95% con métrica F1, sin necesidad de ejecutar múltiples palpaciones del objeto. Además, la generalización del método se ha verificado entrenando nuestro sistema con ciertos objetos y clasificando otros nuevos sin conocimiento previo alguno de estos.
Download paper

Paper: 3DCNN Performance in Hand Gesture Recognition Applied to Robot Arm Interaction

Title: 3DCNN Performance in Hand Gesture Recognition Applied to Robot Arm Interaction
Author: Castro-Vargas, J., Zapata-Impata, B., Gil, P., Garcia-Rodriguez, J. and Torres, F.
Conference: In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods – Volume 1, 2019: ICPRAM, ISBN 978-989-758-351-3, pages 802-806. DOI: 10.5220/0007570208020806
Abstract: In the past, methods for hand sign recognition have been successfully tested in Human Robot Interaction (HRI) using traditional methodologies based on static image features and machine learning. However, the recognition of gestures in video sequences is a problem still open, because current detection methods achieve low scores when the background is undefined or in unstructured scenarios. Deep learning techniques are being applied to approach a solution for this problem in recent years. In this paper, we present a study in which we analyse the performance of a 3DCNN architecture for hand gesture recognition in an unstructured scenario. The system yields a score of 73% in both accuracy and F1. The aim of the work is the implementation of a system for commanding robots with gestures recorded by video in real scenarios.
Download paper