Tactile control based on Gaussian images and its application in bi-manual manipulation of deformable objects

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/66233
Información del item - Informació de l'item - Item information
Título: Tactile control based on Gaussian images and its application in bi-manual manipulation of deformable objects
Autor/es: Delgado Rodríguez, Ángel | Corrales Ramón, Juan Antonio | Mezouar, Youcef | Lequievre, Laurent | Jara Bravo, Carlos Alberto | Torres, Fernando
Grupo/s de investigación o GITE: Automática, Robótica y Visión Artificial
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Física, Ingeniería de Sistemas y Teoría de la Señal
Palabras clave: Tactile servoing | Tactile images | Deformable object | Grasping | In-hand manipulation
Área/s de conocimiento: Ingeniería de Sistemas y Automática
Fecha de publicación: ago-2017
Editor: Elsevier
Cita bibliográfica: Robotics and Autonomous Systems. 2017, 94: 148-161. doi:10.1016/j.robot.2017.04.017
Resumen: The field of in-hand robot manipulation of deformable objects is an open and key issue for the next-coming robots. Developing an adaptable and agile framework for the tasks where a robot grasps and manipulates different kinds of deformable objects, is a main goal in the literature. Many research works have been proposed to control the manipulation tasks using a model of the manipulated object. Despite these techniques are precise to model the deformations, they are time consuming and, using them in real environments is almost impossible because of the large amount of objects which the robot could find. In this paper, we propose a model-independent framework to control the movements of the fingers of the hands while the robot executes manipulation tasks with deformable objects. This technique is based on tactile images which are obtained as a common interface for different tactile sensors, and uses a servo-tactile control to stabilize the grasping points, avoid sliding and adapt the contacts’ configuration regarding to position and magnitude of the applied force. Tactile images are obtained using a combination of dynamic Gaussians, which allows the creation of a common representation for tactile data given by different sensors with different technologies and resolutions. The framework was tested on different manipulation tasks where the objects are deformed, and without using a model of them.
Patrocinador/es: Research supported by the Spanish Ministry of Economy, European FEDER funds, Valencia Regional Government and University of Alicante through the projects DPI2015-68087-R, PROMETEO/2013/085 and GRE 15-05. This work has been also supported by the French Government Research Program Investissements d’avenir, through the RobotEx Equipment of Excellence (ANR-10-EQPX-44) and the IMobS3 Laboratory of Excellence (ANR-10-LABX-16-01).
URI: http://hdl.handle.net/10045/66233
ISSN: 0921-8890 (Print) | 1872-793X (Online)
DOI: 10.1016/j.robot.2017.04.017
Idioma: eng
Tipo: info:eu-repo/semantics/article
Derechos: © 2017 Elsevier B.V.
Revisión científica: si
Versión del editor: http://dx.doi.org/10.1016/j.robot.2017.04.017
Aparece en las colecciones:INV - AUROVA - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
Thumbnail2017_Delgado_etal_Robot&AutSys_final.pdfVersión final (acceso restringido)6,83 MBAdobe PDFAbrir    Solicitar una copia
Thumbnail2017_Delgado_etal_Robot&AutSys_accepted.pdfEmbargo 24 meses (acceso abierto: 12 mayo 2019)2,28 MBAdobe PDFAbrir    Solicitar una copia


Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.