Domestic waste detection and grasping points for robotic picking up

Please use this identifier to cite or link to this item:
Información del item - Informació de l'item - Item information
Title: Domestic waste detection and grasping points for robotic picking up
Authors: Gea, Víctor de | Puente Méndez, Santiago T. | Gil, Pablo
Research Group/s: Automática, Robótica y Visión Artificial
Center, Department or Service: Universidad de Alicante. Departamento de Física, Ingeniería de Sistemas y Teoría de la Señal | Universidad de Alicante. Instituto Universitario de Investigación Informática
Keywords: Robotic Grasping | Object Detection | 3D Points Grasping | Domestic waste | Mask-RCNN | Geograsp | Deep learning
Knowledge Area: Ingeniería de Sistemas y Automática
Issue Date: 31-May-2021
Abstract: This paper presents an AI system applied to location and robotic grasping. Experimental setup is based on a parameter study to train a deep-learning network based on Mask-RCNN to perform waste location in indoor and outdoor environment, using five different classes and generating a new waste dataset. Initially the AI system obtain the RGBD data of the environment, followed by the detection of objects using the neural network. Later, the 3D object shape is computed using the network result and the depth channel. Finally, the shape is used to compute grasping for a robot arm with a two-finger gripper. The objective is to classify the waste in groups to improve a recycling strategy.
Sponsor: This research was funded by Spanish Government through the project RTI2018-094279-B-I00. Besides, computer facilities were provided by Valencian Government and FEDER through the IDIFEFER/2020/003.
Language: eng
Type: info:eu-repo/semantics/conferenceObject
Rights: © The authors
Peer Review: no
Appears in Collections:INV - AUROVA - Comunicaciones a Congresos Internacionales

Files in This Item:
Files in This Item:
File Description SizeFormat 
Thumbnail2105.06825.pdf440,72 kBAdobe PDFOpen Preview

Items in RUA are protected by copyright, with all rights reserved, unless otherwise indicated.