UnrealROX: an extremely photorealistic virtual reality environment for robotics simulations and synthetic data generation
Por favor, use este identificador para citar o enlazar este ítem:
http://hdl.handle.net/10045/106347
Título: | UnrealROX: an extremely photorealistic virtual reality environment for robotics simulations and synthetic data generation |
---|---|
Autor/es: | Martínez González, Pablo | Oprea, Sergiu | Garcia-Garcia, Alberto | Jover-Álvarez, Álvaro | Orts-Escolano, Sergio | Garcia-Rodriguez, Jose |
Grupo/s de investigación o GITE: | Arquitecturas Inteligentes Aplicadas (AIA) | Robótica y Visión Tridimensional (RoViT) |
Centro, Departamento o Servicio: | Universidad de Alicante. Departamento de Tecnología Informática y Computación | Universidad de Alicante. Departamento de Ciencia de la Computación e Inteligencia Artificial |
Palabras clave: | Robotics | Synthetic data | Grasping |
Área/s de conocimiento: | Arquitectura y Tecnología de Computadores | Ciencia de la Computación e Inteligencia Artificial |
Fecha de publicación: | jun-2020 |
Editor: | Springer Nature |
Cita bibliográfica: | Virtual Reality. 2020, 24: 271-288. doi:10.1007/s10055-019-00399-5 |
Resumen: | Data-driven algorithms have surpassed traditional techniques in almost every aspect in robotic vision problems. Such algorithms need vast amounts of quality data to be able to work properly after their training process. Gathering and annotating that sheer amount of data in the real world is a time-consuming and error-prone task. These problems limit scale and quality. Synthetic data generation has become increasingly popular since it is faster to generate and automatic to annotate. However, most of the current datasets and environments lack realism, interactions, and details from the real world. UnrealROX is an environment built over Unreal Engine 4 which aims to reduce that reality gap by leveraging hyperrealistic indoor scenes that are explored by robot agents which also interact with objects in a visually realistic manner in that simulated world. Photorealistic scenes and robots are rendered by Unreal Engine into a virtual reality headset which captures gaze so that a human operator can move the robot and use controllers for the robotic hands; scene information is dumped on a per-frame basis so that it can be reproduced offline to generate raw data and ground truth annotations. This virtual reality environment enables robotic vision researchers to generate realistic and visually plausible data with full ground truth for a wide variety of problems such as class and instance semantic segmentation, object detection, depth estimation, visual grasping, and navigation. |
Patrocinador/es: | This work has been funded by the Spanish Government TIN2016-76515-R Grant for the COMBAHO project, supported with Feder funds. This work has also been supported by three Spanish national grants for Ph.D. studies (FPU15/04516, FPU17/00166, and ACIF/2018/197), by the University of Alicante Project GRE16-19, and by the Valencian Government Project GV/2018/022. |
URI: | http://hdl.handle.net/10045/106347 |
ISSN: | 1359-4338 (Print) | 1434-9957 (Online) |
DOI: | 10.1007/s10055-019-00399-5 |
Idioma: | eng |
Tipo: | info:eu-repo/semantics/article |
Derechos: | © Springer-Verlag London Ltd., part of Springer Nature 2019 |
Revisión científica: | si |
Versión del editor: | https://doi.org/10.1007/s10055-019-00399-5 |
Aparece en las colecciones: | INV - AIA - Artículos de Revistas INV - RoViT - Artículos de Revistas |
Archivos en este ítem:
Archivo | Descripción | Tamaño | Formato | |
---|---|---|---|---|
Martinez-Gonzalez_etal_2020_VirtualReality_final.pdf | Versión final (acceso restringido) | 3,2 MB | Adobe PDF | Abrir Solicitar una copia |
Martinez-Gonzalez_etal_2020_VirtualReality_preprint.pdf | Preprint (acceso abierto) | 7,24 MB | Adobe PDF | Abrir Vista previa |
Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.