Portable 3D laser-camera calibration system with color fusion for SLAM

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/34241
Información del item - Informació de l'item - Item information
Título: Portable 3D laser-camera calibration system with color fusion for SLAM
Autor/es: Navarrete, Javier | Viejo Hernando, Diego | Cazorla, Miguel
Grupo/s de investigación o GITE: Robótica y Visión Tridimensional (RoViT)
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Ciencia de la Computación e Inteligencia Artificial | Universidad de Alicante. Instituto Universitario de Investigación Informática
Palabras clave: 2D-3D calibration | RGB-D information | Color fusion | SLAM
Área/s de conocimiento: Ciencia de la Computación e Inteligencia Artificial
Fecha de publicación: 1-mar-2013
Editor: AUSMT
Cita bibliográfica: International Journal of Automation and Smart Technology (AUSMT). 2013, 3(1): 29-35. doi:10.5875/ausmt.v3i1.163
Resumen: Nowadays, the use of RGB-D sensors have focused a lot of research in computer vision and robotics. These kinds of sensors, like Kinect, allow to obtain 3D data together with color information. However, their working range is limited to less than 10 meters, making them useless in some robotics applications, like outdoor mapping. In these environments, 3D lasers, working in ranges of 20-80 meters, are better. But 3D lasers do not usually provide color information. A simple 2D camera can be used to provide color information to the point cloud, but a calibration process between camera and laser must be done. In this paper we present a portable calibration system to calibrate any traditional camera with a 3D laser in order to assign color information to the 3D points obtained. Thus, we can use laser precision and simultaneously make use of color information. Unlike other techniques that make use of a three-dimensional body of known dimensions in the calibration process, this system is highly portable because it makes use of small catadioptrics that can be placed in a simple manner in the environment. We use our calibration system in a 3D mapping system, including Simultaneous Location and Mapping (SLAM), in order to get a 3D colored map which can be used in different tasks. We show that an additional problem arises: 2D cameras information is different when lighting conditions change. So when we merge 3D point clouds from two different views, several points in a given neighborhood could have different color information. A new method for color fusion is presented, obtaining correct colored maps. The system will be tested by applying it to 3D reconstruction.
Patrocinador/es: This work has been supported by project DPI2009-07144 from Ministerio de Investigación, Ciencia e Innovación (Spain) and GRE10-35 from Universidad de Alicante (Spain).
URI: http://hdl.handle.net/10045/34241
ISSN: 2223-9766
DOI: 10.5875/ausmt.v3i1.163
Idioma: eng
Tipo: info:eu-repo/semantics/article
Derechos: Copyright © 2013 International Journal of Automation and Smart Technology
Revisión científica: si
Versión del editor: http://dx.doi.org/10.5875/ausmt.v3i1.163
Aparece en las colecciones:INV - RoViT - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
Thumbnailausmt.pdfPreprint1,94 MBAdobe PDFAbrir Vista previa
Thumbnail2013_Navarrete_etal_AUSMT.pdfVersión final1,23 MBAdobe PDFAbrir Vista previa


Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.