Portable 3D laser-camera calibration system with color fusion for SLAM

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/34242
Información del item - Informació de l'item - Item information
Título: Portable 3D laser-camera calibration system with color fusion for SLAM
Autor/es: Navarrete, Javier | Viejo Hernando, Diego | Cazorla, Miguel
Grupo/s de investigación o GITE: Robótica y Visión Tridimensional (RoViT)
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Ciencia de la Computación e Inteligencia Artificial | Universidad de Alicante. Instituto Universitario de Investigación Informática
Palabras clave: 2D-3D calibration | RGB-D information | Color fusion | SLAM
Área/s de conocimiento: Ciencia de la Computación e Inteligencia Artificial
Fecha de publicación: 29-ago-2012
Resumen: Nowadays, the use of RGB-D sensors have focused a lot of research in computer vision and robotics. These kinds of sensors, like Kinect, allow to obtain 3D data together with color information. However, their working range is limited to less than 10 meters, making them useless in some robotics applications, like outdoor mapping. In these environments, 3D lasers, working in ranges of 20-80 meters, are better. But 3D lasers do not usually provide color information. A simple 2D camera can be used to provide color information to the point cloud, but a calibration process between camera and laser must be done. In this paper we present a portable calibration system to calibrate any traditional camera with a 3D laser in order to assign color information to the 3D points obtained. Thus, we can use laser precision and simultaneously make use of color information. Unlike other techniques that make use of a three-dimensional body of known dimensions in the calibration process, this system is highly portable because it makes use of small catadioptrics that can be placed in a simple manner in the environment. We use our calibration system in a 3D apping system, including Simultaneous Location and Mapping (SLAM), in order to get a 3D colored map which can be used in different tasks. We show that an additional problem arises: 2D cameras information is different when lighting conditions change. So when we merge 3D point clouds from two different views, several points in a given neighborhood could have different color information. A new method for color fusion is presented, obtaining correct colored maps. The system will be tested by applying it to 3D reconstruction.
Descripción: Paper submitted to the 43rd International Symposium on Robotics (ISR2012), Taipei, Taiwan, Aug. 29-31, 2012.
Patrocinador/es: This work has been supported by project DPI2009-07144 from Ministerio de Investigacion, Ciencia e Innovacion (Spain) and GRE10-35 from Universidad de Alicante (Spain).
URI: http://hdl.handle.net/10045/34242
Idioma: eng
Tipo: info:eu-repo/semantics/conferenceObject
Derechos: Licencia Creative Commons Reconocimiento 4.0
Revisión científica: si
Aparece en las colecciones:INV - RoViT - Comunicaciones a Congresos, Conferencias, etc.

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailISR 2012 Final Paper Template.pdf2,98 MBAdobe PDFAbrir Vista previa


Este ítem está licenciado bajo Licencia Creative Commons Creative Commons