Constrained self-organizing feature map to preserve feature extraction topology

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/55746
Información del item - Informació de l'item - Item information
Título: Constrained self-organizing feature map to preserve feature extraction topology
Autor/es: Azorin-Lopez, Jorge | Saval-Calvo, Marcelo | Fuster-Guilló, Andrés | Garcia-Rodriguez, Jose | Mora, Higinio
Grupo/s de investigación o GITE: Informática Industrial y Redes de Computadores
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Tecnología Informática y Computación
Palabras clave: Self-organizing feature map | Topology preservation | Human behaviour analysis
Área/s de conocimiento: Arquitectura y Tecnología de Computadores
Fecha de publicación: 28-may-2016
Editor: Springer London
Cita bibliográfica: Neural Computing and Applications. 2017, 28(Suppl. 1): 439-459. doi:10.1007/s00521-016-2346-0
Resumen: In many classification problems, it is necessary to consider the specific location of an n-dimensional space from which features have been calculated. For example, considering the location of features extracted from specific areas of a two-dimensional space, as an image, could improve the understanding of a scene for a video surveillance system. In the same way, the same features extracted from different locations could mean different actions for a 3D HCI system. In this paper, we present a self-organizing feature map able to preserve the topology of locations of an n-dimensional space in which the vector of features have been extracted. The main contribution is to implicitly preserving the topology of the original space because considering the locations of the extracted features and their topology could ease the solution to certain problems. Specifically, the paper proposes the n-dimensional constrained self-organizing map preserving the input topology (nD-SOM-PINT). Features in adjacent areas of the n-dimensional space, used to extract the feature vectors, are explicitly in adjacent areas of the nD-SOM-PINT constraining the neural network structure and learning. As a study case, the neural network has been instantiate to represent and classify features as trajectories extracted from a sequence of images into a high level of semantic understanding. Experiments have been thoroughly carried out using the CAVIAR datasets (Corridor, Frontal and Inria) taken into account the global behaviour of an individual in order to validate the ability to preserve the topology of the two-dimensional space to obtain high-performance classification for trajectory classification in contrast of non-considering the location of features. Moreover, a brief example has been included to focus on validate the nD-SOM-PINT proposal in other domain than the individual trajectory. Results confirm the high accuracy of the nD-SOM-PINT outperforming previous methods aimed to classify the same datasets.
Patrocinador/es: This study was supported in part by the University of Alicante, Valencian Government and Spanish government under grants GRE11-01, GV/2013/005 and DPI2013-40534-R.
URI: http://hdl.handle.net/10045/55746
ISSN: 0941-0643 (Print) | 1433-3058 (Online)
DOI: 10.1007/s00521-016-2346-0
Idioma: eng
Tipo: info:eu-repo/semantics/article
Derechos: © The Natural Computing Applications Forum 2016. The final publication is available at Springer via http://dx.doi.org/10.1007/s00521-016-2346-0
Revisión científica: si
Versión del editor: http://dx.doi.org/10.1007/s00521-016-2346-0
Aparece en las colecciones:INV - I2RC - Artículos de Revistas
INV - AIA - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
Thumbnail2016_Azorin_etal_NeuralComput&Applic_final.pdfVersión final (acceso restringido)6,88 MBAdobe PDFAbrir    Solicitar una copia
Thumbnail2016_Azorin_etal_NeuralComput&Applic_preprint.pdfPreprint (acceso abierto)2,79 MBAdobe PDFAbrir Vista previa


Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.