Constrained self-organizing feature map to preserve feature extraction topology
Empreu sempre aquest identificador per citar o enllaçar aquest ítem
http://hdl.handle.net/10045/55746
Títol: | Constrained self-organizing feature map to preserve feature extraction topology |
---|---|
Autors: | Azorin-Lopez, Jorge | Saval-Calvo, Marcelo | Fuster-Guilló, Andrés | Garcia-Rodriguez, Jose | Mora, Higinio |
Grups d'investigació o GITE: | Informática Industrial y Redes de Computadores |
Centre, Departament o Servei: | Universidad de Alicante. Departamento de Tecnología Informática y Computación |
Paraules clau: | Self-organizing feature map | Topology preservation | Human behaviour analysis |
Àrees de coneixement: | Arquitectura y Tecnología de Computadores |
Data de publicació: | 28-de maig-2016 |
Editor: | Springer London |
Citació bibliogràfica: | Neural Computing and Applications. 2017, 28(Suppl. 1): 439-459. doi:10.1007/s00521-016-2346-0 |
Resum: | In many classification problems, it is necessary to consider the specific location of an n-dimensional space from which features have been calculated. For example, considering the location of features extracted from specific areas of a two-dimensional space, as an image, could improve the understanding of a scene for a video surveillance system. In the same way, the same features extracted from different locations could mean different actions for a 3D HCI system. In this paper, we present a self-organizing feature map able to preserve the topology of locations of an n-dimensional space in which the vector of features have been extracted. The main contribution is to implicitly preserving the topology of the original space because considering the locations of the extracted features and their topology could ease the solution to certain problems. Specifically, the paper proposes the n-dimensional constrained self-organizing map preserving the input topology (nD-SOM-PINT). Features in adjacent areas of the n-dimensional space, used to extract the feature vectors, are explicitly in adjacent areas of the nD-SOM-PINT constraining the neural network structure and learning. As a study case, the neural network has been instantiate to represent and classify features as trajectories extracted from a sequence of images into a high level of semantic understanding. Experiments have been thoroughly carried out using the CAVIAR datasets (Corridor, Frontal and Inria) taken into account the global behaviour of an individual in order to validate the ability to preserve the topology of the two-dimensional space to obtain high-performance classification for trajectory classification in contrast of non-considering the location of features. Moreover, a brief example has been included to focus on validate the nD-SOM-PINT proposal in other domain than the individual trajectory. Results confirm the high accuracy of the nD-SOM-PINT outperforming previous methods aimed to classify the same datasets. |
Patrocinadors: | This study was supported in part by the University of Alicante, Valencian Government and Spanish government under grants GRE11-01, GV/2013/005 and DPI2013-40534-R. |
URI: | http://hdl.handle.net/10045/55746 |
ISSN: | 0941-0643 (Print) | 1433-3058 (Online) |
DOI: | 10.1007/s00521-016-2346-0 |
Idioma: | eng |
Tipus: | info:eu-repo/semantics/article |
Drets: | © The Natural Computing Applications Forum 2016. The final publication is available at Springer via http://dx.doi.org/10.1007/s00521-016-2346-0 |
Revisió científica: | si |
Versió de l'editor: | http://dx.doi.org/10.1007/s00521-016-2346-0 |
Apareix a la col·lecció: | INV - I2RC - Artículos de Revistas INV - AIA - Artículos de Revistas |
Arxius per aquest ítem:
Arxiu | Descripció | Tamany | Format | |
---|---|---|---|---|
2016_Azorin_etal_NeuralComput&Applic_final.pdf | Versión final (acceso restringido) | 6,88 MB | Adobe PDF | Obrir Sol·licitar una còpia |
2016_Azorin_etal_NeuralComput&Applic_preprint.pdf | Preprint (acceso abierto) | 2,79 MB | Adobe PDF | Obrir Vista prèvia |
Tots els documents dipositats a RUA estan protegits per drets d'autors. Alguns drets reservats.