Adaptive Human Action Recognition With an Evolving Bag of Key Poses

Please use this identifier to cite or link to this item: http://hdl.handle.net/10045/37068
Información del item - Informació de l'item - Item information
Title: Adaptive Human Action Recognition With an Evolving Bag of Key Poses
Authors: Chaaraoui, Alexandros Andre | Flórez-Revuelta, Francisco
Research Group/s: Informática Industrial y Redes de Computadores
Center, Department or Service: Universidad de Alicante. Departamento de Tecnología Informática y Computación
Keywords: Evolutionary computing and genetic algorithms | Feature evaluation and selection | Human computer interaction | Vision and Scene Understanding
Knowledge Area: Arquitectura y Tecnología de Computadores
Issue Date: 7-Apr-2014
Publisher: IEEE
Citation: IEEE Transactions on Autonomous Mental Development. 2014. doi:10.1109/TAMD.2014.2315676
Abstract: Vision-based human action recognition allows to detect and understand meaningful human motion. This makes it possible to perform advanced human-computer interaction, among other applications. In dynamic environments, adaptive methods are required to support changing scenario characteristics. Specifically, in human-robot interaction, smooth interaction between humans and robots can only be performed if these are able to evolve and adapt to the changing nature of the scenarios. In this paper, an adaptive vision-based human action recognition method is proposed. By means of an evolutionary optimisation method, adaptive and incremental learning of human actions is supported. Through an evolving bag of key poses, which models the learnt actions over time, the current learning memory is developed to recognise increasingly more actions or actors. The evolutionary method selects the optimal subset of training instances, features and parameter values for each learning phase, and handles the evolution of the model. The experimentation shows that our proposal achieves to adapt to new actions or actors successfully, by rearranging the learnt model. Stable and accurate results have been obtained on four publicly available RGB and RGB-D datasets, unveiling the method’s robustness and applicability.
Sponsor: This work has been partially supported by the European Commission under project “caring4U - A study on people activity in private spaces: towards a multisensor network that meets privacy requirements” (PIEF-GA-2010-274649) and by the Spanish Ministry of Science and Innovation under project “Sistema de visión para la monitorización de la actividad de la vida diaria en el hogar” (TIN2010-20510-C04-02). Alexandros Andre Chaaraoui acknowledges financial support by the Conselleria d’Educació, Formació i Ocupació of the Generalitat Valenciana (fellowship ACIF/2011/160).
URI: http://hdl.handle.net/10045/37068
ISSN: 1943-0604 (Print) | 1943-0612 (Online)
DOI: 10.1109/TAMD.2014.2315676
Language: eng
Type: info:eu-repo/semantics/article
Rights: © Copyright 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
Peer Review: si
Publisher version: http://dx.doi.org/10.1109/TAMD.2014.2315676
Appears in Collections:INV - I2RC - Artículos de Revistas

Files in This Item:
Files in This Item:
File Description SizeFormat 
Thumbnail2014_Chaaraoui_Florez_IEEE-TAMD.pdfVersión revisada (acceso abierto)7,41 MBAdobe PDFOpen Preview


Items in RUA are protected by copyright, with all rights reserved, unless otherwise indicated.