A discussion on the validation tests employed to compare human action recognition methods using the MSR Action3D dataset

Please use this identifier to cite or link to this item: http://hdl.handle.net/10045/39889
Información del item - Informació de l'item - Item information
Title: A discussion on the validation tests employed to compare human action recognition methods using the MSR Action3D dataset
Authors: Padilla López, José Ramón | Chaaraoui, Alexandros Andre | Flórez-Revuelta, Francisco
Research Group/s: Informática Industrial y Redes de Computadores | Domótica y Ambientes Inteligentes
Center, Department or Service: Universidad de Alicante. Departamento de Tecnología Informática y Computación
Keywords: Human action recognition | RGB-D Devices | MSR Action3D | Validation | Kinect | Depth sensors
Knowledge Area: Arquitectura y Tecnología de Computadores
Date Created: Jun-2014
Issue Date: 29-Jul-2014
Abstract: This paper aims to determine which is the best human action recognition method based on features extracted from RGB-D devices, such as the Microsoft Kinect. A review of all the papers that make reference to MSR Action3D, the most used dataset that includes depth information acquired from a RGB-D device, has been performed. We found that the validation method used by each work differs from the others. So, a direct comparison among works cannot be made. However, almost all the works present their results comparing them without taking into account this issue. Therefore, we present different rankings according to the methodology used for the validation in order to clarify the existing confusion.
URI: http://arxiv.org/abs/1407.7390 | http://hdl.handle.net/10045/39889
Language: eng
Type: info:eu-repo/semantics/workingPaper
Rights: Licencia Creative Commons Reconocimiento-NoComercial-SinObraDerivada 4.0
Peer Review: no
Appears in Collections:INV - DAI - Informes Técnicos

Files in This Item:
Files in This Item:
File Description SizeFormat 
Thumbnaildiscussion-validation-tests-20140728.pdf205,96 kBAdobe PDFOpen Preview


This item is licensed under a Creative Commons License Creative Commons