An sEMG-Controlled 3D Game for Rehabilitation Therapies: Real-Time Time Hand Gesture Recognition Using Deep Learning Techniques
Por favor, use este identificador para citar o enlazar este ítem:
http://hdl.handle.net/10045/110347
Título: | An sEMG-Controlled 3D Game for Rehabilitation Therapies: Real-Time Time Hand Gesture Recognition Using Deep Learning Techniques |
---|---|
Autor/es: | Nasri, Nadia | Orts-Escolano, Sergio | Cazorla, Miguel |
Grupo/s de investigación o GITE: | Robótica y Visión Tridimensional (RoViT) |
Centro, Departamento o Servicio: | Universidad de Alicante. Departamento de Ciencia de la Computación e Inteligencia Artificial |
Palabras clave: | Electromyography sensor | Deep learning | Hand gesture recognition | Virtual reality | Rehabilitation |
Área/s de conocimiento: | Ciencia de la Computación e Inteligencia Artificial |
Fecha de publicación: | 12-nov-2020 |
Editor: | MDPI |
Cita bibliográfica: | Nasri N, Orts-Escolano S, Cazorla M. An sEMG-Controlled 3D Game for Rehabilitation Therapies: Real-Time Time Hand Gesture Recognition Using Deep Learning Techniques. Sensors. 2020; 20(22):6451. https://doi.org/10.3390/s20226451 |
Resumen: | In recent years the advances in Artificial Intelligence (AI) have been seen to play an important role in human well-being, in particular enabling novel forms of human-computer interaction for people with a disability. In this paper, we propose a sEMG-controlled 3D game that leverages a deep learning-based architecture for real-time gesture recognition. The 3D game experience developed in the study is focused on rehabilitation exercises, allowing individuals with certain disabilities to use low-cost sEMG sensors to control the game experience. For this purpose, we acquired a novel dataset of seven gestures using the Myo armband device, which we utilized to train the proposed deep learning model. The signals captured were used as an input of a Conv-GRU architecture to classify the gestures. Further, we ran a live system with the participation of different individuals and analyzed the neural network’s classification for hand gestures. Finally, we also evaluated our system, testing it for 20 rounds with new participants and analyzed its results in a user study. |
Patrocinador/es: | This work was supported by the Spanish Government PID2019-104818RB-I00 grant, supported with Feder funds. |
URI: | http://hdl.handle.net/10045/110347 |
ISSN: | 1424-8220 |
DOI: | 10.3390/s20226451 |
Idioma: | eng |
Tipo: | info:eu-repo/semantics/article |
Derechos: | © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
Revisión científica: | si |
Versión del editor: | https://doi.org/10.3390/s20226451 |
Aparece en las colecciones: | INV - RoViT - Artículos de Revistas |
Archivos en este ítem:
Archivo | Descripción | Tamaño | Formato | |
---|---|---|---|---|
Nasri_etal_2020_Sensors.pdf | 1,54 MB | Adobe PDF | Abrir Vista previa | |
Este ítem está licenciado bajo Licencia Creative Commons