El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / In peruvian sign language (PSL), recognition of static gestures has been proposed earlier. However, to state a conversation using sign language, it is also necessary to employ dynamic gestures. We propose a method to extract a feature vector for dynamic gestures of PSL. We collect a dataset with 288 video sequences of words related to dynamic gestures and we state a workflow to process the keypoints of the hands, obtaining a feature vector for each video sequence with the support of a video summarization technique. We employ 9 neural networks to test the method, achieving an average accuracy ranging from 80% and 90%, using 10 fold cross-validation.
Identifer | oai:union.ndltd.org:PERUUPC/oai:repositorioacademico.upc.edu.pe:10757/656630 |
Date | 01 September 2020 |
Creators | Neyra-Gutierrez, Andre, Shiguihara-Juarez, Pedro |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Source Sets | Universidad Peruana de Ciencias Aplicadas (UPC) |
Language | English |
Detected Language | English |
Type | info:eu-repo/semantics/article |
Format | application/html |
Source | Universidad Peruana de Ciencias Aplicadas (UPC), Repositorio Académico - UPC, Proceedings of the 2020 IEEE 27th International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2020 |
Rights | info:eu-repo/semantics/embargoedAccess |
Relation | https://ieeexplore.ieee.org/abstract/document/9220243 |
Page generated in 0.058 seconds