Return to search

Feature Extraction with Video Summarization of Dynamic Gestures for Peruvian Sign Language Recognition

El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / In peruvian sign language (PSL), recognition of static gestures has been proposed earlier. However, to state a conversation using sign language, it is also necessary to employ dynamic gestures. We propose a method to extract a feature vector for dynamic gestures of PSL. We collect a dataset with 288 video sequences of words related to dynamic gestures and we state a workflow to process the keypoints of the hands, obtaining a feature vector for each video sequence with the support of a video summarization technique. We employ 9 neural networks to test the method, achieving an average accuracy ranging from 80% and 90%, using 10 fold cross-validation.

Identiferoai:union.ndltd.org:PERUUPC/oai:repositorioacademico.upc.edu.pe:10757/656630
Date01 September 2020
CreatorsNeyra-Gutierrez, Andre, Shiguihara-Juarez, Pedro
PublisherInstitute of Electrical and Electronics Engineers Inc.
Source SetsUniversidad Peruana de Ciencias Aplicadas (UPC)
LanguageEnglish
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/article
Formatapplication/html
SourceUniversidad Peruana de Ciencias Aplicadas (UPC), Repositorio Académico - UPC, Proceedings of the 2020 IEEE 27th International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2020
Rightsinfo:eu-repo/semantics/embargoedAccess
Relationhttps://ieeexplore.ieee.org/abstract/document/9220243

Page generated in 0.002 seconds