El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / Peruvian Sign Language Recognition (PSL) is approached as a classification problem. Previous work has employed 2D features from the position of hands to tackle this problem. In this paper, we propose a method to construct a dataset consisting of 3D spatial positions of static gestures from the PSL alphabet, using the HTC Vive device and a well-known technique to extract 21 keypoints from the hand to obtain a feature vector. A dataset of 35, 400 instances of gestures for PSL was constructed and a novel way to extract data was stated. To validate the appropriateness of this dataset, a comparison of four baselines classifiers in the Peruvian Sign Language Recognition (PSLR) task was stated, achieving 99.32% in the average in terms of F1 measure in the best case. / Revisión por pares
Identifer | oai:union.ndltd.org:PERUUPC/oai:repositorioacademico.upc.edu.pe:10757/656634 |
Date | 21 October 2020 |
Creators | Nurena-Jara, Roberto, Ramos-Carrion, Cristopher, Shiguihara-Juarez, Pedro |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Source Sets | Universidad Peruana de Ciencias Aplicadas (UPC) |
Language | English |
Detected Language | English |
Type | info:eu-repo/semantics/article |
Format | application/html |
Source | Proceedings of the 2020 IEEE Engineering International Research Conference, EIRCON 2020 |
Rights | info:eu-repo/semantics/embargoedAccess |
Relation | https://ieeexplore.ieee.org/document/9254019 |
Page generated in 0.0016 seconds