Return to search

Data collection of 3D spatial features of gestures from static peruvian sign language alphabet for sign language recognition

El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / Peruvian Sign Language Recognition (PSL) is approached as a classification problem. Previous work has employed 2D features from the position of hands to tackle this problem. In this paper, we propose a method to construct a dataset consisting of 3D spatial positions of static gestures from the PSL alphabet, using the HTC Vive device and a well-known technique to extract 21 keypoints from the hand to obtain a feature vector. A dataset of 35, 400 instances of gestures for PSL was constructed and a novel way to extract data was stated. To validate the appropriateness of this dataset, a comparison of four baselines classifiers in the Peruvian Sign Language Recognition (PSLR) task was stated, achieving 99.32% in the average in terms of F1 measure in the best case. / Revisión por pares

Identiferoai:union.ndltd.org:PERUUPC/oai:repositorioacademico.upc.edu.pe:10757/656634
Date21 October 2020
CreatorsNurena-Jara, Roberto, Ramos-Carrion, Cristopher, Shiguihara-Juarez, Pedro
PublisherInstitute of Electrical and Electronics Engineers Inc.
Source SetsUniversidad Peruana de Ciencias Aplicadas (UPC)
LanguageEnglish
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/article
Formatapplication/html
SourceProceedings of the 2020 IEEE Engineering International Research Conference, EIRCON 2020
Rightsinfo:eu-repo/semantics/embargoedAccess
Relationhttps://ieeexplore.ieee.org/document/9254019

Page generated in 0.0023 seconds