Return to search

You Only Gesture Once (YouGo): American Sign Language Translation using YOLOv3

<div>The study focused on creating and proposing a model that could accurately and precisely predict the occurrence of an American Sign Language gesture for an alphabet in the English Language</div><div>using the You Only Look Once (YOLOv3) Algorithm. The training dataset used for this study was custom created and was further divided into clusters based on the uniqueness of the ASL sign.</div><div>Three diverse clusters were created. Each cluster was trained with the network known as darknet. Testing was conducted using images and videos for fully trained models of each cluster and</div><div>Average Precision for each alphabet in each cluster and Mean Average Precision for each cluster was noted. In addition, a Word Builder script was created. This script combined the trained models, of all 3 clusters, to create a comprehensive system that would create words when the trained models were supplied</div><div>with images of alphabets in the English language as depicted in ASL.</div>

  1. 10.25394/pgs.12221963.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/12221963
Date01 May 2020
CreatorsMehul Nanda (8786558)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/You_Only_Gesture_Once_YouGo_American_Sign_Language_Translation_using_YOLOv3/12221963

Page generated in 0.0253 seconds