Return to search

Transition-Based Dependency Parsing with Neural Networks

Dependency parsing is important in contemporary speech and language processing systems. Current dependency parsers typically use the multi-class perceptron machine learning component, which classifies based on millions of sparse indicator features, making developing and maintaining these systems expensive and error-prone. This thesis aims to explore whether replacing the multi-class perceptron component with an artificial neural network component can alleviate this problem without hurting performance, in terms of accuracy and efficiency. A simple transition-based dependency parser using the artificial neural network (ANN) as the classifier is written in Python3 and the same program with the classifier replaced by a multi-class perceptron component is used as a baseline. The results show that the ANN dependency parser provides slightly better unlabeled attachment score with only the most basic atomic features, eliminating the need for complex feature engineering. However, it is about three times slower and the training time required for the ANN is significantly longer.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-138596
Date January 2017
CreatorsGylling, Joakim
PublisherLinköpings universitet, Artificiell intelligens och integrerade datorsystem
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0016 seconds