Return to search

Explanation and Downscalability of Google's Dependency Parser Parsey McParseface

Using the data collected during the hyperparameter tuning for Google's Dependency Parser Parsey McParseface, Feedforward neural networks and the correlation between its hyperparameter during the networks training are explained and analysed in depth.:1 Introduction to Neural Networks 4
1.1 History of AI 4
1.2 The role of Neural Networks in AI Research 6
1.2.1 Artificial Intelligence 6
1.2.2 Machine Learning 6
1.2.3 Neural Network 8
1.3 Structure of Neural Networks 8
1.3.1 Biology Analogy of Artificial Neural Networks 9
1.3.2 Architecture of Artificial Neural Networks 9
1.3.3 Biological Model of Nodes – Neurons 11
1.3.4 Structure of Artificial Neurons 12
1.4 Training a Neural Network 21
1.4.1 Data 21
1.4.2 Hyperparameters 22
1.4.3 Training process 26
1.4.4 Overfitting 27
2 Natural Language Processing (NLP) 29
2.1 Data Preparation 29
2.1.1 Text Preprocessing 29
2.1.2 Part-of-Speech Tagging 30
2.2 Dependency Parsing 31
2.2.1 Dependency Grammar 31
2.2.2 Dependency Parsing Rule-Based & Data-Driven Approach 33
2.2.3 Syntactic Parser 33
2.3 Parsey McParseface 34
2.3.1 SyntaxNet 34
2.3.2 Corpus 34
2.3.3 Architecture 34
2.3.4 Improvements to the Feed Forward Neural Network 38
3 Training of Parsey’s Cousins 41
3.1 Training a Model 41
3.1.1 Building the Framework 41
3.1.2 Corpus 41
3.1.3 Training Process 43
3.1.4 Settings for the Training 44
3.2 Results and Analysis 46
3.2.1 Results from Google’s Models 46
3.2.2 Effect of Hyperparameter 47
4 Conclusion 63
5 Bibliography 65
6 Appendix 74

Identiferoai:union.ndltd.org:DRESDEN/oai:qucosa:de:qucosa:82880
Date10 January 2023
CreatorsEndreß, Hannes
ContributorsUniversität Leipzig
Source SetsHochschulschriftenserver (HSSS) der SLUB Dresden
LanguageEnglish, German
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/publishedVersion, doc-type:masterThesis, info:eu-repo/semantics/masterThesis, doc-type:Text
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0022 seconds