Spelling suggestions: "subject:"cantonese dialectology"" "subject:"cantonese dialectotology""
1 |
Study of Cantonese gwai2: diachrony and synchrony.January 2004 (has links)
Chan Shuen-ti Roy. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2004. / Includes bibliographical references (leaves 164-172). / Abstracts in English and Chinese. / List of Figures --- p.8 / List of Tables --- p.9 / Chapter 1 --- Cantonese gwai2 in focus --- p.10 / Chapter 1.1 --- Background --- p.10 / Chapter 1.2 --- A short description --- p.11 / Chapter 1.3 --- Goal of research --- p.15 / Chapter 1.4 --- Organization --- p.17 / Chapter 1.5 --- On terminologies --- p.17 / Chapter 1.6 --- On styles --- p.17 / Chapter 1.6.1 --- Glossing --- p.17 / Chapter 1.6.2 --- Notations --- p.18 / Chapter 1.6.3 --- Abbreviations --- p.19 / Chapter 2 --- Synchronic description of gwai2 --- p.20 / Chapter 2.1 --- Data in existing works --- p.20 / Chapter 2.1.1 --- Gwai2 in words --- p.21 / Chapter 2.1.2 --- Gwai2 with degree modifiers --- p.25 / Chapter 2.1.3 --- Gwai2 with verbal particles --- p.27 / Chapter 2.1.4 --- Gwai2 with aspect markers --- p.28 / Chapter 2.1.5 --- Gwai2 as negator --- p.31 / Chapter 2.2 --- Puzzles in previous works --- p.36 / Chapter 2.2.1 --- On terminologies --- p.36 / Chapter 2.2.2 --- Intensifier --- p.38 / Chapter 2.3 --- Summary --- p.39 / Chapter 3 --- Collecting new data: Methodology --- p.40 / Chapter 3.1 --- Introduction --- p.40 / Chapter 3.2 --- Judgment data and the linguists' intuition --- p.40 / Chapter 3.2.1 --- Intuition vs. judgment --- p.41 / Chapter 3.2.2 --- Grammaticality vs. acceptability --- p.41 / Chapter 3.2.3 --- Factors affecting grammaticality judgment --- p.43 / Chapter 3.2.4 --- The usual practice --- p.45 / Chapter 3.3 --- Approaches taken in this thesis --- p.46 / Chapter 3.3.1 --- Interview --- p.48 / Chapter 3.3.2 --- Questionnaire --- p.48 / Chapter 3.3.3 --- Corpus study --- p.48 / Chapter 3.3.4 --- Modeling grammaticality judgment: Controlled experiment --- p.49 / Chapter 3.4 --- Conclusion --- p.56 / Chapter 4 --- Further synchronic evidence of gwai2 --- p.57 / Chapter 4.1 --- Corpus statistics --- p.57 / Chapter 4.2 --- Intuitive data --- p.58 / Chapter 4.2.1 --- Gwai2 and word length --- p.58 / Chapter 4.2.2 --- Gwai2 and verb types --- p.61 / Chapter 4.2.3 --- Gwai2 and swear words --- p.63 / Chapter 4.3 --- Questionnaire data --- p.63 / Chapter 4.3.1 --- Gwai2 and adjectives --- p.64 / Chapter 4.3.2 --- "Gwai2, sei2 and degree modifier hou2" --- p.65 / Chapter 4.3.3 --- "Gwai2, sei2, hou2 and degree adverb gam3" --- p.67 / Chapter 4.4 --- Experimental data --- p.67 / Chapter 4.4.1 --- Participants --- p.68 / Chapter 4.4.2 --- gwai2 and aspect markers --- p.68 / Chapter 4.4.3 --- gwai2gam3 vs gam3gwai2 --- p.69 / Chapter 4.5 --- Conclusion --- p.72 / Chapter 5 --- Diachronic perspective of gwai2 --- p.73 / Chapter 5.1 --- Introduction to grammaticalization --- p.74 / Chapter 5.1.1 --- Motivation of grammaticalization --- p.74 / Chapter 5.1.2 --- Mechanisms in grammaticalization --- p.75 / Chapter 5.1.3 --- Interaction of reanalysis and analogy --- p.77 / Chapter 5.2 --- Grammaticalization of gwai2 --- p.79 / Chapter 5.2.1 --- First stage: gwai2 as an lexical morpheme --- p.79 / Chapter 5.2.2 --- Second stage: From lexical to functional --- p.81 / Chapter 5.2.3 --- Third stage: Emergence of adverbial hou2gwai2 --- p.88 / Chapter 5.2.4 --- "Fourth stage: ""Infix"" and adjective negator" --- p.94 / Chapter 5.2.5 --- Fifth stage: gwai2 in verbal compounds --- p.96 / Chapter 5.2.6 --- Independent development: gwai2 and devil negation --- p.99 / Chapter 5.2.7 --- Grammaticalization and grammaticality judgment --- p.103 / Chapter 5.3 --- Conclusion --- p.104 / Chapter 6 --- Formal properties of gwai2 --- p.106 / Chapter 6.1 --- Introduction --- p.106 / Chapter 6.2 --- Interaction of gwai2 with Cantonese aspectual system --- p.107 / Chapter 6.3 --- Syntactic category of gwai2 --- p.109 / Chapter 6.4 --- Interpretation of gwai2 --- p.111 / Chapter 6.4.1 --- Gwai2 as an intensifier --- p.112 / Chapter 6.4.2 --- Gwai2 as a modal operator --- p.115 / Chapter 6.5 --- The syntax of gwai2 --- p.120 / Chapter 6.5.1 --- Theoretical apparatus --- p.120 / Chapter 6.5.2 --- Gwai2 in adjectives --- p.121 / Chapter 6.5.3 --- Gwai2 in resultative verb compound --- p.125 / Chapter 6.5.4 --- Gwai2 between verb and aspect marker --- p.137 / Chapter 6.6 --- Conclusion --- p.145 / Chapter 7 --- Issues unresolved --- p.146 / Chapter 8 --- Summary --- p.149 / Chapter A --- Experiment materials --- p.153 / Chapter A.1 --- gwai2 and aspect markers --- p.153 / Chapter A.2 --- gwai2gam3 vs. gam3gwai2 --- p.156 / Chapter B --- Screen-shots of WebExp Experimental Software --- p.160 / References --- p.164
|
2 |
efficient decoding method for continuous speech recognition based on a tree-structured lexicon =: 基於樹狀詞彙表示方法的有效率連續語音識別系統. / 基於樹狀詞彙表示方法的有效率連續語音識別系統 / An efficient decoding method for continuous speech recognition based on a tree-structured lexicon =: Ji yu shu zhuang ci hui biao shi fang fa de you xiao lü lian xu yu yin shi bie xi tong. / Ji yu shu zhuang ci hui biao shi fang fa de you xiao lü lian xu yu yin shi bie xi tongJanuary 2001 (has links)
Choi Wing Nin. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references. / Text in English; abstracts in English and Chinese. / Choi Wing Nin. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Development of search algorithms for Chinese LVCSR --- p.3 / Chapter 1.2 --- Objectives of the thesis --- p.4 / Chapter 1.3 --- Thesis outline --- p.5 / Reference --- p.7 / Chapter 2 --- Fundamentals of Continuous Speech Recognition --- p.9 / Chapter 2.1 --- The Bayesian decision rule --- p.9 / Chapter 2.2 --- Acoustic front-end processor --- p.11 / Chapter 2.3 --- Phonological constraint --- p.12 / Chapter 2.3.1 --- Characteristics of Cantonese --- p.12 / Chapter 2.3.2 --- Homophones and homographs --- p.13 / Chapter 2.4 --- Acoustic modeling --- p.13 / Chapter 2.5 --- Statistical language model --- p.15 / Chapter 2.5.1 --- Word-based language model --- p.15 / Chapter 2.5.2 --- Class-based language model --- p.16 / Chapter 2.6 --- Search algorithms --- p.17 / Chapter 2.6.1 --- Time-synchronous Viterbi search --- p.18 / Chapter 2.6.2 --- Time-asynchronous stack decoding --- p.18 / Chapter 2.6.3 --- One-pass versus multi-pass search strategies --- p.19 / Chapter 2.7 --- Summary --- p.20 / Reference --- p.21 / Chapter 3 --- Search Space Organization --- p.23 / Chapter 3.1 --- Lexicon representation --- p.24 / Chapter 3.1.1 --- Linear lexicon --- p.25 / Chapter 3.1.2 --- Tree lexicon --- p.27 / Chapter 3.2 --- Factorization of language model --- p.31 / Chapter 3.3 --- Lexical tree incorporated with context-dependent acoustic models --- p.36 / Chapter 3.4 --- Summary --- p.39 / Reference --- p.40 / Chapter 4 --- One-Pass Dynamic Programming Based Search Algorithm --- p.42 / Chapter 4.1 --- Token Passing Algorithm --- p.43 / Chapter 4.2 --- Techniques for speeding up the search --- p.48 / Chapter 4.2.1 --- Different layers of beam in the search hierarchy --- p.48 / Chapter 4.2.2 --- Efficient recombination of tokens --- p.51 / Chapter 4.2.3 --- Fast likelihood computation methods for continuous mixture densities --- p.52 / Chapter 4.2.4 --- Lexical tree with class-based language model --- p.54 / Chapter 4.3 --- Experimental results and discussions --- p.57 / Chapter 4.3.1 --- The Hong Kong stock inquiry task --- p.57 / Chapter 4.3.2 --- General domain continuous speech recognition --- p.59 / Reference --- p.62 / Chapter 5 --- Extension of the One-Pass Search --- p.64 / Chapter 5.1 --- Overview of the extended framework --- p.65 / Chapter 5.2 --- Word lattice construction by modified word-conditioned search --- p.66 / Chapter 5.2.1 --- Exact N-best algorithm --- p.66 / Chapter 5.2.2 --- Word-pair approximation --- p.67 / Chapter 5.2.3 --- Word lattice algorithm --- p.68 / Chapter 5.3 --- Computation of heuristic score --- p.70 / Chapter 5.4 --- Backward A* heuristic search --- p.72 / Chapter 5.4.1 --- Recovering the missing piece --- p.74 / Chapter 5.4.2 --- Generation of N-best list --- p.74 / Chapter 5.5 --- Experimental results --- p.75 / Chapter 5.5.1 --- Simple back-tracking vs A* heuristic search --- p.75 / Chapter 5.5.2 --- N-best list evaluation using class bigram re-scoring --- p.76 / Chapter 5.5.3 --- N-best list evaluation using class trigram re-scoring --- p.77 / Chapter 5.6 --- Summary --- p.78 / Reference --- p.79 / Chapter 6 --- Conclusions and Suggestions for Future Development --- p.80 / Chapter 6.1 --- Conclusions --- p.80 / Chapter 6.2 --- Suggestions for future development --- p.82 / Chapter 6.2.1 --- Incorporation of tone information --- p.82 / Chapter 6.2.2 --- Fast match strategy for acoustic models --- p.82 / Reference --- p.84 / Appendix Cantonese Initials and Finals --- p.85
|
Page generated in 0.0734 seconds