Spelling suggestions: "subject:"achine translating."" "subject:"cachine translating.""
21 |
Distortion constraints in statistical machine translation /Olteanu, Marian G., January 2007 (has links)
Thesis (Ph.D.)--University of Texas at Dallas, 2007. / Includes vita. Includes bibliographical references (leaves 147-152)
|
22 |
Graph-based data selection for statistical machine translationWang, Yi Ming January 2017 (has links)
University of Macau / Faculty of Science and Technology / Department of Computer and Information Science
|
23 |
Etude et mise au point de critères d'évaluation technico-économique d'un procédé de traduction automatiqueVan Slype, Georges 03 1900 (has links)
Doctorat en sciences politiques / info:eu-repo/semantics/nonPublished
|
24 |
Machine translation for Chinese medical literature.January 1997 (has links)
Li Hoi-Ying. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1997. / Includes bibliographical references (leaves 117-120). / Abstract --- p.i / Acknowledgement --- p.iii / Chapter 1 --- Introduction --- p.1 / Chapter 2 --- Background --- p.8 / Chapter 2.1 --- Strategies in Machine Translation Systems --- p.9 / Chapter 2.1.1 --- Direct MT Strategy --- p.10 / Chapter 2.1.2 --- Transfer MT strategy --- p.11 / Chapter 2.1.3 --- Interlingua MT Strategy --- p.13 / Chapter 2.1.4 --- AI Approach --- p.15 / Chapter 2.1.5 --- Statistical Approach --- p.15 / Chapter 2.2 --- Grammars --- p.16 / Chapter 2.3 --- Sublanguages --- p.19 / Chapter 2.4 --- Human Interaction --- p.21 / Chapter 2.5 --- Evaluation for Performance --- p.23 / Chapter 2.6 --- Machine Translation between Chinese and English --- p.25 / Chapter 2.7 --- Problems and Issues in MTCML --- p.29 / Chapter 2.7.1 --- Linguistic Characteristics of the Corpus --- p.29 / Chapter 2.7.2 --- Strategies for problems in MTCML --- p.31 / Chapter 3 --- Segmentation --- p.34 / Chapter 3.1 --- Strategies for Segmentation --- p.34 / Chapter 3.2 --- Segmentation algorithm in MTCML --- p.36 / Chapter 4 --- Tagging --- p.40 / Chapter 4.1 --- Objective --- p.40 / Chapter 4.2 --- Approach --- p.41 / Chapter 4.2.1 --- Category and Sub-category --- p.41 / Chapter 4.2.2 --- Tools --- p.45 / Chapter 5 --- Analysis --- p.48 / Chapter 5.1 --- Linguistic Study of the Corpus --- p.48 / Chapter 5.1.1 --- Imperative Sentences --- p.49 / Chapter 5.1.2 --- Elliptical Sentences --- p.50 / Chapter 5.1.3 --- Inverted Sentences --- p.52 / Chapter 5.1.4 --- Voice and Tense --- p.53 / Chapter 5.1.5 --- Vocabulary --- p.54 / Chapter 5.2 --- Pattern Extraction --- p.54 / Chapter 5.3 --- Pattern Reduction --- p.56 / Chapter 5.3.1 --- Case Study --- p.56 / Chapter 5.3.2 --- Syntactic Rules --- p.61 / Chapter 5.4 --- Disambiguation --- p.62 / Chapter 5.4.1 --- Category Ambiguity --- p.63 / Chapter 5.4.2 --- Structural Ambiguity --- p.65 / Chapter 6 --- Transfer --- p.68 / Chapter 6.1 --- Principle of Transfer --- p.68 / Chapter 6.2 --- Extraction of Templates --- p.71 / Chapter 6.2.1 --- Similarity Comparison --- p.72 / Chapter 6.2.2 --- Algorithm --- p.74 / Chapter 6.3 --- Classification of Templates --- p.76 / Chapter 6.3.1 --- Classification --- p.76 / Chapter 6.3.2 --- A Class-based Filter --- p.79 / Chapter 6.4 --- Transfer Rule-base --- p.80 / Chapter 6.4.1 --- Transfer Rules --- p.81 / Chapter 6.4.2 --- Rule Matching --- p.84 / Chapter 6.5 --- Chapter Summary --- p.85 / Chapter 7 --- Generation --- p.87 / Chapter 7.1 --- Sentence Generation --- p.87 / Chapter 7.2 --- Disambiguation of Homographs --- p.90 / Chapter 7.3 --- Sentence Polishing --- p.91 / Chapter 8 --- System Implementation --- p.95 / Chapter 8.1 --- Corpus --- p.95 / Chapter 8.2 --- Dictionaries and Lexicons --- p.97 / Chapter 8.3 --- Reduction Rules --- p.100 / Chapter 8.4 --- Transfer Rules --- p.102 / Chapter 8.5 --- Efficiency of the System --- p.104 / Chapter 8.6 --- Case Study --- p.105 / Chapter 8.6.1 --- Sample Result and Assessment --- p.105 / Chapter 8.6.2 --- Results of Segmentation and Tagging --- p.107 / Chapter 8.6.3 --- Results of Analysis --- p.108 / Chapter 8.6.4 --- Results of Transfer --- p.110 / Chapter 8.6.5 --- Results of Generation --- p.110 / Chapter 9 --- Conclusion --- p.112 / Bibliography --- p.117 / Chapter A --- Programmer's Guide --- p.121 / Chapter B --- Translation Instances --- p.125
|
25 |
A model and an hypothesis for language structureJanuary 1960 (has links)
Victor H. Yngve. / Cover title. "Reprint from Proceedings of the American Philosophical Society, vol.104, no.5." / Includes bibliographical references.
|
26 |
A critical evaluation of two on-line machine translation systems : Google & Systran / Google and SystranWang, Shuang January 2010 (has links)
University of Macau / Faculty of Social Sciences and Humanities / Department of English
|
27 |
Assessing online translation systems using the BLEU score : Google Language Tools & SYSTRANBox / Google Language Tools & SYSTRANBox;"以BLEU評估在線翻譯系統研究 : Google語言工具及SYSTRANBox"Law, Mei In January 2011 (has links)
University of Macau / Faculty of Social Sciences and Humanities / Department of English
|
28 |
A prototype system for machine translation from English to South African Sign Language using synchronous tree adjoining grammarsWelgemoed, Johan 03 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: Machine translation, especially machine translation for sign languages, remains an active research
area. Sign language machine translation presents unique challenges to the whole machine translation
process. In this thesis a prototype machine translation system is presented. This system is
designed to translate English text into a gloss based representation of South African Sign Language
(SASL).
In order to perform the machine translation, a transfer based approach was taken. English
text is parsed into an intermediate representation. Translation rules are then applied to this
intermediate representation to transform it into an equivalent intermediate representation for the
SASL glosses. For both these intermediate representations, a tree adjoining grammar (TAG)
formalism is used. As part of the prototype machine translation system, a TAG parser was
implemented.
The translation rules used by the system were derived from a SASL phrase book. This phrase
book was also used to create a small gloss based SASL TAG grammar. Lastly, some additional
tools, for the editing of TAG trees, were also added to the prototype system. / AFRIKAANSE OPSOMMING: Masjienvertaling, veral masjienvertaling vir gebaretale, bly ’n aktiewe navorsingsgebied. Masjienvertaling
vir gebaretale bied unieke uitdagings tot die hele masjienvertalingproses. In hierdie tesis
bied ons ’n prototipe masjienvertalingstelsel aan. Hierdie stelsel is ontwerp om Engelse teks te
vertaal na ’n glos gebaseerde voorstelling van Suid-Afrikaanse Gebaretaal (SAG).
Ons vertalingstelsel maak gebruik van ’n oorplasingsbenadering tot masjienvertaling. Engelse
teks word ontleed na ’n intermediˆere vorm. Vertalingre¨els word toegepas op hierdie intermediˆere
vorm om dit te transformeer na ’n ekwivalente intermediˆere vorm vir die SAG glosse. Vir beide
hierdie intermediˆere vorms word boomkoppelingsgrammatikas (BKGs) gebruik. As deel van die
prototipe masjienvertalingstelsel, is ’n BKG sintaksontleder ge¨ımplementeer.
Die vertalingre¨els wat gebruik word deur die stelsel, is afgelei vanaf ’n SAG fraseboek. Hierdie
fraseboek was ook gebruik om ’n klein BKG vir SAG glosse te ontwikkel. Laastens was addisionele
nutsfasiliteite, vir die redigering van BKG bome, ontwikkel.
|
29 |
A report on a C-E technical translation project using Google TranslateMai, Guan Hui, Jennifer January 2018 (has links)
University of Macau / Faculty of Arts and Humanities. / Department of English
|
30 |
A Family of Latent Variable Convex Relaxations for IBM Model 2Simion, Andrei January 2015 (has links)
Introduced in 1993, the IBM translation models were the first generation Statistical Machine Translation systems. For the IBM Models, only IBM Model 1 is a convex optimization problem, meaning that we can initialize all its probabilistic parameters to uniform values and subsequently converge to a good solution via Expectation Maximization (EM). In this thesis we discuss a mechanism to generate an infinite supply of nontrivial convex relaxations for IBM Model 2 and detail an Exponentiated Subgradient algorithm to solve them. We also detail some interesting relaxations that admit and easy EM algorithm that does not require the tuning of a learning rate. Based on the geometric mean of two variables, this last set of convex models can be seamlessly integrated into the open-source GIZA++ word-alignment library. Finally, we also show other applications of the method, including a more powerful strictly convex IBM Model 1, and a convex HMM surrogate that improves on the performance of the previous convex IBM Model 2 variants.
|
Page generated in 0.0819 seconds