• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Toward a More Inclusive Construct of Native Chinese Speaker L2 Written Error Gravity

Holland, Steven K. 18 March 2013 (has links) (PDF)
The purpose of this study is to determine two types of error gravity in a corpus of texts written by native Chinese learners of English (ELLs)—one that enriches the traditional construct of gravity found in error gravity research by including error frequency, or how often an error occurs in a text relative to others, as an intervening variable, and one that applies the new error gravity data in a practical way to help establish salient grammatical focal points for written corrective feedback (WCF). Previous error gravity research has suggested that the amount of irritation caused by error is determined by the extent to which an utterance departs from "native-like" speech. However, because these studies often neglect the role of frequency in determining gravity—relying on isolated sentences, pre-determined errors, and manipulated texts to define it—a more complete view of error gravity is needed. Forty-eight native English speakers without ESL teaching experience and 10 experienced ESL teachers evaluated a set of 18 timed, 30-minute essays written by high intermediate to advanced native-Chinese ELLs. Errors were identified, verified, tagged, and classified by the level of irritation they produced. Results show the most serious errors included count/non-count (C/NC), insert verb (INSERT V), omit verb (OMIT V), and subject-verb agreement (SV). The most frequent error type was word choice (WC), followed by singular/plural (S/PL), awkward (AWK), and word form (WF). When combined, singular/plural (S/PL), word form (WF), word choice (WC), and awkward (AWK) errors were found to be the most critical. These findings support Burt and Kiparsky's (1972) global/local error distinction in which global errors, or those lexical, grammatical and syntactic errors that affect the overall organization or meaning of the sentence (Burt, 1975) are deemed more grievous than local ones, which affect only "single elements (constituents)" (Burt, 1975, p. 57). Implications are discussed in terms of future research and possible uses in the Dynamic Written Corrective Feedback classroom.
2

Efficient and Reliable Simulation of Quantum Molecular Dynamics

Kormann, Katharina January 2012 (has links)
The time-dependent Schrödinger equation (TDSE) models the quantum nature of molecular processes.  Numerical simulations based on the TDSE help in understanding and predicting the outcome of chemical reactions. This thesis is dedicated to the derivation and analysis of efficient and reliable simulation tools for the TDSE, with a particular focus on models for the interaction of molecules with time-dependent electromagnetic fields. Various time propagators are compared for this setting and an efficient fourth-order commutator-free Magnus-Lanczos propagator is derived. For the Lanczos method, several communication-reducing variants are studied for an implementation on clusters of multi-core processors. Global error estimation for the Magnus propagator is devised using a posteriori error estimation theory. In doing so, the self-adjointness of the linear Schrödinger equation is exploited to avoid solving an adjoint equation. Efficiency and effectiveness of the estimate are demonstrated for both bounded and unbounded states. The temporal approximation is combined with adaptive spectral elements in space. Lagrange elements based on Gauss-Lobatto nodes are employed to avoid nondiagonal mass matrices and ill-conditioning at high order. A matrix-free implementation for the evaluation of the spectral element operators is presented. The framework uses hybrid parallelism and enables significant computational speed-up as well as the solution of larger problems compared to traditional implementations relying on sparse matrices. As an alternative to grid-based methods, radial basis functions in a Galerkin setting are proposed and analyzed. It is found that considerably higher accuracy can be obtained with the same number of basis functions compared to the Fourier method. Another direction of research presented in this thesis is a new algorithm for quantum optimal control: The field is optimized in the frequency domain where the dimensionality of the optimization problem can drastically be reduced. In this way, it becomes feasible to use a quasi-Newton method to solve the problem. / eSSENCE
3

Méthodes pour l'analyse des champs profonds extragalactiques MUSE : démélange et fusion de données hyperspectrales ;détection de sources étendues par inférence à grande échelle / Methods for the analysis of extragalactic MUSE deep fields : hyperspectral unmixing and data fusion;detection of extented sources with large-scale inference

Bacher, Raphael 08 November 2017 (has links)
Ces travaux se placent dans le contexte de l'étude des champs profonds hyperspectraux produits par l'instrument d'observation céleste MUSE. Ces données permettent de sonder l'Univers lointain et d'étudier les propriétés physiques et chimiques des premières structures galactiques et extra-galactiques. La première problématique abordée dans cette thèse est l'attribution d'une signature spectrale pour chaque source galactique. MUSE étant un instrument au sol, la turbulence atmosphérique dégrade fortement le pouvoir de résolution spatiale de l'instrument, ce qui génère des situations de mélange spectral pour un grand nombre de sources. Pour lever cette limitation, des approches de fusion de données, s'appuyant sur les données complémentaires du télescope spatial Hubble et d'un modèle de mélange linéaire, sont proposées, permettant la séparation spectrale des sources du champ. Le second objectif de cette thèse est la détection du Circum-Galactic Medium (CGM). Le CGM, milieu gazeux s'étendant autour de certaines galaxies, se caractérise par une signature spatialement diffuse et de faible intensité spectrale. Une méthode de détection de cette signature par test d'hypothèses est développée, basée sur une stratégie de max-test sur un dictionnaire et un apprentissage des statistiques de test sur les données. Cette méthode est ensuite étendue pour prendre en compte la structure spatiale des sources et ainsi améliorer la puissance de détection tout en conservant un contrôle global des erreurs. Les codes développés sont intégrés dans la bibliothèque logicielle du consortium MUSE afin d'être utilisables par l'ensemble de la communauté. De plus, si ces travaux sont particulièrement adaptés aux données MUSE, ils peuvent être étendus à d'autres applications dans les domaines de la séparation de sources et de la détection de sources faibles et étendues. / This work takes place in the context of the study of hyperspectral deep fields produced by the European 3D spectrograph MUSE. These fields allow to explore the young remote Universe and to study the physical and chemical properties of the first galactical and extra-galactical structures.The first part of the thesis deals with the estimation of a spectral signature for each galaxy. As MUSE is a terrestrial instrument, the atmospheric turbulences strongly degrades the spatial resolution power of the instrument thus generating spectral mixing of multiple sources. To remove this issue, data fusion approaches, based on a linear mixing model and complementary data from the Hubble Space Telescope are proposed, allowing the spectral separation of the sources.The second goal of this thesis is to detect the Circum-Galactic Medium (CGM). This CGM, which is formed of clouds of gas surrounding some galaxies, is characterized by a spatially extended faint spectral signature. To detect this kind of signal, an hypothesis testing approach is proposed, based on a max-test strategy on a dictionary. The test statistics is learned on the data. This method is then extended to better take into account the spatial structure of the targets, thus improving the detection power, while still ensuring global error control.All these developments are integrated in the software library of the MUSE consortium in order to be used by the astrophysical community.Moreover, these works can easily be extended beyond MUSE data to other application fields that need faint extended source detection and source separation methods.

Page generated in 0.0465 seconds