• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 613
  • 157
  • 86
  • 74
  • 54
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1423
  • 210
  • 187
  • 187
  • 179
  • 178
  • 123
  • 116
  • 102
  • 101
  • 97
  • 84
  • 80
  • 78
  • 77
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Computer simulations of ribosome reactions

Trobro, Stefan January 2008 (has links)
<p>Peptide bond formation and translational termination on the ribosome have been simulated by molecular mechanics, free energy perturbation, empirical valence bond (MD/FEP/EVB) and automated docking methods. Recent X-ray crystallographic data is used here to calculate the entire free energy surface for the system complete with substrates, ribosomal groups, solvent molecules and ions. A reaction mechanism for peptide bond formation emerges that is found to be catalyzed by the ribosome, in agreement with kinetic data and activation entropy measurements. The results show a water mediated network of hydrogen bonds, capable of reducing the reorganization energy during peptidyl transfer. The predicted hydrogen bonds and the structure of the active site were later confirmed by new X-ray structures with proper transition states analogs. </p><p>Elongation termination on the ribosome is triggered by binding of a release factor (RF) protein followed by rapid release of the nascent peptide. The structure of the RF, bound to the ribosomal peptidyl transfer center (PTC), has not been resolved in atomic detail. Nor is the mechanism known, by which the hydrolysis proceeds. Using automated docking of a hepta-peptide RF fragment, containing the highly conserved GGQ motif, we identified a conformation capable of catalyzing peptide hydrolysis. The MD/FEP/EVB calculations also reproduce the slow spontaneous release when RF is absent, and rationalize available mutational data. The network of hydrogen bonds, the active site structure, and the reaction mechanism are found to be very similar for both peptidyl transfer and termination. </p><p>New structural data, placing a ribosomal protein (L27) in the PTC, motivated additional MD/FEP/EVB simulations to determine the effect of this protein on peptidyl transfer. The simulations predict that the protein N terminus interacts with the A-site substrate in a way that promotes binding. The catalytic effect of L27 in the ribosome, however, is shown to be marginal and it therefore seems valid to view the PTC as a ribozyme. Simulations with the model substrate puromycin (Pmn) predicts that protonation of the N terminus can reduce the rate of peptidyl transfer. This could explain the different pH-rate profiles measured for Pmn, compared to other substrates.</p>
112

Inference for the Quantiles of ARCH Processes/Inférence pour les Quantiles d'un Processus ARCH

Taniai, Hiroyuki 23 June 2009 (has links)
Ce travail se compose de trois parties consacrées à différents aspects des modèles ARCH (AutoRegressive Conditionally Heteroskedastic) quantiles. Dans ces modèles, l’hétéroscédasticité conditionnelle est à prendre dans un sens très large, et affecte de fa¸ con potentiellement différenciée tous les quantiles conditionnels (et donc la loi conditionnelle elle-même), et non seulement, comme dans les modèles ARCH classiques, l’échelle conditionnelle. La première partie étudie les problèmes de Value-at-Risk (VaR) dans les séries financières ainsi modélisées. Les approches traditionnelles présentent une caractéristique discutable, que nous relevons, et à laquelle nous apportons une correction fondée sur les lois résiduelles. Nous pensons que les fondements de cette nouvelle approche sont plus solides, et permettent de prendre en compte le fait que le comportement des processus empiriques résiduels (REP) des processus ARCH, contrairement à celui des REP des processus ARMA, continue à dépendre de certains des paramètres du modèle. La seconde partie approfondit l’étude générale des processus empiriques résiduels (REP) des processus ARCH dans l’optique de la régression quantile (QR) au sens de Koenker et Bassett (Econometrica 1978). La représentation de Bahadur des estimateurs QR, et dont découle la propriété de tension asymptotique des REP, est établie. Finalement, dans la troisième partie, nous mettons en évidence la nature semi-paramétrique des modèles ARCH quantiles, et l’invariance, sous l’action de certains groupes de transforma-tions, des sous-modèles obtenus en fixant la valeur des paramètres. Cette structure de groupe permet la construction de méthodes d’inférence invariantes qui, dans l’esprit des résultats de Hallin and Werker (Bernoulli 2003) préservent l’optimalité au sens semi-paramétrique. Ces méthodes sont fondées sur les rangs et les signes résiduels. Nous développons en particulier les R-estimateurs des modèles considérés et étudions leurs performances.
113

Estimation-based Metaheuristics for Stochastic Combinatorial Optimization: Case Studies in Stochastic Routing Problems

Prasanna, BALAPRAKASH 26 January 2010 (has links)
Stochastic combinatorial optimization problems are combinatorial optimization problems where part of the problem data are probabilistic. The focus of this thesis is on stochastic routing problems, a class of stochastic combinatorial optimization problems that arise in distribution management. Stochastic routing problems involve finding the best solution to distribute goods across a logistic network. In the problems we tackle, we consider a setting in which the cost of a solution is described by a random variable; the goal is to find the solution that minimizes the expected cost. Solving such stochastic routing problems is a challenging task because of two main factors. First, the number of possible solutions grows exponentially with the instance size. Second, computing the expected cost of a solution is computationally very expensive. <br> To tackle stochastic routing problems, stochastic local search algorithms such as iterative improvement algorithms and metaheuristics are quite promising because they offer effective strategies to tackle the combinatorial nature of these problems. However, a crucial factor that determines the success of these algorithms in stochastic settings is the trade-off between the computation time needed to search for high quality solutions in a large search space and the computation time spent in computing the expected cost of solutions obtained during the search. <br> To compute the expected cost of solutions in stochastic routing problems, two classes of approaches have been proposed in the literature: analytical computation and empirical estimation. The former exactly computes the expected cost using closed-form expressions; the latter estimates the expected cost through Monte Carlo simulation. <br> Many previously proposed metaheuristics for stochastic routing problems use the analytical computation approach. However, in a large number of practical stochastic routing problems, due to the presence of complex constraints, the use of the analytical computation approach is difficult, time consuming or even impossible. Even for the prototypical stochastic routing problems that we consider in this thesis, the adoption of the analytical computation approach is computationally expensive. Notwithstanding the fact that the empirical estimation approach can address the issues posed by the analytical computation approach, its adoption in metaheuristics to tackle stochastic routing problems has never been thoroughly investigated. <br> In this thesis, we study two classical stochastic routing problems: the probabilistic traveling salesman problem (PTSP) and the vehicle routing problem with stochastic demands and customers (VRPSDC). The goal of the thesis is to design, implement, and analyze effective metaheuristics that use the empirical estimation approach to tackle these two problems. The main results of this thesis are: 1) The empirical estimation approach is a viable alternative to the widely-adopted analytical computation approach for the PTSP and the VRPSDC; 2) A principled adoption of the empirical estimation approach in metaheuristics results in high performing algorithms for tackling the PTSP and the VRPSDC. The estimation-based metaheuristics developed in this thesis for these two problems define the new state-of-the-art.
114

Computer simulations of ribosome reactions

Trobro, Stefan January 2008 (has links)
Peptide bond formation and translational termination on the ribosome have been simulated by molecular mechanics, free energy perturbation, empirical valence bond (MD/FEP/EVB) and automated docking methods. Recent X-ray crystallographic data is used here to calculate the entire free energy surface for the system complete with substrates, ribosomal groups, solvent molecules and ions. A reaction mechanism for peptide bond formation emerges that is found to be catalyzed by the ribosome, in agreement with kinetic data and activation entropy measurements. The results show a water mediated network of hydrogen bonds, capable of reducing the reorganization energy during peptidyl transfer. The predicted hydrogen bonds and the structure of the active site were later confirmed by new X-ray structures with proper transition states analogs. Elongation termination on the ribosome is triggered by binding of a release factor (RF) protein followed by rapid release of the nascent peptide. The structure of the RF, bound to the ribosomal peptidyl transfer center (PTC), has not been resolved in atomic detail. Nor is the mechanism known, by which the hydrolysis proceeds. Using automated docking of a hepta-peptide RF fragment, containing the highly conserved GGQ motif, we identified a conformation capable of catalyzing peptide hydrolysis. The MD/FEP/EVB calculations also reproduce the slow spontaneous release when RF is absent, and rationalize available mutational data. The network of hydrogen bonds, the active site structure, and the reaction mechanism are found to be very similar for both peptidyl transfer and termination. New structural data, placing a ribosomal protein (L27) in the PTC, motivated additional MD/FEP/EVB simulations to determine the effect of this protein on peptidyl transfer. The simulations predict that the protein N terminus interacts with the A-site substrate in a way that promotes binding. The catalytic effect of L27 in the ribosome, however, is shown to be marginal and it therefore seems valid to view the PTC as a ribozyme. Simulations with the model substrate puromycin (Pmn) predicts that protonation of the N terminus can reduce the rate of peptidyl transfer. This could explain the different pH-rate profiles measured for Pmn, compared to other substrates.
115

Att förmedla trygghet : En studie om distriktssköterskor och derasrelation till patienter / To mediate safety : A study about district nurses and their relationto patients

Avdagić, Mesud January 2009 (has links)
Background One of the main demands on Swedish and global health care in general is to meet the patient’s need for safety. By general health care law this also comprises district nurses’ field of responsibility. Although there are numerous studies describing the concept of safety and its different shapes, no research could be found exploring how safety is, or supposed to be, mediated by district nurses’ in a Swedish context. Research about this is therefore needed. Aim The aim of this qualitative study was to explore how district nurses’ mediate safety to their patients. Method Qualitative data were collected from seven district nurses’ by means of semi structured interviews. Thereafter, a concept analysis was carried out. Results Responses revealed that district nurses’ consider themselves mediate safety through a variety of ways. Five major categories emerged: (1) complaisance’s; (2) competence; (3) patient participation; (4) same caregiver; (5) personal characteristics. Conclusion District nurses’ mediate safety through a combination of general attitudes and concrete acts. Preconditions are bound to each district nurse’s individual ability to give a good complaisance, his/her competence and ability to involve patients in treatment and care. Other, less pronounced, are bound to the district nurse’s ability to create continuity in contact with patients’ and his/her personal characteristics.
116

Understanding Programmers' Working Context by Mining Interaction Histories

Zou, Lijie January 2013 (has links)
Understanding how software developers do their work is an important first step to improving their productivity. Previous research has generally focused either on laboratory experiments or coarsely-grained industrial case studies; however, studies that seek a finegrained understanding of industrial programmers working within a realistic context remain limited. In this work, we propose to use interaction histories — that is, finely detailed records of developers’ interactions with their IDE — as our main source of information for understanding programmer’s work habits. We develop techniques to capture, mine, and analyze interaction histories, and we present two industrial case studies to show how this approach can help to better understand industrial programmers’ work at a detailed level: we explore how the basic characteristics of software maintenance task structures can be better understood, how latent dependence between program artifacts can be detected at interaction time, and show how patterns of interaction coupling can be identified. We also examine the link between programmer interactions and some of the contextual factors of software development, such as the nature of the task being performed, the design of the software system, and the expertise of the developers. In particular, we explore how task boundaries can be automatically detected from interaction histories, how system design and developer expertise may affect interaction coupling, and whether newcomer and expert developers differ in their interaction history patterns. These findings can help us to better reason about the multidimensional nature of software development, to detect potential problems concerning task, design, expertise, and other contextual factors, and to build smarter tools that exploit the inherent patterns within programmer interactions and provide improved support for task-aware and expertise-aware software development.
117

Have Federal Sanctions Helped Failing Schools? The Impact of No Child Left Behind in Texas

Hayhurst, Ernest W 01 January 2013 (has links)
This paper will assess the effectiveness of the No Child Left Behind Act (NCLB) in the state of Texas. In order to do this, we examine how students’ performance levels from failing schools respond to sanctions imposed by the NCLB accountability system. Additionally, we explore achievement gap trends between white and minority students that attend these failing schools. By taking advantage of campus and year fixed effects, as well as controlling for student demographic characteristics, we find that sanctions employed by NCLB have had a statistically significant positive impact on academic achievement gains for all students. However, our results also indicate that these sanctions have effectively widened the achievement gaps between the white and minority students they affect. Given that the federal government spends upwards of 14 billion dollars per year to fund NCLB, this paper offers new insight to an economically important issue that is relevant to all citizens of the United States.
118

Open stope hangingwall design based on general and detailed data collection in unfavourable hangingwall conditions

Capes, Geoffrey William 16 April 2009
This thesis presents new methods to improve open stope hangingwall (HW) design based on knowledge gained from site visits, observations, and data collection at underground mines in Canada, Australia, and Kazakhstan. The data for analysis was collected during 2 months of research at the Hudson Bay Mining and Smelting Ltd. Callinan Mine in Flin Flon, Manitoba, a few trips to the Cameco Rabbit Lake mine in northern Saskatchewan, and 3 years of research and employment at the Xstrata Zinc George Fisher mine near Mount Isa, Queensland, Australia. Other sites visited, where substantial stope stability knowledge was accessed include the Inco Thompson mines in northern Manitoba; BHP Cannington mine, Xstrata Zinc Lead Mine, and Xstrata Copper Enterprise Mine, in Queensland, Australia; and the Kazzinc Maleevskiy Mine in north-eastern Kazakhstan. An improved understanding of stability and design of open stope HWs was developed based on: 1) Three years of data collection from various rock masses and mining geometries to develop new sets of design lines for an existing HW stability assessment method; 2) The consideration of various scales of domains to examine HW rock mass behaviour and development of a new HW stability assessment method; 3) The investigation of the HW failure mechanism using analytical and numerical methods; 4) An examination of the effects of stress, undercutting, faulting, and time on stope HW stability through the presentation of observations and case histories; and 5) Innovative stope design techniques to manage predicted stope HW instability. An observational approach was used for the formulation of the new stope design methodology. To improve mine performance by reducing and/or controlling the HW rock from diluting the ore with non-economic material, the individual stope design methodology included creating vertical HWs, leaving ore skins or chocks where appropriate, and rock mass management. The work contributed to a reduction in annual dilution from 14.4% (2003) to 6.3% (2005), an increase in zinc grade from 7.4% to 8.7%, and increasing production tonnes from 2.1 to 2.6 Mt (Capes et al., 2006).
119

Empirical Likelihood Confidence Intervals for ROC Curves with Missing Data

An, Yueheng 25 April 2011 (has links)
The receiver operating characteristic, or the ROC curve, is widely utilized to evaluate the diagnostic performance of a test, in other words, the accuracy of a test to discriminate normal cases from diseased cases. In the biomedical studies, we often meet with missing data, which the regular inference procedures cannot be applied to directly. In this thesis, the random hot deck imputation is used to obtain a 'complete' sample. Then empirical likelihood (EL) confidence intervals are constructed for ROC curves. The empirical log-likelihood ratio statistic is derived whose asymptotic distribution isproved to be a weighted chi-square distribution. The results of simulation study show that the EL confidence intervals perform well in terms of the coverage probability and the average length for various sample sizes and response rates.
120

Constructions, Semantic Compatibility, and Coercion: An Empirical Usage-based Approach

Yoon, Soyeon 24 July 2013 (has links)
This study investigates the nature of semantic compatibility between constructions and lexical items that occur in them in relation with language use, and the related concept, coercion, based on a usage-based approach to language, in which linguistic knowledge (grammar) is grounded in language use. This study shows that semantic compatibility between linguistic elements is a gradient phenomenon, and that speakers’ knowledge about the degree of semantic compatibility is intimately correlated with language use. To show this, I investigate two constructions of English: the sentential complement construction and the ditransitive construction. I observe speakers’ knowledge of the semantic compatibility between the constructions and lexical items and compared it with empirical data obtained from linguistic corpora and experiments on sentence processing and acceptability judgments. My findings specifically show that the relative semantic compatibility of the lexical items and the construction is significantly correlated with the frequency of use of their co-occurrences and the processing effort and speakers’ acceptability judgments for the co-occurrences. The empirical data show that a lexical item and a construction which are less than fully compatible can be actually used together when the incompatibility is resolved. The resolution of the semantic incompatibility between the lexical item and its host construction has been called coercion. Coercion has been invoked as a theoretical concept without being examined in depth, particularly without regard to language use. By correlating degree of semantic compatibility with empirical data of language use, this study highlights that coercion is an actual psychological process which occurs during the composition of linguistic elements. Moreover, by examining in detail how the semantics of a lexical item and a construction interact in order to reconcile the incompatibility, this study reveals that coercion is semantic integration that involves not only dynamic interaction of linguistic components but also non-linguistic contexts. Investigating semantic compatibility and coercion in detail with empirical data tells about the processes by which speakers compose linguistic elements into larger units. It also supports the assumption of the usage-based model that grammar and usage are not independent, and ultimately sheds light on the dynamic aspect of our linguistic system.

Page generated in 0.0635 seconds