• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 331
  • 58
  • 46
  • 35
  • 21
  • 9
  • 9
  • 8
  • 7
  • 7
  • 4
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 641
  • 67
  • 66
  • 55
  • 54
  • 49
  • 47
  • 45
  • 41
  • 37
  • 35
  • 35
  • 34
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

The validation of a performance-based assessment battery

Wilson, Irene Rose 01 January 2002 (has links)
Legislative pressures are being brought to bear on South African employers to demonstrate that occupational assessment is scientifically valid and culturefair. The development of valid and reliable performance-based assessment tools will enable employers to meet these requirements. The general aim of this research was to validate a performance-based assessment battery for the placement of sales representatives. A literature survey examined alternative assessment measures and methods of performance measurement, leading to the conclusion that the combination of the work sample as a predictor measure and the managerial rating of performance as a criterion measure offer a practical and cost-effective assessment process to the sales manager. The empirical study involved 54 sales persons working for the Commercial division of an oil marketing company, selling products and services to the commercial and industrial market. By means of the empirical study, a significant correlation was found between performance of sales representatives in terms of the performance-based assessment battery for the entry level of the career ladder and their behaviour in the field as measured by the managerial performance rating instrument. The limitations of the sample, however, prevent the results from being generalised to other organisations.
212

Attentiveness: Reactivity at Scale

Hartman, Gregory S. 01 December 2010 (has links)
Clients of reactive systems often change their priorities. For example, a human user of an email viewer may attempt to display a message while a large attachment is downloading. To the user, an email viewer that delayed display of the message would exhibit a failure similar to priority inversion in real-time systems. We propose a new quality attribute, attentiveness, that provides a unified way to model the forms of redirection offered by application-level reactive systems to accommodate the changing priorities of their clients, which may be either humans or systems components. Modeling attentiveness as a quality attribute provides system designers with a single conceptual framework for policy and architectural decisions to address trade-offs among criteria such as responsiveness, overall performance, behavioral predictability, and state consistency.
213

Exploiting Parallelism in GPUs

Hechtman, Blake Alan January 2014 (has links)
<p>Heterogeneous processors with accelerators provide an opportunity to improve performance within a given power budget.</p><p>Many of these heterogeneous processors contain Graphics Processing Units (GPUs) that can perform graphics and embarrassingly parallel computation orders of magnitude faster than a CPU while using less energy. Beyond these obvious applications for GPUs, a larger variety of applications can benefit from a GPU's large computation and memory bandwidth. However, many of these applications are irregular and, as a result, require synchronization and scheduling that are commonly believed to perform poorly on GPUs. The basic building block of synchronization and scheduling is memory consistency, which is, therefore, the first place to look for improving performance on irregular applications. In this thesis, we approach the programmability of irregular applications on GPUs by thinking across traditional boundaries of the compute stack. We think about architecture, microarchitecture and runtime systems from the programmers perspective. To this end, we study architectural memory consistency on future GPUs with cache coherence. In addition, we design a GPU memory system</p><p>microarchitecture that can support fine-grain and coarse-grain synchronization without sacrificing throughput. Finally, we develop a task runtime that embraces the GPU microarchitecture to perform well</p><p>on fork/join parallelism desired by many programmers. Overall, this thesis contributes non-intuitive solutions to improve the performance and programmability of irregular applications from the programmer's perspective.</p> / Dissertation
214

A Model for Managing Data Integrity

Mallur, Vikram 22 September 2011 (has links)
Consistent, accurate and timely data are essential to the functioning of a modern organization. Managing the integrity of an organization’s data assets in a systematic manner is a challenging task in the face of continuous update, transformation and processing to support business operations. Classic approaches to constraint-based integrity focus on logical consistency within a database and reject any transaction that violates consistency, but leave unresolved how to fix or manage violations. More ad hoc approaches focus on the accuracy of the data and attempt to clean data assets after the fact, using queries to flag records with potential violations and using manual efforts to repair. Neither approach satisfactorily addresses the problem from an organizational point of view. In this thesis, we provide a conceptual model of constraint-based integrity management (CBIM) that flexibly combines both approaches in a systematic manner to provide improved integrity management. We perform a gap analysis that examines the criteria that are desirable for efficient management of data integrity. Our approach involves creating a Data Integrity Zone and an On Deck Zone in the database for separating the clean data from data that violates integrity constraints. We provide tool support for specifying constraints in a tabular form and generating triggers that flag violations of dependencies. We validate this by performing case studies on two systems used to manage healthcare data: PAL-IS and iMED-Learn. Our case studies show that using views to implement the zones does not cause any significant increase in the running time of a process.
215

The moderating effects of generalized threat on the relationship between authoritarianism, intolerance of ambiguity, and information processing

Rockwood, Stacy Anne. January 2009 (has links)
Honors Project--Smith College, Northampton, Mass., 2009. / Includes bibliographical references (p. 30-32).
216

Régression linéaire bayésienne sur données fonctionnelles / Functional Bayesian linear regression

Grollemund, Paul-Marie 22 November 2017 (has links)
Un outil fondamental en statistique est le modèle de régression linéaire. Lorsqu'une des covariables est une fonction, on fait face à un problème de statistique en grande dimension. Pour conduire l'inférence dans cette situation, le modèle doit être parcimonieux, par exemple en projetant la covariable fonctionnelle dans des espaces de plus petites dimensions.Dans cette thèse, nous proposons une approche bayésienne nommée Bliss pour ajuster le modèle de régression linéaire fonctionnel. Notre modèle, plus précisément la distribution a priori, suppose que la fonction coefficient est une fonction en escalier. A partir de la distribution a posteriori, nous définissons plusieurs estimateurs bayésiens, à choisir suivant le contexte : un estimateur du support et deux estimateurs, un lisse et un estimateur constant par morceaux. A titre d'exemple, nous considérons un problème de prédiction de la production de truffes noires du Périgord en fonction d'une covariable fonctionnelle représentant l'évolution des précipitations au cours du temps. En terme d'impact sur les productions, la méthode Bliss dégage alors deux périodes de temps importantes pour le développement de la truffe.Un autre atout du paradigme bayésien est de pouvoir inclure de l'information dans la loi a priori, par exemple l'expertise des trufficulteurs et des biologistes sur le développement de la truffe. Dans ce but, nous proposons deux variantes de la méthode Bliss pour prendre en compte ces avis. La première variante récolte de manière indirecte l'avis des experts en leur proposant de construire des données fictives. La loi a priori correspond alors à la distribution a posteriori sachant ces pseudo-données.En outre, un système de poids relativise l'impact de chaque expert ainsi que leurs corrélations. La seconde variante récolte explicitement l'avis des experts sur les périodes de temps les plus influentes sur la production et si cet l'impact est positif ou négatif. La construction de la loi a priori repose alors sur une pénalisation des fonctions coefficients en contradiction avec ces avis.Enfin, ces travaux de thèse s'attachent à l'analyse et la compréhension du comportement de la méthode Bliss. La validité de l'approche est justifiée par une étude asymptotique de la distribution a posteriori. Nous avons construit un jeu d'hypothèses spécifique au modèle Bliss, pour écrire une démonstration efficace d'un théorème de Wald. Une des difficultés est la mauvaise spécification du modèle Bliss, dans le sens où la vraie fonction coefficient n'est sûrement pas une fonction en escalier. Nous montrons que la loi a posteriori se concentre autour d'une fonction coefficient en escalier, obtenue par projection au sens de la divergence de Kullback-Leibler de la vraie fonction coefficient sur un ensemble de fonctions en escalier. Nous caractérisons cette fonction en escalier à partir du design et de la vraie fonction coefficient. / The linear regression model is a common tool for a statistician. If a covariable is a curve, we tackle a high-dimensional issue. In this case, sparse models lead to successful inference, for instance by expanding the functional covariate on a smaller dimensional space.In this thesis, we propose a Bayesian approach, named Bliss, to fit the functional linear regression model. The Bliss model supposes, through the prior, that the coefficient function is a step function. From the posterior, we propose several estimators to be used depending on the context: an estimator of the support and two estimators of the coefficient function: a smooth one and a stewpise one. To illustrate this, we explain the black Périgord truffle yield with the rainfall during the truffle life cycle. The Bliss method succeeds in selecting two relevant periods for truffle development.As another feature of the Bayesian paradigm, the prior distribution enables the integration of preliminary judgments in the statistical inference. For instance, the biologists’ knowledge about the truffles growth is relevant to inform the Bliss model. To this end, we propose two modifications of the Bliss model to take into account preliminary judgments. First, we indirectly collect preliminary judgments using pseudo data provided by experts. The prior distribution proposed corresponds to the posterior distribution given the experts’ pseudo data. Futhermore, the effect of each expert and their correlations are controlled with weighting. Secondly, we collect experts’ judgments about the most influential periods effecting the truffle yield and if the effect is positive or negative. The prior distribution proposed relies on a penalization of coefficient functions which do not conform to these judgments.Lastly, the asymptotic behavior of the Bliss method is studied. We validate the proposed approach by showing the posterior consistency of the Bliss model. Using model-specific assumptions, efficient proof of the Wald theorem is given. The main difficulty is the misspecification of the model since the true coefficient function is surely not a step function. We show that the posterior distribution contracts on a step function which is the Kullback-Leibler projection of the true coefficient function on a set of step functions. This step function is derived from the true parameter and the design.
217

Optimal and adaptive designs for multi-regional clinical trials with regional consistency requirement

Teng, Zhaoyang 08 April 2016 (has links)
To shorten the time for drug development and regulatory approval, a growing number of clinical trials are being conducted in multiple regions simultaneously. One of the challenges to multi-regional clinical trials (MRCT) is how to utilize the data obtained from other regions within the entire trial to help make local approval decisions. In addition to the global efficacy, the evidence of consistency in treatment effects between the local region and the entire trial is usually required for regional approval. In recent years, a number of statistical models and consistency criteria have been proposed. The sample size requirement for the region of interest was also studied. However, there is no specific regional requirement being broadly accepted; sample size planning considering regional requirement of all regions of interest is not well developed; how to apply the adaptive design to MRCT has not been studied. In this dissertation, we have made a number of contributions. First, we propose a unified regional requirement for the consistency assessment of MRCT, which generalizes the requirements proposed by Ko et al. (2010), Chen et al. (2012) and Tsong et al. (2012), make recommendations for choosing the value of parameters defining the proposed requirement, and determine the sample size increase needed to preserve power. Second, we propose two optimal designs for MRCT: minimal total sample size design and maximal utility design, which will provide more effective sample size allocation to ensure certain overall power and assurance probabilities of all interested regions. We also introduce the factors which should be considered in designing MRCT and analyze how each factor affects sample size planning. Third, we propose an unblinded region-level adaptive design to perform sample size re-estimation and re-allocation at interim based on the observed values of each region. We can determine not only whether to stop the whole MRCT based on the conditional power, but also whether to stop any individual region based on the conditional success rate at interim. The simulation results support that the proposed adaptive design has better performance than the classical design in terms of overall power and success rate of each region.
218

Identités, appartenances, transmission : un parcours philosophique avec l’ethnopsychiatrie / Identity, belonging, transmission : a philosophical itinerary together with ethnopsychiatry

Courbin, Lauriane 05 July 2012 (has links)
Cette recherche est le récit, à vocation conceptuelle, d’une rencontre – rencontre de la philosophie avec l’ethnopsychiatrie, pratique clinique destinée à la prise en charge psychothérapeutique des patients migrants. Elle s’est déroulée au centre Georges Devereux dont l’équipe a accueilli l’auteure pendant deux ans. Rencontrer l’autre n’est jamais indifférent, cela fait quelque chose : à celui ou celle qui rencontre, à sa pensée et à l’héritage qui est le sien. La question principale est donc celle-ci : qu’est-ce que cela fait – à la pensée philosophique et à celle qui ici la représente – de rencontrer l’ethnopsychiatrie et les mondes qu’elle convoque ? Cela implique une double exigence : décrire les transformations que subissent, dans la rencontre, des notions comme celles d’« identité », de « personne » ou de « relation », mais aussi de « collectif », de « croyance » et de « transmission », tout en montrant en quoi le récit de ces transformations n’équivaut pas à une enquête introspective et à sa restitution subjective. La transformation principale réside dans le passage d’une pensée générale à un autre régime de pensée (qualifié de « générique »1) qu’il s’agit de caractériser dans la mesure où la pratique clinique, en particulier dans le contexte de multiplicité qu’imposent les passages entre les mondes, semble le supposer. Dans le cadre thérapeutique d’une contrainte à la transformation en effet, les paroles, actes et gestes doivent prendre une consistance propre afin d’être opérants. Cette consistance clinique et le plan de pensée qui l’accompagne sont ce dont ce travail propose de dessiner les principaux traits : immanence, nouveauté et caractère spéculatif. / This research gives the conceptual account of an encounter: the encounter of philosophy with ethnopsychiatry, a clinical practice which proposes psychotherapeutic treatments to migrant patients. This encounter took place at the centre Georges Devereux whose team accepted the author as an observer and participant during two years. Encountering the other is never indifferent. Something happens – to the one that encounters, to her thought and to her personal and philosophical heritage. The main question is also this: what effect does this encounter with ethnopsychiatry and the worlds it deals with have on philosophical thought and the researcher who represents it here? This question requires a double answer: describing the changes that notions like “identity”, “relation” or “person”, but also “community”, “culture”, “belief” and “transmission” undergo in the encounter and showing at the same time that such an account is not the equivalent of an introspective inquiry and its subjective report. The main change consists in the transition from a general way of thinking to another called “generic”1. The point is to characterize this generic way of thinking in so far as the clinical practice seems to require it – particularly dealing with the multiplicity that crossing between worlds imposes. Indeed, in a therapeutic context requiring change, the words, acts and gestures used have to take a specific consistency to be efficient. This work attempts to draw the mains traits of this clinical consistency and the level of thought that goes with: immanency, novelty and speculative character.
219

Modeling Recurrent Gap Times Through Conditional GEE

Liu, Hai Yan 16 August 2018 (has links)
We present a theoretical approach to the statistical analysis of the dependence of the gap time length between consecutive recurrent events, on a set of explanatory random variables and in the presence of right censoring. The dependence is expressed through regression-like and overdispersion parameters, estimated via estimating functions and equations. The mean and variance of the length of each gap time, conditioned on the observed history of prior events and other covariates, are known functions of parameters and covariates, and are part of the estimating functions. Under certain conditions on censoring, we construct normalized estimating functions that are asymptotically unbiased and contain only observed data. We then use modern mathematical techniques to prove the existence, consistency and asymptotic normality of a sequence of estimators of the parameters. Simulations support our theoretical results.
220

Essays in game theory and bankruptcy

Aslan, Ercan January 2016 (has links)
In Chapter 1 I study the iterative strategy elimination mechanisms for normal form games. The literature is mostly clustered around the order of elimination. The conventional elimination also requires more strict knowledge assumptions if the elimination is iterative. I define an elimination process which requires weaker rationality. I establish some preliminary results suggesting that my mechanism is order independent whenever iterative elimination of weakly dominated strategies (IEWDS) is so. I also specify conditions under which the \undercutting problem" occurs. Comparison of other elimination mechanisms in the literature (Iterated Weak Strategy Elimination, Iterated Strict Strategy Elimination, Generalized Strategy Eliminability Criterion, RBEU, Dekel-Fudenberg Procedure, Asheim- Dufwenberg Procedure) and mine is also studied to some extent. In Chapter 2 I study the axiomatic characterization of a well-known bankruptcy rule: Proportional Division (PROP). The rule allocates shares proportional to agents' claims and hence, is intuitive according to many authors. I give supporting evidence to this opinion by first defining a new type of consistency requirement, i.e. union-consistency and showing that PROP is the only rule that satisfies anonymity, continuity and union-consistency. Note that anonymity and continuity are very general requirements and satisfied by almost all the rules that have been studied in this literature. Thus, I prove that we can choose a unique rule among them by only requiring union-consistency. Then, I define a bankruptcy operator and give some intuition on it. A bankruptcy operator is a mapping from the set of bankruptcy operators to itself. I prove that any rule will converge to PROP under this operator as the claims increase. I show nice characteristics of the operator some of which are related to PROP. I also give a definition for continuity of an operator. In Chapter 3 investigate risk-averse investors' behaviour towards a risky firm. In order to find Pareto Optimal allocations regarding a joint venture, I employ a 2-stage game, first stage of which involves a social-planner committing to an ex-post bankruptcy rule. A bankruptcy rule is a set of suggestions for solving each possible bankruptcy problem. A bankruptcy problem occurs when there is not enough endowment to allocate to the agents each of whom has a claim on it. I devise the game-theoretic approach posed in K1br1s and K1br1s (2013) and extend it further. In fact, that paper considers a comparison among 4 renowned bankruptcy rules whereas mine do not restrict attention to any particular rule but rather aim to find a Pareto Optimal(PO) one. I start with 2 agent case in order to give some insight to the reader and then, generalise the results to an arbitrary number of investors. I find socially desirable (PO) allocations and show that the same can be achieved through financial markets by the help of some well-known results.

Page generated in 0.1028 seconds