• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 58
  • 10
  • 5
  • 4
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 108
  • 22
  • 13
  • 13
  • 13
  • 10
  • 9
  • 9
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Statistical issues in Mendelian randomization : use of genetic instrumental variables for assessing causal associations

Burgess, Stephen January 2012 (has links)
Mendelian randomization is an epidemiological method for using genetic variationto estimate the causal effect of the change in a modifiable phenotype onan outcome from observational data. A genetic variant satisfying the assumptionsof an instrumental variable for the phenotype of interest can be usedto divide a population into subgroups which differ systematically only in thephenotype. This gives a causal estimate which is asymptotically free of biasfrom confounding and reverse causation. However, the variance of the causalestimate is large compared to traditional regression methods, requiring largeamounts of data and necessitating methods for efficient data synthesis. Additionally,if the association between the genetic variant and the phenotype is notstrong, then the causal estimates will be biased due to the “weak instrument”in finite samples in the direction of the observational association. This biasmay convince a researcher that an observed association is causal. If the causalparameter estimated is an odds ratio, then the parameter of association willdiffer depending on whether viewed as a population-averaged causal effect ora personal causal effect conditional on covariates. We introduce a Bayesian framework for instrumental variable analysis, whichis less susceptible to weak instrument bias than traditional two-stage methods,has correct coverage with weak instruments, and is able to efficiently combinegene–phenotype–outcome data from multiple heterogeneous sources. Methodsfor imputing missing genetic data are developed, allowing multiple genetic variantsto be used without reduction in sample size. We focus on the question ofa binary outcome, illustrating how the collapsing of the odds ratio over heterogeneousstrata in the population means that the two-stage and the Bayesianmethods estimate a population-averaged marginal causal effect similar to thatestimated by a randomized trial, but which typically differs from the conditionaleffect estimated by standard regression methods. We show how thesemethods can be adjusted to give an estimate closer to the conditional effect. We apply the methods and techniques discussed to data on the causal effect ofC-reactive protein on fibrinogen and coronary heart disease, concluding withan overall estimate of causal association based on the totality of available datafrom 42 studies.
42

INTEGRATIVE OMICS REVEALS INSIGHTS INTO HUMAN LIVER DEVELOPMENT, DISEASE ETIOLOGY, AND PRECISION MEDICINE

Zhipeng Liu (8126406) 20 December 2019 (has links)
<div><div><div><p>Transcriptomic regulation of human liver is a tightly controlled and highly dynamic process. Genetic and environmental exposures to this process play pivotal roles in the development of multiple liver disorders. Despite accumulating knowledge have gained through large-scale genomics studies in the developed adult livers, the contributing factors to the interindividual variability in the pediatric livers remain largely uninvestigated. In the first two chapters of the present study, we addressed this question through an integrative analysis of both genetic variations and transcriptome-wide RNA expression profiles in a pediatric human liver cohort with different developmental stages ranging from embryonic to adulthood. Our systematic analysis revealed a transcriptome-wide transition from stem-cell-like to liver-specific profiles during the course of human liver development. Moreover, for the first time, we observed different genetic control of hepatic gene expression in different developmental stages. Motivated by the critical roles of genetics variations and development in regulating hepatic gene expression, we constructed robust predictive models to impute the virtual liver gene expression using easily available genotype and demographic information. Our model is promising in improving both PK/PD modeling and disease diagnosis for pediatric patients. In the last two chapters of the study, we analyzed the genomics data in a more liver disease- related context. Specifically, in the third chapter, we identified Macrophage migration inhibitory factor (MIF) and its related pathways as potential targets underlying human liver fibrosis through an integrative omics analysis. In the last chapter, utilizing the largest-to-date publicly available GWAS summary data, we dissected the causal relationships among three important and clinically related metabolic diseases: non-alcoholic fatty liver disease (NAFLD), type 2 diabetes (T2D), and obesity. Our analysis suggested new subtypes and provided insights into the precision treatment or prevention for the three complex diseases. Taken together, through integrative analysis of multiple levels of genomics information, we improved the current understanding of human liver development, the pathogenesis of liver disorders, and provided implications to precision medicine.</p></div></div></div>
43

Preferences for Randomization in Social Choice:

Letsou, Christina January 2020 (has links)
Thesis advisor: Uzi Segal / This dissertation consists of three chapters analyzing preferences for randomization in social choice problems. The first two chapters are related and in the fields of distributive justice and social choice. They concern allocation of an indivisible good in social choice problems where efficiency is at odds with equality. The last chapter addresses a social choice problem from an individual's perspective using decision theoretical analysis. In this dissertation I demonstrate why randomization may be an attractive policy in social choice problems and demonstrate how individuals may have preferences over the precise method of randomization. The first chapter is titled "Live and Let Die." This paper discusses how to allocate an indivisible good by social lottery when agents have asymmetric claims. Intuition suggests that there may exist agents who should receive zero probability in the optimal social lottery. In such a case, I say that these agents have weak claims to the good. This paper uses a running example of allocating an indivisible medical treatment to individuals with different survival rates and reactions to the treatment in order to provide conditions for consistency of weak claims. As such, I develop two related assumptions on a social planner's preferences over lotteries. The first -- survival rate scaling -- states that if an individual has a weak claim, then his claim is also weak when survival rates increase proportionally. The second -- independence of weak claims -- states that if an individual has a weak claim, then his removal does not affect others' probabilities of receiving the treatment. These assumptions imply that a compatible social welfare function must exhibit constant elasticity of substitution, which results in potentially-degenerate weighted lotteries. The second chapter is titled "Why is Six Afraid of Seven? Bringing the "Numbers" to Economics." This chapter discusses the numbers problem: the question of if the numbers of people involved should be used to determine whether to help certain people or to help certain other people. I discuss the main solutions that have been proposed: flipping a coin, saving the greater number, and proportionally weighted lotteries. Using the economic tools of social choice, I then show how the model of the previous chapter, "Live and Let Die," can be extended to address numbers problems and compare the implications of prominent social welfare functions for numbers problems. I argue that potentially-degenerate weighted lotteries can assuage the main concerns discussed in the literature and I show that both the Nash product social welfare function as well as constant elasticity of substitution (CES) social welfare functions are compatible with this solution. Finally, I discuss a related problem known as "probability cases," in which individuals differ in survival chances rather than numbers of individuals at risk. When the model is extended to allow for both asymmetries in survival chances and numbers of individuals in groups, CES results in potentially-degenerate weighted lotteries whereas Nash product does not. The third chapter is titled "All Probabilities are Equal, but Some Probabilities are More Equal than Others," which is joint work with Professor Uzi Segal of the Economics Department at Boston College and Professor Shlomo Naeh of the Departments of Talmud and Jewish Thought at The Hebrew University of Jerusalem. In this chapter we compare preferences for different procedures of selecting people randomly. A common procedure for selecting people is to have them draw balls from an urn in turn. Modern and ancient stories (for example, by Graham Greene and the Talmud) suggest that such a lottery may not be viewed by the individuals as "fair.'' In this paper, we compare this procedure with several alternatives. These procedures give all individuals equal chance of being selected, but have different structures. We analyze these procedures as multi-stage lotteries. In line with previous literature, our analysis is based on the observation that multi-stage lotteries are not considered indifferent to their probabilistic one-stage representations. As such, we use a non-expected utility model to understand the preferences of risk-averse individuals over these procedures and show that they may be not indifferent between them. / Thesis (PhD) — Boston College, 2020. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Economics.
44

Efficient Inversion of Large-Scale Problems Exploiting Structure and Randomization

January 2020 (has links)
abstract: Dimensionality reduction methods are examined for large-scale discrete problems, specifically for the solution of three-dimensional geophysics problems: the inversion of gravity and magnetic data. The matrices for the associated forward problems have beneficial structure for each depth layer of the volume domain, under mild assumptions, which facilitates the use of the two dimensional fast Fourier transform for evaluating forward and transpose matrix operations, providing considerable savings in both computational costs and storage requirements. Application of this approach for the magnetic problem is new in the geophysics literature. Further, the approach is extended for padded volume domains. Stabilized inversion is obtained efficiently by applying novel randomization techniques within each update of the iteratively reweighted scheme. For a general rectangular linear system, a randomization technique combined with preconditioning is introduced and investigated. This is shown to provide well-conditioned inversion, stabilized through truncation. Applying this approach, while implementing matrix operations using the two dimensional fast Fourier transform, yields computationally effective inversion, in memory and cost. Validation is provided via synthetic data sets, and the approach is contrasted with the well-known LSRN algorithm when applied to these data sets. The results demonstrate a significant reduction in computational cost with the new algorithm. Further, this new algorithm produces results for inversion of real magnetic data consistent with those provided in literature. Typically, the iteratively reweighted least squares algorithm depends on a standard Tikhonov formulation. Here, this is solved using both a randomized singular value de- composition and the iterative LSQR Krylov algorithm. The results demonstrate that the new algorithm is competitive with these approaches and offers the advantage that no regularization parameter needs to be found at each outer iteration. Given its efficiency, investigating the new algorithm for the joint inversion of these data sets may be fruitful. Initial research on joint inversion using the two dimensional fast Fourier transform has recently been submitted and provides the basis for future work. Several alternative directions for dimensionality reduction are also discussed, including iteratively applying an approximate pseudo-inverse and obtaining an approximate Kronecker product decomposition via randomization for a general matrix. These are also topics for future consideration. / Dissertation/Thesis / Doctoral Dissertation Applied Mathematics 2020
45

Faire preuve par le chiffre ? Le cas des expérimentations aléatoires en économie / Evidence by numbers? The case of randomized controlled trials

Jatteau, Arthur 05 December 2016 (has links)
Par l’intermédiaire d’Esther Duflo et de son laboratoire le J-PAL, les expérimentations aléatoires ont connu un essor remarquable depuis les années 2000 en économie et sont présentées par leurs promoteurs comme une méthode particulièrement robuste dans l’évaluation d’impact. Combinant méthodologies quantitatives et qualitatives, cette thèse examine la construction sociale de la preuve expérimentale et apporte une contribution à une épistémologie sociale et historique des expérimentations aléatoires, ainsi qu’à la socio-économie de la quantification. Dans une première partie, nous développons une socio-histoire de cette méthode. Les origines des expérimentations aléatoires sont pluridisciplinaires et antérieures à leur utilisation massive en médecine depuis les années 1940, puis en économie depuis la fin des années 1960. Nous en tirons des enseignements méthodologiques éclairant la pratique actuelle des expérimentations aléatoires. Dans un second temps, nous nous intéressons aux acteurs de cette méthode, en nous penchant sur les chercheurs du J-PAL. En procédant à une analyse prosopographique, complétée par une analyse de réseau, nous montrons que les capitaux académiques élevés de ces chercheurs et l’existence de leaders permettent de contrôler et promouvoir la diffusion de la méthode. Dans une dernière partie, nous interrogeons la production de la preuve par les expérimentations aléatoires. En nous attachant à saisir les pratiques expérimentales, nous montrons que les validités interne et externe sont souvent problématiques. Enfin, nous analysons les liens contrariés entre expérimentations aléatoires et politique(s). / With Esther Duflo and her lab (the J-PAL), randomized controlled trials (RCTs) became trendy from the the 2000’s onward in economics and are presented by their advocates as the most robust method for impact evaluation. Relying on mixed methods, this thesis investigates the social construction of experimental evidence and contributes to a social and historical epistemology of RCTs and to the socio-economy of quantification.The first part develops a socio-history of this method. The origins of RCTs are multidisciplinary and precede their extensive use in medicine from the 1940s and in economics from the 1960s onward. This allows us to gain a deeper undestanding of the current use of RCTs.In the second part, we examine the stakeholders of this method, chiefly J-PAL researchers. Our prosopographical analysis, supplemented by a network analysis, demonstrates that their high level of academic capital and the presence of leaders allow for the control and the diffusion of RCTs.In the last part, we scrutinize the production of experimental evidence. By examining RCTs in operation, we show that both their internal and external validity are in many cases compromized. Finally, we explore the convoluted links between RCTs, policy and politics.
46

An Instructional Strategy with Simulations Used to Increase Statistical Literacy among Students in a Hispanic Serving Institution

Hernandez, Eric O 05 November 2018 (has links)
This study analyzed the effects of a randomization-based inference teaching methodology on students’ content mastery in an introductory statistics college course. The sample was 125 undergraduate students from Miami Dade College, a large Hispanic Serving Institution in the Southeast. A pretest-posttest nonequivalent group design was used for the study. Students in the randomization-based teaching modality received exposure to simulation activities, specifically bootstrap confidence intervals and randomization test, that aim to enhance conceptual understanding of inferential statistics, an important component of statistical literacy. The instructional strategy was designed to trigger critical reflection that confronted students with their thinking and lead them through a process of reorganization, restructure, and improvement of their concepts. The 40 item Comprehensive Assessment of Outcomes in a first Statistics course (CAOS) instrument was used to measure students’ conceptual understanding of important statistical ideas along with a demographic and academic survey that collected data on student characteristics. A stepwise linear regression method was used to look at the effects of group membership while controlling for Pre-CAOS scores, age, gender, first generation, prior experience with statistics, student status (part/full time), native speaker, STEM or not-STEM major, Hispanic, highest math course taken in high school, and GPA. The full model showed that only Group and Pre-CAOS score were the only significant predictors of Post CAOS scores. None of the other variables were significant. The model was a significant predictor of Post CAOS score, F(2, 121) = 16.96, p The results supported the claim that the randomization-based teaching modality for inferential statistics help Hispanic students to achieve a better understanding of the learning outcomes associated with an undergraduate introductory statistics course.
47

Power Analysis for Alternative Tests for the Equality of Means.

Li, Haiyin 07 May 2011 (has links) (PDF)
The two sample t-test is the test usually taught in introductory statistics courses to test for the equality of means of two populations. However, the t-test is not the only test available to compare the means of two populations. The randomization test is being incorporated into some introductory courses. There is also the bootstrap test. It is also not uncommon to decide the equality of the means based on confidence intervals for the means of these two populations. Are all those methods equally powerful? Can the idea of non-overlapping t confidence intervals be extended to bootstrap confidence intervals? The powers of seven alternative ways of comparing two population means are analyzed using small samples with data coming from distributions with different degrees of skewness and kurtosis. The analysis is done using simulation; programs in GAUSS were especially written for this purpose.
48

Treatment Effect Heterogeneity and Statistical Decision-making in the Presence of Interference

Owusu, Julius January 2023 (has links)
This dissertation consists of three chapters that generally focus on the design of welfare-maximizing treatment assignment rules in heterogeneous populations with interactions. In the first two chapters, I focus on an important pre-step in the design of treatment assignment rules: inference for heterogeneous treatment effects in populations with interactions. In the final chapter, I and my co-authors study treatment assignment rules in the presence of social interaction in heterogeneous populations. In chapter one, I argue that statistical inference of heterogeneous treatment effects (HTEs) across predefined subgroups is complicated when economic units interact because treatment effects may vary by pretreatment variables, post-treatment exposure variables (that measure the exposure to other units’ treatment statuses), or both. It invalidates the standard hypothesis testing technique used to infer HTEs. To address the problem, I develop statistical methods (asymptotic and bootstrap) to infer HTEs and disentangle the drivers of treatment effects heterogeneity in populations where units interact. Specifically, I incorporate clustered interference into the potential outcomes model and propose kernel-based test statistics for the null hypotheses of (a) no HTEs by treatment assignment (or post-treatment exposure variables) for all pretreatment variables values; and (b) no HTEs by pretreatment variables for all treatment assignment vectors. To disentangle the source of heterogeneity in treatment effects, I recommend a multiple-testing algorithm. In addition, I prove the asymptotic properties of the proposed test statistics via a modern poissonization technique. As a robust alternative to the inferential methods I propose in chapter one, in chapter two, I design randomization tests of heterogeneous treatment effects (HTEs) when units interact on a single network. My modeling strategy allows network interference into the potential outcomes framework using the concept of network exposure mapping. I consider three null hypotheses that represent different notions of homogeneous treatment effects, but due to nuisance parameters and the multiplicity of potential outcomes, the hypotheses are not sharp. To address the issue of multiple potential outcomes, I propose a conditional randomization inference method that expands on existing methods. Additionally, I consider two techniques that overcome the nuisance parameter issue. I show that my conditional randomization inference method, combined with either of the proposed techniques for handling nuisance parameters, produces asymptotically valid p-values. Chapter three is based on a joint paper with Young Ki Shin and Seungjin Han. We study treatment assignment rules in the presence of social interaction in heterogeneous populations. We construct an analytical framework under the anonymous interaction assumption, where the decision problem becomes choosing a treatment fraction. We propose a multinomial empirical success (MES) rule that includes the empirical success rule of Manski (2004) as a special case. We investigate the non-asymptotic bounds of the expected utility based on the MES rule. / Dissertation / Doctor of Philosophy (PhD)
49

Using nonrandomized vs. randomized interdependent group contingency components: Comparing the effects on disruptive behaviors and academic engagement in elementary students

McKissick, Chele January 2011 (has links)
No description available.
50

A CLUSTER RANDOMIZATION TRIAL TO TEST THE EFFECTIVENESS OF A CLINICAL PATHWAY IN MANAGING ATRIAL FIBRILLATION IN THE EMERGENCY DEPARTMENT

Nair, Madhavan Girish 10 1900 (has links)
<p><strong>BACKGROUND: </strong></p> <p>Emergency Department (ED) visit rates for patients with symptomatic, recent onset atrial fibrillation or atrial flutter (RAFF) are on the rise, which in turn has resulted in high hospital admission rates (38-45%). Optimal management strategies need to be developed for reducing hospital admission rates.</p> <p><strong>STUDY QUESTION:</strong></p> <p>The primary objective of this study is to evaluate the effectiveness of a clinical pathway for the management of low-risk RAFF patients in the ED. The hypothesis is that an evidence-based clinical pathway will help ED physicians better manage AF patients and reduce hospital admissions.</p> <p><strong>STUDY DESIGN:</strong></p> <p>A prospective, blinded, stratified, two-arm cluster-randomized trial will be conducted. The intervention is a clinical pathway for the management of RAFF.</p> <p><strong>PRIMARY AND SECONDARY OUTCOMES:</strong></p> <p>The primary outcome of this trial will be the proportion of low-risk, RAAF patients admitted to the hospital from the ED. Secondary outcomes will include a range of safety outcomes.</p> <p><strong>STATISTICAL CONSIDERATIONS:</strong></p> <p>An intention-to-treat analysis will be conducted at the individual level using proportions and means according to the variable in question with an alpha level of 0.05 and power of 0.80 for the primary outcome. The cRCT (assuming a 30% RRR) will be conducted over a two year time period. A total of 13 clusters and 3500 ED visits will be recruited to each intervention arm.</p> <p><strong>SUMMARY:</strong></p> <p>This thesis explores the methodological issues relevant to the design of a cRCT evaluating a clinical pathway in the management of acute onset, low risk AF patients presenting to the ED.</p> / Master of Science (MSc)

Page generated in 0.093 seconds