• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 287
  • 67
  • 48
  • 32
  • 28
  • 18
  • 14
  • 13
  • 12
  • 9
  • 3
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 666
  • 666
  • 359
  • 359
  • 150
  • 147
  • 101
  • 72
  • 66
  • 66
  • 65
  • 63
  • 62
  • 60
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

EVALUATING THE IMPACTS OF ANTIDEPRESSANT USE ON THE RISK OF DEMENTIA

Duan, Ran 01 January 2019 (has links)
Dementia is a clinical syndrome caused by neurodegeneration or cerebrovascular injury. Patients with dementia suffer from deterioration in memory, thinking, behavior and the ability to perform everyday activities. Since there are no cures or disease-modifying therapies for dementia, there is much interest in identifying modifiable risk factors that may help prevent or slow the progression of cognitive decline. Medications are a common focus of this type of research. Importantly, according to a report from the Centers for Disease Control and Prevention (CDC), 19.1% of the population aged 60 and over report taking antidepressants during 2011-2014, and this number tends to increase. However, antidepressant use among the elderly may be concerning because of the potentially harmful effects on cognition. To assess the impacts of antidepressants on the risk of dementia, we conducted three consecutive projects. In the first project, a retrospective cohort study using Marginal Structural Cox Proportional Hazards regression model with Inverse Probability Weighting (IPW) was conducted to evaluate the average causal effects of different classes of antidepressant on the risk of dementia. Potential causal effects of selective serotonin reuptake inhibitors (SSRIs), serotonin and norepinephrine reuptake inhibitors (SNRIs), atypical anti-depressants (AAs) and tri-cyclic antidepressants (TCAs) on the risk of dementia were observed at the 0.05 significance level. Multiple sensitivity analyses supported these findings. Unmeasured confounding is a threat to the validity of causal inference methods. In evaluating the effects of antidepressants, it is important to consider how common comorbidities of depression, such as sleep disorders, may affect both the exposure to anti-depressants and the onset of cognitive impairment. In this dissertation, sleep apnea and rapid-eye-movement behavior disorder (RBD) were unmeasured and thus uncontrolled confounders for the association between antidepressant use and the risk of dementia. In the second project, a bias factor formula for two binary unmeasured confounders was derived in order to account for these variables. Monte Carlo analysis was implemented to estimate the distribution of the bias factor for each class of antidepressant. The effects of antidepressants on the risk of dementia adjusted for both measured and unmeasured confounders were estimated. Sleep apnea and RBD attenuated the effect estimates for SSRI, SNRI and AA on the risk of dementia. In the third project, to account for potential time-varying confounding and observed time-varying treatment, a multi-state Markov chain with three transient states (normal cognition, mild cognitive impairment (MCI), and impaired but not MCI) and two absorbing states (dementia and death) was performed to estimate the probabilities of moving between finite and mutually exclusive cognitive state. This analysis also allowed participants to recover from mild impairments (i.e., mild cognitive impairment, impaired but not MCI) to normal cognition, and accounted for the competing risk of death prior to dementia. These findings supported the results of the main analysis in the first project.
112

Statistical Monitoring of Queuing Networks

Kaya, Yaren Bilge 26 October 2018 (has links)
Queuing systems are important parts of our daily lives and to keep their operations at an efficient level they need to be monitored by using queuing Performance Metrics, such as average queue lengths and average waiting times. On the other hand queue lengths and waiting times are generally random variables and their distributions depend on different properties like arrival rates, service times, number of servers. We focused on detecting the change in service rates in this report. Therefore, we monitored queues by using Cumulative Sum(CUSUM) charts based on likelihood ratios and compared the Average Run Length values of different service rates.
113

Statistical Modeling and Prediction of HIV/AIDS Prognosis: Bayesian Analyses of Nonlinear Dynamic Mixtures

Lu, Xiaosun 10 July 2014 (has links)
Statistical analyses and modeling have contributed greatly to our understanding of the pathogenesis of HIV-1 infection; they also provide guidance for the treatment of AIDS patients and evaluation of antiretroviral (ARV) therapies. Various statistical methods, nonlinear mixed-effects models in particular, have been applied to model the CD4 and viral load trajectories. A common assumption in these methods is all patients come from a homogeneous population following one mean trajectories. This assumption unfortunately obscures important characteristic difference between subgroups of patients whose response to treatment and whose disease trajectories are biologically different. It also may lack the robustness against population heterogeneity resulting misleading or biased inference. Finite mixture models, also known as latent class models, are commonly used to model nonpredetermined heterogeneity in a population; they provide an empirical representation of heterogeneity by grouping the population into a finite number of latent classes and modeling the population through a mixture distribution. For each latent class, a finite mixture model allows individuals in each class to vary around their own mean trajectory, instead of a common one shared by all classes. Furthermore, a mixture model has ability to cluster and estimate class membership probabilities at both population and individual levels. This important feature may help physicians to better understand a particular patient disease progression and refine the therapeutical strategy in advance. In this research, we developed mixture dynamic model and related Bayesian inferences via Markov chain Monte Carlo (MCMC). One real data set from HIV/AIDS clinical management and another from clinical trial were used to illustrate the proposed models and methods. This dissertation explored three topics. First, we modeled the CD4 trajectories using a finite mixture model with four distinct components of which the mean functions are designed based on Michaelis-Menten function. Relevant covariates both baseline and time-varying were considered and model comparison and selection were based on such-criteria as Deviance Information Criteria (DIC). Class membership model was allowed to depend on covariates for prediction. Second, we explored disease status prediction HIV/AIDS using the latent class membership model. Third, we modeled viral load trajectories using a finite mixture model with three components of which the mean functions are designed based on published HIV dynamic systems. Although this research is motivated by HIV/AIDS studies, the basic concepts and methods developed here have much broader applications in management of other chronic diseases; they can also be applied to dynamic systems in other fields. Implementation of our methods using the publicly- vailable WinBUGS package suggest that our approach can be made quite accessible to practicing statisticians and data analysts.
114

A Markov Chain Approach to IEEE 802.11WLAN Performance Analysis

Xiong, Lixiang January 2008 (has links)
Doctor of Philosopy (PhD) / Wireless communication always attracts extensive research interest, as it is a core part of modern communication technology. During my PhD study, I have focused on two research areas of wireless communication: IEEE 802.11 network performance analysis, and wireless cooperative retransmission. The first part of this thesis focuses on IEEE 802.11 network performance analysis. Since IEEE 802.11 technology is the most popular wireless access technology, IEEE 802.11 network performance analysis is always an important research area. In this area, my work includes the development of three analytical models for various aspects of IEEE 802.11 network performance analysis. First, a two-dimensional Markov chain model is proposed for analysing the performance of IEEE 802.11e EDCA (Enhanced Distributed Channel Access). With this analytical model, the saturated throughput is obtained. Compared with the existing analytical models of EDCA, the proposed model includes more correct details of EDCA, and accordingly its results are more accurate. This better accuracy is also proved by the simulation study. Second, another two-dimensional Markov chain model is proposed for analysing the coexistence performance of IEEE 802.11 DCF (Distributed Coordination Function) and IEEE 802.11e EDCA wireless devices. The saturated throughput is obtained with the proposed analytical model. The simulation study verifies the proposed analytical model, and it shows that the channel access priority of DCF is similar to that of the best effort access category in EDCA in the coexistence environment. The final work in this area is a hierarchical Markov chain model for investigating the impact of data-rate switching on the performance of IEEE 802.11 DCF. With this analytical model,the saturated throughput can be obtained. The simulation study verifies the accuracy of the model and shows the impact of the data-rate switching under different network conditions. A series of threshold values for the channel condition as well as the number of stations are obtained to decide whether the data-rate switching should be active or not. The second part of this thesis focuses on wireless cooperative retransmission. In this thesis, two uncoordinated distributed wireless cooperative retransmission strategies for single-hop connection are presented. In the proposed strategies, each uncoordinated cooperative neighbour randomly decide whether it should transmit to help the frame delivery depending on some pre-calculated optimal transmission probabilities. In Strategy 1, the source only transmits once in the first slot, and only the neighbours are involved in the retransmission attempts in the subsequent slots. In Strategy 2, both the source and the neighbours participate in the retransmission attempts. Both strategies are first analysed with a simple memoryless channel model, and the results show the superior performance of Strategy 2. With the elementary results for the memoryless channel model, a more realistic two-state Markov fading channel model is used to investigate the performance of Strategy 2. The simulation study verifies the accuracy of our analysis and indicates the superior performance of Strategy 2 compared with the simple retransmission strategy and the traditional two-hop strategy.
115

最大利潤下規格上限與EWMA管制圖之設計 / Design of upper specification and EWMA control chart with maximal profit

蔡佳宏, Tsai, Chia Hung Unknown Date (has links)
The determination of economic control charts and the determination of specification limits with minimum cost are two different research topics. In this study, we first combine the design of economic control charts and the determination of specification limits to maximize the expected profit per unit time for the smaller the better quality variable following the gamma distribution. Because of the asymmetric distribution, we design the EWMA control chart with asymmetric control limits. We simultaneously determine the economic EWMA control chart and upper specification limit with maximum expected profit per unit time. Then, extend the approach to determine the economic variable sampling interval EWMA control chart and upper specification limit with maximum expected profit per unit time. In all our numerical examples of the two profit models, the optimum expected profit per unit time under inspection is higher than that of no inspection. The detection ability of the EWMA chart with an appropriate weight is always better than the X-bar probability chart. The detection ability of the VSI EWMA chart is also superior to that of the fixed sampling interval EWMA chart. Sensitivity analyses are provided to determine the significant parameters for the optimal design parameters and the optimal expected profit per unit time.
116

Modeling and Analysis of Two-Part Type Manufacturing Systems

Jang, Young Jae, Gershwin, Stanley B. 01 1900 (has links)
This paper presents a model and analysis of a synchronous tandem flow line that produces different part types on unreliable machines. The machines operate according to a static priority rule, operating on the highest priority part whenever possible, and operating on lower priority parts only when unable to produce those with higher priorities. We develop a new decomposition method to analyze the behavior of the manufacturing system by decomposing the long production line into small analytically tractable components. As a first step in modeling a production line with more than one part type, we restrict ourselves to the case where there are two part types. Detailed modeling and derivations are presented with a small two-part-type production line that consists of two processing machines and two demand machines. Then, a generalized longer flow line is analyzed. Furthermore, estimates for performance measures, such as average buffer levels and production rates, are presented and compared to extensive discrete event simulation. The quantitative behavior of the two-part type processing line under different demand scenarios is also provided. / Singapore-MIT Alliance (SMA)
117

The relationship between citing and cited patterns in research papers and the fluctuation of journal ranking

huang, shou-ching 31 July 2007 (has links)
The journal cited frequency is usually an index to weigh an academic research achievement and may provide useful information for the academic society. However, it is spectulated that it may be influenced by factors such as the citing frequencies of other journals, the price of the journal and so on. In this work, as an initial attempt we will investigate the correlation between the citing frequency and cited frequency in the same journal. The data is taken the JCR (Journal Citation Reports) annually published by ISI (Institute for Scientific Information) to understand the relationship between citing and cited patterns. Moreover, Impact Factor from the JCR has also been used as a basis of ranking, we will discuss about the variation in journal ranking in all fields based on Markov chain modeling. The ranking based on a modified impact factor will be used to compare with that by the original impact factor provided by the JCR.
118

Probability calculations of orthologous genes

Lagervik Öster, Alice January 2005 (has links)
The aim of this thesis is to formulate and implement an algorithm that calculates the probability for two genes being orthologs, given a gene tree and a species tree. To do this, reconciliations between the gene tree and the species trees are used. A birth and death process is used to model the evolution, and used to calculate the orthology probability. The birth and death parameters are approximated with a Markov Chain Monte Carlo (MCMC). A MCMC framework for probability calculations of reconciliations written by Arvestad et al. (2003) is used. Rules for orthologous reconciliations are developed and implemented to calculate the probability for the reconciliations that have two genes as orthologs. The rules where integrated with the Arvestad et al. (2003) framework, and the algorithm was then validated and tested.
119

Bayesian Modeling of Conditional Densities

Li, Feng January 2013 (has links)
This thesis develops models and associated Bayesian inference methods for flexible univariate and multivariate conditional density estimation. The models are flexible in the sense that they can capture widely differing shapes of the data. The estimation methods are specifically designed to achieve flexibility while still avoiding overfitting. The models are flexible both for a given covariate value, but also across covariate space. A key contribution of this thesis is that it provides general approaches of density estimation with highly efficient Markov chain Monte Carlo methods. The methods are illustrated on several challenging non-linear and non-normal datasets. In the first paper, a general model is proposed for flexibly estimating the density of a continuous response variable conditional on a possibly high-dimensional set of covariates. The model is a finite mixture of asymmetric student-t densities with covariate-dependent mixture weights. The four parameters of the components, the mean, degrees of freedom, scale and skewness, are all modeled as functions of the covariates. The second paper explores how well a smooth mixture of symmetric components can capture skewed data. Simulations and applications on real data show that including covariate-dependent skewness in the components can lead to substantially improved performance on skewed data, often using a much smaller number of components. We also introduce smooth mixtures of gamma and log-normal components to model positively-valued response variables. In the third paper we propose a multivariate Gaussian surface regression model that combines both additive splines and interactive splines, and a highly efficient MCMC algorithm that updates all the multi-dimensional knot locations jointly. We use shrinkage priors to avoid overfitting with different estimated shrinkage factors for the additive and surface part of the model, and also different shrinkage parameters for the different response variables. In the last paper we present a general Bayesian approach for directly modeling dependencies between variables as function of explanatory variables in a flexible copula context. In particular, the Joe-Clayton copula is extended to have covariate-dependent tail dependence and correlations. Posterior inference is carried out using a novel and efficient simulation method. The appendix of the thesis documents the computational implementation details. / <p>At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 3: In press. Paper 4: Manuscript.</p>
120

On time duality for quasi-birth-and-death processes

Keller, Peter, Roelly, Sylvie, Valleriani, Angelo January 2012 (has links)
We say that (weak/strong) time duality holds for continuous time quasi-birth-and-death-processes if, starting from a fixed level, the first hitting time of the next upper level and the first hitting time of the next lower level have the same distribution. We present here a criterion for time duality in the case where transitions from one level to another have to pass through a given single state, the so-called bottleneck property. We also prove that a weaker form of reversibility called balanced under permutation is sufficient for the time duality to hold. We then discuss the general case.

Page generated in 0.0406 seconds