• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 573
  • 240
  • 58
  • 58
  • 28
  • 25
  • 24
  • 24
  • 20
  • 15
  • 15
  • 7
  • 3
  • 3
  • 3
  • Tagged with
  • 1273
  • 618
  • 313
  • 269
  • 197
  • 195
  • 192
  • 178
  • 171
  • 167
  • 151
  • 122
  • 122
  • 107
  • 106
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Understanding the effects of obesity and age on likelihood of tripping and subsequent balance recovery

Garman, Christina Maria Rossi 15 April 2015 (has links)
Fall related injuries are a major public health concern due to their high associated medical costs and negative impact on quality of life. Obese and older adults are reported to fall more frequently than their normal-weight and young counterparts. To help identify potential mechanisms of these falls the purpose of the research within this dissertation was to investigate the effects of obesity and age on the likelihood of tripping and subsequent balance recovery. Four experimental studies were conducted. The purpose of the first study was to investigate the effects of obesity, age and gender on the likelihood of tripping during level walking. Likelihood of tripping was assessed with median minimum foot clearance (MFC) and MFC interquartile range (IQR). Obesity did not increase the likelihood of tripping suggesting the increased rate of falls among obese adults is not likely due to a greater likelihood of tripping over an unseen obstacle. Additional results suggested females and individuals of shorter stature have an increased likelihood of tripping compared to their male and taller counterparts. The purpose of the second study was two-fold. First, the effects of load carriage and ramp walking on the likelihood of tripping were investigated, followed by investigating the effects of age and obesity on the likelihood of tripping during load carriage and ramp walking. Again, likelihood of tripping was assessed with median MFC and MFC IQR. Load carriage increased the likelihood of tripping during both level and ramp walking and obesity and age increased the likelihood of tripping during selected combinations of load carriage and/or ramp walking. These results suggest that the increased rate of falls during load carriage and the increased rate of falls among obese and older adult workers reported elsewhere may be due in part to an increased likelihood of tripping. The third study proposed a new method for investigating the likelihood of tripping as a function of obstacle height. The proposed method aimed to clear up ambiguous results often encountered when using MFC central tendency and variability to quantify likelihood of tripping. The method used trip probability curves and a statistical bootstrapping technique to compare trip probability at specific obstacle heights between groups of interest. An additional benefit of this method was that it was able to identify effects of factors not identifiable by the commonly used ANOVA analysis using MFC central tendency and variability. The purpose of the fourth study was to investigate the effects of obesity, age and gender on balance recovery following a lab induced trip perturbation. Measures of balance recovery included fall rate, stepping strategy and characteristics, and trunk kinematics. Obese, older, and female adults fell more frequently after tripping and this higher fall rate may help explain the higher fall rates among obese, older and female adults reported elsewhere. Failed recoveries were associated with higher peak trunk angles and angular velocities in addition to the use of a lowering strategy. Obese, older, and female adults had higher peak trunk angles and angular velocities and older adults and females used lowering strategies more often. These alterations in trunk kinematics and stepping strategy may have contributed to the higher fall rate among these individuals. / Ph. D.
62

ModPET: Novel Applications of Scintillation Cameras to Preclinical PET

Moore, Stephen K. January 2011 (has links)
We have designed, developed, and assessed a novel preclinical positron emission tomography (PET) imaging system named ModPET. The system was developed using modular gamma cameras, originally developed for SPECT applications at the Center for Gamma Ray Imaging (CGRI), but configured for PET imaging by enabling coincidence timing. A pair of cameras are mounted on a exible system gantry that also allows for acquisition of optical images such that PET images can be registered to an anatomical reference. Data is acquired in a super list-mode form where raw PMT signals and event times are accumulated in events lists for each camera. Event parameter estimation of position and energy is carried out with maximum likelihood methods using careful camera calibrations accomplished with collimated beams of 511-keV photons and a new iterative mean-detector-response-function processing routine. Intrinsic lateral spatial resolution for 511-keV photons was found to be approximately 1.6 mm in each direction. Lists of coincidence pairs are found by comparing event times in the two independent camera lists. A timing window of 30 nanoseconds is used. By bringing the 4.5 inch square cameras in close proximity, with a 32-mm separation for mouse imaging, a solid angle coverage of ∼75% partially compensates for the relatively low stopping power in the 5-mm-thick NaI crystals to give a mea- sured sensitivity of up to 0.7%. An NECR analysis yields 11,000 pairs per second with 84 μCi of activity. A list-mode MLEM reconstruction algorithm was developed to reconstruct objects in a 88 x 88 x 30 mm field of view. Tomographic resolution tests with a phantom suggest a lateral resolution of 1.5 mm and a slightly degraded resolution of 2.5 mm in the direction normal to the camera faces. The system can also be configured to provide (99m)Tc planar scintigraphy images. Selected biological studies of inammation, apoptosis, tumor metabolism, and bone osteogenic activity are presented.
63

Likelihood-Based Tests for Common and Idiosyncratic Unit Roots in the Exact Factor Model

Solberger, Martin January 2013 (has links)
Dynamic panel data models are widely used by econometricians to study over time the economics of, for example, people, firms, regions, or countries, by pooling information over the cross-section. Though much of the panel research concerns inference in stationary models, macroeconomic data such as GDP, prices, and interest rates are typically trending over time and require in one way or another a nonstationary analysis. In time series analysis it is well-established how autoregressive unit roots give rise to stochastic trends, implying that random shocks to a dynamic process are persistent rather than transitory. Because the implications of, say, government policy actions are fundamentally different if shocks to the economy are lasting than if they are temporary, there are now a vast number of univariate time series unit root tests available. Similarly, panel unit root tests have been designed to test for the presence of stochastic trends within a panel data set and to what degree they are shared by the panel individuals. Today, growing data certainly offer new possibilities for panel data analysis, but also pose new problems concerning double-indexed limit theory, unobserved heterogeneity, and cross-sectional dependencies. For example, economic shocks, such as technological innovations, are many times global and make national aggregates cross-country dependent and related in international business cycles. Imposing a strong cross-sectional dependence, panel unit root tests often assume that the unobserved panel errors follow a dynamic factor model. The errors will then contain one part which is shared by the panel individuals, a common component, and one part which is individual-specific, an idiosyncratic component. This is appealing from the perspective of economic theory, because unobserved heterogeneity may be driven by global common shocks, which are well captured by dynamic factor models. Yet, only a handful of tests have been derived to test for unit roots in the common and in the idiosyncratic components separately. More importantly, likelihood-based methods, which are commonly used in classical factor analysis, have been ruled out for large dynamic factor models due to the considerable number of parameters. This thesis consists of four papers where we consider the exact factor model, in which the idiosyncratic components are mutually independent, and so any cross-sectional dependence is through the common factors only. Within this framework we derive some likelihood-based tests for common and idiosyncratic unit roots. In doing so we address an important issue for dynamic factor models, because likelihood-based tests, such as the Wald test, the likelihood ratio test, and the Lagrange multiplier test, are well-known to be asymptotically most powerful against local alternatives. Our approach is specific-to-general, meaning that we start with restrictions on the parameter space that allow us to use explicit maximum likelihood estimators. We then proceed with relaxing some of the assumptions, and consider a more general framework requiring numerical maximum likelihood estimation. By simulation we compare size and power of our tests with some established panel unit root tests. The simulations suggest that the likelihood-based tests are locally powerful and in some cases more robust in terms of size. / Solving Macroeconomic Problems Using Non-Stationary Panel Data
64

Verossimilhança hierárquica em modelos de fragilidade / Hierarchical likelihood in frailty models

Amorim, William Nilson de 12 February 2015 (has links)
Os métodos de estimação para modelos de fragilidade vêm sendo bastante discutidos na literatura estatística devido a sua grande utilização em estudos de Análise de Sobrevivência. Vários métodos de estimação de parâmetros dos modelos foram desenvolvidos: procedimentos de estimação baseados no algoritmo EM, cadeias de Markov de Monte Carlo, processos de estimação usando verossimilhança parcial, verossimilhança penalizada, quasi-verossimilhança, entro outros. Uma alternativa que vem sendo utilizada atualmente é a utilização da verossimilhança hierárquica. O objetivo principal deste trabalho foi estudar as vantagens e desvantagens da verossimilhança hierárquica para a inferência em modelos de fragilidade em relação a verossimilhança penalizada, método atualmente mais utilizado. Nós aplicamos as duas metodologias a um banco de dados real, utilizando os pacotes estatísticos disponíveis no software R, e fizemos um estudo de simulação, visando comparar o viés e o erro quadrático médio das estimativas de cada abordagem. Pelos resultados encontrados, as duas metodologias apresentaram estimativas muito próximas, principalmente para os termos fixos. Do ponto de vista prático, a maior diferença encontrada foi o tempo de execução do algoritmo de estimação, muito maior na abordagem hierárquica. / Estimation procedures for frailty models have been widely discussed in the statistical literature due its widespread use in survival studies. Several estimation methods were developed: procedures based on the EM algorithm, Monte Carlo Markov chains, estimation processes based on parcial likelihood, penalized likelihood and quasi-likelihood etc. An alternative currently used is the hierarchical likelihood. The main objective of this work was to study the hierarchical likelihood advantages and disadvantages for inference in frailty models when compared with the penalized likelihood method, which is the most used one. We applied both approaches to a real data set, using R packages available. Besides, we performed a simulation study in order to compare the methods through out the bias and the mean square error of the estimators. Both methodologies presented very similar estimates, mainly for the fixed effects. In practice, the great difference was the computational cost, much higher in the hierarchical approach.
65

Verossimilhança hierárquica em modelos de fragilidade / Hierarchical likelihood in frailty models

William Nilson de Amorim 12 February 2015 (has links)
Os métodos de estimação para modelos de fragilidade vêm sendo bastante discutidos na literatura estatística devido a sua grande utilização em estudos de Análise de Sobrevivência. Vários métodos de estimação de parâmetros dos modelos foram desenvolvidos: procedimentos de estimação baseados no algoritmo EM, cadeias de Markov de Monte Carlo, processos de estimação usando verossimilhança parcial, verossimilhança penalizada, quasi-verossimilhança, entro outros. Uma alternativa que vem sendo utilizada atualmente é a utilização da verossimilhança hierárquica. O objetivo principal deste trabalho foi estudar as vantagens e desvantagens da verossimilhança hierárquica para a inferência em modelos de fragilidade em relação a verossimilhança penalizada, método atualmente mais utilizado. Nós aplicamos as duas metodologias a um banco de dados real, utilizando os pacotes estatísticos disponíveis no software R, e fizemos um estudo de simulação, visando comparar o viés e o erro quadrático médio das estimativas de cada abordagem. Pelos resultados encontrados, as duas metodologias apresentaram estimativas muito próximas, principalmente para os termos fixos. Do ponto de vista prático, a maior diferença encontrada foi o tempo de execução do algoritmo de estimação, muito maior na abordagem hierárquica. / Estimation procedures for frailty models have been widely discussed in the statistical literature due its widespread use in survival studies. Several estimation methods were developed: procedures based on the EM algorithm, Monte Carlo Markov chains, estimation processes based on parcial likelihood, penalized likelihood and quasi-likelihood etc. An alternative currently used is the hierarchical likelihood. The main objective of this work was to study the hierarchical likelihood advantages and disadvantages for inference in frailty models when compared with the penalized likelihood method, which is the most used one. We applied both approaches to a real data set, using R packages available. Besides, we performed a simulation study in order to compare the methods through out the bias and the mean square error of the estimators. Both methodologies presented very similar estimates, mainly for the fixed effects. In practice, the great difference was the computational cost, much higher in the hierarchical approach.
66

Rarities of genotype profiles in a normal Swedish population

Hedell, Ronny January 2010 (has links)
Investigation of stains from crime scenes are commonly used in the search for criminals. At The National Laboratory of Forensic Science, where these stains are examined, a number of questions of theoretical and practical interest regarding the databases of DNA profiles and the strength of DNA evidence against a suspect in a trial are not fully investigated. The first part of this thesis deals with how a sample of DNA profiles from a population is used in the process of estimating the strength of DNA evidence in a trial, taking population genetic factors into account. We then consider how to combine hypotheses regarding the relationship between a suspect and other possible donors of the stain from the crime scene by two applications of Bayes’ theorem. After that we assess the DNA profiles that minimize the strength of DNA evidence against a suspect, and investigate how the strength is affected by sampling error using the bootstrap method and a Bayesian method. In the last part of the thesis we examine discrepancies between different databases of DNA profiles by both descriptive and inferential statistics, including likelihood ratio tests and Bayes factor tests. Little evidence of major differences is found.
67

Aspects of Composite Likelihood Inference

Jin, Zi 07 March 2011 (has links)
A composite likelihood consists of a combination of valid likelihood objects, and in particular it is of typical interest to adopt lower dimensional marginal likelihoods. Composite marginal likelihood appears to be an attractive alternative for modeling complex data, and has received increasing attention in handling high dimensional data sets when the joint distribution is computationally difficult to evaluate, or intractable due to complex structure of dependence. We present some aspects of methodological development in composite likelihood inference. The resulting estimator enjoys desirable asymptotic properties such as consistency and asymptotic normality. Composite likelihood based test statistics and their asymptotic distributions are summarized. Higher order asymptotic properties of the signed composite likelihood root statistic are explored. Moreover, we aim to compare accuracy and efficiency of composite likelihood estimation relative to estimation based on ordinary likelihood. Analytical and simulation results are presented for different models, which include multivariate normal distributions, times series model, and correlated binary data.
68

Aspects of Composite Likelihood Inference

Jin, Zi 07 March 2011 (has links)
A composite likelihood consists of a combination of valid likelihood objects, and in particular it is of typical interest to adopt lower dimensional marginal likelihoods. Composite marginal likelihood appears to be an attractive alternative for modeling complex data, and has received increasing attention in handling high dimensional data sets when the joint distribution is computationally difficult to evaluate, or intractable due to complex structure of dependence. We present some aspects of methodological development in composite likelihood inference. The resulting estimator enjoys desirable asymptotic properties such as consistency and asymptotic normality. Composite likelihood based test statistics and their asymptotic distributions are summarized. Higher order asymptotic properties of the signed composite likelihood root statistic are explored. Moreover, we aim to compare accuracy and efficiency of composite likelihood estimation relative to estimation based on ordinary likelihood. Analytical and simulation results are presented for different models, which include multivariate normal distributions, times series model, and correlated binary data.
69

On Intraclass Correlation Coefficients

Yu, Jianhui 17 July 2009 (has links)
This paper uses Maximum likelihood estimation method to estimate the common correlation coefficients for multivariate datasets. We discuss a graphical tool, Q-Q plot, to test equality of the common intraclass correlation coefficients. Kolmogorov-Smirnov test and Cramér-von Mises test are used to check if the intraclass correlation coefficients are the same among populations. Bootstrap and empirical likelihood methods are applied to construct the confidence interval of the common intraclass correlation coefficients.
70

Inference for Cox's Regression Model via a New Version of Empirical Likelihood

Jinnah, Ali 28 November 2007 (has links)
Cox Proportional Hazard Model is one of the most popular tools used in the study of Survival Analysis. Empirical Likelihood (EL) method has been used to study the Cox Proportional Hazard Model. In recent work by Qin and Jing (2001), empirical likelihood based confidence region is constructed with the assumption that the baseline hazard function is known. However, in Cox’s regression model the baseline hazard function is unspecified. In this thesis, we re-formulate empirical likelihood for the vector of regression parameters by estimating the baseline hazard function. The EL confidence regions are obtained accordingly. In addition, Adjusted Empirical Likelihood (AEL) method is proposed. Furthermore, we conduct extensive simulation studies to evaluate the performance of the proposed empirical likelihood methods in terms of coverage probabilities by comparing with the Normal Approximation based method. The simulation studies show that all the three methods produce similar coverage probabilities.

Page generated in 0.0319 seconds