• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 331
  • 135
  • 10
  • 4
  • Tagged with
  • 928
  • 928
  • 467
  • 437
  • 384
  • 380
  • 380
  • 184
  • 174
  • 92
  • 68
  • 66
  • 63
  • 62
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Life Data Analysis of Repairable Systems: A Case Study on Brigham Young University Media Rooms

Manortey, Stephen Oluaku 05 December 2006 (has links) (PDF)
It is an undisputable fact that most systems, upon consistence usage are bound to fail in the performance of their intended functions at a point in time. When this occurs, various strategies are set in place to restore them back to a satisfactory performance. This may include replacing the failed component with a new one, swapping parts, resetting adjustable parts to mention but a few. Any such system is referred to as a repairable system. There is the need to study these systems and use statistical models to predict their failing time and be able to set modalities in place to repair them at least cost to the operator. The main objective of this paper is to analyze data collected on the projectors used for teaching and learning activities in some designated rooms at the Brigham Young University (BYU) under the auspices of the Office of Information Technology (OIT) and help to detect the failure rate of such systems, predict the optimal replacement time for the parts with the view of maximizing the reliability of the systems and finally formulate a cost model that will be used to estimate the optimal cost involve in servicing a failed projector.
192

Parameter Estimation for the Beta Distribution

Owen, Claire Elayne Bangerter 20 November 2008 (has links) (PDF)
The beta distribution is useful in modeling continuous random variables that lie between 0 and 1, such as proportions and percentages. The beta distribution takes on many different shapes and may be described by two shape parameters, alpha and beta, that can be difficult to estimate. Maximum likelihood and method of moments estimation are possible, though method of moments is much more straightforward. We examine both of these methods here, and compare them to three more proposed methods of parameter estimation: 1) a method used in the Program Evaluation and Review Technique (PERT), 2) a modification of the two-sided power distribution (TSP), and 3) a quantile estimator based on the first and third quartiles of the beta distribution. We find the quantile estimator performs as well as maximum likelihood and method of moments estimators for most beta distributions. The PERT and TSP estimators do well for a smaller subset of beta distributions, though they never outperform the maximum likelihood, method of moments, or quantile estimators. We apply these estimation techniques to two data sets to see how well they approximate real data from Major League Baseball (batting averages) and the U.S. Department of Energy (radiation exposure). We find the maximum likelihood, method of moments, and quantile estimators perform well with batting averages (sample size 160), and the method of moments and quantile estimators perform well with radiation exposure proportions (sample size 20). Maximum likelihood estimators would likely do fine with such a small sample size were it not for the iterative method needed to solve for alpha and beta, which is quite sensitive to starting values. The PERT and TSP estimators do more poorly in both situations. We conclude that in addition to maximum likelihood and method of moments estimation, our method of quantile estimation is efficient and accurate in estimating parameters of the beta distribution.
193

Modeling Temperature Reduction in Tendons Using Gaussian Processes Within a Dynamic Linear Model

Wyss, Richard David 02 July 2009 (has links) (PDF)
The time it takes an athlete to recover from an injury can be highly influenced by training procedures as well as the medical care and physical therapy received. When an injury occurs to the muscles or tendons of an athlete, it is desirable to cool the muscles and tendons within the body to reduce inflammation, thereby reducing the recovery time. Consequently, finding a method of treatment that is effective in reducing tendon temperatures is beneficial to increasing the speed at which the athlete is able to recover. In this project, Bayesian inference with Gaussian processes will be used to model the effect that different treatments have in reducing tendon temperature within the ankle. Gaussian processes provide a powerful methodology for modeling data that exhibit complex characteristics such as nonlinear behavior while retaining mathematical simplicity.
194

Parameter Estimation for the Lognormal Distribution

Ginos, Brenda Faith 13 November 2009 (has links) (PDF)
The lognormal distribution is useful in modeling continuous random variables which are greater than or equal to zero. Example scenarios in which the lognormal distribution is used include, among many others: in medicine, latent periods of infectious diseases; in environmental science, the distribution of particles, chemicals, and organisms in the environment; in linguistics, the number of letters per word and the number of words per sentence; and in economics, age of marriage, farm size, and income. The lognormal distribution is also useful in modeling data which would be considered normally distributed except for the fact that it may be more or less skewed (Limpert, Stahel, and Abbt 2001). Appropriately estimating the parameters of the lognormal distribution is vital for the study of these and other subjects. Depending on the values of its parameters, the lognormal distribution takes on various shapes, including a bell-curve similar to the normal distribution. This paper contains a simulation study concerning the effectiveness of various estimators for the parameters of the lognormal distribution. A comparison is made between such parameter estimators as Maximum Likelihood estimators, Method of Moments estimators, estimators by Serfling (2002), as well as estimators by Finney (1941). A simulation is conducted to determine which parameter estimators work better in various parameter combinations and sample sizes of the lognormal distribution. We find that the Maximum Likelihood and Finney estimators perform the best overall, with a preference given to Maximum Likelihood over the Finney estimators because of its vast simplicity. The Method of Moments estimators seem to perform best when σ is less than or equal to one, and the Serfling estimators are quite accurate in estimating μ but not σ in all regions studied. Finally, these parameter estimators are applied to a data set counting the number of words in each sentence for various documents, following which a review of each estimator's performance is conducted. Again, we find that the Maximum Likelihood estimators perform best for the given application, but that Serfling's estimators are preferred when outliers are present.
195

Application of Convex Methods to Identification of Fuzzy Subpopulations

Eliason, Ryan Lee 10 September 2010 (has links) (PDF)
In large observational studies, data are often highly multivariate with many discrete and continuous variables measured on each observational unit. One often derives subpopulations to facilitate analysis. Traditional approaches suggest modeling such subpopulations with a compilation of interaction effects. However, when many interaction effects define each subpopulation, it becomes easier to model membership in a subpopulation rather than numerous interactions. In many cases, subjects are not complete members of a subpopulation but rather partial members of multiple subpopulations. Grade of Membership scores preserve the integrity of this partial membership. By generalizing an analytic chemistry concept related to chromatography-mass spectrometry, we obtain a method that can identify latent subpopulations and corresponding Grade of Membership scores for each observational unit.
196

An Introduction to Bayesian Methodology via WinBUGS and PROC MCMC

Lindsey, Heidi Lula 06 July 2011 (has links) (PDF)
Bayesian statistical methods have long been computationally out of reach because the analysis often requires integration of high-dimensional functions. Recent advancements in computational tools to apply Markov Chain Monte Carlo (MCMC) methods are making Bayesian data analysis accessible for all statisticians. Two such computer tools are Win-BUGS and SASR 9.2's PROC MCMC. Bayesian methodology will be introduced through discussion of fourteen statistical examples with code and computer output to demonstrate the power of these computational tools in a wide variety of settings.
197

Hierarchical Bayesian Methods for Evaluation of Traffic Project Efficacy

Olsen, Andrew Nolan 07 March 2011 (has links) (PDF)
A main objective of Departments of Transportation is to improve the safety of the roadways over which they have jurisdiction. Safety projects, such as cable barriers and raised medians, are utilized to reduce both crash frequency and crash severity. The efficacy of these projects must be evaluated in order to use resources in the best way possible. Five models are proposed for the evaluation of traffic projects: (1) a Bayesian Poisson regression model; (2) a hierarchical Poisson regression model building on model (1) by adding hyperpriors; (3) a similar model correcting for overdispersion; (4) a dynamic linear model; and (5) a traditional before-after study model. Evaluation of these models is discussed using various metrics including DIC. Using the models selected for analysis, it was determined that cable barriers are quite effective at reducing severe crashes and cross-median crashes on Utah highways. Raised medians are also largely effective at reducing severe crashes. The results of before and after analyses are highly valuable to Departments of Transportation in identifying effective projects and in determining which roadway segments will benefit most from their implementation.
198

The Effect of Smoking on Tuberculosis Incidence in Burdened Countries

Ellison, Natalie Noel 06 March 2012 (has links) (PDF)
It is estimated that one third of the world's population is infected with tuberculosis. Though once thought a "dead" disease, tuberculosis is very much alive. The rise of drug resistant strains of tuberculosis, and TB-HIV coinfection have made tuberculosis an even greater worldwide threat. While HIV, poverty, and public health infrastructure are historically assumed to affect the burden of tuberculosis, recent research has been done to implicate smoking in this list. This analysis involves combining data from multiple sources in order determine if smoking is a statistically significant factor in predicting the number of incident tuberculosis cases in a country. Quasi-Poisson generalized linear models and negative binomial regression will be used to analyze the effect of smoking, as well as the other factors, on tuberculosis incidence. This work will enhance tuberculosis control efforts by helping to identify new hypotheses that can be tested in future studies. One of the main hypotheses is whether or not smoking increases the number of tuberculosis cases above and beyond the effects of other factors that are known to influence tuberculosis incidence. These known factors include TB-HIV coinfection, poverty and public health infrastructure represented by treatment outcomes.
199

Predicting Maximal Oxygen Consumption (VO2max) Levels in Adolescents

Shepherd, Brent A. 09 March 2012 (has links) (PDF)
Maximal oxygen consumption (VO2max) is considered by many to be the best overall measure of an individual's cardiovascular health. Collecting the measurement, however, requires subjecting an individual to prolonged periods of intense exercise until their maximal level, the point at which their body uses no additional oxygen from the air despite increased exercise intensity, is reached. Collecting VO2max data also requires expensive equipment and great subject discomfort to get accurate results. Because of this inherent difficulty, it is often avoided despite its usefulness. In this research, we propose a set of Bayesian hierarchical models to predict VO2max levels in adolescents, ages 12 through 17, using less extreme measurements. Two models are developed separately, one that uses submaximal exercise data and one that uses physical fitness questionnaire data. The best submaximal model was found to include age, gender, BMI, heart rate, rate of perceived exertion, treadmill miles per hour, and an interaction between age and heart rate. The second model, designed for those with physical limitations, uses age, gender, BMI, and two separate questionnaire results measuring physical activity levels and functional ability levels, as well as an interaction between the physical activity level score and gender. Both models use separate model variances for males and females.
200

Bayesian Pollution Source Apportionment Incorporating Multiple Simultaneous Measurements

Christensen, Jonathan Casey 12 March 2012 (has links) (PDF)
We describe a method to estimate pollution profiles and contribution levels for distinct prominent pollution sources in a region based on daily pollutant concentration measurements from multiple measurement stations over a period of time. In an extension of existing work, we will estimate common source profiles but distinct contribution levels based on measurements from each station. In addition, we will explore the possibility of extending existing work to allow adjustments for synoptic regimes—large scale weather patterns which may effect the amount of pollution measured from individual sources as well as for particular pollutants. For both extensions we propose Bayesian methods to estimate pollution source profiles and contributions.

Page generated in 0.0986 seconds