• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2233
  • 498
  • 378
  • 210
  • 133
  • 87
  • 47
  • 44
  • 35
  • 31
  • 19
  • 18
  • 18
  • 18
  • 18
  • Tagged with
  • 4783
  • 2326
  • 1797
  • 1165
  • 1152
  • 969
  • 665
  • 650
  • 536
  • 444
  • 381
  • 380
  • 373
  • 345
  • 322
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Smoothing Spline Analysis of Variance Models On Accelerometer Data

Chen, Lulu 01 January 2023 (has links) (PDF)
In this thesis, the basics of smoothing spline analysis of variance are first introduced. Regular physical activity has been shown to reduce the risk of chronic diseases in older adults, such as heart disease, stroke, diabetes, and certain forms of cancer. Accurate measurement of physical activity levels in older adults is crucial to identify those who may require interventions to increase their activity levels and prevent functional decline. In our study, we collected data on the physical activity of older individuals by utilizing accelerometer accelerometers. To estimate the underlying patterns related to each covariate, we applies smoothing spline analysis of variance (SSANOVA) methods to two types of measurements from the accelerometer device. We investigates the underlying patterns of different participant groups and compared the patterns among groups. The paper reveals clear patterns of activity levels throughout the day and across days, with differences among groups observed. Additionally, the study compares the mean curve method and the SSANOVA model, and shows that the SSANOVA model is a more suitable method for analyzing physical activity data. The study provides valuable insights into daily physical activity patterns in older people and highlights the usefulness of the SSANOVA model for such data analysis.
52

Probability driven heuristic nets

Carter, Lynn Robert 25 July 1974 (has links)
Let a probability driven switch be defined as a switch of three input paths and three output paths. The status of the input paths defines a probability for each output path (as to whether it will generate a signal or not.) One output path is linked to one input path, so the results of the switch at time t can affect the switch at time t+1. A switch so constructed can be defined (by the probabilities) to take on the function of the standard logic gates (AND, OR, …) A net constructed of these switches can be “taught” by “reward” and “punish” algorithms to recognize input patterns. A simulation model showed that a repetitive learning algorithm coupled with a base knowledge (where new patterns are learned while continually checking past learned patterns) gives best results as a function of time. A good measure for the level of stability in response is to notice how many probabilities have converged to one or zero. The larger the number, the more stable the net.
53

Statistical Analysis for Pollutants in Precipitation Observed in a Selected Region

Malik, Rajat January 2010 (has links)
<p>The study of pollution and its effects on the environment has long been an interest for many researchers. Short and long term affects as well as future predictions based on past observations are important factors to consider when one undertakes such a study. The purpose of this thesis is to observe the long term changes and trends of pollutants in precipitation selected from the north-eastern region of the United States of America and Canada. The data was collected on a weekly basis between 1995 to 2006 on air pollutants Ammonium. Nitrate. any type of Sulphate. and Hydron (NH4: N03. XS04. and H+. respectively). In total. 19 different stations 'vvere investigated. Two types of statistical models were fit to the data which include the gamma regression and the random effect model. The gamma regression assumed independence of any spatial and temporal factors. This was used to conceptualize the overall trend and yearly fit. The preliminary analysis found strong evidence of spatial dependence. but temporal dependence was so weak that it could be ignored. The random effect model has been adopted to handle dependencies caused by any underlying mechanisms. Pollutant NH4 had no significant factors resulting from the fitting of the random effect model and the Year effect was non-significant. In the result for pollutant N03: the coefficient of Year was significant and decreasing. Pollutant XSO'l was revealed to have a significant and decreasing Year effect. The random effect model did not produce any significant factors for pollutant H+. Overall. the random effects model is a more reasonable approach to fitting this data because it considers spatial dependence where the gamma regression assumes independent responses.</p> / Master of Science (MS)
54

Počátky teorie pravděpodobnosti / The beginnings of probability theory

Marcinčín, Martin January 2015 (has links)
The purpose of this thesis is to give a summary of historical development and explain fundamentals of the probability theory. Early systematic thoughts, emergence of classical Laplace, geometric and statistical definition of probability with development of theory, independence, conditional probability and Bayes theorem are shown. The thesis describes first mention of random values and the central limit theorem. The alternative, discrete uniform, binomial, Poisson, continuous uniform, normal and exponential distributions are discussed with historical background of their discoveries. The theory is supplemented with illustrative and contemporary examples. The thesis describes development in various fields of probability until publication of the Kolmogorov's probability theory in 1933. Powered by TCPDF (www.tcpdf.org)
55

Methods for Rigorous Uncertainty Quantification with Application to a Mars Atmosphere Model

Balch, Michael Scott 08 January 2011 (has links)
The purpose of this dissertation is to develop and demonstrate methods appropriate for the quantification and propagation of uncertainty in large, high-consequence engineering projects. The term "rigorous uncertainty quantification" refers to methods equal to the proposed task. The motivating practical example is uncertainty in a Mars atmosphere model due to the incompletely characterized presence of dust. The contributions made in this dissertation, though primarily mathematical and philosophical, are driven by the immediate needs of engineers applying uncertainty quantification in the field. Arguments are provided to explain how the practical needs of engineering projects like Mars lander missions motivate the use of the objective probability bounds approach, as opposed to the subjectivist theories which dominate uncertainty quantification in many research communities. An expanded formalism for Dempster-Shafer structures is introduced, allowing for the representation of continuous random variables and fuzzy variables as Dempster-Shafer structures. Then, the correctness and incorrectness of probability bounds analysis and the Cartesian product propagation method for Dempster-Shafer structures under certain dependency conditions are proven. It is also conclusively demonstrated that there exist some probability bounds problems in which the best-possible bounds on probability can not be represented using Dempster-Shafer structures. Nevertheless, Dempster-Shafer theory is shown to provide a useful mathematical framework for a wide range of probability bounds problems. The dissertation concludes with the application of these new methods to the problem of propagating uncertainty from the dust parameters in a Mars atmosphere model to uncertainty in that model's prediction of atmospheric density. A thirty-day simulation of the weather at Holden Crater on Mars is conducted using a meso-scale atmosphere model, MRAMS. Although this analysis only addresses one component of Mars atmosphere uncertainty, it demonstrates the applicability of probability bounds methods in practical engineering work. More importantly, the Mars atmosphere uncertainty analysis provides a framework in which to conclusively establish the practical importance of epistemology in rigorous uncertainty quantification. / Ph. D.
56

A Comparison of Learning Subjective and Traditional Probability in Middle Grades

Rast, Jeanne D 20 December 2005 (has links)
The emphasis given to probability and statistics in the K-12 mathematics curriculum has brought attention to the various approaches to probability and statistics concepts, as well as how to teach these concepts. Teachers from fourth, fifth, and sixth grades from a small suburban Catholic school engaged their students (n=87) in a study to compare learning traditional probability concepts to learning traditional and subjective probability concepts. The control group (n=44) received instruction in traditional probability, while the experimental group (n=43) received instruction in traditional and subjective probability. A Multivariate Analysis of Variance and a Bayesian t-test were used to analyze pretest and posttest scores from the Making Decisions about Chance Questionnaire (MDCQ). Researcher observational notes, teacher journal entries, student activity worksheet explanations, pre- and post-test answers, and student interviews were coded for themes. All groups showed significant improvement on the post-MDCQ (p < .01). There was a disordinal interaction between the combined fifth- and sixth-grade experimental group (n=28) and the control group (n=28), however the mean difference in performance on the pre-MDCQ and post-MDCQ was not significant (p=.096). A Bayesian t-test indicated that there is reasonable evidence to believe that the mean of the experimental group exceeded the mean of the control group. Qualitative data showed that while students have beliefs about probabilistic situations based on their past experiences and prior knowledge, and often use this information to make probability judgments, they find traditional probability problems easier than subjective probability. Further research with different grade levels, larger sample sizes or different activities would develop learning theory in this area and may provide insight about probability judgments previously labeled as misconceptions by researchers.
57

SELECTING THE MOST PROBABLE CATEGORY: THE R PACKAGE RS

Jin, Hong 10 1900 (has links)
<p>Selecting the most probable multinomial or multivariate hypergeometric category isa multiple-decision selection problem. In this package, xed sampling and inversesampling are used for selecting the most probable category. This package aims atproviding functionality to calculate, display and plot the probabilities of correctlyselecting the most probable category under the least favorable configuration for thesetwo sampling types. A function for finding the specified smallest acceptable samplesize (or cell quota and expected sample size) is included as well.</p> / Master of Science (MSc)
58

Some new statistical methods for a class of zero-truncated discrete distributions with applications

Ding, Xiqian, 丁茜茜 January 2015 (has links)
Counting data without zero category often occur in various _elds. Examples include days of hospital stay for patients, numbers of publication for tenure-tracked faculty in a university, numbers of tra_c violation for drivers during a certain period and so on. A class of zero-truncated discrete models such as zero-truncated Poisson, zero-truncated binomial and zero-truncated negative-binomial distributions are proposed in literature to model such count data. In this thesis, firstly, literature review is presented in Chapter 1 on a class of commonly used univariate zero-truncated discrete distributions. In Chapter 2, a unified method is proposed to derive the distribution of the sum of i.i.d. zero-truncated distribution random variables, which has important applications in the construction of the shortest Clopper-Person confidence intervals of parameters of interest and in the calculation of the exact p-value of a two-sided test for small sample sizes in one sample problem. These problems are discussed in Section 2.4. Then a novel expectation-maximization (EM) algorithm is developed for calculating the maximum likelihood estimates (MLEs) of parameters in general zero-truncated discrete distributions. An important feature of the proposed EM algorithm is that the latent variables and the observed variables are independent, which is unusual in general EM-type algorithms. In addition, a unified minorization-maximization (MM) algorithm for obtaining the MLEs of parameters in a class of zero-truncated discrete distributions is provided. The first objective of Chapter 3 is to propose the multivariate zero-truncated Charlier series (ZTCS) distribution by developing its important distributional properties, and providing efficient MLE methods via a novel data augmentation in the framework of the EM algorithm. Since the joint marginal distribution of any r-dimensional sub-vector of the multivariate ZTCS random vector of dimension m is an r-dimensional zero-deated Charlier series (ZDCS) distribution (1 6 r < m), it is the second objective of Chapter 3 to propose a new family of multivariate zero-adjusted Charlier series (ZACS) distributions (including the multivariate ZDCS distribution as a special member) with a more flexible correlation structure by accounting for both inflation and deflation at zero. The corresponding distributional properties are explored and the associated MLE method via EM algorithm is provided for analyzing correlated count data. / published_or_final_version / Statistics and Actuarial Science / Master / Master of Philosophy
59

Epistemic probability

Watt, D. E. January 1987 (has links)
No description available.
60

Representing uncertainty in models

Dixon, Elsbeth Clare January 1989 (has links)
No description available.

Page generated in 0.0605 seconds