• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Kolmogorov-Smirnov Test for r Samples

Böhm, Walter, Hornik, Kurt 12 1900 (has links) (PDF)
We consider the problem of testing whether r (>=2) samples are drawn from the same continuous distribution F(x). The test statistic we will study in some detail is defined as the maximum of the circular differences of the empirical distribution functions, a generalization of the classical 2-sample Kolmogorov-Smirnov test to r (>=2) independent samples. For the case of equal sample sizes we derive the exact null distribution by counting lattice paths confined to stay in the scaled alcove $\mathcal{A}_r$ of the affine Weyl group $A_{r-1}$. This is done using a generalization of the classical reflection principle. By a standard diffusion scaling we derive also the asymptotic distribution of the test statistic in terms of a multivariate Dirichlet series. When the sample sizes are not equal the reflection principle no longer works, but we are able to establish a weak convergence result even in this case showing that by a proper rescaling a test statistic based on a linear transformation of the circular differences of the empirical distribution functions has the same asymptotic distribution as the test statistic in the case of equal sample sizes. / Series: Research Report Series / Department of Statistics and Mathematics
2

Using Neural Networks to Classify Discrete Circular Probability Distributions

Gaumer, Madelyn 01 January 2019 (has links)
Given the rise in the application of neural networks to all sorts of interesting problems, it seems natural to apply them to statistical tests. This senior thesis studies whether neural networks built to classify discrete circular probability distributions can outperform a class of well-known statistical tests for uniformity for discrete circular data that includes the Rayleigh Test1, the Watson Test2, and the Ajne Test3. Each neural network used is relatively small with no more than 3 layers: an input layer taking in discrete data sets on a circle, a hidden layer, and an output layer outputting probability values between 0 and 1, with 0 mapping to uniform and 1 mapping to nonuniform. In evaluating performances, I compare the accuracy, type I error, and type II error of this class of statistical tests and of the neural networks built to compete with them. 1 Jammalamadaka, S. Rao(1-UCSB-PB); SenGupta, A.(6-ISI-ASU)Topics in circular statistics. (English summary) With 1 IBM-PC floppy disk (3.5 inch; HD). Series on Multivariate Analysis, 5. World Scientific Publishing Co., Inc., River Edge, NJ, 2001. xii+322 pp. ISBN: 981-02-3778-2 2 Watson, G. S.Goodness-of-fit tests on a circle. II. Biometrika 49 1962 57–63. 3 Ajne, B.A simple test for uniformity of a circular distribution. Biometrika 55 1968 343–354.

Page generated in 0.0326 seconds