• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 19
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A minimal-maximal correlation-type goodness-of-fit test

White, Josie 26 June 2014 (has links)
In goodness-of-fit testing, the goal is to determine if data come from a particular distribution. One graphical approach to test goodness-of-fit is a probability plot. Two probability plots typically used are the probability-probability plot and the quantile-quantile plot, but to use these plots, plotting points are needed. Balakrishnan et al. (2010) proposed a new plotting point based on simultaneous closeness probabilities. This was followed up by a correlation-type goodness-of-fit test based on these plotting points. In this thesis, two tests based on the correlation coefficient test are proposed; in particular, a maximal-correlation coefficient test and a minimal-correlation coefficient test which are based on simultaneous closeness probabilities are developed. Two approaches are considered to investigate these two tests: a grid search method and an averaging method. Numerical results, including illustrative examples, critical values and a power study are also provided.
2

Estimating water loss from the Renoster River, Free State province, South Africa

Brown, Fred Herman 17 November 2006 (has links)
Faculty of Engineering School of Civil and Enviromental Engineering 9308302w fredb@ananzi.co.za / The Renoster River is a minor left-bank tributary of the Vaal River located some 120 km south-west of Johannesburg in the Free State Province of South Africa. The river is dammed by the Koppies Dam, which divides the river into upper and lower basins. No water is released directly from the Koppies Dam into the Renoster River channel except as a protective measure during natural flooding events. Irrigation releases are made through a separate canal system. The lower Renoster River was modeled as a series of channels, crossing four quaternary catchments. Hydraulic routing was used to estimate potential flow losses for discrete flow events. Out-of-bank flow was found to occur when flow exceeded 300 m3/s. A Pitman monthly rainfall-runoff hydrologic model was used to model longer term historical streamflow data and associated losses. Flow losses for individual flood events along the lower Renoster River were found to vary between 3% and 17%. For a longer term constant flow regime, losses can be expected to range between 10% and 40%. Based on both the hydraulic and hydrologic models, the primary cause of flow loss was found to be evaporation, with insignificant transpiration and transmission losses. For eighty years flow in the lower Renoster River has been artificially controlled. Within that time period the river has adjusted itself to the imposed anthropogenic flow regime. Water losses associated with planned releases into the lower Renoster River can be minimized by using a high discharge rate, to a maximum of 300 m3/s, combined with larger volumetric flows. Evaporation losses can be significant, and release strategies will need to take this into consideration.
3

Building people and teams for effective ministry at the Pitman Creek Church of Christ

Jordan, Richard Wayne, January 2001 (has links)
Thesis (D. Min.)--Abilene Christian University, 2001. / Includes abstract and vita. Includes bibliographical references (leaves 101-108).
4

Nonparametric Bayesian Clustering under Structural Restrictions

Hanxi Sun (11009154) 23 July 2021 (has links)
<div>Model-based clustering, with its flexibility and solid statistical foundations, is an important tool for unsupervised learning, and has numerous applications in a variety of fields. This dissertation focuses on nonparametric Bayesian approaches to model-based clustering under structural restrictions. These are additional constraints on the model that embody prior knowledge, either to regularize the model structure to encourage interpretability and parsimony or to encourage statistical sharing through underlying tree or network structure.</div><div><br></div><div>The first part in the dissertation focuses on the most commonly used model-based clustering models, mixture models. Current approaches typically model the parameters of the mixture components as independent variables, which can lead to overfitting that produces poorly separated clusters, and can also be sensitive to model misspecification. To address this problem, we propose a novel Bayesian mixture model with the structural restriction being that the clusters repel each other.The repulsion is induced by the generalized Matérn type-III repulsive point process. We derive an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior inference, and demonstrate its utility on a number of synthetic and real-world problems. <br></div><div><br></div><div>The second part of the dissertation focuses on clustering populations with a hierarchical dependency structure that can be described by a tree. A classic example of such problems, which is also the focus of our work, is the phylogenetic tree with nodes often representing biological species. The structure of this problem refers to the hierarchical structure of the populations. Clustering of the populations in this problem is equivalent to identify branches in the tree where the populations at the parent and child node have significantly different distributions. We construct a nonparametric Bayesian model based on hierarchical Pitman-Yor and Poisson processes to exploit this, and develop an efficient particle MCMC algorithm to address this problem. We illustrate the efficacy of our proposed approach on both synthetic and real-world problems.</div>
5

Εκτίμηση των παραμέτρων στο μοντέλο της διπαραμετρικής εκθετικής κατανομής, υπό περιορισμό

Ραφτοπούλου, Χριστίνα 10 June 2014 (has links)
Η παρούσα μεταπτυχιακή διατριβή εντάσσεται ερευνητικά στην περιοχή της Στατιστικής Θεωρίας Αποφάσεων και ειδικότερα στην εκτίμηση των παραμέτρων στο μοντέλο της διπαραμετρικής εκθετικής κατανομής με παράμετρο θέσης μ και παράμετρο κλίμακος σ. Θεωρούμε το πρόβλημα εκτίμησης των παραμέτρων κλίμακας μ και θέσης σ, όταν μ≤c, όπου c είναι μία γνωστή σταθερά. Αποδεικνύουμε ότι σε σχέση με το κριτήριο του Μέσου Τετραγωνικού Σφάλματος (ΜΤΣ), οι βέλτιστοι αναλλοίωτοι εκτιμητές των μ και σ, είναι μη αποδεκτοί όταν μ≤c, και προτείνουμε βελτιωμένους. Επίσης συγκρίνουμε του εκτιμητές αυτούς σε σχέση με το κριτήριο του Pitman. Επιπλέον, προτείνουμε εκτιμητές που είναι καλύτεροι από τους βέλτιστους αναλλοίωτους εκτιμητές, όταν μ≤c, ως προς την συνάρτηση ζημίας LINEX. Τέλος, η θεωρία που αναπτύσσεται εφαρμόζεται σε δύο ανεξάρτητα δείγματα προερχόμενα από εκθετική κατανομή. / The present master thesis deals with the estimation of the location parameter μ and the scale parameter σ of the two-parameter exponential distribution. We consider the problem of estimation of locasion parameter μ and the scale parameter σ, when it is known apriori that μ≤c, where c is a known constant. We establish that with respect to the mean square error (mse) criterion the best affine estimators of μ and σ in the absence of information μ≤c are inadmissible and we propose estimators which are better than these estimators. Also, we compare these estimators with respect to the Pitman Nearness criterion. We propose estimators which are better than the standard estimators in the unrestricted case with respect to the suitable choise of LINEX loss. Finally, the theory developed is applied to the problem of estimating the location and scale parameters of two exponential distributions when the location parameters are ordered.
6

Modèle de Littelmann pour cristaux géométriques, fonctions de Whittaker sur des groupes de Lie et mouvement brownien.

Chhaibi, Reda 24 January 2013 (has links) (PDF)
De façon générale, cette thèse s'intéresse aux liens entre théorie des représentations et probabilités. Elle se subdivise en principalement trois parties. Dans un premier volet plutôt algébrique, nous construisons un modèle de chemins pour les cristaux géométriques de Berenstein et Kazhdan, pour un groupe de Lie complexe semi-simple. Il s'agira pour l'essentiel de décrire la structure algébrique, ses morphismes naturels et ses paramétrisations. La théorie de la totale positivité y jouera un role particulièrement important. Ensuite, nous avons choisi d'anticiper sur les résultats probabilistes et d'exhiber une mesure canonique sur les cristaux géométriques. Celle-ci utilise comme ingrédients le superpotentiel de variété drapeau, et une mesure invariante sous les actions cristallines. La mesure image par l'application poids joue le role de mesure de Duistermaat-Heckman. Sa transformée de Laplace définit les fonctions de Whittaker, fournissant une formule intégrale particulièrement intéressante pour tous les groupes de Lie. Il apparait alors clairement que les fonctions de Whittaker sont aux cristaux géométriques, ce que les caractères sont aux cristaux combinatoires classiques. La règle de Littlewood-Richardson est aussi exposée. Enfin nous présentons l'approche probabiliste permettant de trouver la mesure canonique. Elle repose sur l'idée fondamentale que la mesure de Wiener induira la bonne mesure sur les structures algébriques du modèle de chemins. Dans une dernière partie, nous démontrons comment notre modèle géométrique dégénère en le modèle de Littelmann continu classique, pour retrouver des résultats connus. Par exemple, la mesure canonique sur un cristal géométrique de plus haut poids dégénère en une mesure uniforme sur un polytope, et retrouve les paramétrisations des cristaux continus.
7

Suspended Sediment Transport Dynamics and Sediment Yields in Relation to Watershed Characteristics, Upper Green River Basin, Kentucky

Otoo, James Nii Aboh 01 April 2010 (has links)
Sediment delivery is a major problem in the Green River, Kentucky, home of 71 of the state’s 103 known mussel species and 151 fish species. The river also provides water for many of its surrounding counties. This research focuses on how suspended sediment loads, grain size, and sediment concentration during runoff events are related to watershed characteristics. The research characterized suspended sediment loads, grain size, and sediment concentration during runoff events and how they were related to watershed characteristics such as hydro-climatic regime, watershed size, geology and soils, topography and landuse conditions and land cover conditions. The study focused on Brush Creek and Pitman Creek watersheds in the Upper Green River Basin. This research can help in the planning and development of effective environmental strategies by screening out mitigation measures that would not be effective for implementation to minimize sediment load and suspended sediment concentration in the Green River, thereby improving the water quality of the river. Water quality was monitored using data sondes positioned at selected sites in the two watersheds. Water samples were collected during turbidity thresholds of 100 NTU and analyzed for suspended sediment concentrations. Regression models between ‘discharge and stage’ and also between ‘average turbidity and suspended sediment concentration’ were formulated and load estimates were made and compared. Four sets of samples were collected, two at Brush Creek on 11 April (Brush Creek’s event 1) and 3 May (Brush Creek’s event 2) and the other two at Pitman Creek on the 12 February (Pitman Creek’s event 1) and 3 March (Pitman Creek’s event 2) all in the year 2008. The suspended sediment samples collected for all four events were well graded but had relatively more silt than clay and sand. This could be due to the fact that more time and energy was needed to break the bonds in clay minerals or particles and also to the fact that more energy was also needed to transport sand compared to silt. Brush Creek watershed’s particles had smaller grain sizes than Pitman Creek watershed’s particles. All four events showed clockwise hysteresis indicating that most of the sediments from both watersheds during the events were derived from the bed and banks of the channel or area adjacent to the channel. The 11 April event (Brush Creek’s event 1) produced an estimated load of 1.1 x 105 kg and a sediment yield of 5.3 x 102 kg/km2. The 3 May event (Brush Creek’s event 2) produced an estimated load of 3.8 x 104 kg and a sediment yield of 1.8 x 102 kg/km2. Brush Creek watershed’s estimated load for the period compared was 4.9 x 105 kg and a sediment yield of 2.3 x 103 kg/km2 (53 kg/km2/day). The 12 February event (Pitman Creek’s event 1) produced an estimated load of 2.9 x 105 kg and a sediment yield of 8.4 x 102 kg/km2. The 3 March event (Pitman Creek’s event 2) produced an estimated load of 5.7 x 105 kg and a sediment yield of 1.6 x 103 kg/km2. Pitman Creek watershed’s estimated load for the period compared was 1.1 x 106 kg and a sediment yield of 3.1 x 103 kg/km2 (71 kg/km2/day). Pitman Creek watershed’s higher number of stream network per unit area, its high elevation and relief, its high percentage of erodible soil per unit area, its lesser area of protection of erodible soil by its vegetation compared to Brush Creek watershed’s are responsible for its higher sediment load and yield.
8

Bayesian Nonparametric Modeling and Inference for Multiple Object Tracking

January 2019 (has links)
abstract: The problem of multiple object tracking seeks to jointly estimate the time-varying cardinality and trajectory of each object. There are numerous challenges that are encountered in tracking multiple objects including a time-varying number of measurements, under varying constraints, and environmental conditions. In this thesis, the proposed statistical methods integrate the use of physical-based models with Bayesian nonparametric methods to address the main challenges in a tracking problem. In particular, Bayesian nonparametric methods are exploited to efficiently and robustly infer object identity and learn time-dependent cardinality; together with Bayesian inference methods, they are also used to associate measurements to objects and estimate the trajectory of objects. These methods differ from the current methods to the core as the existing methods are mainly based on random finite set theory. The first contribution proposes dependent nonparametric models such as the dependent Dirichlet process and the dependent Pitman-Yor process to capture the inherent time-dependency in the problem at hand. These processes are used as priors for object state distributions to learn dependent information between previous and current time steps. Markov chain Monte Carlo sampling methods exploit the learned information to sample from posterior distributions and update the estimated object parameters. The second contribution proposes a novel, robust, and fast nonparametric approach based on a diffusion process over infinite random trees to infer information on object cardinality and trajectory. This method follows the hierarchy induced by objects entering and leaving a scene and the time-dependency between unknown object parameters. Markov chain Monte Carlo sampling methods integrate the prior distributions over the infinite random trees with time-dependent diffusion processes to update object states. The third contribution develops the use of hierarchical models to form a prior for statistically dependent measurements in a single object tracking setup. Dependency among the sensor measurements provides extra information which is incorporated to achieve the optimal tracking performance. The hierarchical Dirichlet process as a prior provides the required flexibility to do inference. Bayesian tracker is integrated with the hierarchical Dirichlet process prior to accurately estimate the object trajectory. The fourth contribution proposes an approach to model both the multiple dependent objects and multiple dependent measurements. This approach integrates the dependent Dirichlet process modeling over the dependent object with the hierarchical Dirichlet process modeling of the measurements to fully capture the dependency among both object and measurements. Bayesian nonparametric models can successfully associate each measurement to the corresponding object and exploit dependency among them to more accurately infer the trajectory of objects. Markov chain Monte Carlo methods amalgamate the dependent Dirichlet process with the hierarchical Dirichlet process to infer the object identity and object cardinality. Simulations are exploited to demonstrate the improvement in multiple object tracking performance when compared to approaches that are developed based on random finite set theory. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2019
9

Evaluation of the Catchment Parameter (CAPA) and Midgley and Pitman (MIPI) empirical design flood estimation methods

Smal, Ruan 12 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: The devastating effects floods have on both social and economic level make effective flood risk management an essential part of rural and urban development. A major part of effective flood risk management is the application of reliable design flood estimation methods. Research over the years has illustrated that current design flood estimation methods as a norm show large discrepancies which can mainly be attributed to the fact that these methods are outdated (Smithers, 2007). The research presented focused on the evaluation and updating of the Midgley and Pitman (MIPI) and the Catchment Parameter (CAPA or McPherson) empirical design flood estimation methods. The evaluation was done by means of comparing design floods estimated by each method with more reliable probabilistic design floods derived from historical flow records. Flow gauging stations were selected as drainage data points based on the availability of flow data and available catchment characteristics. A selection criterion was developed resulting in 53 gauging stations. The Log Normal (LN) and Log Pearson Type III (LP III) distributions were used to derive the probabilistic floods for each gauging station. The flow gauging stations were used to delineate catchments and to quantify catchment characteristics using Geographic Information Systems (GIS) software and their associated applications. The two methods were approximated by means derived formulas instead of evaluating and updating the two methods from first principles. This was done as a result of the constraints brought about by both time and the attainment of the relevant literature. The formulae were derived by means of plotting method inputs and resulted in graphs, fitting a trendline through the points and deriving a formula best describing the trendline. The derived formulae and the catchment characteristics were used to estimate the design floods for each method. A comparison was then done between the design flood results of the two methods and the probabilistic design floods. The results of these comparisons were used to derive correction factors which could potentially increase the reliability of the two methods used to estimate design floods. The effectiveness of any updating would be the degree (or level) in which the reliability of a method could be increased. It was proven that the correction factors did decrease the difference between the „assumed and more reliable probabilistic design floods‟ and the methods‟ estimates. However, the increase in reliability of the methods through the use of the recommended correction factors is questionable due to factors such as the reliability of the flow data as well as the methods which had to be used to derive the correction factors. / AFRIKAANSE OPSOMMING: Die verwoestende gevolge van vloede op beide ekonomiese en sosiale gebiede beklemtoon die belangrikheid van effektiewe vloed risiko bestuur vir ontwikellings doeleindes. „n Baie belangrikke gedeelte van effektiewe vloed risiko bestuur is die gebruik van betroubare ontwerp vloed metodes. Navorsing oor die laaste paar jaar het die tekortkominge van die metodes beklemtoon, wat meestal toegeskryf kan word aan die metodes wat verouderd is. Die navorsing het gefokus op die evaluering en moontlike opdatering van die Midley en Pitman (MIPI) en die “Catchment Parameter” (CAPA of McPherson) empiriese ontwerp vloed metodes. Die evaluering het geskied deur middel van die vergelyking van die ontwerp vloed soos bereken deur die twee metodes en die aanvaarde, meer betroubare probabilistiese ontwerp vloede, bepaal deur middel van statistiese ontledings. Vloei meetstasies is gekies as data-punte omrede die beskikbaarheid van vloei data en beskikbare opvanggebied eienskappe. „n Seleksie kriteruim is ontwikkel waaruit 53 meetstasies gekies is. Die Log Normale (LN) en Log Pearson Tipe III (LP III) verspreidings is verder gebruik om die probabilistiese ontwerp vloede te bereken vir elke meetstasie. Die posisie van die meetstasies is ook verder gebruik om opvanggebiede te definieer en opvanggebied eienskappe te bereken. Geografiese inligtingstelsels (GIS) is vir die doel gebruik inplaas van die oorspronlik hand metodes. Die twee metodes is benader deur die gebruik van afgeleide formules inplaas van „n eerste beginsel benadering. Dit is gedoen as gevolg van die beperkings wat teweeggebring is deur beide tyd en die beskikbaarheid van die relevante litratuur wat handel oor die ontwikkeling van die twee metodes. Die formules is verkry deur middel van die plot van beide insette en resultate in grafieke, die passing van tendenslyne en die afleiding van formules wat die tendenslyne die beste beskryf. Die afgeleide formules saam met die opvanggebied eienskappe is toe verder gebruik om die ontwerp vloede van elke meet stasie te bepaal, vir beide metodes. The resultate van die twee metodes is toe vergelyk met die probabilistiese ontwerp vloede. Die resultate van hierdie vergelyking is verder gebruik om korreksie faktore af te lei wat moontlik die betroubaarheid van die twee metodes kon verhoog. Die doeltreffendheid van enige opdatering sal die mate wees waarin die betroubaarheid van n metode verhoog kan word. Gedurende die verhandeling is dit bewys dat die korreksie faktore wel n vermindering teweebring in die verskil tussen die ontwerp vloede van die aanvaarde meer betroubare probabilistiese ontwerp vloede van beide metodes. Die toename in betroubaarheid van die metodes deur die gebruik van die voorgestelde korreksie faktore is egter bevraagteken as gevolg van faktore soos die betroubaarheid van die vloei data self asook die metodologie wat gevolg is om die korreksie faktore af te lei.
10

Estimateur par variables instrumentales : une approche alternative?

Ba, Bocar 10 1900 (has links) (PDF)
Ce texte propose une façon alternative d'obtenir l'estimateur à variables instrumentales. Dans un premier temps, un ensemble d'estimés candidats est obtenu à l'aide d'une estimation par méthode des moments généralisés. Puis, l'estimateur alternatif est obtenu en combinant les estimés candidats par l'intermédiaire d'une matrice de poids optimale. Les hypothèses, concernant l'estimateur alternatif, garantissant l'efficacité et la normalité asymptotique sont également présentées. De plus, un test permettant de vérifier la validité des instruments est suggéré. Les expériences de Monte Carlo montrent que le biais de l'estimateur proposé augmente avec le nombre d'estimés candidats. Finalement, l'écart type moyen de l'estimateur proposé est plus petit que l'estimateur à variables instrumentales standard quand les instruments ne sont pas faibles. Lorsque la gamme des estimateurs possibles s'élargit, l'estimateur alternatif démontre de meilleures performances que l'estimateur standard dans le cas d'une endogénéité qui est faible ou moyenne. Les propriétés de puissance dans le cadre de l'alternative de type Pitman-local sont également suggérées. Les simulations de Monte Carlo montrent que la puissance du test en grand échantillon semble satisfaisante au niveau de la taille et de la puissance. Cependant il se peut que ce test souffre d'un biais en petit échantillon. Finalement, en utilisant l'approche d'Angrist et Krueger (1991) pour l'application empirique, les estimateurs alternatif et standard donnent des résultats similaires; cependant la méthode proposée donne des écart-types plus faibles. ______________________________________________________________________________ MOTS-CLÉS DE L’AUTEUR : alternative de type Pitman-local, fonction de puissance, méthode des moments généralisés, minimisation de la distance simulation de Monte Carlo, test de Wald, variables instrumentales, tests d'hypothèses.

Page generated in 0.0337 seconds