• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 362
  • 149
  • 78
  • 28
  • 10
  • 10
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • Tagged with
  • 855
  • 120
  • 112
  • 110
  • 106
  • 106
  • 95
  • 74
  • 63
  • 60
  • 59
  • 58
  • 58
  • 57
  • 57
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Dimensionality Reduction in High-Dimensional Profile Analysis Using Scores

Vikbladh, Jonathan January 2022 (has links)
Profile analysis is a multivariate statistical method for comparing the mean vectors for different groups. It consists of three tests, they are the tests for parallelism, level and flatness. The results from each test give information about the behaviour of the groups and the variables in the groups. The test statistics used when there are more than two groups are likelihood-ratio tests. However, issues in the form indeterminate test statistics occur in the high-dimensional setting, that is when there are more variables than observations. This thesis investigates a method to approach this problem by reducing the dimensionality of the data using scores, that is linear combinations of the variables. Three different ways of choosing this score are compared: the eigendecomposition and two variations of the non-negative matrix factorization. The methods are compared using simulations for five different type of mean parameter settings. The results show that the eigendecomposition is the best technique for choosing the score, and that using more scores only slightly improves the results. Moreover, the results for the parallelism and the flatness tests are shown to be very good, but the results for the level hypothesis deviate from the expectation.
502

On Steiner Symmetrizations of First Exit Time Distributions and Levy Processes

Timothy M Rolling (16642125) 25 July 2023 (has links)
<p>The goal of this thesis is to establish generalized isoperimetric inequalities on first exit time distributions as well as expectations of L\'evy processes.</p> <p>Firstly, we prove inequalities on first exit time distributions in the case that the L\'evy process is an $\alpha$-stable symmetric process $A_t$ on $\R^d$, $\alpha\in(0,2]$. Given $A_t$ and a bounded domain $D\subset\R^d$, we present a proof, based on the classical Brascamp-Lieb-Luttinger inequalities for multiple integrals, that the distribution of the first exit time of $A_t$ from $D$ increases under Steiner symmetrization. Further, it is shown that when a sequence of domains $\{D_m\}$ each contained in a ball $B\subset\R^d$ and satisfying the $\varepsilon$-cone property converges to a domain $D'$ with respect to the Hausdorff metric, the sequence of distributions of first exit times for Brownian motion from  $D_m$  converges to the distribution of the exit time of Brownian motion from $D'$. The second set of results in this thesis extends the theorems from \cite{BanMen} by proving generalized isoperimetric inequalities on expectations of L\'evy processes in the case of Steiner symmetrization.% using the Brascamp-Lieb-Luttinger inequalities used above. </p> <p>These results will then be used to establish inequalities involving distributions of first exit times of $\alpha$-stable symmetric processes $A_t$ from triangles and quadrilaterals. The primary application of these inequalities is verifying a conjecture from Ba\~nuelos for these planar domains. This extends a classical result of P\'olya and Szeg\"o to the fractional Laplacian with Dirichlet boundary conditions.</p>
503

Comparison of MaxEnt and boosted regression tree model performance in predicting the spatial distribution of threatened plant, Telephus spurge (Euphorbia telephioides)

Mainella, Alexa Marie 29 April 2016 (has links)
No description available.
504

Associations between stream macroinvertebrate communities and surface substrate size distributions

Stamp, Jennifer January 2004 (has links)
No description available.
505

Electric Field and Voltage Distributions along Non-ceramic Insulators

Que, Weiguo January 2002 (has links)
No description available.
506

Proximity curves for potential-based clustering

Csenki, Attila, Neagu, Daniel, Torgunov, Denis, Micic, Natasha 11 January 2020 (has links)
Yes / The concept of proximity curve and a new algorithm are proposed for obtaining clusters in a finite set of data points in the finite dimensional Euclidean space. Each point is endowed with a potential constructed by means of a multi-dimensional Cauchy density, contributing to an overall anisotropic potential function. Guided by the steepest descent algorithm, the data points are successively visited and removed one by one, and at each stage the overall potential is updated and the magnitude of its local gradient is calculated. The result is a finite sequence of tuples, the proximity curve, whose pattern is analysed to give rise to a deterministic clustering. The finite set of all such proximity curves in conjunction with a simulation study of their distribution results in a probabilistic clustering represented by a distribution on the set of dendrograms. A two-dimensional synthetic data set is used to illustrate the proposed potential-based clustering idea. It is shown that the results achieved are plausible since both the ‘geographic distribution’ of data points as well as the ‘topographic features’ imposed by the potential function are well reflected in the suggested clustering. Experiments using the Iris data set are conducted for validation purposes on classification and clustering benchmark data. The results are consistent with the proposed theoretical framework and data properties, and open new approaches and applications to consider data processing from different perspectives and interpret data attributes contribution to patterns.
507

Toward Error-Statistical Principles of Evidence in Statistical Inference

Jinn, Nicole Mee-Hyaang 02 June 2014 (has links)
The context for this research is statistical inference, the process of making predictions or inferences about a population from observation and analyses of a sample. In this context, many researchers want to grasp what inferences can be made that are valid, in the sense of being able to uphold or justify by argument or evidence. Another pressing question among users of statistical methods is: how can spurious relationships be distinguished from genuine ones? Underlying both of these issues is the concept of evidence. In response to these (and similar) questions, two questions I work on in this essay are: (1) what is a genuine principle of evidence? and (2) do error probabilities have more than a long-run role? Concisely, I propose that felicitous genuine principles of evidence should provide concrete guidelines on precisely how to examine error probabilities, with respect to a test's aptitude for unmasking pertinent errors, which leads to establishing sound interpretations of results from statistical techniques. The starting point for my definition of genuine principles of evidence is Allan Birnbaum's confidence concept, an attempt to control misleading interpretations. However, Birnbaum's confidence concept is inadequate for interpreting statistical evidence, because using only pre-data error probabilities would not pick up on a test's ability to detect a discrepancy of interest (e.g., "even if the discrepancy exists" with respect to the actual outcome. Instead, I argue that Deborah Mayo's severity assessment is the most suitable characterization of evidence based on my definition of genuine principles of evidence. / Master of Arts
508

Intuitive Numerical Information Processes in Consumer Judgment

Villanova, Daniel Joseph Bodin 09 April 2018 (has links)
Numerical information is ubiquitous in modern life. The prevalence of numerical information in the marketplace necessitates understanding how consumers handle and interpret that information, for both theoretical and practical reasons. Past research has largely focused on consumers’ encoding of numbers, calculative limitations, and usage of heuristics. This dissertation will contribute to this burgeoning literature in several ways. First, I identify a general tendency in how consumers calculate ratios based on an intuitive model of division. Specifically, consumers tend to divide larger numbers by smaller numbers. The intuitive model of division has marketing implications for both consumers’ evaluations of quantity offers and sensitivities to promotions. Next, I examine how consumers draw inferences from distributional information. In contrast to the assumption that consumers utilize means to assess central tendency, I demonstrate that consumers use the modal response to judge what is typical, with implications for consumers’ inferences about product ratings and other social distributions. / PHD / Numerical information is ubiquitous in modern life. The prevalence of numerical information in the marketplace necessitates understanding how consumers handle and interpret that information, for both theoretical and practical reasons. Past research has largely focused on how consumers’ mentally perceive numbers, how difficult it is to engage in calculation, and usage of mental shortcuts. This dissertation will contribute to this burgeoning literature in several ways. First, I identify a general tendency in how consumers calculate ratios based on an intuitive model of division. Specifically, consumers tend to divide larger numbers by smaller numbers. The intuitive model of division has marketing implications for both consumers’ evaluations of quantity offers and sensitivities to promotions. Next, I examine how consumers draw inferences from distributional information. In contrast to the assumption that consumers utilize means to assess central tendency, I demonstrate that consumers use the modal response to judge what is typical, with implications for consumers’ inferences about product ratings and other social distributions.
509

Complete Surface Current Surface Distribution in a Normal-Mode Helical Antenna using a Galerkin Solution with Sinusoidal Basis Functions

Abd-Alhameed, Raed, Excell, Peter S. January 2002 (has links)
No / An investigation of the surface current distribution in a normal-mode helical antenna (NMHA) is reported. This enables precise prediction of the performance of NMHAs, since traditional wire-antenna simulations ignore important details, such as non-uniform and transverse current distributions. A moment-method formulation is developed, using two-dimensional basis functions to represent the total non-uniform surface current distribution over the surface of the wire of the helix. Piecewise-sinusoidal basis functions are employed in two normal directions, with an exact kernel formulation and application of Galerkin's solution method. The numerical solution of the singular integrals associated with self-impedance terms was computed with a very low relative error. The surface current distribution was computed for different helix geometries. It was found that the axially-directed component of the current distribution around the surface of the wire was highly non-uniform and that there was also a significant circumferential current flow due to inter-turn capacitance, both effects that are overlooked by standard filamentary current representations.
510

Utilizing soil quality data for premium rate making in the federal crop insurance program

Moore, Rylan 08 August 2023 (has links) (PDF)
The federal crop insurance program provides crop insurance for millions of acres and many commodities every year. The Risk Management Agency of the USDA is responsible for determining the premium rates for these covered commodities. Currently, the quality of soil is not considered when determining baseline yields and expected premium rates. This study utilizes the moment-based maximum entropy method to assess the effect of incorporating soil in the rate making methodology. Several moments of upland cotton yield in Arkansas, Mississippi, and Texas are conditioned on weather, irrigation, and soil control variables. Ultimately, I find evidence of mispriced premium rates for counties in all three states for both irrigated and non-irrigated upland cotton yield.

Page generated in 0.0742 seconds