• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 291
  • 113
  • 32
  • 31
  • 15
  • 13
  • 8
  • 7
  • 7
  • 6
  • 5
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 604
  • 604
  • 213
  • 118
  • 101
  • 99
  • 97
  • 82
  • 78
  • 65
  • 62
  • 61
  • 55
  • 53
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Uncertainty, Identification, And Privacy: Experiments In Individual Decision-making

Rivenbark, David 01 January 2010 (has links)
The alleged privacy paradox states that individuals report high values for personal privacy, while at the same time they report behavior that contradicts a high privacy value. This is a misconception. Reported privacy behaviors are explained by asymmetric subjective beliefs. Beliefs may or may not be uncertain, and non-neutral attitudes towards uncertainty are not necessary to explain behavior. This research was conducted in three related parts. Part one presents an experiment in individual decision making under uncertainty. Ellsberg's canonical two-color choice problem was used to estimate attitudes towards uncertainty. Subjects believed bets on the color ball drawn from Ellsberg's ambiguous urn were equally likely to pay. Estimated attitudes towards uncertainty were insignificant. Subjective expected utility explained subjects' choices better than uncertainty aversion and the uncertain priors model. A second treatment tested Vernon Smith's conjecture that preferences in Ellsberg's problem would be unchanged when the ambiguous lottery is replaced by a compound objective lottery. The use of an objective compound lottery to induce uncertainty did not affect subjects' choices. The second part of this dissertation extended the concept of uncertainty to commodities where quality and accuracy of a quality report were potentially ambiguous. The uncertain priors model is naturally extended to allow for potentially different attitudes towards these two sources of uncertainty, quality and accuracy. As they relate to privacy, quality and accuracy of a quality report are seen as metaphors for online security and consumer trust in e-commerce, respectively. The results of parametric structural tests were mixed. Subjects made choices consistent with neutral attitudes towards uncertainty in both the quality and accuracy domains. However, allowing for uncertainty aversion in the quality domain and not the accuracy domain outperformed the alternative which only allowed for uncertainty aversion in the accuracy domain. Finally, part three integrated a public-goods game and punishment opportunities with the Becker-DeGroot-Marschak mechanism to elicit privacy values, replicating previously reported privacy behaviors. The procedures developed elicited punishment (consequence) beliefs and information confidentiality beliefs in the context of individual privacy decisions. Three contributions are made to the literature. First, by using cash rewards as a mechanism to map actions to consequences, the study eliminated hypothetical bias as a confounding behavioral factor which is pervasive in the privacy literature. Econometric results support the 'privacy paradox' at levels greater than 10 percent. Second, the roles of asymmetric beliefs and attitudes towards uncertainty were identified using parametric structural likelihood methods. Subjects were, in general, uncertainty neutral and believed 'bad' events were more likely to occur when their private information was not confidential. A third contribution is a partial test to determine which uncertain process, loss of privacy or the resolution of consequences, is of primary importance to individual decision-makers. Choices were consistent with uncertainty neutral preferences in both the privacy and consequences domains.
182

Wavelet-Domain Hyperspectral Soil Texture Classification

Zhang, Xudong 08 May 2004 (has links)
This thesis presents an automatic soil texture classification system using hyperspectral soil signals and wavelet-based statistical models. Previous soil texture classification systems are closely related to texture classification methods, which use images for training and testing. Although using image-based algorithms is a straightforward way to conduct soil texture classification, our research shows that it does not provide reliable and consistent results. Rather, we develop a novel system using hyperspectral soil textures, better known as hyperspectral soil signals, which provide rich information and intrinsic properties about soil textures. Hyperspectral soil textures, in their very nature, are nonstationary and time-varying. Therefore, the wavelet transform, which is proven to be successful in such applications, is incorporated. In this study, we incorporate two wavelet-domain statistical models, namely, the maximum likelihood (ML) and the hidden Markov model (HMM) for the classification task. Experimental results show that this method is reliable and robust. It is also more effective and efficient in terms of practical implementation than the traditional image-based methods.
183

Analysis of Agreement Between Two Long Ranked Lists

Sampath, Srinath January 2013 (has links)
No description available.
184

Reliability Assessment for Complex Systems Using Multi-level, Multi-type Reliability Data and Maximum Likelihood Method

Li, Xiangfei 24 September 2014 (has links)
No description available.
185

Updating Bridge Deck Condition Transition Probabilities as New Inspection Data are Collected: Methodology and Empirical Evaluation

Li, Zequn, LI January 2017 (has links)
No description available.
186

Stochastic modeling of the sleep process

Gibellato, Marilisa Gail 09 March 2005 (has links)
No description available.
187

Blind rate detection for multirate DS-CDMA signals

Sharma, Abhay January 2000 (has links)
No description available.
188

LIKELIHOOD INFERENCE FOR LOG-LOGISTIC DISTRIBUTION UNDER PROGRESSIVE TYPE-II RIGHT CENSORING

Alzahrani, Alya 10 1900 (has links)
<p>Censoring arises quite often in lifetime data. Its presence may be planned or unplanned. In this project, we demonstrate progressive Type-II right censoring when the underlying distribution is log-logistic. The objective is to discuss inferential methods for the unknown parameters of the distribution based on the maximum likelihood estimation method. The Newton-Raphson method is proposed as a numerical technique to solve the pertinent non-linear equations. In addition, confidence intervals for the unknown parameters are constructed based on (i) asymptotic normality of the maximum likelihood estimates, and (ii) percentile bootstrap resampling technique. A Monte Carlo simulation study is conducted to evaluate the performance of the methods of inference developed here. Some illustrative examples are also presented.</p> / Master of Science (MSc)
189

Phase Transform Time Delay Estimation to Counteract Spectral Haystacking Effects in Jet Exhaust Flow Measurements

Silas, Kevin Alexander 01 September 2021 (has links)
This study determined a superior data processing technique for correlating an acoustic signal passing through a subsonic jet engine exhaust in order to estimate the traversal time of the signal. Thrust measurement is possible with enough time delay estimates across different portions of the exhaust. This preliminary study did not take the full array of data necessary to measure thrust, but did validate key aspects of the measurement process. The turbulent shear layers of the exhaust spectrally broaden the signal, creating the appearance of spectral "haystacks", making traditional correlation methods unworkable. An experiment was performed to evaluate the ability of a novel sound source to produce a signal from which a reliable and precise time delay estimate could be found. The test apparatus was installed on either side of a Honeywell TFE731-2 turbofan research engine exhaust cone, with the source and receivers placed near the jet exit plane. The signal was then directed across the jet exhaust. This flow environment is considered an extreme challenge for accurate acoustic signal propagation. A key contribution of this paper is the determination that the Phase Transform processor of the Generalized Cross-Correlation (GCC) method produces the most reliable time delay estimates, for the given signal and flow conditions. Several alternative time delay estimators and GCC processors were examined and evaluated on this data. A proposed explanation is provided for why this time delay estimation technique produces the most accurate results, as well as explanations for why the technique became less reliable as the flow environment became more challenging, with an observed 22% anomalous TDE selection rate for the N1Corr = 60% and N1Corr = 70% conditions combined, versus only 6% for the idle and N1Corr = 50% conditions combined. This paper also details the development and first use of a novel acoustic source that produces a two-tone narrowband signal emanating from a single point – the dual Hartmann generator. / Master of Science / This study builds on a Computational Tomography (CT) technique that uses an acoustic signal and an array of receivers to measure the velocity and temperature of a gas flow field. In particular, the velocity and temperature field tested involves multiple turbulent and disruptive elements, requiring a loud and specifically designed signal. As such, a novel acoustic signal generator, the dual Hartmann generator, was designed that is both loud and produces a specific two-toned signal. The key contribution of the study was to process the data, comparing the sets of transmitted and received signals, in order to estimate the time delay amongst receiver pairs – a key input in the CT method. Traditional cross-correlation methods were inadequate, and multiple alternatives were evaluated. The Phase Transform (PHAT) technique showed the most promise, and an explanation is given for why this technique is most suitable for this type of signal.
190

Mapping quantitative trait loci using multiple linked markers via Residual Maximum Likelihood

Grignola, Fernando E. 10 November 2005 (has links)
Mapping quantitative trait loci in outbred populations is important since development of inbred lines in livestock species is usually not feasible. Traditional genetic mapping methods, such as Least Squares and Maximum Likelihood, cannot fully accommodate complex pedigree structures, and more sophisticated methods such as Bayesian analysis are very demanding computationally. In this thesis, an alternative approach based on a Residual Maximum Likelihood method for estimation of position and variance of one or two linked QTLs and of additive polygenic and residual variances is presented. The method is based on a mixed linear model including polygenic and random QTL allelic effects. The variance-covariance matrix of QTL allelic effects and its inverse is computed conditional on incomplete information from multiple linked markers. The method is implemented using interval mapping and a derivative-free algorithm, where the required coefficient matrix of the Mixed Model Equations is derived from a Reduced Animal Model. simulation studies based on a granddaughter design with 2000 sons, 20 sires and 9 ancestors were performed to evaluate parameter estimation and power of QTL detection. Daughter Yield Deviations of sons were simulated under three QTL models, a biallelic, a multiallelic (10 alleles), and a normal-effects model. A linkage group of five or nine markers located on the same chromosome was assumed, and genotypes were available on sons, sires and ancestors. Likelihood ratio statistics were used to test for the presence of one or two linked QTLs. Parameters were estimated quite accurately for all three QTL models, showing that the method is robust to the number of alleles at the QTL. The effect of considering or ignoring relationships in the analyses did not have a major impact on parameter estimates but reduced the power of QTL detection. In general, power tended to decrease as the number of sons per sire, QTL contribution to additive genetic variance, or distance between QTLs was reduced. The method allowed for detection of a single QTL explaining 25% of the additive genetic variance, and for detection of two QTLs when jointly they accounted for 50% or 12.5% of the additive genetic variance. Although the REML analysis is an approximate method incorporating an expected covariance matrix of the QTL effects conditional on marker information, it is a computationally less expensive alternative to Bayesian analysis for accounting for the distribution of marker-QTL genotypes given marker and phenotypic information. For the designs studied, parameters were estimated accurately and QTLs mapped with satisfactory power. / Ph. D.

Page generated in 0.0337 seconds