• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2049
  • 601
  • 263
  • 260
  • 61
  • 32
  • 26
  • 19
  • 15
  • 14
  • 10
  • 8
  • 6
  • 6
  • 5
  • Tagged with
  • 4149
  • 815
  • 761
  • 732
  • 723
  • 722
  • 714
  • 661
  • 582
  • 451
  • 433
  • 416
  • 413
  • 370
  • 315
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Detection of Latent Heteroscedasticity and Group-Based Regression Effects in Linear Models via Bayesian Model Selection

Metzger, Thomas Anthony 22 August 2019 (has links)
Standard linear modeling approaches make potentially simplistic assumptions regarding the structure of categorical effects that may obfuscate more complex relationships governing data. For example, recent work focused on the two-way unreplicated layout has shown that hidden groupings among the levels of one categorical predictor frequently interact with the ungrouped factor. We extend the notion of a "latent grouping factor'' to linear models in general. The proposed work allows researchers to determine whether an apparent grouping of the levels of a categorical predictor reveals a plausible hidden structure given the observed data. Specifically, we offer Bayesian model selection-based approaches to reveal latent group-based heteroscedasticity, regression effects, and/or interactions. Failure to account for such structures can produce misleading conclusions. Since the presence of latent group structures is frequently unknown a priori to the researcher, we use fractional Bayes factor methods and mixture g-priors to overcome lack of prior information. We provide an R package, slgf, that implements our methodology in practice, and demonstrate its usage in practice. / Doctor of Philosophy / Statistical models are a powerful tool for describing a broad range of phenomena in our world. However, many common statistical models may make assumptions that are overly simplistic and fail to account for key trends and patterns in data. Specifically, we search for hidden structures formed by partitioning a dataset into two groups. These two groups may have distinct variability, statistical effects, or other hidden effects that are missed by conventional approaches. We illustrate the ability of our method to detect these patterns through a variety of disciplines and data layouts, and provide software for researchers to implement this approach in practice.
382

UAS Risk Analysis using Bayesian Belief Networks: An Application to the VirginiaTech ESPAARO

Kevorkian, Christopher George 27 September 2016 (has links)
Small Unmanned Aerial Vehicles (SUAVs) are rapidly being adopted in the National Airspace (NAS) but experience a much higher failure rate than traditional aircraft. These SUAVs are quickly becoming complex enough to investigate alternative methods of failure analysis. This thesis proposes a method of expanding on the Fault Tree Analysis (FTA) method to a Bayesian Belief Network (BBN) model. FTA is demonstrated to be a special case of BBN and BBN can allow for more complex interactions between nodes than is allowed by FTA. A model can be investigated to determine the components to which failure is most sensitive and allow for redundancies or mitigations against those failures. The introduced method is then applied to the Virginia Tech ESPAARO SUAV. / Master of Science
383

Discrete-Time Bayesian Networks Applied to Reliability of Flexible Coping Strategies of Nuclear Power Plants

Sahin, Elvan 11 June 2021 (has links)
The Fukushima Daiichi accident prompted the nuclear community to find a new solution to reduce the risky situations in nuclear power plants (NPPs) due to beyond-design-basis external events (BDBEEs). An implementation guide for diverse and flexible coping strategies (FLEX) has been presented by Nuclear Energy Institute (NEI) to manage the challenge of BDBEEs and to enhance reactor safety against extended station blackout (SBO). To assess the effectiveness of FLEX strategies, probabilistic risk assessment (PRA) methods can be used to calculate the reliability of such systems. Due to the uniqueness of FLEX systems, these systems can potentially carry dependencies among components not commonly modeled in NPPs. Therefore, a suitable method is needed to analyze the reliability of FLEX systems in nuclear reactors. This thesis investigates the effectiveness and applicability of Bayesian networks (BNs) and Discrete-Time Bayesian Networks (DTBNs) in the reliability analysis of FLEX equipment that is utilized to reduce the risk in nuclear power plants. To this end, the thesis compares BNs with two other reliability assessment methods: Fault Tree (FT) and Markov chain (MC). Also, it is shown that these two methods can be transformed into BN to perform the reliability analysis of FLEX systems. The comparison of the three reliability methods is shown and discussed in three different applications. The results show that BNs are not only a powerful method in modeling FLEX strategies, but it is also an effective technique to perform reliability analysis of FLEX equipment in nuclear power plants. / Master of Science / Some external events like earthquakes, flooding, and severe wind, may cause damage to the nuclear reactors. To reduce the consequences of these damages, the Nuclear Energy Institute (NEI) has proposed mitigating strategies known as FLEX (Diverse and Flexible Coping Strategies). After the implementation of FLEX in nuclear power plants, we need to analyze the failure or success probability of these engineering systems through one of the existing methods. However, the existing methods are limited in analyzing the dependencies among components in complex systems. Bayesian networks (BNs) are a graphical and quantitative technique that is utilized to model dependency among events. This thesis shows the effectiveness and applicability of BNs in the reliability analysis of FLEX strategies by comparing it with two other reliability analysis tools, known as Fault Tree Analysis and Markov Chain. According to the reliability analysis results, BN is a powerful and promising method in modeling and analyzing FLEX strategies.
384

Interferometry in diffusive systems: Theory, limitation to its practical application and its use in Bayesian estimation of material properties

Shamsalsadati, Sharmin 01 May 2013 (has links)
Interferometry in geosciences uses mathematical techniques to image subsurface properties. This method turns a receiver in to a virtual source through utilizing either random noises or engineered sources. The method in seismology has been discussed extensively. Electromagnetic interferometry at high frequencies with coupled electromagnetic fields was developed in the past. However, the problem was not addressed for diffusive electromagnetic fields where the quasi-static limit holds. One of the objectives of this dissertation was to theoretically derive the impulse response of the Earth for low-frequency electromagnetic fields. Applying the theory of interferometry in the regions where the wavefields are diffusive requires volumetrically distributed sources in an infinite domain. That precondition imposed by the theory is not practical in experiments. Hence, the aim of this study was to quantify the important areas and distribution of sources that makes it possible to apply the theory in practice through conducting numerical experiments. Results of the numerical analysis in double half-space models revealed that for surface-based exploration scenarios sources are required to reside in a region with higher diffusivity. In contrast, when the receivers straddle an interface, as in borehole experiments, there is no universal rule for which region is more important; it depends on the frequency, receiver separation and also diffusivity contrast between the layers and varies  for different scenarios. Time-series analysis of the sources confirmed previous findings that the accuracy of the Green\'s function retrieval is a function of both source density and its width. Extending previous works in homogenous media into inhomogeneous models, it was found that sources must be distributed asymmetrically in the system, and extend deeper into the high diffusivity region in comparison to the low diffusivity area. The findings were applied in a three-layered example with a reservoir layer between two impermeable layers. Bayesian statistical inversion of the data obtained by interferometry was then used to estimate the fluid diffusivity (and permeability) along with associated uncertainties. The inversion results determined the estimated model parameters in the form of probability distributions. The output demonstrated that the algorithm converges closely to the true model. / Ph. D.
385

Variable selection for generalized linear mixed models and non-Gaussian Genome-wide associated study data

Xu, Shuangshuang 11 June 2024 (has links)
Genome-wide associated study (GWAS) aims to identify associated single nucleotide polymorphisms (SNP) for phenotypes. SNP has the characteristic that the number of SNPs is from hundred of thousands to millions. If p is the number of SNPs and n is the sample size, it is a p>>n variable selection problem. To solve this p>>n problem, the common method for GWAS is single marker analysis (SMA). However, since SNPs are highly correlated, SMA identifies true causal SNPs with high false discovery rate. In addition, SMA does not consider interaction between SNPs. In this dissertation, we propose novel Bayesian variable selection methods BG2 and IBG3 for non-Gaussian GWAS data. To solve ultra-high dimension problem and highly correlated SNPs problem, BG2 and IBG3 have two steps: screening step and fine-mapping step. In the screening step, BG2 and IBG3, like SMA method, only have one SNP in one model and screen to obtain a subset of most associated SNPs. In the fine-mapping step, BG2 and IBG3 consider all possible combinations of screened candidate SNPs to find the best model. Fine-mapping step helps to reduce false positives. In addition, IBG3 iterates these two steps to detect more SNPs with small effect size. In simulation studies, we compare our methods with SMA methods and fine-mapping methods. We also compare our methods with different priors for variables, including nonlocal prior, unit information prior, Zellner-g prior, and Zellner-Siow prior. Our methods are applied to substance use disorder (alcohol comsumption and cocaine dependence), human health (breast cancer), and plant science (the number of root-like structure). / Doctor of Philosophy / Genome-wide associated study (GWAS) aims to identify genomics variants for targeted phenotype, such as disease and trait. The genomics variants which we are interested in are single nucleotide polymorphisms (SNP). SNP is a substitution mutation in the DNA sequence. GWAS solves the problem that which SNP is associated with the phenotype. However, the number of possible SNPs is from hundred of thousands to millions. The common method for GWAS is called single marker analysis (SMA). SMA only considers one SNP's association with the phenotype each time. In this way, SMA does not have the problem which comes from the large number of SNPs and small sample size. However, SMA does not consider the interaction between SNPs. In addition, SNPs that are close to each other in the DNA sequance may highly correlated SNPs causing SMA to have high false discovery rate. To solve these problems, this dissertation proposes two variable selection methods (BG2 and IBG3) for non-Gaussian GWAS data. Compared with SMA methods, BG2 and IBG3 methods detect true causal SNPs with low false discovery rate. In addition, IBG3 can detect SNPs with small effect sizes. Our methods are applied to substance use disorder (alcohol comsumption and cocaine dependence), human health (breast cancer), and plant science (the number of root-like structure).
386

Statistical estimation of the locations of lightning events

Elhor, Aicha 01 April 2000 (has links)
No description available.
387

Three papers on belief updating and its applications

Chan, Chao Hung 23 May 2024 (has links)
The normative foundation (axioms) of Bayesian belief updating has long been established in the literature of decision science. However, psychology and experiments suggest that while rational decision making is ideal, it is rarely achievable for ordinary people. Therefore, it is important to explore the foundations and consequences of rational decision making within the field of economics. This thesis involves three papers on this. In the first paper, I explore the consequences of wishful thinking on mechanism design. It suggests that wishful thinking bias could be profit-generating for mechanism designers. In the second paper, I investigate conservative updating and provide a foundation for it. The main behavioral axiom, ``conservative consistency," suggests that decision-makers may partially incorporate information, particularly when it requires them to revise their previous preferences (the preferences order according to their prior belief). In the third paper, I reframe the model selection problem as a rational decision-making problem. The decision-maker is restricted to choosing an advisor to delegate their choices. I explore the conditions under which a rational decision-maker selects models (or advisors) according to Bayes factor criteria. / Doctor of Philosophy / Most of us do not always make decisions completely rational. This thesis digs into how irrational decision-making fits into economics, with three papers to break it down. The first paper looks at wishful thinking and how it affects our decisions. It suggests that if we understand our biases, we can design better mechanism to generate profit. The second paper talks about conservative updating, which is all about how we pick and choose what information matters, especially when it clashes with our existing belief. Lastly, the third paper explores how we choose advisors to help us make decisions. It looks at when it is smart to pick based on Bayes factor criteria. Through these papers, this thesis helps us understand how rational decision-making plays out in real-life economics.
388

Safety of Flight Prediction for Small Unmanned Aerial Vehicles Using Dynamic Bayesian Networks

Burns, Meghan Colleen 23 May 2018 (has links)
This thesis compares three variations of the Bayesian network as an aid for decision-making using uncertain information. After reviewing the basic theory underlying probabilistic graphical models and Bayesian estimation, the thesis presents a user-defined static Bayesian network, a static Bayesian network in which the parameter values are learned from data, and a dynamic Bayesian network with learning. As a basis for the comparison, these models are used to provide a prior assessment of the safety of flight of a small unmanned aircraft, taking into consideration the state of the aircraft and weather. The results of the analysis indicate that the dynamic Bayesian network is more effective than the static networks at predicting safety of flight. / Master of Science / This thesis used probabilities to aid decision-making using uncertain information. This thesis presents three models in the form of networks that use probabilities to aid the assessment of flight safety for a small unmanned aircraft. All three methods are forms of Bayesian networks, graphs that map causal relationships between random variables. Each network models the flight conditions and state of the aircraft; two of the networks are static and one varies with time. The results of the analysis indicate that the dynamic Bayesian network is more effective than the static networks at predicting safety of flight.
389

Empirical Analysis of User Passwords across Online Services

Wang, Chun 05 June 2018 (has links)
Leaked passwords from data breaches can pose a serious threat if users reuse or slightly modify the passwords for other services. With more and more online services getting breached today, there is still a lack of large-scale quantitative understanding of the risks of password reuse and modification. In this project, we perform the first large-scale empirical analysis of password reuse and modification patterns using a ground-truth dataset of 28.8 million users and their 61.5 million passwords in 107 services over 8 years. We find that password reuse and modification is a very common behavior (observed on 52% of the users). More surprisingly, sensitive online services such as shopping websites and email services received the most reused and modified passwords. We also observe that users would still reuse the already-leaked passwords for other online services for years after the initial data breach. Finally, to quantify the security risks, we develop a new training-based guessing algorithm. Extensive evaluations show that more than 16 million password pairs (30% of the modified passwords and all the reused passwords) can be cracked within just 10 guesses. We argue that more proactive mechanisms are needed to protect user accounts after major data breaches. / Master of Science / Since most of the internet services use text-based passwords for user authentication, the leaked passwords from data breaches pose a serious threat, especially if users reuse or slightly modify the passwords for other services. The attacker can leverage a known password from one site to guess the same user’s passwords at other sites more easily. In this project, we perform the first large-scale study of password usage based on the largest ever leaked password dataset. The dataset consists of 28.8 million users and their 61.5 million passwords from 107 internet services over 8 years. We find that password reuse and modification is a very common behavior (observed on 52% of the users). More surprisingly, we find that sensitive online services such as shopping websites and email services received the most reused and modified passwords. In addition, users would still reuse the already-leaked passwords for other online services for years after the initial data breach. Finally, we develop a cross-site password-guessing algorithm to guess the modified passwords based on one of the user’s leaked passwords. Our password guessing experiments show that 30% of the modified passwords can be cracked within only 10 guesses. Therefore, we argue that more proactive mechanisms are needed to protect user accounts after major data breaches.
390

Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration.

Roach, N.W., Heron, James, McGraw, Paul V. January 2006 (has links)
No / In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.

Page generated in 0.0781 seconds