• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 11
  • 7
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 58
  • 58
  • 13
  • 11
  • 10
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Pamokų tvarkaraščio optimizavimas profiliuotoms mokykloms / Optimization of profiled school schedule

Norkus, Aurimas 25 May 2005 (has links)
There are three implemented algorithms in this work: lessons permutation, lessons permutation with simulated annealing adjustment, lessons permutation using Bayesian approach theory to optimize SA parameters algorithms. Algorithms and graphical user interface are programmed with JSP which is based on Java object programming language. To evaluate schedule goodness algorithms are computing every penalty points which are given for some inconvenieces. User is able to define how much penalty points will be given if some inconveniece is satisfied. Also he is able to assign stochastic algorithm parameters. There was accomplished theory, where was observed using of simulated annealing and Bayesian approch methods in other stochastic algorithms and their different combination. There is a description of profiled school schedule optimization algorithm, which is based on SA searching methodology: searching for the optima through lower quality solutions, using temperature function which convergence, difference in quality. Algorythm which is using BA was created in case to improve SA searching methodology. User by changing systems temperature or annealing speed througth parameters can make big influence to SA behaviour. Passing parameters then using algorithm with BA meaner influence is made to behaviour because this method prognosticates, acording to him, better parameters with which SA should work effectively and changing them. Researches with three stochastic algorithms were made... [to full text]
2

Vortex Detection in CFD Datasets Using a Multi-Model Ensemble Approach

Bassou, Randa 09 December 2016 (has links)
Over the past few decades, visualization and application researchers have been investigating vortices and have developed several algorithms for detecting vortex-like structures in the flow. These techniques can adequately identify vortices in most computational datasets, each with its own degree of accuracy. However, despite these efforts, there still does not exist an entirely reliable vortex detection method that does not require significant user intervention. The objective of this research is to solve this problem by introducing a novel vortex analysis technique that provides more accurate results by optimizing the threshold for several computationally-efficient, local vortex detectors, before merging them using the Bayesian method into a more robust detector that assimilates global domain knowledge based on labeling performed by an expert. Results show that when choosing the threshold well, combining the methods does not improve accuracy; whereas, if the threshold is chosen poorly, combining the methods produces significant improvement.
3

A Bayesian inversion framework for subsurface seismic imaging problems

Urozayev, Dias 11 1900 (has links)
This thesis considers the reconstruction of subsurface models from seismic observations, a well-known high-dimensional and ill-posed problem. As a first regularization to such a problem, a reduction of the parameters' space is considered following a truncated Discrete Cosine Transform (DCT). This helps regularizing the seismic inverse problem and alleviates its computational complexity. A second regularization based on Laplace priors as a way of accounting for sparsity in the model is further proposed to enhance the reconstruction quality. More specifically, two Laplace-based penalizations are applied: one for the DCT coefficients and another one for the spatial variations of the subsurface model, which leads to an enhanced representation of cross-correlations of the DCT coefficients. The Laplace priors are represented by hierarchical forms that are suitable for deriving efficient inversion schemes. The corresponding inverse problem, which is formulated within a Bayesian framework, lies in computing the joint posteriors of the target model parameters and the hyperparameters of the introduced priors. Such a joint posterior is indeed approximated using the Variational Bayesian (VB) approach with a separable form of marginals under the minimization of Kullback-Leibler divergence criterion. The VB approach can provide an efficient means of obtaining not only point estimates but also closed forms of the posterior probability distributions of the quantities of interest, in contrast with the classical deterministic optimization methods. The case in which the observations are contaminated with outliers is further considered. For that case, a robust inversion scheme is proposed based on a Student-t prior for the observation noise. The proposed approaches are applied to successfully reconstruct the subsurface acoustic impedance model of the Volve oilfield.
4

Risk Based Maintenance Optimization using Probabilistic Maintenance Quantification Models of Circuit Breaker

Natti, Satish 14 January 2010 (has links)
New maintenance techniques for circuit breakers are studied in this dissertation by proposing a probabilistic maintenance model and a new methodology to assess circuit breaker condition utilizing its control circuit data. A risk-based decision approach is proposed at system level making use of the proposed new methodology, for optimizing the maintenance schedules and allocation of resources. This dissertation is focused on developing optimal maintenance strategies for circuit breakers, both at component and system level. A probabilistic maintenance model is proposed using similar approach recently introduced for power transformers. Probabilistic models give better insight into the interplay among monitoring techniques, failure modes and maintenance techniques of the component. The model is based on the concept of representing the component life time by several deterioration stages. Inspection and maintenance is introduced at each stage and model parameters are defined. A sensitivity analysis is carried to understand the importance of model parameters in obtaining optimal maintenance strategies. The analysis covers the effect of inspection rate calculated for each stage and its impact on failure probability, inspection cost, maintenance cost and failure cost. This maintenance model is best suited for long-term maintenance planning. All simulations are carried in MATLAB and how the analysis results may be used to achieve optimal maintenance schedules is discussed. A new methodology is proposed to convert data from the control circuit of a breaker into condition of the breaker by defining several performance indices for breaker assemblies. Control circuit signal timings are extracted and a probability distribution is fitted to each timing parameter. Performance indices for various assemblies such as, trip coil, close coil, auxiliary contacts etc. are defined based on the probability distributions. These indices are updated using Bayesian approach as the new data arrives. This process can be made practical by approximating the Bayesian approach calculating the indices on-line. The quantification of maintenance is achieved by computing the indices after a maintenance action and comparing with those of previously estimated ones. A risk-based decision approach to maintenance planning is proposed based on the new methodology developed for maintenance quantification. A list of events is identified for the test system under consideration, and event probability, event consequence, and hence the risk associated with each event is computed. Optimal maintenance decisions are taken based on the computed risk levels for each event. Two case studies are presented to evaluate the performance of the proposed new methodology for maintenance quantification. The risk-based decision approach is tested on IEEE Reliability Test System. All simulations are carried in MATLAB and the discussions of results are provided.
5

Evaluating the Teacher-Intern-Professor Model in a Professional Development School Partnership Setting using a Bayesian Approach to Mix Methods

Ogletree, August Elena 24 September 2009 (has links)
Two needs of Georgia State University Professional Development School Partnerships are to show increases in both student academic achievement and teacher efficacy. The Teacher-Intern-Professor (TIP) Model was designed to address these needs. The TIP model focuses on using the university and school partnership to support Georgia State University student intern preparedness and student academic achievement for those participating in the program. TIP Model outcomes were analyzed using a quasi-experimental design for achievement data and a Bayesian approach to mix methods for efficacy data. Quantitative data, in the form of test scores, were analyzed to compare mean student academic achievement at the classroom level. Mean differences between treatment and comparison groups were not significant for the TIP treatment factor (F(1, 60) = .248, p =.620) as measured by a benchmark test. Results favored the treatment group over control group for the TIP treatment factor (F(1, 56) = 17.967, p < .001) on a geometry test. A methodological contribution is the exploration and development of an approach to mix methods using Bayesian statistics to combine quantitative and qualitative data. Bayesian statistics allows for incorporation of the researcher’s prior belief into the data analysis. Narrative Inquiry was the qualitative framework employed to gain understanding of the participants’ qualitative data, thus providing a particular way of prior belief elicitation. More specifically, a content analysis of the qualitative data, which included interviews, observations, and artifacts, was used in conjunction with quantitative historical data to elicit prior beliefs. The Bayesian approach to mix methods combined prior beliefs from the teacher efficacy qualitative data with the quantitative data from Gibson’s and Dembo’s Teacher Efficacy Scale to obtain posterior distributions, which summarized beliefs for the themes of teacher efficacy and personal efficacy.
6

RESPONSE ADAPTIVE CLINICAL TRIALS WITH CENSORED LIFETIMES

2013 October 1900 (has links)
We have constructed a response adaptive clinical trial to treat patients sequentially in order to maximize the total survival time of all patients. Typically the response adaptive design is based on the urn models or on sequential estimation procedures, but we used a bandit process in this dissertation. The objective of a bandit process is to optimize a measure of sequential selections from several treatments. Each treatment consist of a sequence of conditionally independent and identically distributed random variables, and some of these treatment have unknown distribution functions. For the purpose of this clinical trial, we are focusing on the bandit process with delayed response. These responses are lifetime variables which may be censored upon their observations. Following the Bayesian approach and dynamic programming technique, we formulated a controlled stochastic dynamic model. In addition, we used an example to illustrate the possible application of the main results as well as "R" to implement a model simulation.
7

Genes de efeito principal e locos de características quantitativas (QTL) em suínos

Gonçalves, Tarcísio de Moraes [UNESP] January 2003 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:32:59Z (GMT). No. of bitstreams: 0 Previous issue date: 2003Bitstream added on 2014-06-13T20:04:47Z : No. of bitstreams: 1 goncalves_tm_dr_botfmvz.pdf: 444010 bytes, checksum: c1849f380080443d0dab0cbcb119af68 (MD5) / Foi utilizada uma análise de segregação com o uso da inferência Bayesiana para se verificar a presença de genes de efeito principal (GEP) afetando duas características de carcaça: gordura intramuscular em % (GIM) e espessura de toucinho em mm (ET); e uma de crescimento, ganho de peso (g/dia) no período entre 25 a 90 kg de peso vivo (GP). Para este estudo foram usadas informações de 1.257 animais provenientes de um experimento de cruzamento de suínos machos da raça Meishan (raça chinesa) e fêmeas de linhagens holandesas de Large White e Landrace. No melhoramento genético animal, Modelos Poligênicos Finitos (MPF) podem ser uma alternativa a Modelos Poligênicos Infinitesimais (MPI) para avaliação genética de características quantitativas usando pedigris complexos. MPI, MPF e MPI combinado com MPF, foram empiricamente testados para estimar componentes de variâncias e número de genes no MPF. Para a estimação de médias marginais a posteriori de componentes de variância e parâmetros foi usado uma metodologia Bayesiana, através do uso da Cadeia de Markov, algoritmos de Monte Carlo (MCMC), via Amostrador de Gibbs e “Reversible Jump Sampler (Metropolis-Hastings)”. Em função dos resultados obtidos, pode-se evidenciar quatro GEP, isto é, dois para GIM e dois para ET. Para ET, o GEP explicou a maior parte da variação genética, enquanto para GIM, o GEP reduziu significativamente a variação poligênica. Para a variação do GP não foi possível determinar a influência do GEP. As herdabilidades estimadas para GIM, ET e GP foram de 0,37, 0,24 e 0,37 respectivamente. A metodologia Bayesiana foi implementada satisfatoriamente usando o pacote computacional FlexQTLTM. Estudos futuros baseados neste experimento que usem marcadores moleculares para mapear os genes de efeito principal que afetem, principalmente GIM e ET, poderão lograr êxito. / A Bayesian marker-free segregation analysis was applied to search for evidence of segregation genes affecting two carcass traits: Intramuscular Fat in % (IMF) and Backfat Thickness in mm (BF), and one growth trait: Liveweight Gain from approximately 25 to 90 kg liveweight, in g/day (LG). For this study 1257 animals from an experimental cross between pigs Meishan (male) and Dutch Large White and Landrace lines (female) were used. In animal breeding, Finite Polygenic Models (FPM) may be an alternative to the Infinitesimal Polygenic Model (IPM) for genetic evaluation of pedigree multiple-generations populations for multiple quantitative traits. FPM, IPM and FPM combined with IPM were empirically tested for estimation of variance components and number of genes in the FPM. Estimation of marginal posteriori means of variance components and parameters was performed by use Markov Chain Monte Carlo techniques by use of the Gibbs sampler and the reversible Jump sampler (Metropolis-Hastings). The results showed evidence for four Major Genes (MG), i.e., two for IMF and two BF. For BF, the MG explained almost all of the genetic variance while for IMF, the MG reduced the polygenic variance significantly. For LG was not found to be likely influenced by MG. The polygenic heritability estimates for IMF, BF and LG were 0.37, 0.24 and 0.37 respectively. The Bayesian methodology was satisfactorily implemented in the software package FlexQTLTM. Further molecular genetic research, based on the same experimental data, effort to map single genes affecting, mainly IMF and BF, has a high probability of success.
8

Radar Target Tracking with Varying Levels of Communications Interference for Shared Spectrum Access

January 2015 (has links)
abstract: As the demand for spectrum sharing between radar and communications systems is steadily increasing, the coexistence between the two systems is a growing and very challenging problem. Radar tracking in the presence of strong communications interference can result in low probability of detection even when sequential Monte Carlo tracking methods such as the particle filter (PF) are used that better match the target kinematic model. In particular, the tracking performance can fluctuate as the power level of the communications interference can vary dynamically and unpredictably. This work proposes to integrate the interacting multiple model (IMM) selection approach with the PF tracker to allow for dynamic variations in the power spectral density of the communications interference. The model switching allows for a necessary transition between different communications interference power spectral density (CI-PSD) values in order to reduce prediction errors. Simulations demonstrate the high performance of the integrated approach with as many as six dynamic CI-PSD value changes during the target track. For low signal-to-interference-plus-noise ratios, the derivation for estimating the high power levels of the communications interference is provided; the estimated power levels would be dynamically used in the IMM when integrated with a track-before-detect filter that is better matched to low SINR tracking applications. / Dissertation/Thesis / Masters Thesis Electrical Engineering 2015
9

Application of Bayesian Methods to Structural Models and Stochastic Frontier Production Models

January 2014 (has links)
abstract: This dissertation applies the Bayesian approach as a method to improve the estimation efficiency of existing econometric tools. The first chapter suggests the Continuous Choice Bayesian (CCB) estimator which combines the Bayesian approach with the Continuous Choice (CC) estimator suggested by Imai and Keane (2004). Using simulation study, I provide two important findings. First, the CC estimator clearly has better finite sample properties compared to a frequently used Discrete Choice (DC) estimator. Second, the CCB estimator has better estimation efficiency when data size is relatively small and it still retains the advantage of the CC estimator over the DC estimator. The second chapter estimates baseball's managerial efficiency using a stochastic frontier function with the Bayesian approach. When I apply a stochastic frontier model to baseball panel data, the difficult part is that dataset often has a small number of periods, which result in large estimation variance. To overcome this problem, I apply the Bayesian approach to a stochastic frontier analysis. I compare the confidence interval of efficiencies from the Bayesian estimator with the classical frequentist confidence interval. Simulation results show that when I use the Bayesian approach, I achieve smaller estimation variance while I do not lose any reliability in a point estimation. Then, I apply the Bayesian stochastic frontier analysis to answer some interesting questions in baseball. / Dissertation/Thesis / Doctoral Dissertation Economics 2014
10

Influence of the Estimator Selection in Scalloped Hammerhead Shark Stock Assessment

Ballesta Artero, Irene Maria 13 January 2014 (has links)
In natural sciences, frequentist paradigm has led statistical practice; however, Bayesian approach has been gaining strength in the last decades. Our study assessed the scalloped hammerhead shark population on the western North Atlantic Ocean using Bayesian methods. This approach allowed incorporate diverse types of errors in the surplus production model and compare the influences of different statistical estimators on the values of the key parameters (r, growth rate; K carrying capacity; depletion, FMSY , fishing levels that would sustain maximum yield; and NMSY, abundance at maximum sustainable yield). Furthermore, we considered multi-levelpriors due to the variety of results on the population growth rate of this species. Our research showed that estimator selection influences the results of the surplus production model and therefore, the value of the target management points. Based on key parameter estimates with uncertainty and Deviance Information Criterion, we suggest that state-space Bayesian models be used for assessing scalloped hammerhead shark or other fish stocks with poor data available. This study found the population was overfished and suffering overfishing. Therefore, based on our research and that there was very low evidence of recovery according with the last data available, we suggest prohibition of fishing for this species because: (1) it is highly depleted (14% of its initial population), (2) the fishery status is very unstable over time, (3) it has a low reproductive rate contributing to a higher risk of overexploitation, and (4) the easiness of misidentification among different hammerhead sharks (smooth, great, scalloped and cryptic species). / Master of Science

Page generated in 0.0595 seconds