• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 7
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 44
  • 44
  • 12
  • 10
  • 9
  • 9
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Designing computer experiments to estimate integrated response functions

Marin, Ofelia, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Includes bibliographical references (p. 115-117).
2

Robust and adaptive sampled data I - control

Ozdemir, Necati January 2000 (has links)
No description available.
3

Performance evaluation in Bayesian adaptive randomization.

Wang, Degang. Lee, Jack J., Fu, Yunxin, Lai, Dajian Boerwinkle, Eric, January 2008 (has links)
Source: Masters Abstracts International, Volume: 47-03, page: 1686. Advisers: Jack J. Lee; Yunxin Fu. Includes bibliographical references.
4

Sequential Optimal Recovery: A Paradigm for Active Learning

Niyogi, Partha 12 May 1995 (has links)
In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).
5

Implementation of an adaptive importance sampling technique in MCNP for monoenergetic slab problems

Mosher, Scott William 05 1900 (has links)
No description available.
6

An analysis of the adaptive cluster sampling design with rare plant point distributions /

Tout, Jeremy. January 1900 (has links)
Thesis (M.A.)--Humboldt State University, 2009. / Includes bibliographical references (leaves 29-31). Also available via Humboldt Digital Scholar.
7

Adaptive Sampling Line Search for Simulation Optimization

Ragavan, Prasanna Kumar 08 March 2017 (has links)
This thesis is concerned with the development of algorithms for simulation optimization (SO), a special case of stochastic optimization where the objective function can only be evaluated through noisy observations from a simulation. Deterministic techniques, when directly applied to simulation optimization problems fail to converge due to their inability to handle randomness thus requiring sophisticated algorithms. However, many existing algorithms dedicated for simulation optimization often show poor performance on implementation as they require extensive parameter tuning. To overcome these shortfalls with existing SO algorithms, we develop ADALINE, a line search based algorithm that eliminates the need for any user defined parameters. ADALINE is designed to identify a local minimum on continuous and integer ordered feasible sets. ADALINE on a continuous feasible set mimics deterministic line search algorithms, while it iterates between a line search and an enumeration procedure on integer ordered feasible sets in its quest to identify a local minimum. ADALINE improves upon many of the existing SO algorithms by determining the sample size adaptively as a trade-off between the error due to estimation and the optimization error, that is, the algorithm expends simulation effort proportional to the quality of the incumbent solution. We also show that ADALINE converges ``almost surely'' to the set of local minima. Finally, our numerical results suggest that ADALINE converges to a local minimum faster, outperforming other advanced SO algorithms that utilize variable sampling strategies. To demonstrate the performance of our algorithm on a practical problem, we apply ADALINE in solving a surgery rescheduling problem. In the rescheduling problem, the objective is to minimize the cost of disruptions to an existing schedule shared between multiple surgical specialties while accommodating semi-urgent surgeries that require expedited intervention. The disruptions to the schedule are determined using a threshold based heuristic and ADALINE identifies the best threshold levels for various surgical specialties that minimizes the expected total cost of disruption. A comparison of the solutions obtained using a Sample Average Approximation (SAA) approach, and ADALINE is provided. We find that the adaptive sampling strategy in ADALINE identifies a better solution quickly than SAA. / Ph. D.
8

An efficient approach for high-fidelity modeling incorporating contour-based sampling and uncertainty

Crowley, Daniel R. 13 January 2014 (has links)
During the design process for an aerospace vehicle, decision-makers must have an accurate understanding of how each choice will affect the vehicle and its performance. This understanding is based on experiments and, increasingly often, computer models. In general, as a computer model captures a greater number of phenomena, its results become more accurate for a broader range of problems. This improved accuracy typically comes at the cost of significantly increased computational expense per analysis. Although rapid analysis tools have been developed that are sufficient for many design efforts, those tools may not be accurate enough for revolutionary concepts subject to grueling flight conditions such as transonic or supersonic flight and extreme angles of attack. At such conditions, the simplifying assumptions of the rapid tools no longer hold. Accurate analysis of such concepts would require models that do not make those simplifying assumptions, with the corresponding increases in computational effort per analysis. As computational costs rise, exploration of the design space can become exceedingly expensive. If this expense cannot be reduced, decision-makers would be forced to choose between a thorough exploration of the design space using inaccurate models, or the analysis of a sparse set of options using accurate models. This problem is exacerbated as the number of free parameters increases, limiting the number of trades that can be investigated in a given time. In the face of limited resources, it can become critically important that only the most useful experiments be performed, which raises multiple questions: how can the most useful experiments be identified, and how can experimental results be used in the most effective manner? This research effort focuses on identifying and applying techniques which could address these questions. The demonstration problem for this effort was the modeling of a reusable booster vehicle, which would be subject to a wide range of flight conditions while returning to its launch site after staging. Contour-based sampling, an adaptive sampling technique, seeks cases that will improve the prediction accuracy of surrogate models for particular ranges of the responses of interest. In the case of the reusable booster, contour-based sampling was used to emphasize configurations with small pitching moments; the broad design space included many configurations which produced uncontrollable aerodynamic moments for at least one flight condition. By emphasizing designs that were likely to trim over the entire trajectory, contour-based sampling improves the predictive accuracy of surrogate models for such designs while minimizing the number of analyses required. The simplified models mentioned above, although less accurate for extreme flight conditions, can still be useful for analyzing performance at more common flight conditions. The simplified models may also offer insight into trends in the response behavior. Data from these simplified models can be combined with more accurate results to produce useful surrogate models with better accuracy than the simplified models but at less cost than if only expensive analyses were used. Of the data fusion techniques evaluated, Ghoreyshi cokriging was found to be the most effective for the problem at hand. Lastly, uncertainty present in the data was found to negatively affect predictive accuracy of surrogate models. Most surrogate modeling techniques neglect uncertainty in the data and treat all cases as deterministic. This is plausible, especially for data produced by computer analyses which are assumed to be perfectly repeatable and thus truly deterministic. However, a number of sources of uncertainty, such as solver iteration or surrogate model prediction accuracy, can introduce noise to the data. If these sources of uncertainty could be captured and incorporated when surrogate models are trained, the resulting surrogate models would be less susceptible to that noise and correspondingly have better predictive accuracy. This was accomplished in the present effort by capturing the uncertainty information via nuggets added to the Kriging model. By combining these techniques, surrogate models could be created which exhibited better predictive accuracy while selecting the most informative experiments possible. This significantly reduced the computational effort expended compared to a more standard approach using space-filling samples and data from a single source. The relative contributions of each technique were identified, and observations were made pertaining to the most effective way to apply the separate and combined methods.
9

New methods for studying complex diseases via genetic association studies

Schu, Matthew Charles 22 January 2016 (has links)
Genome-wide association studies (GWAS) have delivered many novel insights about the etiology of many common heritable diseases. However, in most disorders studied by GWAS, the known single nucleotide polymorphisms (SNPs) associated with the disease do not account for a large portion of the genetic factors underlying the condition. This suggests that many of the undiscovered variants contributing to the risk of common diseases have weak effects or are relatively rare. This thesis introduces novel adaptations of techniques for improving detection power for both of these types of risk variants, and reports the results of analyses applying these methods to real datasets for common diseases. Chapter 2 describes a novel approach to improve the detection of weak-effect risk variants that is based on an adaptive sampling technique known as Distilled Sensing (DS). This procedure entails utilization of a portion of the total sample to exclude from consideration regions of the genome where there is no evidence of genetic association, and then testing for association with a greatly reduced number of variants in the remaining sample. Application of the method to simulated data sets and GWAS data from studies of age-related macular degeneration (AMD) demonstrated that, in many situations, DS can have superior power over traditional meta-analysis techniques to detect weak-effect loci. Chapter 3 describes an innovative pipeline to screen for rare variants in next generation sequencing (NGS) data. Since rare variants, by definition, are likely to be present in only a few individuals even in large samples, efficient methods to screen for rare causal variants are critical for advancing the utility of NGS technology. Application of our approach, which uses family-based data to identify candidate rare variants that could explain aggregation of disease in some pedigrees, resulted in the discovery of novel protein-coding variants linked to increased risk for Alzheimer's disease (AD) in African Americans. The techniques presented in this thesis address different aspects of the "missing heritability" problem and offer efficient approaches to discover novel risk variants, and thereby facilitate development of a more complete picture of genetic risk for common diseases.
10

Active Machine Learning for Computational Design and Analysis under Uncertainties

Lacaze, Sylvain January 2015 (has links)
Computational design has become a predominant element of various engineering tasks. However, the ever increasing complexity of numerical models creates the need for efficient methodologies. Specifically, computational design under uncertainties remains sparsely used in engineering settings due to its computational cost. This dissertation proposes a coherent framework for various branches of computational design under uncertainties, including model update, reliability assessment and reliability-based design optimization. Through the use of machine learning techniques, computationally inexpensive approximations of the constraints, limit states, and objective functions are constructed. Specifically, a novel adaptive sampling strategy allowing for the refinement of any approximation only in relevant regions has been developed, referred to as generalized max-min. This technique presents various computational advantages such as ease of parallelization and applicability to any metamodel. Three approaches tailored for computational design under uncertainties are derived from the previous approximation technique. An algorithm for reliability assessment is proposed and its efficiency is demonstrated for different probabilistic settings including dependent variables using copulas. Additionally, the notion of fidelity map is introduced for model update settings with large number of dependent responses to be matched. Finally, a new reliability-based design optimization method with local refinement has been developed. A derivation of sampling-based probability of failure derivatives is also provided along with a discussion on numerical estimates. This derivation brings additional flexibility to the field of computational design. The knowledge acquired and techniques developed during this Ph.D. have been synthesized in an object-oriented MATLAB toolbox. The help and ergonomics of the toolbox have been designed so as to be accessible by a large audience.

Page generated in 0.0898 seconds