• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 6
  • 1
  • 1
  • Tagged with
  • 23
  • 9
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Dual Metamodeling Perspective for Design and Analysis of Stochastic Simulation Experiments

Wang, Wenjing 17 July 2019 (has links)
Fueled by a growing number of applications in science and engineering, the development of stochastic simulation metamodeling methodologies has gained momentum in recent years. A majority of the existing methods, such as stochastic kriging (SK), only focus on efficiently metamodeling the mean response surface implied by a stochastic simulation experiment. As the simulation outputs are stochastic with the simulation variance varying significantly across the design space, suitable methods for variance modeling are required. This thesis takes a dual metamodeling perspective and aims at exploiting the benefits of fitting the mean and variance functions simultaneously for achieving an improved predictive performance. We first explore the effects of replacing the sample variances with various smoothed variance estimates on the performance of SK and propose a dual metamodeling approach to obtain an efficient simulation budget allocation rule. Second, we articulate the links between SK and least-square support vector regression and propose to use a ``dense and shallow'' initial design to facilitate selection of important design points and efficient allocation of the computational budget. Third, we propose a variational Bayesian inference-based Gaussian process (VBGP) metamodeling approach to accommodate the situation where either one or multiple simulation replications are available at every design point. VBGP can fit the mean and variance response surfaces simultaneously, while taking into full account the uncertainty in the heteroscedastic variance. Lastly, we generalize VBGP for handling large-scale heteroscedastic datasets based on the idea of ``transductive combination of GP experts.'' / Doctor of Philosophy / In solving real-world complex engineering problems, it is often helpful to learn the relationship between the decision variables and the response variables to better understand the real system of interest. Directly conducting experiments on the real system can be impossible or impractical, due to the high cost or time involved. Instead, simulation models are often used as a surrogate to model the complex stochastic systems for conducting simulation-based design and analysis. However, even simulation models can be very expensive to run. To alleviate the computational burden, a metamodel is often built based on the outputs of the simulation runs at some selected design points to map the performance response surface as a function of the controllable decision variables, or uncontrollable environmental variables, to approximate the behavior of the original simulation model. There has been a plethora of work in the simulation research community dedicated to studying stochastic simulation metamodeling methodologies suitable for analyzing stochastic simulation experiments in science and engineering. A majority of the existing methods, such as stochastic kriging (SK), have been known as effective metamodeling tool for approximating a mean response surface implied by a stochastic simulation. Despite that SK has been extensively used as an effective metamodeling methodology for stochastic simulations, SK and metamodeling techniques alike still face four methodological barriers: 1) Lack of the study in variance estimates methods; 2) Absence of an efficient experimental design for simultaneous mean and variance metamodeling; 3) Lack of flexibility to accommodate situations where simulation replications are not available; and 4) Lack of scalability. To overcome the aforementioned barriers, this thesis takes a dual metamodeling perspective and aims at exploiting the benefits of fitting the mean and variance functions simultaneously for achieving an improved predictive performance. We first explore the effects of replacing the sample variances with various smoothed variance estimates on the performance of SK and propose a dual metamodeling approach to obtain an efficient simulation budget allocation rule. Second, we articulate the links between SK and least-square support vector regression and propose to use a “dense and shallow” initial design to facilitate selection of important design points and efficient allocation of the computational budget. Third, we propose a variational Bayesian inference-based Gaussian process (VBGP) metamodeling approach to accommodate the situation where either one or multiple simulation replications are available at every design point. VBGP can fit the mean and variance response surfaces simultaneously, while taking into full account the uncertainty in the heteroscedastic variance. Lastly, we generalize VBGP for handling large-scale heteroscedastic datasets based on the idea of “transductive combination of GP experts.”
2

Integer-valued ARCH and GARCH models

Choden C, Kezang 01 August 2016 (has links)
The models for volatility, autoregressive conditional heteroscedastic (ARCH) and generalized autoregressive conditional heteroscedastic (GARCH) are discussed. Stationarity condition and forecasting for simple ARCH(1) and GARCH(1,1) models are given. The model for discrete time series is proposed to be negative binomial integer-valued GARCH model, which is a generalization of the Poisson INGARCH model. The stationarity conditions and the autocorrelation function are given. For parameter estimation, three methodologies are presented with a focus on maximum likelihood approach. Simulation study on a sample size of 100 and 500 are carried out and the results are presented. An application of the model to a real time series with numerical example is given indicating that the proposed methodology performs better than the Poisson and double Poisson model-based methods.
3

An ARCH/GARCH arbitrage pricing theory approach to modelling the return generating process of South African stock returns.

Szczygielski, Jan Jakub 14 August 2013 (has links)
This study investigates the return generating process underlying the South African stock market. The investigation of the return generating process is framed within the Arbitrage Pricing Theory (APT) framework with the APT reinterpreted so as to provide a conceptual framework within which the return generating process can be investigated. In modelling the return generating process, the properties of South African stock returns are taken into consideration and an appropriate econometric framework in the form of Autoregressive Conditional Heteroscedastic (ARCH) and Generalized Autoregressive Conditional Heteroscedastic (GARCH) models is applied. Results indicate that the return generating process of South African stock returns is described by innovations in multiple risk factors representative of several risk categories. The multifactor model of the return generating process explains a substantial amount of variation in South African stock returns and the ARCH/GARCH methodology is an appropriate econometric framework for the estimation of models of the return generating process. The APT framework is successfully applied to model and investigate the return generating process of South African stock returns.
4

Design space exploration of stochastic system-of-systems simulations using adaptive sequential experiments

Kernstine, Kemp H. 25 June 2012 (has links)
The complexities of our surrounding environments are becoming increasingly diverse, more integrated, and continuously more difficult to predict and characterize. These modeling complexities are ever more prevalent in System-of-Systems (SoS) simulations where computational times can surpass real-time and are often dictated by stochastic processes and non-continuous emergent behaviors. As the number of connections continue to increase in modeling environments and the number of external noise variables continue to multiply, these SoS simulations can no longer be explored with traditional means without significantly wasting computational resources. This research develops and tests an adaptive sequential design of experiments to reduce the computational expense of exploring these complex design spaces. Prior to developing the algorithm, the defining statistical attributes of these spaces are researched and identified. Following this identification, various techniques capable of capturing these features are compared and an algorithm is synthesized. The final algorithm will be shown to improve the exploration of stochastic simulations over existing methods by increasing the global accuracy and computational speed, while reducing the number of simulations required to learn these spaces.
5

NONPARAMETRIC ESTIMATION OF DERIVATIVES WITH APPLICATIONS

Hall, Benjamin 01 January 2010 (has links)
We review several nonparametric regression techniques and discuss their various strengths and weaknesses with an emphasis on derivative estimation and confidence band creation. We develop a generalized C(p) criterion for tuning parameter selection when interest lies in estimating one or more derivatives and the estimator is both linear in the observed responses and self-consistent. We propose a method for constructing simultaneous confidence bands for the mean response and one or more derivatives, where simultaneous now refers both to values of the covariate and to all derivatives under consideration. In addition we generalize the simultaneous confidence bands to account for heteroscedastic noise. Finally, we consider the characterization of nanoparticles and propose a method for identifying a proper subset of the covariate space that is most useful for characterization purposes.
6

Consistency and Uniform Bounds for Heteroscedastic Simulation Metamodeling and Their Applications

Zhang, Yutong 05 September 2023 (has links)
Heteroscedastic metamodeling has gained popularity as an effective tool for analyzing and optimizing complex stochastic systems. A heteroscedastic metamodel provides an accurate approximation of the input-output relationship implied by a stochastic simulation experiment whose output is subject to input-dependent noise variance. Several challenges remain unsolved in this field. First, in-depth investigations into the consistency of heteroscedastic metamodeling techniques, particularly from the sequential prediction perspective, are lacking. Second, sequential heteroscedastic metamodel-based level-set estimation (LSE) methods are scarce. Third, the increasingly high computational cost required by heteroscedastic Gaussian process-based LSE methods in the sequential sampling setting is a concern. Additionally, when constructing a valid uniform bound for a heteroscedastic metamodel, the impact of noise variance estimation is not adequately addressed. This dissertation aims to tackle these challenges and provide promising solutions. First, we investigate the information consistency of a widely used heteroscedastic metamodeling technique, stochastic kriging (SK). Second, we propose SK-based LSE methods leveraging novel uniform bounds for input-point classification. Moreover, we incorporate the Nystrom approximation and a principled budget allocation scheme to improve the computational efficiency of SK-based LSE methods. Lastly, we investigate empirical uniform bounds that take into account the impact of noise variance estimation, ensuring an adequate coverage capability. / Doctor of Philosophy / In real-world engineering problems, understanding and optimizing complex systems can be challenging and prohibitively expensive. Computer simulation is a valuable tool for analyzing and predicting system behaviors, allowing engineers to explore different scenarios without relying on costly physical prototypes. However, the increasing complexity of simulation models leads to a higher computational burden. Metamodeling techniques have emerged to address this issue by accurately approximating the system performance response surface based on limited simulation experiment data to enable real-time decision-making. Heteroscedastic metamodeling goes further by considering varying noise levels inherent in simulation outputs, resulting in more robust and accurate predictions. Among various techniques, stochastic kriging (SK) stands out by striking a good balance between computational efficiency and statistical accuracy. Despite extensive research on SK, challenges persist in its application and methodology. These include little understanding of SK's consistency properties, an absence of sequential SK-based algorithms for level-set estimation (LSE) under heteroscedasticity, and the increasingly low computational efficiency of SK-based LSE methods in implementation. Furthermore, a precise construction of uniform bounds for the SK predictor is also missing. This dissertation aims at addressing these aforementioned challenges. First, the information consistency of SK from a prediction perspective is investigated. Then, sequential SK-based procedures for LSE in stochastic simulation, incorporating novel uniform bounds for accurate input-point classification, are proposed. Furthermore, a popular approximation technique is incorporated to enhance the computational efficiency of the SK-based LSE methods. Lastly, empirical uniform bounds are investigated considering the impact of noise variance estimation.
7

Rank Estimation in Elliptical Models : Estimation of Structured Rank Covariance Matrices and Asymptotics for Heteroscedastic Linear Regression

Kuljus, Kristi January 2008 (has links)
This thesis deals with univariate and multivariate rank methods in making statistical inference. It is assumed that the underlying distributions belong to the class of elliptical distributions. The class of elliptical distributions is an extension of the normal distribution and includes distributions with both lighter and heavier tails than the normal distribution. In the first part of the thesis the rank covariance matrices defined via the Oja median are considered. The Oja rank covariance matrix has two important properties: it is affine equivariant and it is proportional to the inverse of the regular covariance matrix. We employ these two properties to study the problem of estimating the rank covariance matrices when they have a certain structure. The second part, which is the main part of the thesis, is devoted to rank estimation in linear regression models with symmetric heteroscedastic errors. We are interested in asymptotic properties of rank estimates. Asymptotic uniform linearity of a linear rank statistic in the case of heteroscedastic variables is proved. The asymptotic uniform linearity property enables to study asymptotic behaviour of rank regression estimates and rank tests. Existing results are generalized and it is shown that the Jaeckel estimate is consistent and asymptotically normally distributed also for heteroscedastic symmetric errors.
8

Applications of modern regression techniques in empirical economics

März, Alexander 14 July 2016 (has links)
No description available.
9

Some extensions in measurement error models / Algumas extensões em modelos com erros de medição

Tomaya, Lorena Yanet Cáceres 14 December 2018 (has links)
In this dissertation, we approach three different contributions in measurement error model (MEM). Initially, we carry out maximum penalized likelihood inference in MEMs under the normality assumption. The methodology is based on the method proposed by Firth (1993), which can be used to improve some asymptotic properties of the maximum likelihood estimators. In the second contribution, we develop two new estimation methods based on generalized fiducial inference for the precision parameters and the variability product under the Grubbs model considering the two-instrument case. One method is based on a fiducial generalized pivotal quantity and the other one is built on the method of the generalized fiducial distribution. Comparisons with two existing approaches are reported. Finally, we propose to study inference in a heteroscedastic MEM with known error variances. Instead of the normal distribution for the random components, we develop a model that assumes a skew-t distribution for the true covariate and a centered Students t distribution for the error terms. The proposed model enables to accommodate skewness and heavy-tailedness in the data, while the degrees of freedom of the distributions can be different. We use the maximum likelihood method to estimate the model parameters and compute them via an EM-type algorithm. All proposed methodologies are assessed numerically through simulation studies and illustrated with real datasets extracted from the literature. / Neste trabalho abordamos três contribuições diferentes em modelos com erros de medição (MEM). Inicialmente estudamos inferência pelo método de máxima verossimilhança penalizada em MEM sob a suposição de normalidade. A metodologia baseia-se no método proposto por Firth (1993), o qual pode ser usado para melhorar algumas propriedades assintóticas de os estimadores de máxima verossimilhança. Em seguida, propomos construir dois novos métodos de estimação baseados na inferência fiducial generalizada para os parâmetros de precisão e a variabilidade produto no modelo de Grubbs para o caso de dois instrumentos. O primeiro método é baseado em uma quantidade pivotal generalizada fiducial e o outro é baseado no método da distribuição fiducial generalizada. Comparações com duas abordagens existentes são reportadas. Finalmente, propomos estudar inferência em um MEM heterocedástico em que as variâncias dos erros são consideradas conhecidas. Nós desenvolvemos um modelo que assume uma distribuição t-assimétrica para a covariável verdadeira e uma distribuição t de Student centrada para os termos dos erros. O modelo proposto permite acomodar assimetria e cauda pesada nos dados, enquanto os graus de liberdade das distribuições podem ser diferentes. Usamos o método de máxima verossimilhança para estimar os parâmetros do modelo e calculá-los através de um algoritmo tipo EM. Todas as metodologias propostas são avaliadas numericamente em estudos de simulação e são ilustradas com conjuntos de dados reais extraídos da literatura
10

Modelos de regressão linear heteroscedásticos com erros t-Student: uma abordagem bayesiana objetiva / Heteroscedastics linear regression models with Student t erros: an objective bayesian analysis.

Souza, Aline Campos Reis de 18 February 2016 (has links)
Neste trabalho, apresentamos uma extensão da análise bayesiana objetiva feita em Fonseca et al. (2008), baseada nas distribuições a priori de Jeffreys para o modelo de regressão linear com erros t-Student, para os quais consideramos a suposição de heteoscedasticidade. Mostramos que a distribuição a posteriori dos parâmetros do modelo regressão gerada pela distribuição a priori é própria. Através de um estudo de simulação, avaliamos as propriedades frequentistas dos estimadores bayesianos e comparamos os resultados com outras distribuições a priori encontradas na literatura. Além disso, uma análise de diagnóstico baseada na medida de divergência Kullback-Leiber é desenvolvida com a finalidade de estudar a robustez das estimativas na presença de observações atípicas. Finalmente, um conjunto de dados reais é utilizado para o ajuste do modelo proposto. / In this work , we present an extension of the objective bayesian analysis made in Fonseca et al. (2008), based on Jeffreys priors for linear regression models with Student t errors, for which we consider the heteroscedasticity assumption. We show that the posterior distribution generated by the proposed Jeffreys prior, is proper. Through simulation study , we analyzed the frequentist properties of the bayesian estimators obtained. Then we tested the robustness of the model through disturbances in the response variable by comparing its performance with those obtained under another prior distributions proposed in the literature. Finally, a real data set is used to analyze the performance of the proposed model . We detected possible in uential points through the Kullback -Leibler divergence measure, and used the selection model criterias EAIC, EBIC, DIC and LPML in order to compare the models.

Page generated in 0.0895 seconds