• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2265
  • 1054
  • 673
  • 184
  • 120
  • 103
  • 68
  • 55
  • 53
  • 53
  • 35
  • 31
  • 30
  • 24
  • 23
  • Tagged with
  • 5585
  • 5585
  • 1655
  • 1364
  • 571
  • 532
  • 531
  • 521
  • 419
  • 410
  • 394
  • 378
  • 329
  • 325
  • 308
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

ON THE ROBUSTNESS OF TOTAL INDIRECT EFFECTS ESTIMATED IN THE JORESKOG-KEESLING-WILEY COVARIANCE STRUCTURE MODEL.

STONE, CLEMENT ADDISON. January 1987 (has links)
In structural equation models, researchers often examine two types of causal effects: direct and indirect effects. Direct effects involve variables that "directly" influence other variables, whereas indirect effects are transmitted via intervening variables. While researchers have paid considerable attention to the distribution of sample direct effects, the distribution of sample indirect effects has only recently been considered. Using the (delta) method (Rao, 1973), Sobel (1982) derived the asymptotic distribution for estimators of indirect effects in recursive systems. Sobel (1986) then derived the asymptotic distribution for estimators of total indirect effects in the Joreskog covariance structure model (Joreskog, 1977). This study examined the applicability of the large sample theory described by Sobel (1986) in small samples. Monte Carlo methods were used to evaluate the behavior of estimated total indirect effects in sample sizes of 50, 100, 200, 400, and 800. Two models were used in the analysis. Model 1 was a nonrecursive model with latent variables, feedback, and functional constraints among the effects (Duncan, Haller, & Portes, 1968; Sobel, 1986). Model 2 was a recursive model with observable variables (Duncan, Featherman, & Duncan, 1972). In addition, variations in these models were studied by randomly increasing and decreasing model parameters. The principal findings of the study suggest certain guidelines for researchers who use Sobel's procedures to evaluate total indirect effects in structural equation models. In order for the behavior of the estimates to approximate the asymptotic properties, sample sizes of 400 or more are indicated for nonrecursive systems similar to Model 1, and for recursive systems such as Model 2, sample sizes of 200 or more are suggested. At these sample sizes, researchers can expect sample indirect effects to be accurate point estimators, and confidence intervals for the effects to behave as theory predicts. A caveat to the above guidelines is that, when the total indirect effects are "small" in magnitude, relative to the scale of the model, convergence to the asymptotic properties appears to be very slow. Under these conditions, sampling distributions for the "smaller" valued estimates were positively skewed. This caused estimates to be significantly different from true values, and confidence intervals to behave contrary to theoretical expectations.
82

Bayesian uncertainty analysis for complex computer codes

Oakley, Jeremy January 1999 (has links)
No description available.
83

Computer simulations of flux pinning in type II superconductors

Spencer, Steven Charles January 1996 (has links)
No description available.
84

Molecular similarity : alignment and advanced applications

Parretti, Martin Frank January 1999 (has links)
No description available.
85

A Bayesian approach to the job search model and its application to unemployment durations using MCMC methods

Walker, Neil Rawlinson January 1999 (has links)
No description available.
86

Multiple profile models

Rimmer, Martin John January 1999 (has links)
No description available.
87

Simulación Monte Carlo Cinético de la difusión atómica en la aleación FeA1

Manrique Castillo, Erich Víctor January 2014 (has links)
Las propiedades físicas de los materiales de importancia tecnológica se originan en las reacciones y procesos a los que han sido sometidos. En todos ellos la difusión atómica juega un rol clave, porque la difusión es relevante para la cinética de muchos cambios microestructurales que ocurren durante la preparación, procesamiento y tratamiento térmico de estos materiales. Las superaleaciones, como el FeAl, son materiales tecnológicamente importantes; pues, son resistentes a altas temperaturas, mantienen su estabilidad estructural, superficial y la estabilidad de sus propiedades físicas. Por todo esto profundizar el conocimiento científico acerca de la difusión es necesario. Por tal motivo en el presente trabajo la migración atómica, en la aleación binaria ordenada con estructura B2, es estudiada por medio de simulaciones Monte Carlo Cinético, en donde la migración atómica resulta del intercambio de posiciones de un átomo con una vacante en una red rígida. El modelo cinético atomístico usado se fundamenta en la teoría de la tasa de saltos y el algoritmo del tiempo de residencia. También, usamos interacciones de a par hasta segundos vecinos más próximo. Tomamos los valores que se usaron en simulaciones del diagrama de fases, el ordenamiento B2 y precipitación del FeAl[41]. Determinamos las constantes de difusión como función de la temperatura. Además, investigamos la movilidad de las fronteras antifase en las últimas etapas del proceso de ordenamiento. Finalmente, se calcula la función de autocorrelación, la cual nos revela que la vacante efectúa saltos altamente correlacionados en la red a bajas temperaturas y también que los átomos saltan a posiciones de su propia subred a temperaturas moderadas.
88

Evaluating Atlantic tropical cyclone track error distributions based on forecast confidence

Hauke, Matthew D. 06 1900 (has links)
A new Tropical Cyclone (TC) surface wind speed probability product from the National Hurricane Center (NHC) takes into account uncertainty in track, maximum wind speed, and wind radii. A Monte Carlo (MC) model is used that draws from probability distributions based on historic track errors. In this thesis, distributions of forecast track errors conditioned on forecast confidence are examined to determine if significant differences exist in distribution characteristics. Two predictors are used to define forecast confidence: the Goerss Predicted Consensus Error (GPCE) and the Global Forecast System (GFS) ensemble spread. The distributions of total-, along-, and crosstrack errors from NHC official forecasts are defined for low, average, and high forecast confidence. Also, distributions of the GFS ensemble mean total-track errors are defined based on similar confidence levels. Standard hypothesis testing methods are used to examine distribution characteristics. Using the GPCE values, significant differences in nearly all track error distributions existed for each level of forecast confidence. The GFS ensemble spread did not provide a basis for statistically different distributions. These results suggest that the NHC probability model would likely be improved if the MC model would draw from distributions of track errors based on the GPCE measures of forecast confidence / US Air Force (USAF) author.
89

Die toepasbaarheid van die Monte Carlo studies op empiriese data van die Suid-Afrikaanse ekonomie

29 July 2014 (has links)
M.Com.(Econometrics) / The objective of this study is to evaluate different estimation techniques that can be used to estimate the coefficients of a model. The estimation techniques were applied to empirical data drawn from the South African economy. The Monte Carlo studies are unique in that data was statistically generated for the experiments. This approach was due to the fact that actual observations on economic variables contain several econometric problems, such as autocorrelation and MUlticollinearity, simultaneously. However, the approach in this study differs in that empirical data is used to evaluate the estimation techniques. The estimation techniques evaluated are : • Ordinary least squares method • Two stage least squares method • Limited information maximum likelihood method • Three stage least squares method • Full information maximum likelihood method. The estimates of the different coefficients are evaluated on the following criteria : • The bias of the estimates • The variance of the estimates • t-values of the estimates • The root mean square error. The ranking of the estimation techniques on the bias criterion is as follows : 1 Full information maximum likelihood method. 2 Ordinary least squares method 3 Three stage least squares method 4 Two stage least squares method 5 Limited information maximum likelihood method The ranking of the estimation techniques on the variance criterion is as follows : 1 Full information maximum likelihood method. 2 Ordinary least squares method 3 Three stage least squares method 4 Two stage least squares method 5 Limited information maximum.likelihood method All the estimation techniques performed poorly with regard to the statistical significance of the estimates. The ranking of the estimation techniques on the t-values of the estimates is thus as follows 1 Three stage least squares method 2 ordinary least squares method 3 Two stage least squares method and the limited information maximum likelihood method 4 Full information maximum likelihood method. The ranking of the estimation techniques on the root mean square error criterion is as follows : 1 Full information maximum likelihood method and the ordinary least squares method 2 Two stage least squares method 3 Limited information maximum likelihood method and the three stage least squares method The results achieved in this study are very similar to those of the Monte Carlo studies. The only exception is the ordinary least squares method that performed better on every criteria dealt with in this study. Though the full information maximum likelihood method performed the best on two of the criteria, its performance was extremely poor on the t-value criterion. The ordinary least squares method is shown, in this study, to be the most constant performer.
90

The effect of simulation bias on action selection in Monte Carlo Tree Search

James, Steven Doron January 2016 (has links)
A dissertation submitted to the Faculty of Science, University of the Witwatersrand, in fulfilment of the requirements for the degree of Master of Science. August 2016. / Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. It combines a traditional tree-search approach with Monte Carlo simulations, using the outcome of these simulations (also known as playouts or rollouts) to evaluate states in a look-ahead tree. That MCTS does not require an evaluation function makes it particularly well-suited to the game of Go — seen by many to be chess’s successor as a grand challenge of artificial intelligence — with MCTS-based agents recently able to achieve expert-level play on 19×19 boards. Furthermore, its domain-independent nature also makes it a focus in a variety of other fields, such as Bayesian reinforcement learning and general game-playing. Despite the vast amount of research into MCTS, the dynamics of the algorithm are still not yet fully understood. In particular, the effect of using knowledge-heavy or biased simulations in MCTS still remains unknown, with interesting results indicating that better-informed rollouts do not necessarily result in stronger agents. This research provides support for the notion that MCTS is well-suited to a class of domain possessing a smoothness property. In these domains, biased rollouts are more likely to produce strong agents. Conversely, any error due to incorrect bias is compounded in non-smooth domains, and in particular for low-variance simulations. This is demonstrated empirically in a number of single-agent domains. / LG2017

Page generated in 0.3815 seconds