• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 551
  • 94
  • 78
  • 58
  • 36
  • 25
  • 25
  • 25
  • 25
  • 25
  • 24
  • 22
  • 15
  • 4
  • 3
  • Tagged with
  • 956
  • 956
  • 221
  • 163
  • 139
  • 126
  • 97
  • 92
  • 90
  • 74
  • 72
  • 69
  • 66
  • 65
  • 64
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

Eliciting and combining expert opinion : an overview and comparison of methods

Chinyamakobvu, Mutsa Carole January 2015 (has links)
Decision makers have long relied on experts to inform their decision making. Expert judgment analysis is a way to elicit and combine the opinions of a group of experts to facilitate decision making. The use of expert judgment is most appropriate when there is a lack of data for obtaining reasonable statistical results. The experts are asked for advice by one or more decision makers who face a specific real decision problem. The decision makers are outside the group of experts and are jointly responsible and accountable for the decision and committed to finding solutions that everyone can live with. The emphasis is on the decision makers learning from the experts. The focus of this thesis is an overview and comparison of the various elicitation and combination methods available. These include the traditional committee method, the Delphi method, the paired comparisons method, the negative exponential model, Cooke’s classical model, the histogram technique, using the Dirichlet distribution in the case of a set of uncertain proportions which must sum to one, and the employment of overfitting. The supra Bayes approach, the determination of weights for the experts, and combining the opinions of experts where each opinion is associated with a confidence level that represents the expert’s conviction of his own judgment are also considered.
612

Maximization of power in randomized clinical trials using the minimization treatment allocation technique

Marange, Chioneso Show January 2010 (has links)
Generally the primary goal of randomized clinical trials (RCT) is to make comparisons among two or more treatments hence clinical investigators require the most appropriate treatment allocation procedure to yield reliable results regardless of whether the ultimate data suggest a clinically important difference between the treatments being studied. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the statistical efficiency in detecting treatment effect and its complexity in implementation. Methods: A SAS simulation code was designed for allocating patients into two different treatment groups. Categorical prognostic factors were used together with multi-level response variables and demonstration of how simulation of data can help to determine the power of the minimization technique was carried out using ordinal logistic regression models. Results: Several scenarios were simulated in this study. Within the selected scenarios, increasing the sample size significantly increased the power of detecting the treatment effect. This was contrary to the case when the probability of allocation was decreased. Power did not change when the probability of allocation given that the treatment groups are balanced was increased. The probability of allocation { } k P was seen to be the only one with a significant effect on treatment balance. Conclusion: Maximum power can be achieved with a sample of size 300 although a small sample of size 200 can be adequate to attain at least 80% power. In order to have maximum power, the probability of allocation should be fixed at 0.75 and set to 0.5 if the treatment groups are equally balanced.
613

Multicollinearity in transportation models

Chan , Sheung-Ling January 1970 (has links)
This thesis explores some of the limitations and implications of using multiple regression analysis in transportation models. Specifically it investigates how the problem of multicollinearity, which results from using intercorrelated variables in trip generation models, adversely affects the validation of hypotheses, discovery of underlying relationships and prediction. The research methodology consists of a review of the literature on trip generation analysis and a theoretical exposition on multicollinearity. Secondly, trip generation data for Greater Vancouver (1968) is used for empirical analysis. Factor analysis and multiple regression techniques are employed. The results demonstrate that multicollinearity is both an explanatory and prediction problem which can be overcome by a combined factor analytic and regression method. This method is also capable of identifying and incorporating causal relationships between land use and trip generation into a single model. It is concluded that the distinction between the explanatory, analytic and predictive abilities of a regression model is artificial, and that greater emphasis on theorizing in model-construction is needed. . / Applied Science, Faculty of / Community and Regional Planning (SCARP), School of / Graduate
614

POUŽITÍ STATISTICKÝ METOD PŘI OCEŇOVÁNÍ PODNIKU / USE OF STATISTICAL METHODS IN THE BUSINESS VALUATION

Zelenka, Martin January 2010 (has links)
The aim of this paper is to outline the possibility of application of statistical methods for business valuation. This paper provides a basic overview of the subject in particular mathematical statistical point of view. The first chapter contains an introduction to the field of business valuation are presented valuation areas where it is possible to use different statistical methods. In the following parts of my work it is possible to find a description of methods and a brief description of the problem. The work is mainly focused on the analysis of time series. At the end of the theoretical part of the time series analysis problems of application of regression models are mentioned as well as the difficulties of its application practice. Potential solutions of these problems are mentioned. The final chapter is devoted to practical demonstration of application of the proposed methods on real data for a selected company. The work presents unique suitability of statistical methods in business valuation and demonstrates their practical application.
615

On the nature of the stock market : simulations and experiments

Blok, Hendrik J. 11 1900 (has links)
Over the last few years there has been a surge of activity within the physics community in the emerging field of Econophysics—the study of economic systems from a physicist's perspective. Physicists tend to take a different view than economists and other social scientists, being interested in such topics as phase transitions and fluctuations. In this dissertation two simple models of stock exchange are developed and simulated numerically. The first is characterized by centralized trading with a market maker. Fluctuations are driven by a stochastic component in the agents' forecasts. As the scale of the fluctuations is varied a critical phase transition is discovered. Unfortunately, this model is unable to generate realistic market dynamics. The second model discards the requirement of centralized trading. In this case the stochastic driving force is Gaussian-distributed "news events" which are public knowledge. Under variation of the control parameter the model exhibits two phase transitions: both a first- and a second-order (critical). The decentralized model is able to capture many of the interesting properties observed in empirical markets such as fat tails in the distribution of returns, a brief memory in the return series, and long-range correlations in volatility. Significantly, these properties only emerge when the parameters are tuned such that the model spans the critical point. This suggests that real markets may operate at or near a critical point, but is unable to explain why this should be. This remains an interesting open question worth further investigation. One of the main points of the thesis is that these empirical phenomena are not present in the stochastic driving force, but emerge endogenously from interactions between agents. Further, they emerge despite the simplicity of the modeled agents; suggesting complex market dynamics do not arise from the complexity of individual investors but simply from interactions between (even simple) investors. Although the emphasis of this thesis is on the extent to which multi-agent models can produce complex dynamics, some attempt is also made to relate this work with empirical data. Firstly, the trading strategy applied by the agents in the second model is demonstrated to be adequate, if not optimal, and to have some surprising consequences. Secondly, the claim put forth by Sornette et al. that large financial crashes may be heralded by accelerating precursory oscillations is also tested. It is shown that there is weak evidence for the existence of log-periodic precursors but the signal is probably too indistinct to allow for reliable predictions. / Science, Faculty of / Physics and Astronomy, Department of / Graduate
616

An exploratory study of student attitudes toward statistics and their retention of statistical concepts

Araki, Linda 01 January 1995 (has links)
The purpose of this exploratory research was to investigate potential factors (i.e. gender, time lapse between the latest statistics class and completion of the statistics test, grade point average (GPA), attitude towards statistics), which are associated with the comprehension and retention of statistical knowledge in Baccalaureate Psychology students. The criterion variable was statistical competency, which was measured in five subdomains: basic concepts, descriptives, correlation/regression, hypothesis testing, and inferential statistics.
617

Construção dos gráficos de Shewhart e avaliação de sua eficiência no controle de processos de envase /

Cruz, Raul Acedo Pinto Alves da. January 2019 (has links)
Orientador: Marcela Aparecida Guerreiro Machado de Freitas / Banca: Paloma Maria Silva Rocha Rizol / Banca: Fabricio Maciel Gomes / Resumo: Os gráficos de Shewhart há muito auxiliam na manutenção da estabilidade de diversos processos de produção. Um dos processos que mais intensamente dependem de ajustes finos e monitoramento são os processos de envase pois são executados em alta velocidade e contém muitos pontos críticos de controle. Este trabalho contempla a implantação dos gráficos de controle de Shewhart no processo de envase do equipamento de envase de líquidos do DPD da FEG-UNESP e as dificuldades inerentes ao ajuste e operação do equipamento desde a sua instalação. Testes foram executados com o intuito de gerar uma base de dados para verificar se causas especiais estão interferindo no processo, utilizando histograma, diagrama de Ishikawa de causa e efeito e ferramentas de verificação de normalidade dos dados como o teste de Shapiro-Wilk. Foi também verificado o número médio de amostras até o alarme falso NMA0, e simulados resultados que alteram a média do processo para verificar a sua eficiência. Os resultados mostram que mesmo com a ação inicial de causas especiais sobre o processo, estas foram corrigidas, e após ajustes o equipamento encontra-se em controle estatístico / Abstract: Shewhart control charts assist, since long time ago, in the maintenance of stability in many production processes. One of the most intense processes that are dependent on fine adjustments and monitoring are the filling process due to its high speed and lots of critical control points. This work covers the implementation of Shewhart control charts on the equipment of liquid filling process of DPD of FEG-UNESP and the inherent difficulties of adjustment and operation of this equipment since its installation. Trials were performed with intention to generate a database to verify if special causes are interfering in the process, using histogram, Ishikawa diagram of cause and effect and tools for data normality verification as the Shapiro-Wilk test. The Average Run Length, ARL0, was verified and simulated results were used to check the eficiency. The results indicate that even with the initial action of special causes on the process, they were fixed, and after adjustments the equipment is under statistical control / Mestre
618

Variabilidad en el Sistema de Medición de los Contenidos de Cobre y Oro en el Concentrado, entre el almacén de acopio y el punto de despacho y su posible impacto económico - Caso de estudio compañía minera en el norte del Perú / Variability in the Measurement System of the Copper and Gold Contents in the Concentrate, between the storage warehouse and the dispatch point and its possible economic impact - Case study of a mining company in northern Peru

Cadenillas Luna, Hernán Ulises, Manay Llontop, Marco Antonio, Rodríguez Salinas, Walter Felipe 09 May 2020 (has links)
El presente trabajo de investigación propone determinar el rango aceptable de variabilidad en el sistema de medición de los contenidos de cobre y oro en el concentrado, entre el almacén de acopio en puerto y el punto de despacho en mina, mediante la realización de una serie de análisis de las variables dentro del proceso de estudio a nivel de toma de muestras, preparación de la muestra, análisis en laboratorio y métodos estadísticos empleados, con la finalidad de determinar si el impacto económico para la empresa es significativo. Para lo cual se ha descrito el análisis económico del sector minero a nivel nacional e internacional. He allí de donde viene la importancia de las diferencias de los contenidos de cobre y oro entre los puntos de despacho y acopio de una unidad minera. Es decir, si no se tiene identificado cuales son las variables o rangos aceptables de las diferencias de cobre y oro no se podrá determinar cuál es el impacto económico real en la compañía. El presente trabajo ha utilizado el método estadístico Anova y la pruebas de hipótesis t , así como análisis del sistema de medición MSA (Measurement System Analysis) usando Gage R & R, lo que nos ha permitido determinar que los sistemas de medición en ambos lugares (mina y almacén de acopio) se tiene que mejorar a través de estandarización de procesos: procedimientos de muestreo, capacitación a operarios, implementación y calibración de equipos similares, balances semanales entre ambos puntos. Esto nos permitirá disminuir la variabilidad entre los resultados de cobre y oro tanto en mina como en el almacén de acopio. / The present research work proposes to determine the acceptable range of variability in the measurement system of the copper and gold contents in the concentrate, between the storage warehouse in port and the point of dispatch in the mine, by carrying out a series of analysis of variables within the study process at the level of sampling, sample preparation, laboratory analysis and statistical methods used, with the determination to determine if the economic impact for the company is significant. For which the economic analysis of the mining sector at national and international level has been described. This is where the importance of the differences in copper and gold content between the dispatch and storage points of a mining unit comes from. In other words, if the acceptable variables or ranges of the copper and gold differences are not identified, the real economic impact on the company cannot be determined. The present work has used the Anova statistical method and the hypothesis T tests, as well as analysis of the measurement system MSA (Measurement System Analysis) using Gage R&R, which has allowed us to determine that the measurement systems in both places (mine and storage warehouse) have to improve through the standardization of processes: sampling procedures, operator training, implementation and calibration of similar equipment, weekly balances between both points. This allows us to decrease the variability between copper and gold results both in the mine and in the storage warehouse. / Trabajo de investigación
619

Machine Learning Methods for Causal Inference with Observational Biomedical Data

Averitt, Amelia Jean January 2020 (has links)
Causal inference -- the process of drawing a conclusion about the impact of an exposure on an outcome -- is foundational to biomedicine, where it is used to guide intervention. The current gold-standard approach for causal inference is randomized experimentation, such as randomized controlled trials (RCTs). Yet, randomized experiments, including RCTs, often enforce strict eligibility criteria that impede the generalizability of causal knowledge to the real world. Observational data, such as the electronic health record (EHR), is often regarded as a more representative source from which to generate causal knowledge. However, observational data is non-randomized, and therefore causal estimates from this source are susceptible to bias from confounders. This weakness complicates two central tasks of causal inference: the replication or evaluation of existing causal knowledge and the generation of new causal knowledge. In this dissertation I (i) address the feasibility of observational data to replicate existing causal knowledge and (ii) present new methods for the generation of causal knowledge with observational data, with a focus on the causal tasks of comparing an outcome between two cohorts and the estimation of attributable risks of exposures in a causal system.
620

An Analysis of the Mathematics Necessary for a Course in Research Statistics for the Behavioral Sciences

Peterson, Daniel Ray 12 1900 (has links)
This study attempted to determine the specific mathematics necessary to a student in a beginning course in behavioral science research statistics. To determine the most desirable form for a review of mathematics prior to a research statistics course,, it was first necessary to determine the following: (1) the specific overall content of such a course, (2) the specific mathematics topics of such a course, and (3) the specific mathematics operations utilized in such a course. The study consisted of three parts. The first phase was a determination of the content of a typical beginning course in research statistics for the behavioral sciences. To make this determination, a survey was conducted among forty universities chosen by random sampling from those in the United States offering the Doctor of Education degree. Course outlines and textbooks used by these universities were analyzed, and topics were tabulated. In addition, a selection of recent statistics texts was analyzed, and these topics were also tabulated. These tables were used as a means of content determination.

Page generated in 0.135 seconds