11 |
Aggregate models for target acquisition in urban terrainMlakar, Joseph A. 06 1900 (has links)
Approved for public release, distribution is unlimited. / High-resolution combat simulations that model urban combat currently use computationally expensive algorithms to represent urban target acquisition at the entity level. While this may be suitable for small-scale urban combat scenarios, simulation run time can become unacceptably long for larger scenarios. Consequently, there is a need for models that can lend insight into target acquisition in urban terrain for largescale scenarios in an acceptable length of time. This research develops urban target acquisition models that can be substituted for existing physicsbased or computationally expensive combat simulation algorithms and result in faster simulation run time with an acceptable loss of aggregate simulation accuracy. Specifically, this research explores (1) the adaptability of probability of line of sight estimates to urban terrain; (2) how cumulative distribution functions can be used to model the outcomes when a set of sensors is employed against a set of targets; (3) the uses for Markov Chains and Event Graphs to model the transition of a target among acquisition states; and (4) how a system of differential equations may be used to model the aggregate flow of targets from one acquisition state to another. / Captain, United States Marine Corps
|
12 |
APLICAÇÃO DE ANÁLISE ESTATÍSTICA NAS CURVAS σ-N DA LIGA DE ALUMÍNIO 7050-T7451 DE APLICAÇÃO ESTRUTURALCanto, Osmari Ademir Hoffmann do 28 August 2013 (has links)
Made available in DSpace on 2017-07-21T20:42:45Z (GMT). No. of bitstreams: 1
Osmari Ademir.pdf: 5711833 bytes, checksum: b01e1ba889748fa3eef4d7ca30cbe837 (MD5)
Previous issue date: 2013-08-28 / Given the importance of presenting ever more reliable results with respect to the fatigue life of aluminum alloy 7050-T7451 application of this structural work guided their main focus on carrying out a statistical analysis of fatigue data collected from Axial fatigue tests conducted on specimens of this alloy. The data collected from these tests may be treated by various types of distribution functions, each function has some special features that have direct relation to the nature of the data collected thus different functions may have different results when applied to the same data set. Therefore it is not always clear which distribution function presented the results more reliable in relation to the data analyzed. For the statistical treatment of the data we used the statistical tool Minitab Release 13.0 which were analyzed four functions of distribution, Weibull, Lognormal Base and Loglogística and Normal. The choice of the function that best represented the data was based on two methods, graph method, where we obtained a qualitative analysis of the distribution of fatigue data and analytical method which produced a quantitative analysis of the results. For the definition of the function that showed the best results, comparisons were made of the most relevant statistics, especially the value of Anderson Darling test that was used in the work of adhesion. In addition to this statistical comparisons were also made errors regarding the probability of failure of 50% and also the values of the confidence intervals. The curves σ - N were plotted from the best figures for each voltage level. The results obtained after the statistical treatment of all values of fatigue life showed that for higher voltage levels (low cycle fatigue) any of the four functions studied can provide good adhesion to the fatigue data with errors quite small, as expected. For the values for medium voltage levels, which began to appear more substantial dispersions four functions still showed good adhesion, but showed most significant errors. When dealing with data sets of high cycle fatigue where stress levels were lower, the functions not performed well in fitting to the data, therefore, showed very high values of Anderson Darling statistic in which case the function is Loglogística was very efficient achieving produce good results even in groups where there were censored values (samples that did not fail). Thus it can be stated that the fatigue curves obtained from this method provide more reliable results, ensuring more security in predicting fatigue life of components built with aluminum alloy 7050-T7451. / Havendo a necessidade de se apresentar resultados cada vez mais confiáveis no que diz respeito à vida em fadiga da liga de alumínio 7050-T7451 de aplicação estrutural o presente trabalho norteia seu foco principal na realização de uma análise estatística dos dados de fadiga coletados a partir de ensaios de fadiga axial realizados em espécimes desta liga. Os dados coletados a partir destes ensaios podem ser tratados por vários tipos de funções de distribuição. Cada função apresenta particularidades que possuem relação direta com a natureza dos dados coletados. Dessa forma funções diferentes podem apresentar resultados diferentes quando aplicadas ao mesmo conjunto de dados. Sendo assim, nem sempre fica claro qual função de distribuição apresentou o resultado mais confiável em relação aos dados analisados. Para o tratamento estatístico, utilizou-se a ferramenta MINITAB Release 13.0 onde foram analisadas quatro funções de distribuição, Weibull, Lognormal Base e, Loglogística e Normal. A escolha da função que melhor representou os dados foi baseada em dois métodos, o primeiro foi o método gráfico, onde foi obtida uma analise qualitativa da distribuição dos dados de fadiga, o segundo foi o método analítico que permite fazer uma análise quantitativa dos resultados. Para a definição da função que apresentou o melhor resultado, foram realizadas comparações das estatísticas de maior relevância, principalmente o valor de Anderson Darling, que foi o teste de aderência utilizado no trabalho. Além deste método foram também realizadas comparações entre os erros referentes à probabilidade de falha a 50% e também dos valores dos intervalos de confiança. As curvas – N foram plotadas a partir dos melhores valores apresentados para cada nível de tensão. Os resultados obtidos após o tratamento estatístico de todos os valores de vida em fadiga mostraram que para os níveis de tensão mais elevados (fadiga de baixo ciclo), qualquer uma das quatro funções estudadas consegue apresentar boa aderência aos dados de fadiga com erros considerados pequenos. Para os valores referentes a níveis de tensão médios, onde começaram a surgir dispersões mais consideráveis, as quatro funções ainda mostraram boa aderência, mas apresentaram erros mais significativos. Ao tratar os conjuntos de dados de fadiga de alto ciclo, as funções não obtiveram um bom desempenho na aderência aos dados, pois, apresentaram valores muito elevados para a estatística de Anderson Darling. Nesses casos a função Loglogística se mostrou mais eficiente conseguindo apresentar melhores resultados até mesmo em conjuntos onde houveram valores censurados (amostras que não falharam). Com isso, pode-se afirmar que as curvas de fadiga obtidas a partir deste método fornecem resultados mais confiáveis, garantindo maior segurança na previsão de vida em fadiga de componentes construídos com a liga de alumínio 7050-T7451.
|
13 |
Measurements of the differential cross section and charge asymmetry for inclusive pp→W(μν) production with 8 TeV CMS data and CMS single muon trigger efficiency studyOgul, Hasan 01 May 2016 (has links)
This dissertation presents muon charge asymmetry, fiducial differential cross section and CMS single muon trigger efficiency measurements as a function of muon pseudorapidity for inclusive W→μν events produced in proton-proton collisions at the LHC. The data were recorded by the CMS detector at a center-of-mass energy of 8 TeV and correspond to an integrated luminosity of 18.8 fb-1. Several comparisons are performed to cross-check the experimental results. Muon efficiency measurements are compared to estimated values from Monte Carlo simulations and reference values recommended by CMS physics object groups. The differential cross section and the charge asymmetry measurements are compared to theoretical predictions based on next-to-leading order and next-to-next-to-leading order QCD calculations with different PDF models. Inputs from the charge asymmetry and the differential cross section measurements for the determination of the next generation of PDF sets are expected to bring different predictions closer together and aid in reducing PDF uncertainties. The impact of the charge asymmetry on PDFs has been investigated by putting the asymmetry results into a QCD analysis at next-to-leading order and next-to-next-leading order with inclusive deep-inelastic scattering data from HERA. Significant improvement of the accuracy on the valence-quark distributions is observed. This measurement is recommended for more accurate constraints in future PDF determinations. More precise measurements of PDFs will improve LHC predictions.
|
14 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsLwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
|
15 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsLwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
|
16 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsKondlo, Lwando Orbet January 2010 (has links)
Magister Scientiae - MSc / The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher’s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available. / South Africa
|
17 |
Pionem indukovaný polarizovaný Drell-Yan proces v experimentu COMPASS / Pion-induced polarized Drell-Yan process at CompassPešek, Michael January 2020 (has links)
In this work we present the basic theoretical concepts of the description of the nucleon spin structure. The theoretical background of two processes of interest - Semi-inclusive DIS and Drell-Yan - in the terms of Transverse Momentum De- pendent Parton distribution Functions is presented. The COMPASS experiment and particularly its unique polarised target are described in detail. Several target related measurements are presented. The express analysis and detector efficien- cies analysis are presented as examples of important hardware related analysis. Finally two measurements of Transverse Spin Asymmetries are presented. The first measurement is the measurement of the Transverse Spin Asymmetries in J/ψ production in the Semi-inclusive DIS on polarised protons. The second mea- surement is the measurement of Transverse Spin Asymmetries in J/ψ in the π− p polarised Drell-Yan data. 1
|
18 |
Estimation of Pareto Distribution Functions from Samples Contaminated by Measurement ErrorsKondlo, Lwando Orbet January 2010 (has links)
>Magister Scientiae - MSc / Estimation of population distributions, from samples that are contaminated
by measurement errors, is a common problem. This study considers the problem
of estimating the population distribution of independent random variables
Xi, from error-contaminated samples ~i (.j = 1, ... , n) such that Yi = Xi + f·.i,
where E is the measurement error, which is assumed independent of X. The
measurement error ( is also assumed to be normally distributed. Since the
observed distribution function is a convolution of the error distribution with
the true underlying distribution, estimation of the latter is often referred to
as a deconvolution problem. A thorough study of the relevant deconvolution
literature in statistics is reported.
We also deal with the specific case when X is assumed to follow a truncated
Pareto form. If observations are subject to Gaussian errors, then the observed
Y is distributed as the convolution of the finite-support Pareto and Gaussian
error distributions. The convolved probability density function (PDF)
and cumulative distribution function (CDF) of the finite-support Pareto and
Gaussian distributions are derived.
The intention is to draw more specific connections bet.ween certain deconvolution
methods and also to demonstrate the application of the statistical theory
of estimation in the presence of measurement error.
A parametric methodology for deconvolution when the underlying distribution
is of the Pareto form is developed.
Maximum likelihood estimation (MLE) of the parameters of the convolved distributions
is considered. Standard errors of the estimated parameters are calculated
from the inverse Fisher's information matrix and a jackknife method.
Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof-
fit tests are used to evaluate the fit of the posited distribution. A bootstrapping
method is used to calculate the critical values of the K-S test statistic,
which are not available.
Simulated data are used to validate the methodology. A real-life application
of the methodology is illustrated by fitting convolved distributions to astronomical
data
|
19 |
Improving Project Management With Simulation And Completion DistributiCates, Grant 01 January 2004 (has links)
Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. Uncertainty has been identified as a contributing factor in late projects. This uncertainty resides in activity duration estimates, unplanned upsetting events, and the potential unavailability of critical resources. This research developed a comprehensive simulation based methodology for conducting quantitative project completion-time risk assessments. The methodology enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used to determine a project's completion distribution function. The project simulation is populated with both deterministic and stochastic elements. Deterministic inputs include planned activities and resource requirements. Stochastic inputs include activity duration growth distributions, probabilities for unplanned upsetting events, and other dynamic constraints upon project activities. Stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Multiple replications of the simulation are run to create the completion distribution function. The methodology was demonstrated to be effective for the on-going project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. Project stakeholders participated in determining and managing completion distribution functions. The first result was improved project completion risk awareness. Secondly, mitigation options were analyzed to improve project completion performance and reduce total project cost.
|
20 |
Application Of Statistical Methods In Risk And ReliabilityHeard, Astrid 01 January 2005 (has links)
The dissertation considers construction of confidence intervals for a cumulative distribution function F(z) and its inverse at some fixed points z and u on the basis of an i.i.d. sample where the sample size is relatively small. The sample is modeled as having the flexible Generalized Gamma distribution with all three parameters being unknown. This approach can be viewed as an alternative to nonparametric techniques which do not specify distribution of X and lead to less efficient procedures. The confidence intervals are constructed by objective Bayesian methods and use the Jeffreys noninformative prior. Performance of the resulting confidence intervals is studied via Monte Carlo simulations and compared to the performance of nonparametric confidence intervals based on binomial proportion. In addition, techniques for change point detection are analyzed and further evaluated via Monte Carlo simulations. The effect of a change point on the interval estimators is studied both analytically and via Monte Carlo simulations.
|
Page generated in 0.1122 seconds