• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 7
  • 6
  • 4
  • 4
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 89
  • 89
  • 27
  • 21
  • 19
  • 16
  • 15
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Jackknife Empirical Likelihood for the Accelerated Failure Time Model with Censored Data

Bouadoumou, Maxime K 15 July 2011 (has links)
Kendall and Gehan estimating functions are used to estimate the regression parameter in accelerated failure time (AFT) model with censored observations. The accelerated failure time model is the preferred survival analysis method because it maintains a consistent association between the covariate and the survival time. The jackknife empirical likelihood method is used because it overcomes computation difficulty by circumventing the construction of the nonlinear constraint. Jackknife empirical likelihood turns the statistic of interest into a sample mean based on jackknife pseudo-values. U-statistic approach is used to construct the confidence intervals for the regression parameter. We conduct a simulation study to compare the Wald-type procedure, the empirical likelihood, and the jackknife empirical likelihood in terms of coverage probability and average length of confidence intervals. Jackknife empirical likelihood method has a better performance and overcomes the under-coverage problem of the Wald-type method. A real data is also used to illustrate the proposed methods.
42

Statistical inference with randomized nomination sampling

Nourmohammadi, Mohammad 08 1900 (has links)
In this dissertation, we develop several new inference procedures that are based on randomized nomination sampling (RNS). The first problem we consider is that of constructing distribution-free confidence intervals for quantiles for finite populations. The required algorithms for computing coverage probabilities of the proposed confidence intervals are presented. The second problem we address is that of constructing nonparametric confidence intervals for infinite populations. We describe the procedures for constructing confidence intervals and compare the constructed confidence intervals in the RNS setting, both in perfect and imperfect ranking scenario, with their simple random sampling (SRS) counterparts. Recommendations for choosing the design parameters are made to achieve shorter confidence intervals than their SRS counterparts. The third problem we investigate is the construction of tolerance intervals using the RNS technique. We describe the procedures of constructing one- and two-sided RNS tolerance intervals and investigate the sample sizes required to achieve tolerance intervals which contain the determined proportions of the underlying population. We also investigate the efficiency of RNS-based tolerance intervals compared with their corresponding intervals based on SRS. A new method for estimating ranking error probabilities is proposed. The final problem we consider is that of parametric inference based on RNS. We introduce different data types associated with different situation that one might encounter using the RNS design and provide the maximum likelihood (ML) and the method of moments (MM) estimators of the parameters in two classes of distributions; proportional hazard rate (PHR) and proportional reverse hazard rate (PRHR) models.
43

Promoted ignition testing : an investigation of sample geometry and data analysis techniques

Suvorovs, Terese January 2007 (has links)
Metallic materials and oxygen can be a volatile combination when accompanied by ignition mechanisms. Once ignited, metallic materials can readily burn in high pressure oxygen atmospheres, releasing an enormous amount of energy and potentially destroying equipment, space missions and resulting in the loss of life. The potential losses associated with these fires led to research into the conditions under which metal fires propagate. Several organisations, including the American Society for Testing and Materials (ASTM) and the International Organisation for Standardisation (ISO), have published recommended standard test practices with which to assess the relative flammability of metallic materials. These promoted ignition tests, so called because samples are ignited with an overwhelming source of energy, are typically used to examine two important parameters as an indication of a metallic material's flammability: Threshold Pressure (TP) and the Regression Rate of the Melting Interface (RRMI). A material's TP is the minimum pressure at which it burns, therefore, TPs of different materials can be compared to assess which materials are most suited for a range of high pressure applications. The RRMI is a useful measure for ranking materials, particularly if they have the same TP, but can be used as a ranking method irrespective of TP. In addition, it is a crucial parameter to aid in understanding the complex burning process and is one of the few experimental parameters that can be measured. Promoted ignition test standards specify a standard sample geometry to use when performing the test, typically a 3.2 mm diameter cylindrical rod. The recent addition of a 3.2 × 3.2 mm square rod as an optional standard sample geometry raises the issue of how the geometry of a sample affects its flammability. Promoted ignition test results for standard geometries are often applied to assess the flammability risk for the complex geometries of real components within oxygen systems, including regulators, valves, piping etc. Literature shows that sample geometry has a significant effect on material rankings when rankings are based on testing of standard geometries, for example, cylindrical rods, compared to non-standard geometries, for example, sintered filters and meshes. In addition, the RRMI has been shown to be dependent on a sample's cross-sectional area (XA). However, it remains unclear, from a simple heat transfer analysis, why the RRMI is dependent on XA or how the shape of a sample affects its melting rate. These questions are particularly relevant since understanding how sample geometry affects burning contributes to two important research goals: to be able to accurately model and predict the flammability risk of a metallic component without the need for physical testing, and to understand the effects of different sample geometries on their relative flammabilities within the standard tests used. Promoted ignition tests were conducted on iron rods with cylindrical, rectangular and triangular cross sections for a range of XAs. Their RRMIs were measured and analysed using a statistical approach which allowed differences in RRMI to be quantitatively assessed. Statistically significant differences in RRMI were measured for rods with the same XA but of different shape. Furthermore, the magnitude of the difference was dependent on XA. Triangular rods had the fastest RRMIs, followed by rectangular rods and then cylindrical rods. Differences in RRMI based on rod shape are due to heat transfer effects and the dynamic motion of the attached molten mass during the drop cycle. The corners of the rectangular and triangular rods melt faster due to their locally higher Surface Area to Volume ratio (SA/V). This dynamic effect increases the area of contact between the molten mass and the solid rod (solid liquid interface (SLI)) which facilitates increased heat transfer to the rod resulting in a faster RRMI. This finding highlights the importance of the SLI in the heat transfer process. Although the SLI is largely dependent on the XA, the shape of the rod causes subtle changes to the size of the SLI and thus affects heat transfer, burning and observed RRMI. The relationship between rod diameter, test pressure and Extent of Reaction (ER), the proportion of metal that reacts (oxidises) whilst attached to the burning rod, was investigated. During promoted ignition testing of iron rods of varying diameter the detached drops were rapidly quenched by immersion in a water bath. Microanalysis techniques were used to qualitatively assess the ER as a function of pressure and rod diameter. It was found that the pressure dramatically affects ER. High pressure tests resulted in a slag mass consisting of oxide, with no unreacted iron, whereas low pressure tests resulted in a significant fraction of unreacted iron within the slag. This indicates that the ER contributes directly to the observed increase in RRMI with increasing test pressure. At high pressures the ER is not affected by rod diameter, since all available liquid metal reacted, but at low pressures ER is a function of rod diameter, ER decreases as XA increases. This thesis also investigates the analysis of promoted ignition test data through suitable statistical methods. Logistic regression is identified as an appropriate method for modelling binary burn/no-burn test data. The relationship between the reaction probability, defined as the probability that a sample will undergo sustained burning, and pressure, is evaluated for two different data sets. The fits of the logistic regression models are assessed and found to model the available data well. The logistic regression method is contrasted with the confidence levels associated with binary data based on the Bernoulli distribution. It is concluded that a modelling approach is beneficial in providing an overall understanding of the transition between pressures where no burning occurs and pressures where burning is expected.
44

Stress, uncertainty and multimodality of risk measures / Stress, incertitude et multimodalité des mesures de risque

Li, Kehan 06 June 2017 (has links)
Dans cette thèse, nous discutons du stress, de l'incertitude et de la multimodalité des mesures de risque en accordant une attention particulière à deux parties. Les résultats ont une influence directe sur le calcul du capital économique et réglementaire des banques. Tout d'abord, nous fournissons une nouvelle mesure de risque - la VaR du stress du spectre (SSVaR) - pour quantifier et intégrer l'incertitude de la valeur à risque. C'est un modèle de mise en œuvre de la VaR stressée proposée par Bâle III. La SSVaR est basée sur l'intervalle de confiance de la VaR. Nous étudions la distribution asymptotique de la statistique de l'ordre, qui est un estimateur non paramétrique de la VaR, afin de construire l'intervalle de confiance. Deux intervalles de confiance sont obtenus soit par le résultat gaussien asymptotique, soit par l'approche saddlepoint. Nous les comparons avec l'intervalle de confiance en bootstrapping par des simulations, montrant que l'intervalle de confiance construit à partir de l'approche saddlepoint est robuste pour différentes tailles d'échantillons, distributions sous-jacentes et niveaux de confiance. Les applications de test de stress utilisant SSVaR sont effectuées avec des rendements historiques de l'indice boursier lors d'une crise financière, pour identifier les violations potentielles de la VaR pendant les périodes de turbulences sur les marchés financiers. Deuxièmement, nous étudions l'impact de la multimodalité des distributions sur les calculs de la VaR et de l'ES. Les distributions de probabilité unimodales ont été largement utilisées pour le calcul paramétrique de la VaR par les investisseurs, les gestionnaires de risques et les régulateurs. Cependant, les données financières peuvent être caractérisées par des distributions ayant plus d'un mode. Avec ces données nous montrons que les distributions multimodales peuvent surpasser la distribution unimodale au sens de la qualité de l'ajustement. Deux catégories de distributions multimodales sont considérées: la famille de Cobb et la famille Distortion. Nous développons un algorithme d'échantillonnage de rejet adapté, permettant de générer efficacement des échantillons aléatoires à partir de la fonction de densité de probabilité de la famille de Cobb. Pour une étude empirique, deux ensembles de données sont considérés: un ensemble de données quotidiennes concernant le risque opérationnel et un scénario de trois mois de rendement du portefeuille de marché construit avec cinq minutes de données intraday. Avec un éventail complet de niveaux de confiance, la VaR et l'ES à la fois des distributions unimodales et des distributions multimodales sont calculés. Nous analysons les résultats pour voir l'intérêt d'utiliser la distribution multimodale au lieu de la distribution unimodale en pratique. / In this thesis, we focus on discussing the stress, uncertainty and multimodality of risk measures with special attention on two parts. The results have direct influence on the computation of bank economic and regulatory capital. First, we provide a novel risk measure - the Spectrum Stress VaR (SSVaR) - to quantify and integrate the uncertainty of the Value-at-Risk. It is an implementation model of stressed VaR proposed in Basel III. The SSVaR is based on the confidence interval of the VaR. We investigate the asymptotic distribution of the order statistic, which is a nonparametric estimator of the VaR, in order to build the confidence interval. Two confidence intervals are derived from either the asymptotic Gaussian result, or the saddlepoint approach. We compare them with the bootstrapping confidence interval by simulations, showing that the confidence interval built from the saddlepoint approach is robust for different sample sizes, underlying distributions and confidence levels. Stress testing applications using SSVaR are performed with historical stock index returns during financial crisis, for identifying potential violations of the VaR during turmoil periods on financial markets. Second, we investigate the impact of multimodality of distributions on VaR and ES calculations. Unimodal probability distributions have been widely used for parametric VaR computation by investors, risk managers and regulators. However, financial data may be characterized by distributions having more than one modes. For these data, we show that multimodal distributions may outperform unimodal distribution in the sense of goodness-of-fit. Two classes of multimodal distributions are considered: Cobb's family and Distortion family. We develop an adapted rejection sampling algorithm, permitting to generate random samples efficiently from the probability density function of Cobb's family. For empirical study, two data sets are considered: a daily data set concerning operational risk and a three month scenario of market portfolio return built with five minutes intraday data. With a complete spectrum of confidence levels, the VaR and the ES from both unimodal distributions and multimodal distributions are calculated. We analyze the results to see the interest of using multimodal distribution instead of unimodal distribution in practice.
45

Statistical analysis software for the TRS-80 microcomputer

Isbell, Robert Paul 09 1900 (has links)
Approved for public release; distribution is unlimited. / This paper documents the development of a statistical analysis package for the TRS-80 microcoraputer. The package is comprised of six interactive programs which are generally divided into topical areas. The major emphasis is on exploratory data analysis and statistical inference, however, probability and inverse probability distributions are also included. The programming language is TRS-80 Level II BASIC enhanced by the input/output commands available through the ESF-80 (Exatron Stringy Floppy) mass storage subsystem. With the modification of these few commands, the package is compatible with most floppy disk operating systems designed for the TRS-80 Model I or Model III microcomputers. This statistical analysis capability implemented on a relatively inexpensive system provides a useful tool to the student or the trained analyst without ready access to a mainframe computer system. / Major, United States Marine Corps
46

Analýza spotřeby domácností v EU / Analysis of household consumption in the EU

Kolman, Martin January 2014 (has links)
The goal of this work is to analyze the evolution of household consumption of the states in the EU. The consumption will be researched in the view of classification COICOP, which is the classification of individual consumption by purpose. After mapping of this evolution the estimation of future values will be done from known time series. This estimation will be performed by two different ways. First one will respect the composition of household consumption in sections of classification COICOP. The second one will only work with time series of average consumption for all sections together. To compare the states cluster analysis will be done. This analysis will be done by two ways again. First one will be aimed to analyze the current situation and the second one will be aimed to analyze the evolution of household consumption. Instead of Microsoft Excel STATGRAPHICS X64 CENTURION and SPSS will be used in this thesis. Household consumption prognosis is the main benefit of this thesis. This prognosis is made for all sections of COICOP. Analysis has shown, that the consumption should rise in future. There are few exceptions, mainly countries with not good economic situation as Greece.
47

Subjektivní hodnocení kvality videosekvencí / Subjective quality evaluation of video sequences

Krmela, Tomáš January 2012 (has links)
This master´s work is focused on the comparison of subjective assessment of the quality of video sequences. In this study, data are obtained by hardware and sofware techniques and they are compared. In the introduction, methods of video compressions are described. The main part of this work deals wtih the exploring of different methods of subjective assessment of the quality of video sequences. Finally, obtained results from different methods, are evaluated and discussed.
48

Association between household socioeconomic level and consumption of fast food and soft drinks: A cross-sectional assessment of the Young Lives cohort in Peru

Najar, Carol Argelia, Vila-Quispe, Jessi Nataly, Astete-Robilliard, Laura, Bernabe-Ortiz, Antonio 01 January 2020 (has links)
Introduction: The consumption of fast food and soft drinks is a risk factor for developing overweight and obesity. This study aimed at assessing if there is association between household socioeconomic level and the consumption of fast food and soft drinks among children. Material and Methods: A cross-sectional assessment of the data from the third round (2009-2010) of the youngest cohort of the Young Lives study in Peru was conducted. Sampling was conducted in three stages: In the first one, the country was divided into equal geographical regions, excluding the 5% of the richest district; in the second stage, 20 sentinel sites were chosen and an area within each sentinel site was selected. Finally, in the third stage, eligible children were selected. Outcomes were the self-reported consumption of fast food and soft drinks (never, sometimes, and always), whereas the exposure was household socioeconomic status (in quintiles). Crude and adjusted models were created between variables of interest using Poisson regression models, with robust variance, to report prevalence ratios (PR) and 95% confidence intervals (95% CI). Results: Data of 1901 children, of which 942 (49.6%) were girls, with a mean age of 7.5 (SD: 0.5) was analyzed. A total of 24.1% (95%CI: 22.2%-26.1%) reported always consuming fast food, whilst this number was 22.4% (20.5%-24.3%) for soft drinks. Compared to the lowest socioeconomic quintile, those who were in higher socioeconomic status had more probability of consuming fast food and soft drinks (Chi-squared for trends <0.001). The highest socioeconomic quintile had a greater probability to always consume fast food (PR=1.42; 95%CI: 1.08-1.88) and soft drinks (PR=1.71; 95%CI: 1.24-2.37). Conclusions: This study shows that there is a significant association between the household socioeconomic level and the consumption of soft drinks and fast food. / Revisión por pares
49

Kvazinormy diskrétních rozdělení pravděpodobnosti a jejich aplikace / Quasinorms of Discrete Probability Distributions and their Applications

Šácha, Jakub January 2013 (has links)
Dissertation thesis is focused on solution of the statistical problem to find a probability distribution of a discrete random variable on the basis of the observed data. These estimates are obtained by minimizing quasi-norms with given constraints. The thesis further focuses on deriving confidence intervals for estimated probabilities. It also contains practical application of these methods.
50

Estimating the Difference of Percentiles from Two Independent Populations.

Tchouta, Romual Eloge 12 August 2008 (has links) (PDF)
We first consider confidence intervals for a normal percentile, an exponential percentile and a uniform percentile. Then we develop confidence intervals for a difference of percentiles from two independent normal populations, two independent exponential populations and two independent uniform populations. In our study, we mainly focus on the maximum likelihood to develop our confidence intervals. The efficiency of this method is examined via coverage rates obtained in a simulation study done with the statistical software R.

Page generated in 0.0905 seconds