Spelling suggestions: "subject:"[een] SAMPLING"" "subject:"[enn] SAMPLING""
251 |
Statistical analysis of type-II progressively hybrid censored samples and adaptive type-II progressively hybrid censored samples from extreme value distribution.January 2009 (has links)
Mak, Man Yung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 115-117). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Introduction --- p.1 / Chapter 1.2 --- Conventional Censoring Schemes --- p.2 / Chapter 1.3 --- Type-II Progressively Hybrid Censoring Scheme --- p.4 / Chapter 1.4 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.6 / Chapter 1.5 --- Extreme Value Distribution --- p.8 / Chapter 1.6 --- The Scope of the Thesis --- p.11 / Chapter 2 --- Estimation methods --- p.12 / Chapter 2.1 --- Introduction --- p.12 / Chapter 2.2 --- Maximum Likelihood Estimators --- p.13 / Chapter 2.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.13 / Chapter 2.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.15 / Chapter 2.3 --- Approximate Maximum Likelihood Estimators --- p.18 / Chapter 2.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.18 / Chapter 2.3.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.20 / Chapter 2.4 --- Monte Carlo Simulation and Result --- p.23 / Chapter 2.4.1 --- Numerical Comparisons --- p.33 / Chapter 3 --- Construction of Confidence Intervals --- p.35 / Chapter 3.1 --- Introduction --- p.35 / Chapter 3.2 --- Asymptotic Confidence Interval --- p.36 / Chapter 3.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.37 / Chapter 3.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.39 / Chapter 3.3 --- Parametric Percentile Bootstrap Confidence Interval --- p.56 / Chapter 3.3.1 --- Parametric Percentile Bootstrap Confidence Interval based on Maximum Likelihood Estimation method --- p.57 / Chapter 3.3.2 --- Parametric Percentile Bootstrap Confidence Interval based on Approximate Maximum Likelihood Estimation method --- p.65 / Chapter 3.4 --- Parametric Bootstrap-t Confidence Interval --- p.71 / Chapter 3.4.1 --- Parametric Bootstrap-t Confidence Interval based on Maximum Likelihood Estimation method --- p.72 / Chapter 3.4.2 --- Parametric Bootstrap-t Confidence Interval based on Approxi mate Maximum Likelihood Estimation method --- p.79 / Chapter 3.5 --- Numerical Comparisons --- p.86 / Chapter 4 --- Expected Total Test Time --- p.88 / Chapter 4.1 --- Introduction --- p.88 / Chapter 4.2 --- Type-II Progressively Hybrid Censoring Scheme --- p.89 / Chapter 4.3 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.92 / Chapter 4.4 --- Numerical Comparisons --- p.99 / Chapter 5 --- Optimality Criteria and Censoring Schemes --- p.100 / Chapter 5.1 --- Introduction --- p.100 / Chapter 5.2 --- Optimality Criteria --- p.101 / Chapter 5.3 --- Expected Fisher Information Matrix --- p.102 / Chapter 5.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.103 / Chapter 5.4 --- Optimal Censoring Scheme for Progressively Hybrid Censoring --- p.106 / Chapter 6 --- Conclusions and Further Research --- p.113 / Bibliography --- p.115
|
252 |
Reconciliação pró-ativa em empreendimentos mineiros. / Proactive reconciliation at mining industry.Chieregati, Ana Carolina 18 April 2007 (has links)
As práticas de reconciliação consistem na comparação entre as quantidades e teores de minério estimados pelos modelos da jazida e as quantidades e teores de minério produzidos na usina de beneficiamento. O resultado dessas comparações é geralmente um grupo de fatores que são aplicados a estimativas futuras, na tentativa de melhorar a previsão do desempenho de uma operação. Atualmente, a prática comum de reconciliação baseia-se na definição do mine call factor (MCF) e sua aplicação às estimativas dos modelos de recursos e de controle de teor. O MCF expressa a diferença entre a produção prevista pelos modelos e a produção registrada na usina e, portanto, sua aplicação permite uma correção nas estimativas dos modelos. Esta é uma prática de reconciliação reativa. Entretanto, a aplicação desses fatores às estimativas dos modelos pode mascarar as causas dos erros responsáveis pelas discrepâncias observadas. As causas reais de qualquer variância só podem ser identificadas analisando-se as informações referentes a cada variância e, em seguida, modificando metodologias e processos. Este é o conceito de prognosticação, ou reconciliação pró-ativa, um processo iterativo de recalibração constante das entradas de dados e dos cálculos. Portanto, a prognosticação permite uma correção das metodologias de coleta de dados, e não simplesmente uma correção das estimativas dos modelos. O presente trabalho analisa as práticas de reconciliação realizadas em uma mina de ouro do Brasil e sugere um novo protocolo de amostragem, com base nos conceitos de prognosticação. / Reconciliation is the practice of comparing the tonnage and average grade of ore predicted from resource and grade control models with the tonnage and grade generated by the processing plant. The result is usually a group of factors, which are applied to future estimates in an attempt to better predict how the operation may perform. The common practice of reconciliation is based on definition of the mine call factor (MCF) and its application to resource or grade control estimates. The MCF expresses the difference, a ratio or percentage, between the predicted grade and the grade reported by the plant. Therefore, its application allows to correct model estimates. This practice is named reactive reconciliation. However, the use of generic factors that are applied across differing time scales and material types often disguises the causes of the error responsible for the discrepancy. The root causes of any given variance can only be identified by analyzing the information behind any variance and, then, making changes to methodologies and processes. This practice is named prognostication, or proactive reconciliation, an iterative process resulting in constant recalibration of the inputs and the calculations. The prognostication allows personnel to adjust processes so that results align within acceptable tolerance ranges, and not only to correct model estimates. This study analyses the reconciliation practices performed at a gold mine in Brazil and suggests a new sampling protocol, based on prognostication concepts.
|
253 |
Transience: An Album-Length Recording for Solo Percussion and ElectronicsAugspurger, Christine Anne 01 May 2017 (has links)
No description available.
|
254 |
A Quality of Service Monitoring System for Service Level Agreement VerificationTa, Xiaoyuan January 2006 (has links)
Master of Engineering by Research / Service-level-agreement (SLA) monitoring measures network Quality-of-Service (QoS) parameters to evaluate whether the service performance complies with the SLAs. It is becoming increasingly important for both Internet service providers (ISPs) and their customers. However, the rapid expansion of the Internet makes SLA monitoring a challenging task. As an efficient method to reduce both complexity and overheads for QoS measurements, sampling techniques have been used in SLA monitoring systems. In this thesis, I conduct a comprehensive study of sampling methods for network QoS measurements. I develop an efficient sampling strategy, which makes the measurements less intrusive and more efficient, and I design a network performance monitoring software, which monitors such QoS parameters as packet delay, packet loss and jitter for SLA monitoring and verification. The thesis starts with a discussion on the characteristics of QoS metrics related to the design of the monitoring system and the challenges in monitoring these metrics. Major measurement methodologies for monitoring these metrics are introduced. Existing monitoring systems can be broadly classified into two categories: active and passive measurements. The advantages and disadvantages of both methodologies are discussed and an active measurement methodology is chosen to realise the monitoring system. Secondly, the thesis describes the most common sampling techniques, such as systematic sampling, Poisson sampling and stratified random sampling. Theoretical analysis is performed on the fundamental limits of sampling accuracy. Theoretical analysis is also conducted on the performance of the sampling techniques, which is validated using simulation with real traffic. Both theoretical analysis and simulation results show that the stratified random sampling with optimum allocation achieves the best performance, compared with the other sampling methods. However, stratified sampling with optimum allocation requires extra statistics from the parent traffic traces, which cannot be obtained in real applications. In order to overcome this shortcoming, a novel adaptive stratified sampling strategy is proposed, based on stratified sampling with optimum allocation. A least-mean-square (LMS) linear prediction algorithm is employed to predict the required statistics from the past observations. Simulation results show that the proposed adaptive stratified sampling method closely approaches the performance of the stratified sampling with optimum allocation. Finally, a detailed introduction to the SLA monitoring software design is presented. Measurement results are displayed which calibrate systematic error in the measurements. Measurements between various remote sites have demonstrated impressively good QoS provided by Australian ISPs for premium services.
|
255 |
The role of the sampling distribution in developing understanding of statistical inferenceLipson, Kay, klipson@swin.edu.au January 2000 (has links)
There has been widespread concern expressed by members of the statistics education community in the past few years about the lack of any real understanding demonstrated by many students completing courses in introductory statistics. This deficiency in understanding has been particularly noted in the area of inferential statistics, where students, particularly those studying statistics as a service course, have been inclined to view statistical inference as a set of unrelated recipes. As such, these students have developed skills that have little practical application and are easily forgotten.
This thesis is concerned with the development of understanding in statistical inference for beginning students of statistics at the post-secondary level. This involves consideration of the nature of understanding in introductory statistical inference, and how understanding can be measured in the context of statistical inference. In particular, the study has examined the role of the sampling distribution in the students? schemas for statistical inference, and its relationship to both conceptual and procedural understanding. The results of the study have shown that, as anticipated, students will construct highly individual schemas for statistical inference but that the degree of integration of the concept of sampling distribution within this schema is indicative of the level of development of conceptual understanding in that student. The results of the study have practical implications for the teaching of courses in introductory statistics, in terms of content, delivery and assessment.
|
256 |
Something to do with community structure : the influence of sampling and analysis on measures of community structureAnderson, Barbara J., n/a January 2006 (has links)
Diversity indices confound two components: species richness and evenness. Community structure should therefore be evaluated by employing separate measures of the number of species and their relative abundances. However, the relative abundances of species are dependent on the abundance measure used. Although the use of biomass or productivity is recommended by theory, in practice a surrogate measure is more often used. Frequency (local or relative) and point-quadrat cover provide two objective measures of abundance which are fast, less destructive and avoid problems associated with distinguishing individuals. However, both give discrete bounded data which may further alter the relative abundances of species. These measures have a long history of use and, as the need for objective information on biodiversity becomes more pressing, their use is likely to become more widespread. Consequently, it seems appropriate to investigate the effect of these abundance measures, and the resolution at which they are used, on calculated evenness.
Field, artificial and simulated data were used to investigate the effect of abundance measure and resolution on evidence for community structure. The field data consisted of seventeen sites. Sites from four vegetation types (saltmeadow, geothermal, ultramafic and high-altitude meadow) were sampled in three biogeographical regions. Most of the indices of community structure (species richness, diversity and evenness) detected differences between the different vegetation types, and different niche-apportionment models were fitted to the field data from saltmeadow and geothermal vegetation. Estimates of community structure based on local frequency and point-quadrat data differed. Local frequency tended to give higher calculated evenness; whereas point-quadrat data tended to fit to niche apportionment models where local frequency data failed.
The effect of resolution on the eighteen evenness indices investigated depended on community species richness and the particular index used. The investigated evenness indices were divided into three groups (symmetric, continuous and traditional indices) based on how they ranked real and artificially constructed communities. Contrary to Smith and Wilson�s recommendation the symmetric indices E[VAR] and E[Q] proved unsuitable for use with most types of plant data. In particular, E[Q] tends to assign most communities low values and has a dubious relationship with intrinsic evenness. The continuous indices, E[MS] and E[2,1], were the indices best able to discriminate between field, artificial and simulated communities, and their use should be re-evaluated. Traditional indices used with low resolution tended to elevate the calculated evenness, especially in species-rich communities. The relativized indices, E[Hurlbert] and EO[dis], were an exception, as they were always able to attain the minimum of zero; however, they were more sensitive to changes in resolution, particularly when resolution was low. Overall, traditional indices based on Hill�s ratios, including E[1/D] (=E[2,0]), and G[2,1] gave the best performance, while the general criticism of the use of Pielou�s J� as an index of evenness was further substantiated by this study. As a final recommendation, ecologists are implored to investigate their data and the likely effects that sampling and analysis have had on the calculated values of their indices.
|
257 |
Im Warenkorb nichts Neues? : zur Beobachtung und Bewertung digitaler Wiederholungstaten am Beispiel des Audiosampling ; systemische Interferenzen und medientheoretische Perspektiven zwischen Kunst und Künstlichkeit /Juhnke, Matthias. January 2007 (has links)
Zugl.: Magisterarbeit.
|
258 |
Wireless acceptance in a university setting using the Unified Theory of Acceptance and Use of TechnologyCetron, Justin Franco. January 2007 (has links)
Thesis (M.S.)--University of Delaware, 2007. / Principal faculty advisor: Cihan Cobanoglu, Dept. of Hotel, Restaurant, & Institutional Management. Includes bibliographical references.
|
259 |
Thresholds for peak-over-threshold theoryAmankonah, Frank O. January 2005 (has links)
Thesis (M.S.)--University of Nevada, Reno, 2005. / "August, 2005." Includes bibliographical references (leaf 43). Online version available on the World Wide Web.
|
260 |
A sensitivity evaluation of EPA's CAL3QHC dispersion model for carbon monoxide analysis at urban intersectionsLindemann, Julie B. 25 October 1994 (has links)
In February 1994, EPA proposed to "refine" its carbon monoxide (CO) hot spot model,
CAL3QHC, to increase its accuracy and reliability. Currently, this proposal includes
using hourly meteorological data recorded at airports near the modeling sites instead of
the worst case conditions now recommended as a modeling default. The analysis in
this study has shown that airport data should be used only on a situational basis as
discrepancies exist between airports and local intersections. It is unclear if the added
meteorological data will improve CAL3QHC's performance. This study also compares
actual CO concentrations and their corresponding meteorology with the worst case
modeling defaults. The highest CO levels are occurring at calms (wind speed less
than one meter per second) which are not able to be modeled with the current
Gaussian dispersion equation in CAL3QHC. Given the findings of this analysis and
the lack of other scientific evidence supporting the proposed refinements,
recommendations for developing an alternative model improvement plan are outlined. / Graduation date: 1995
|
Page generated in 0.044 seconds