• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1284
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2847
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 162
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Interviewer effects in sample surveys

Lau, Cheung-na. January 1991 (has links)
Thesis (M.Soc.Sc.)--University of Hong Kong, 1991. / Also available in print.
342

Importance resampling for global illumination /

Talbot, Justin F., January 2005 (has links) (PDF)
Thesis (M.S.)--Brigham Young University. Dept. of Computer Science, 2005. / Includes bibliographical references (p. 65-69).
343

'n Ondersoek na die eindige steekproefgedrag van inferensiemetodes in ekstreemwaarde-teorie /

Van Deventer, Dewald. January 2005 (has links)
Assignment (MComm)--University of Stellenbosch, 2005. / Bibliography. Also available via the Internet.
344

On the computation and power of goodness-of-fit tests

Wang, Jingbo, January 2005 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2005. / Title proper from title frame. Also available in printed format.
345

Statistical models for motion segmentation and tracking /

Wong, King Yuen. January 2005 (has links)
Thesis (Ph.D.)--York University, 2005. Graduate Programme in Computer Science. / Typescript. Includes bibliographical references (leaves 166-179). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://wwwlib.umi.com/cr/yorku/fullcit?pNR11643
346

Rigorous methods for the analysis, reporting and evaluation of ESM style data

Carter, Lesley-Anne January 2016 (has links)
Experience sampling methodology (ESM) is a real-time data capture method that can be used to monitor symptoms and behaviours as they occur during everyday life. With measures completed multiple times a day, over several days, this intensive longitudinal data collection method results in multilevel data with observations nested within days, nested within subjects. The aim of this thesis was to investigate the optimal use of multilevel models for ESM in the design, reporting and analysis of ESM data, and apply these models to a study in people with psychosis. A methodological systematic review was conducted to identify design, analysis and statistical reporting practices in current ESM studies. Seventy four studies from 2012 were reviewed, and together with the analysis of a motivating example, four significant areas of interest were identified: power and sample size, missing data, momentary variation and predicting momentary change. Appropriate multilevel methods were sought for each of these areas, and were evaluated in the three-level context of ESM.Missing data was found to be both underreported and rarely considered when choosing analysis methods in practice. This work has introduced a more detailed understanding of nonresponse in ESM studies and has discussed appropriate statistical methods in the presence of missing data. This thesis has extended two-level statistical methodology for data analysis to accommodate the three-level structure of ESM. Novel applications of time trends have been developed, were time can be measured at two separate levels. The suitability of predicting momentary change in ESM data has been questioned; it is argued that the first-difference and joint modelling methods that are claimed in the literature to remove bias possibly induce more in this context. Finally, Monte Carlo simulations were shown to be a flexible option for estimating empirical power under varying sample sizes at levels 3, 2 and 1, with recommendations made for conservative power estimates when a priori parameter estimates are unknown. In summary, this work demonstrates how multilevel models can be used to examine the rich data structure of ESM and fully utilize the variation in measures captured at all levels.
347

Undisturbed Sampling of Cohesionless Soil for Evaluation of Mechanical Properties and Micro-structure

January 2011 (has links)
abstract: As a prelude to a study on the post-liquefaction properties and structure of soil, an investigation of ground freezing as an undisturbed sampling technique was conducted to investigate the ability of this sampling technique to preserve soil structure and properties. Freezing the ground is widely regarded as an appropriate technique to recover undisturbed samples of saturated cohesionless soil for laboratory testing, despite the fact that water increases in volume when frozen. The explanation generally given for the preservation of soil structure using the freezing technique was that, as long as the freezing front advanced uni-directionally, the expanding pore water is expelled ahead of the freezing front as the front advances. However, a literature review on the transition of water to ice shows that the volume of ice expands approximately nine percent after freezing, bringing into question the hypothesized mechanism and the ability of a frozen and then thawed specimen to retain the properties and structure of the soil in situ. Bench-top models were created by pluviation of sand. The soil in the model was then saturated and subsequently frozen. Freezing was accomplished using a pan filled with alcohol and dry ice placed on the surface of the sand layer to induce a unidirectional freezing front in the sample container. Coring was used to recover frozen samples from model containers. Recovered cores were then placed in a triaxial cell, thawed, and subjected to consolidated undrained loading. The stress-strain-strength behavior of the thawed cores was compared to the behavior of specimens created in a split mold by pluviation and then saturated and sheared without freezing and thawing. The laboratory testing provide insight to the impact of freezing and thawing on the properties of cohesionless soil. / Dissertation/Thesis / M.S. Civil and Environmental Engineering 2011
348

Adaptive distance sampling

Pollard, John January 2002 (has links)
We investigate mechanisms to improve efficiency for line and point transect surveys of clustered populations by combining the distance methods with adaptive sampling. In adaptive sampling, survey effort is increased when areas of high animal density are located, thereby increasing the number of observations. We begin by building on existing adaptive sampling techniques, to create both point and line transect adaptive estimators, these are then extended to allow the inclusion of covariates in the detection function estimator. However, the methods are limited, as the total effort required cannot be forecast at the start of a survey, and so a new fixed total effort adaptive approach is developed. A key difference in the new method is that it does not require the calculation of the inclusion probabilities typically used by existing adaptive estimators. The fixed effort method is primarily aimed at line transect sampling, but point transect derivations are also provided. We evaluate the new methodology by computer simulation, and report on surveys of harbour porpoise in the Gulf of Maine, in which the approach was compared with conventional line transect sampling. Line transect simulation results for a clustered population showed up to a 6% improvement in the adaptive density variance estimate over the conventional, whilst when there was no clustering the adaptive estimate was 1% less efficient than the conventional. For the harbour porpoise survey, the adaptive density estimate cvs showed improvements of 8% for individual porpoise density and 14% for school density over the conventional estimates. The primary benefit of the fixed effort method is the potential to improve survey coverage, allowing a survey to complete within a fixed time and effort; an important feature if expensive survey resources are involved, such as an aircraft, crew and observers.
349

Sampling in the evaluation of ore deposits

Grant, D E C S 19 March 2013 (has links)
Sampling is an error generating process and these errors should be reduced to a minimum if an accurate ore reserve estimation is to be made from the sample values. Error in sampling can arise from the sampling procedure as well as where and how each sample is taken from the deposit . Sampling procedure involves sample collection, sample reduction and analysis, and the error from each of these three stages has an equal influence on the total error of the process. Error due to sampling procedure should be identified and eliminated at an early stage in the evaluation programme. An ore deposit should be subdivided into sampling strata along geological boundaries, and once these boundaries have been established they should be adhered to for the evaluation programme. The sampling of each stratum depends on the small-scale structures in which the grade is distributed, and this distribution in relation to sample size controls sample variance, sample bias and the volume of influence of each sample. Cluster sampling can be used where an impractically large sample is necessary to reduce sample variance or increase the volume of influence of samples. Sample bias can be reduced by composing a large number of small samples . Sampling patterns should be designed with reference to the volumes of influence of samples, and in favourable geology, geostatistical or statistical techniques can be used to predict the precision of an ore reserve estimation 1n terms of the number of samples taken. Different are deposits have different sampling characteristics and problems which can be directly related to the geology of the mineralization. If geology is disregarded when sampling an are deposit, an evaluation programme cannot claim to give an accurate estimate of the ore reserves .
350

A HIGH PERFORMANCE GIBBS-SAMPLING ALGORITHM FOR ITEM RESPONSE THEORY MODELS

Patsias, Kyriakos 01 January 2009 (has links)
Item response theory (IRT) is a newer and improved theory compared to the classical measurement theory. The fully Bayesian approach shows promise for IRT models. However, it is computationally expensive, and therefore is limited in various applications. It is important to seek ways to reduce the execution time and a suitable solution is the use of high performance computing (HPC). HPC offers considerably high computational power and can handle applications with high computation and memory requirements. In this work, we have modified the existing fully Bayesian algorithm for 2PNO IRT models so that it can be run on a high performance parallel machine. With this parallel version of the algorithm, the empirical results show that a speedup was achieved and the execution time was reduced considerably.

Page generated in 0.0341 seconds