• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1290
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 14
  • 12
  • 10
  • 10
  • Tagged with
  • 2854
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 163
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Computer solutions for the OC and ASN values of six sampling plans

Stralkowski, C. Michael. January 1964 (has links)
Thesis (M.S.)--University of Wisconsin--Madison, 1964. / eContent provider-neutral record in process. Description based on print version record. Bibliography: l. 86.
192

Improving sampled microprocessor simulation

Luo, Yue, John, Lizy Kurian, January 2005 (has links) (PDF)
Thesis (Ph. D.)--University of Texas at Austin, 2005. / Supervisor: Lizy K. John. Vita. Includes bibliographical references.
193

Designing computer experiments to estimate integrated response functions

Marin, Ofelia, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Includes bibliographical references (p. 115-117).
194

Exploring network models under sampling

Zhou, Shu January 1900 (has links)
Master of Science / Department of Statistics / Perla Reyes / Networks are defined as sets of items and their connections. Interconnected items are represented by mathematical abstractions called vertices (or nodes), and the links connecting pairs of vertices are known as edges. Networks are easily seen in everyday life: a network of friends, the Internet, metabolic or citation networks. The increase of available data and the need to analyze network have resulted in the proliferation of models for networks. However, for networks with billions of nodes and edges, computation and inference might not be achieved within a reasonable amount of time or budget. A sampling approach seems a natural choice, but traditional models assume that we can have access to the entire network. Moreover, when data is only available for a sampled sub-network conclusions tend to be extrapolated to the whole network/population without regard to sampling error. The statistical problem this report addresses is the issue of how to sample a sub-network and then draw conclusions about the whole network. Are some sampling techniques better than others? Are there more efficient ways to estimate parameters of interest? In which way can we measure how effectively my method is reproducing the original network? We explore these questions with a simulation study on Mesa High School students' friendship network. First, to assess the characteristics of the whole network, we applied the traditional exponential random graph model (ERGM) and a stochastic blockmodel to the complete population of 205 students. Then, we drew simple random and stratified samples of 41 students, applied the traditional ERGM and the stochastic blockmodel again, and defined a way to generalized the sample findings to the population friendship network of 205 students. Finally, we used the degree distribution and other network statistics to compare the true friendship network with the projected one. We achieved the following important results: 1) as expected stratified sampling outperforms simple random sampling when selecting nodes; 2) ERGM without restrictions offers a poor estimate for most of the tested parameters; and 3) the Bayesian stochastic blockmodel estimation using a strati ed sample of nodes achieves the best results.
195

Successful Sampling Strategy Advances Laboratory Studies of NMR Logging in Unconsolidated Aquifers

Behroozmand, Ahmad A., Knight, Rosemary, Müller-Petke, Mike, Auken, Esben, Barfod, Adrian A. S., Ferré, Ty P. A., Vilhelmsen, Troels N., Johnson, Carole D., Christiansen, Anders V. 16 November 2017 (has links)
The nuclear magnetic resonance (NMR) technique has become popular in groundwater studies because it responds directly to the presence and mobility of water in a porous medium. There is a need to conduct laboratory experiments to aid in the development of NMR hydraulic conductivity models, as is typically done in the petroleum industry. However, the challenge has been obtaining high-quality laboratory samples from unconsolidated aquifers. At a study site in Denmark, we employed sonic drilling, which minimizes the disturbance of the surrounding material, and extracted twelve 7.6 cm diameter samples for laboratory measurements. We present a detailed comparison of the acquired laboratory and logging NMR data. The agreement observed between the laboratory and logging data suggests that the methodologies proposed in this study provide good conditions for studying NMR measurements of unconsolidated near-surface aquifers. Finally, we show how laboratory sample size and condition impact the NMR measurements.
196

Improved Sampling-based Alpha Matting in Images and Video

Hao, Chengcheng January 2012 (has links)
Foreground extraction technology plays an important role in image and video processing tasks. It has been widely used in various industries. To better describe the overlap relationship between foreground and background, alpha channel is introduced. It reveals the opacity property of foreground objects. Thus, fully extracting a foreground object requires determining the alpha values for pixels, also known as extracting an alpha matte. In this thesis, we propose an improved sampling-based alpha matting algorithm, which is capable of generating high quality matting results. By analyzing the weakness of previous approaches, we optimize the sampling process and consider the cost of each sample pair to avoid missing any good samples. The good performance is demonstrated even for complex images. On the other hand, extracting foreground objects from video sequences is a more challenging task since it has higher demands on accuracy and efficiency. Previous approaches usually require a significant amount of user input and the results still suffer from inaccuracy. In this thesis, we successfully extend our algorithm to video sequences and let it run in an automatic fashion. Adaptive trimap, which is vital for matting, can be automatically generated and properly propagated in this system. Our method not only reduces the user interference but also guarantees the matting quality.
197

Properties of composites sampling procedures

Elder, Robert S. 08 September 2012 (has links)
In a composite sampling procedure initial samples (increments) are drawn from a lot and physically mixed to form composite samples. Subsamples are then taken from these composite samples and tested to determine the lot quality, usually the lot mean, μ<sub>x</sub>. Composite sampling procedures typically are employed with bulk materials, for which high testing costs preclude estimation of μ<sub>x</sub> using the arithmetic average of values from several individually tested increments. Because of the physical averaging that occurs when increments are mixed to form composite samples, it is possible to estimate μ<sub>x</sub> with specified precision with greater economy using a composite sampling procedure than using a noncompositing procedure. This dissertation extends and interprets the work of Brown and Fisher on modeling procedures that involve subsampling mixtures of sampled material. Models are developed for sampling from segmented or nonsegmented lots, allowing for more than one finite composite, testing error, within-increment variability, or two subsampling stages. The result of each model is a formula expressing the variance of the estimator of μ<sub>x</sub> in terms of model parameters. Each such formula is contrasted with the corresponding formula derived from the customarily employed random effects linear model. / Ph. D.
198

ASCERTAINMENT IN TWO-PHASE SAMPLING DESIGNS FOR SEGREGATION AND LINKAGE ANALYSIS

ZHU, GUOHUA 07 April 2005 (has links)
No description available.
199

Work Sampling and Methods Improvements in Shipment Preparation

McCann, Michael P. 01 January 1976 (has links) (PDF)
A study was undertaken in the Shipping Center of Rohm and Haas Chemical Company in Philadelphia, Pennsylvania to determine what course of action should be taken to reduce overall costs in a labor oriented shipment preparation operation. This activity, which currently utilizes a complement of twenty-seven people, involves the preshipment labelling and stenciling of product and customer information to metal drums and pails plus various other preparation requirements. A work sampling was performed to determine manpower requirements by work category and this information was used to direct the methods improvements study into the most lucrative areas. By transferring the responsibility of label and stencil preparation from the Shipment Preparers, who work on the Shipping platforms, to Shipping Office personnel, and by changing stencil cutting and label storage methods, a net reduction of five people is projected.
200

The effect of additional information on mineral deposit geostatistical grade estimates /

Milioris, George J. (George Joseph) January 1983 (has links)
No description available.

Page generated in 0.0826 seconds