• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 120
  • 13
  • 10
  • 10
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 164
  • 164
  • 59
  • 25
  • 16
  • 15
  • 15
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Distribution of Particle Image Velocimetry (PIV) Errors in a Planar Jet

Howell, Jaron A. 01 May 2018 (has links)
Particle Image Velocimetry (PIV) is an optical fluid measurement technique used to obtain velocity measurements. Two PIV systems were used to capture data simultaneously and measurement error for the MS PIV system is calculated. An investigation of error distribution is performed to determine when uncertainty estimations fail for the CS PIV-UQ method. Investigation of when results from multi pass PIV processing are achieve were performed so that reliable uncertainty estimations are produced with the CS method. An investigation was also performed which determined that error distributions in PIV systems are correlated with flow shear and particle seeding density. Correlation of random errors in space was also performed at the jet core and shear regions of the flow. It was found that in flow regions with large shear that error distributions were non-Gaussian. It was also found in regions of large shear that CS uncertainty results did not match the error. For multi-pass PIV processing with 50% and 75% IW overlap it was found that 4 and 6 passes should be used, respectively, in order for CS uncertainty estimations to be reliable. It was also found that the correlation of random errors in space is much larger in shear regions of the jet flow than in the jet core.
162

Ecological momentary assessment versus traditional retrospective self-reports as predictors of health-relevant outcomes

Zielke, Desiree Joy 05 September 2013 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Ecological momentary assessment (EMA) has been asserted by proponents of the technique as being superior to standard paper-and-pencil measurements in terms of the reliability and validity of the information obtained; however, this claim has not yet been fully evaluated in the literature. Accordingly, the purpose of this study was to evaluate one aspect of this assertion by comparing the utility of EMA and retrospective measures of depressive symptoms in predicting health-relevant biological and behavioral outcomes. It was hypothesized that (1) the EMA measure will have better predictive utility when examining objective sleep quality (a biological outcome), and that (2) the retrospective measure will have better predictive utility when examining blood donation intention (a behavioral outcome). Ninety-six undergraduate females participated in this 2-week study. Depressive symptoms were measured momentarily and retrospectively using the Center for Epidemiological Studies-Depression Scale (CES-D). The biological outcome was assessed by actigraphy, whereas the behavioral outcome was measured via a self-report questionnaire. Unfortunately, it was not possible to fully test these hypotheses due to the failure to observe relationships between the predictor variables and the outcomes. The reported results, although limited, did not provide support for the hypotheses. Supplemental analyses revealed a moderate to high amount of shared variance between the EMA and retrospective measures, a similar extent of random error in both measures, and potentially a greater degree of systematic error in the retrospective measure. Due to the paucity of literature examining the claim of superior reliability and validity of EMA versus retrospective measures, as well as the failure of the current study to evaluate this assertion sufficiently, it appears that this claim remains unfounded. Therefore, suggestions for future research are provided.
163

Understanding patterns of aggregation in count data

Sebatjane, Phuti 06 1900 (has links)
The term aggregation refers to overdispersion and both are used interchangeably in this thesis. In addressing the problem of prevalence of infectious parasite species faced by most rural livestock farmers, we model the distribution of faecal egg counts of 15 parasite species (13 internal parasites and 2 ticks) common in sheep and goats. Aggregation and excess zeroes is addressed through the use of generalised linear models. The abundance of each species was modelled using six different distributions: the Poisson, negative binomial (NB), zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), zero-altered Poisson (ZAP) and zero-altered negative binomial (ZANB) and their fit was later compared. Excess zero models (ZIP, ZINB, ZAP and ZANB) were found to be a better fit compared to standard count models (Poisson and negative binomial) in all 15 cases. We further investigated how distributional assumption a↵ects aggregation and zero inflation. Aggregation and zero inflation (measured by the dispersion parameter k and the zero inflation probability) were found to vary greatly with distributional assumption; this in turn changed the fixed-effects structure. Serial autocorrelation between adjacent observations was later taken into account by fitting observation driven time series models to the data. Simultaneously taking into account autocorrelation, overdispersion and zero inflation proved to be successful as zero inflated autoregressive models performed better than zero inflated models in most cases. Apart from contribution to the knowledge of science, predictability of parasite burden will help farmers with effective disease management interventions. Researchers confronted with the task of analysing count data with excess zeroes can use the findings of this illustrative study as a guideline irrespective of their research discipline. Statistical methods from model selection, quantifying of zero inflation through to accounting for serial autocorrelation are described and illustrated. / Statistics / M.Sc. (Statistics)
164

Design optimization of heterogeneous microstructured materials

Emami, Anahita January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Our ability to engineer materials is limited by our capacity to tailor the material’s microstructure morphology and predict resulting properties. The insufficient knowledge on microstructure-property relationship is due to complexity and randomness in all materials at different scales. The objective of this research is to establish a design optimization methodology for microstructured materials. The material design problem is stated as finding the optimum microstructure to maximize the desired performance satisfying material processing constrains. This problem has been solved in this thesis by means of numerical techniques through four main steps: microstructure characterization, model reconstruction, property evaluation, and optimization. Two methods of microstructure characterizations have been investigated along with the advantages and disadvantages of each method. The first microstructure characterization method is a statistical method which utilizes correlation functions to extract the microstructural information. Algorithms for calculating these correlations functions have been developed and optimized based on their computational cost using MATLAB software. The second microstructure characterization method is physical characterization which works based on evaluation of physical features in microstructured domain. These features have been measured by means of MATLAB codes. Three model reconstruction techniques are proposed based on these characterization methods and employed to generate material models for further evaluation. The first reconstructing algorithm uses statistical functions to reconstruct the statistical equivalent model through simulating annealing optimization method. The second algorithm uses cellular automaton concepts to simulate the grain growth utilizing physical descriptors, and the third one generates elliptical inclusions in a material matrix using physical characteristic of microstructure. The finite element method is used to analysis the mechanical behavior of material models. Several material samples with different microstructural characteristics have been generated to model the micro-scale design domain of AZ31 magnesium alloy and magnesium matrix composite with silicon carbide fibers. Then, surrogate models have been created based on these samples to approximate the entire design domain and demonstrate the sensitivity of the desired mechanical property to two independent microstructural features. Finally, the optimum microstructure characteristics of material samples for fracture strength maximization have been obtained.

Page generated in 0.0929 seconds