• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 202
  • 100
  • 35
  • 32
  • 31
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 520
  • 520
  • 84
  • 81
  • 65
  • 60
  • 46
  • 45
  • 38
  • 38
  • 37
  • 36
  • 35
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Test Validity and Statistical Analysis

Sargsyan, Alex 17 September 2018 (has links)
No description available.
42

Statistical Analysis of Linear Analog Circuits Using Gaussian Message Passing in Factor Graphs

Phadnis, Miti 01 December 2009 (has links)
This thesis introduces a novel application of factor graphs to the domain of analog circuits. It proposes a technique of leveraging factor graphs for performing statistical yield analysis of analog circuits that is much faster than the standard Monte Carlo/Simulation Program With Integrated Circuit Emphasis (SPICE) simulation techniques. We have designed a tool chain to model an analog circuit and its corresponding factor graph and then use a Gaussian message passing approach along the edges of the graph for yield calculation. The tool is also capable of estimating unknown parameters of the circuit given known output statistics through backward message propagation in the factor graph. The tool builds upon the concept of domain-specific modeling leveraged for modeling and interpreting different kinds of analog circuits. Generic Modeling Environment (GME) is used to design modeling environment for analog circuits. It is a configurable tool set that supports creation of domain-specific design environments for different applications. This research has developed a generalized methodology that could be applied towards design automation of different kinds of analog circuits, both linear and nonlinear. The tool has been successfully used to model linear amplifier circuits and a nonlinear Metal Oxide Semiconductor Field Effect Transistor (MOSFET) circuit. The results obtained by Monte Carlo simulations performed on these circuits are used as a reference in the project to compare against the tool's results. The tool is tested for its efficiency in terms of time and accuracy against the standard results.
43

Statistical Analysis of Longitudinal Data with a Case Study

Liu, Kai January 2015 (has links)
Preterm birth is the leading cause of neonatal mortality and long-term morbidity. Neonatologists can adjust nutrition to preterm neonates to control their weight gain so that the possibility of long-term morbidity can be minimized. This optimization of growth trajectories of preterm infants can be achieved by studying a cohort of selected healthy preterm infants with weights observed during day 1 to day 21. However, missing values in such a data poses a big challenge in this case. In fact, missing data is a common problem faced by most applied researchers. Most statistical softwares deal with missing data by simply deleting subjects with missing items. Analyses carried out on such incomplete data result in biased estimates of the parameters of interest and consequently lead to misleading or invalid inference. Even though many statistical methods may provide robust analysis, it will be better to handle missing data by imputing them with plausible values and then carry out a suitable analysis on the full data. In this thesis, several imputation methods are first introduced and discussed. Once the data get completed by the use of any of these methods, the growth trajectories for this cohort of preterm infants can be presented in the form of percentile growth curves. These growth trajectories can now serve as references for the population of preterm babies. To find out the explicit growth rate, we are interested in establishing predictive models for weights at days 7, 14 and 21. I have used both univariate and multivariate linear models on the completed data. The resulting predictive models can then be used to calculate the target weight at days 7, 14 and 21 for any other infant given the information at birth. Then, neonatologists can adjust the amount of nutrition given in order to preterm infants to control their growth so that they will not grow too fast or too slow, thus avoiding later-life complications. / Thesis / Master of Science (MSc)
44

Investigation of Quantitative NMR by Statistical Analysis

Lao, Lydia Lai-Mui 03 1900 (has links)
<p> Quantitation by nuclear magnetic resonance (NMR) has been frequently done with integrals. The use of peak heights has been thought to be unreliable. The aim of this thesis is to examine the reliability of peak height method in achieving quantitative NMR measurements for small molecules such as water and sucrose.</p> <p> Isotope measurements have been traditionally done by isotope-ratio mass spectrometry. This is no doubt a highly sensitive technique for analyzing pure samples but the analysis of mixtures is not as straight-forward as it would be for the former. Ontario Hydro has encountered problems in measuring deuterium in DMSO/water mixtures. To solve the problem with NMR, an analytical method has been established to measure the deuterium content in waters and in DMSO/water mixtures. This involved testing a linear model for analyzing waters which were enriched or depleted with deuterium as well as applying the model to quantify DMSO/water mixtures. Both 1H and 2H NMR were employed. Satisfactory accuracy and precision of the results were obtained.</p> <p> For quantitative 13C work, the peak height method is often not recommended due to the variations in signal width, which is a result of varied T2 values and nuclear Overhauser enhancement (NOE). Sucrose molecules in cane sugar and beet sugar have different 13C isotopic ratios because they are synthesized by different photosynthetic pathways. To see the usefulness and limitation of the peak height method, 13C spectra of sucrose were acquired and the carbon peaks were quantified. Good precision was achieved but no predictable trend in the isotope difference could be found.</p> / Thesis / Master of Science (MSc)
45

An Investigation of Variables Associated with Mortality in a Broiler Complex in Mississippi

Johnson, Leslie B 03 May 2019 (has links)
A southern Mississippi broiler complex in an area of high poultry density experienced persistent lower livability and growth performance compared with company averages for the state. It was hypothesized that circulating Infectious Bronchitis Virus (IBV) challenge exacerbated by underlying Infectious Bursal Disease (IBD)-induced immune suppression was the primary contributor to reduced livability and live production performance on certain farms, and that disease challenges were most prevalent on farms in areas of high bird density. A retrospective analysis of data from a three-year period (March 2014 through March 2017) was designed to investigate the role of disease, settlement, geographic, and weather variables in broiler mortality. A database comprising diagnostic variables (processing-age ELISA titers for Infectious Bursal Disease (IBD), Infectious Bronchitis Virus (IBV), Newcastle Disease Virus (NDV), and Reovirus (REO)), settlement variables (downtime, age at processing, average weight at processing, week 1 mortality, genetic line, year, and broiler vaccination programs), geographic variables (number of commercial chicken farms and houses within 1 km, 5 km, 10 km and 15 km radii), and weather variables (average temperature, average heat index, and average humidity for the first 7 days and last 14 days of grow-out) was created and analyzed using univariable and multivariable statistical analyses. First-week mortality, processing age, average processing weight, genetic line, NDV/IBV vaccination program, and heat index in the last 14 days of the grow-out period were found to be significantly associated with flock mortality in this broiler complex (P <= 0.05). The results of this study should guide future management and disease control strategies aimed at reducing broiler mortality. Future studies with more diagnostic data are needed to further investigate the relative contribution of diseases to broiler flock mortality.
46

Density estimation and some topics in multivariate analysis

Gaskins, Ray Allen 14 October 2005 (has links)
Part I, entitled "A test of goodness-of-fit for multivariate distributions with special emphasis on multinornality", investigates a modification of the Chi-squared goodness-of-fit statistic which eliminates certain objectionable properties of other multivariate goodness-of-fit tests. Special emphasis is given to the multinormal distribution, and computer simulation is used to generate an empirical distribution for this goodness-of-fit statistic for the standardized bivariate normal density. Attempts to fit a four-parameter generalized gamma density function to this empirical distribution were only partially successful. Part II, entitled "The centroid method of numerical integration", begins with a discussion of the often slighted midpoint method of numerical integration, then, using Taylor's theorem, generalized formulae for the centroid method of numerical integration of a function of several variables over a closed bounded region are developed. These formulae are in terms of the derivatives of the integrand and the moments of the region of integration with respect to its centroid. Since most nonpathological bounded regions can be well approximated by a finite set of simplexes, formulae are developed for the moments of general as well as special simplexes. Several numerical examples are given and a comparison is made between the midpoint and Gaussian quadrature methods. FORTRAN programs are included. Part III - entitled "Non-parametric density estimation," begins with an extensive literature review of non-parametric methods for estimating probability densities based on a sample of N observations and goes on to suggest a new method which is to subtract a penalty for roughness fron the log-likelihood before maximizing. The roughness penalty is a functional of the assumed density function and the recommendation is to use a linear combination of the squares of the first and second derivatives of the square root of the density function. Many numerical examples and graphs are given and show that the estimated density function, for selected values of the coefficients in the linear expression, turns out to be very smooth even for very small sample sizes. Computer programs are not included but are available upon request. Part IV, entitles "On separation of product and error variability," surveys standard techniques of partitioning the total variance into product (or item) variance and error (or testing) variance when destructive testing makes replication over the same item impossible. The problem of negative variance estimates is also investigated. The factor-analysis model and related iterative techniques are suggested as an alternative method for dealing with this separation when three or more independent measurements per item are available. The problem of dependent measurements is discussed. Numerical examples are included. / Ph. D.
47

Automated Data Type Identification And Localization Using Statistical Analysis Data Identification

Moody, Sarah Jean 01 December 2008 (has links)
This research presents a new and unique technique called SÁDI, statistical analysis data identification, for identifying the type of data on a digital device and its storage format based on data type, specifically the values of the bytes representing the data being examined. This research incorporates the automation required for specialized data identification tools to be useful and applicable in real-world applications. The SÁDI technique utilizes the byte values of the data stored on a digital storage device in such a way that the accuracy of the technique does not rely solely on the potentially misleading metadata information but rather on the values of the data itself. SÁDI provides the capability to identify what digitally stored data actually represents. The identification of the relevancy of data is often dependent upon the identification of the type of data being examined. Typical file type identification is based upon file extensions or magic keys. These typical techniques fail in many typical forensic analysis scenarios, such as needing to deal with embedded data, as in the case of Microsoft Word files or file fragments. These typical techniques for file identification can also be easily circumvented, and individuals with nefarious purposes often do so.
48

Testing the Significance of Summary Response Functions

Gray, B. M., Pilcher, J. R. January 1983 (has links)
A simple method of testing the statistical significance of the summary response function derived by Pilcher and Gray is given and applied to European oak data.
49

Modelagem matemática do processo fermentativo de produção de retamicina por microrganismo filamentoso Streptomyces olindensis. / Mathematical modeling of the production of retamicyn by filamentous microorganism Streptomyces olindensis.

Lopes, Juliana Silva 05 June 2007 (has links)
Neste trabalho estudou-se a modelagem matemática de processo de produção do antitumoral retamicina produzido pelo microrganismo filamentoso Streptomyces olindensis em cultivos descontínuos, descontínuos alimentados e contínuos. Através da modelagem matemática é possível verificar o comportamento dos fatores que interferem na produção deste metabólito secundário, a fim de identificar as melhores condições de processo. Foram estudados diferentes modelos: modelo morfologicamente estruturado, modelo não estruturado e um modelo híbrido que combina equações de balanço material com redes neurais artificiais. O modelo morfologicamente estruturado é um aperfeiçoamento de um modelo anterior e o modelo não estruturado, por sua vez, foi desenvolvido na tentativa de simplificar a descrição do processo ao considerar menos variáveis e possuir menor número de parâmetros ajustáveis. Nos modelos, as variáveis consideradas no ajuste foram as concentrações de biomassa, de glicose, de retamicina e de oxigênio dissolvido no meio. Os resultados das simulações foram avaliados estatisticamente por comparação com os dados experimentais. Os modelos também foram comparados entre si através de uma análise estatística. Observou-se que, dentre os modelos estudados, o modelo híbrido apresentou sensibilidade pronunciada às condições iniciais e qualidade de representação dos dados experimentais inferior à dos demais modelos. Os modelos morfologicamente estruturado e não estruturado apresentaram capacidade similar de representação do comportamento dos dados experimentais dos ensaios descontínuos, descontínuo-alimentados e contínuos com baixas taxas de alimentação. / The mathematical modeling of retamycin production during batch, fed-batch and continuous cultivations of Streptomyces olindensis was studied. Through the mathematical modeling, it is possible to identify the best conditions to conduct the process. Different models considered were: a morphologically structured model, an unstructured model and a hybrid model that combines artificial neural networks with mass balances. The morphologically structured model included an enhancement in a model previously described. The unstructured model was developed as an attempt to simplify the description of the process by considering fewer variables and fewer parameters to be adjusted. The variables considered in the models were the concentrations of biomass, glucose, retamycin and dissolved oxygen. Simulation results were submitted to statistical analysis such as model discrimination and test of adequacy to verify which of the models were suitable to describe the process and whether the results of the simulations fit the experimental data or not. Results show that the hybrid model presented high sensitivity to the initial conditions and its capability of representing the experimental data was worse than that of the other developed models. Both the morphologically structured model and the unstructured model show similar suitability to represent the experimental data behavior for batch, fed-batch and low-dilution rate continuous runs.
50

Analyse du géoïde de séparation des sources pour la compréhension de l'organisation verticale des anomalies de masse dans le manteau / Geoid analysis and blind source separation for the undrstanding of the vertical organization of mass anomalies within the mantle

Grabkowiak, Alexia 21 February 2017 (has links)
Depuis le début des années 80, les progrès techniques permettent des estimations quantitatives des relations entre les structures internes de la Terre et sa forme. Des méthodes statistiques appliquées à un modèle tomographique de la Méditerranée permettent d’extraire 3 composantes qui capturent 70% de la variance des données. La première correspond à la réaction du manteau supérieur à la présence de lithosphère à 660 km, la seconde enregistre la réaction du manteau aux slabs subverticaux dans le manteau supérieur, la troisième capture les variations de vitesse sismiques du sommet des manteaux supérieur et inférieur. L’effet de ces 3 phénomènes sur le géoïde est modélisé en considérant : (i) Toutes les anomalies enregistrées par les données tomographiques constituent des sources d’anomalies. Leur effet est calculé en utilisant la gravito-visco-élasticité. Il semble que cette approche ne suffit pas à modéliser l’ensemble de la composante mantellique. (ii) les données tomographiques enregistrent les sources d’anomalies et tous les phénomènes de réajustement. En intégrant les anomalies de masses données par le modèle tomographique on obtient des structures détaillées qui surestiment la composante mantellique. (ii) Chaque composante isole des phénomènes du manteau liés à une source à laquelle est appliquée la théorie de la gravito-visco-élasticité. La composante mantellique obtenue est de longueur d'onde et amplitude compatibles avec le géoïde.La présence de calottes de lithosphère à la base de la zone de transition est susceptible d’être visible par le géoïde contrairement à la présence de slabs subverticaux dans le manteau supérieur / Progress made in seismic tomography and geodesy open the way to estimations of relations between the structures within the Earth and its shape. Applying statistical analysis to tomographic data of the mediterranean area, we extract 3 components that capture almost 70 % of the variance of the tomographic data : first one isolates the mantle reaction to lithospheric masses from the bottom of the transition zone, the second one is legated to subvertical lithsopheric slabs in the upper mantle, the third one corresponds to the tops of upper- and lower- mantle expression. Effect of these dynamics on the geoid has been modelized using considering that : (i) all the structures of the tomographic model are geoid anomaly sources, mantle component of the geoid is computed applying the gravito-visco-elasticity theory to take into account deflection of viscosity interfaces. This approach provides a smooth and low amplitude geoid mantle component. (ii) the tomographic model can register sources but also the all readjustment. Mantle component of the geoid is computed integrating anomalies of the model. It provides a detailled but too large with respect with the regional geoid mantle component. (iii) each component has capacity to isolate a phenomenon legated to a specific source of geoid anomalies. We applied gravito-visco-elastic theory specifically to it. That provides a mantle component detailled and that has a magnitude low enough with respect to the geoid.The presence of lithospheric caps on the bottom of the transition zone can be detected by the mantellic component of the geoid, but the geoid is not sensitive to subvertical slabs within the upper mantle

Page generated in 0.1093 seconds