791 |
Degradation modeling and monitoring of engineering systems using functional data analysisZhou, Rensheng 08 November 2012 (has links)
In this thesis, we develop several novel degradation models based on techniques from functional data analysis. These models are suitable for characterizing different types of sensor-based degradation signals, whether they are censored at a certain fixed time point or truncated at the failure threshold. Our proposed models can also be easily extended to accommodate for the effects of environmental conditions on degradation processes. Unlike many existing degradation models that rely on the existence of a historical sample of complete degradation signals, our modeling framework is well-suited for modeling complete as well as incomplete (sparse and fragmented) degradation signals. We utilize these models to predict and continuously update, in real time, the residual life distributions of partially degraded components. We assess and compare the performance of our proposed models and existing benchmark models by using simulated signals and real world data sets. The results indicate that our models can provide a better characterization of the degradation signals and a more accurate prediction of a system's lifetime under different signal scenarios. Another major advantage of our models is their robustness to the model mis-specification, which is especially important for applications with incomplete degradation signals (sparse or fragmented).
|
792 |
Statistical Modeling of High-Dimensional Nonlinear Systems: A Projection Pursuit SolutionSwinson, Michael D. 28 November 2005 (has links)
Despite recent advances in statistics, artificial neural network theory, and machine learning, nonlinear function estimation in high-dimensional space remains a nontrivial problem. As the response surface becomes more complicated and the dimensions of the input data increase, the dreaded "curse of dimensionality" takes hold, rendering the best of function approximation methods ineffective. This thesis takes a novel approach to solving the high-dimensional function estimation problem. In this work, we propose and develop two distinct parametric projection pursuit learning networks with wide-ranging applicability. Included in this work is a discussion of the choice of basis functions used as well as a description of the optimization schemes utilized to find the parameters that enable each network to best approximate a response surface.
The essence of these new modeling methodologies is to approximate functions via the superposition of a series of piecewise one-dimensional models that are fit to specific directions, called projection directions. The key to the effectiveness of each model lies in its ability to find efficient projections for reducing the dimensionality of the input space to best fit an underlying response surface. Moreover, each method is capable of effectively selecting appropriate projections from the input data in the presence of relatively high levels of noise. This is accomplished by rigorously examining the theoretical conditions for approximating each solution space and taking full advantage of the principles of optimization to construct a pair of algorithms, each capable of effectively modeling high-dimensional nonlinear response surfaces to a higher degree of accuracy than previously possible.
|
793 |
The influence of critical asset management facets on improving reliability in power systemsPerkel, Joshua 04 November 2008 (has links)
The objective of the proposed research is to develop statistical algorithms for controlling failure trends through targeted maintenance of at-risk components. The at-risk components are identified via chronological history and diagnostic data, if available. Utility systems include many thousands (possibly millions) of components with many of them having already exceeded their design lives. Unfortunately, neither the budget nor manufacturing resources exist to allow for the immediate replacement of all these components. On the other hand, the utility cannot tolerate a decrease in reliability or the associated increased costs. To combat this problem, an overall maintenance model has been developed that utilizes all the available historical information (failure rates and population sizes) and diagnostic tools (real-time conditions of each component) to generate a maintenance plan. This plan must be capable of delivering the needed reliability improvements while remaining economical. It consists of three facets each of which addresses one of the critical asset management issues:
* Failure Prediction Facet - Statistical algorithm for predicting future failure trends and estimating required numbers of corrective actions to alter these failure trends to desirable levels. Provides planning guidance and expected future performance of the system.
* Diagnostic Facet - Development of diagnostic data and techniques for assessing the accuracy and validity of that data. Provides the true effectiveness of the different diagnostic tools that are available.
* Economics Facet - Stochastic model of economic benefits that may be obtained from diagnostic directed maintenance programs. Provides the cost model that may be used for budgeting purposes.
These facets function together to generate a diagnostic directed maintenance plan whose goal is to provide the best available guidance for maximizing the gains in reliability for the budgetary limits utility engineers must operate within.
|
794 |
Optimal design of mesostructured materials under uncertaintyPatel, Jiten 24 August 2009 (has links)
The main objective of the topology optimization is to fulfill the objective function with the minimum amount of material. This reduces the overall cost of the structure and at the same time reduces the assembly, manufacturing and maintenance costs because of the reduced number of parts in the final structure. The concept of reliability analysis can be incorporated into the deterministic topology optimization method; this incorporated scheme is referred to as Reliability-based Topology Optimization (RBTO). In RBTO, the statistical nature of constraints and design problems are defined in the objective function and probabilistic constraint. The probabilistic constraint can specify the required
reliability level of the system. In practical applications, however, finding global optimum in the presence of uncertainty is a difficult and computationally intensive task, since for every possible design a full stochastic analysis has to be performed for estimating various statistical
parameters. Efficient methodologies are therefore required for the solution of the stochastic part and the optimization part of the design process.
This research will explore a reliability-based synthesis method which estimates all the statistical parameters and finds the optimum while being less computationally intensive. The efficiency of the proposed method is achieved with the combination of topology optimization and stochastic approximation which utilizes a sampling technique such as Latin Hypercube Sampling (LHS) and surrogate modeling techniques such as Local Regression and Classification using Artificial Neural Networks (ANN). Local regression is comparatively less computationally intensive and produces good results in case of low probability of failures whereas Classification is particularly useful in cases where the reliability of failure has to be estimated with disjoint failure domains. Because
classification using ANN is comparatively more computationally demanding than Local regression, classification is only used when local regression fails to give the desired level of goodness of fit. Nevertheless, classification is an indispensible tool in estimating the
probability of failure when the failure domain is discontinuous.
Representative examples will be demonstrated where the method is used to design
customized meso-scale truss structures and a macro-scale hydrogen storage tank. The
final deliverable from this research will be a less computationally intensive and robust
RBTO procedure that can be used for design of truss structures with variable design
parameters and force and boundary conditions.
|
795 |
Characterization and impact of ambient air pollution measurement error in time-series epidemiologic studiesGoldman, Gretchen Tanner 28 June 2011 (has links)
Time-series studies of ambient air pollution and acute health outcomes utilize measurements from fixed outdoor monitoring sites to assess changes in pollution concentration relative to time-variable health outcome measures. These studies rely on measured concentrations as a surrogate for population exposure. The degree to which monitoring site measurements accurately represent true ambient concentrations is of interest from both an etiologic and regulatory perspective, since associations observed in time-series studies are used to inform health-based ambient air quality standards. Air pollutant measurement errors associated with instrument precision and lack of spatial correlation between monitors have been shown to attenuate associations observed in health studies. Characterization and adjustment for air pollution measurement error can improve effect estimates in time-series studies. Measurement error was characterized for 12 ambient air pollutants in Atlanta. Simulations of instrument and spatial error were generated for each pollutant, added to a reference pollutant time-series, and used in a Poisson generalized linear model of air pollution and cardiovascular emergency department visits. This method allows for pollutant-specific quantification of impacts of measurement error on health effect estimates, both the assessed strength of association and its significance. To inform on the amount and type of error present in Atlanta measurements, air pollutant concentrations were simulated over the 20-county metropolitan area for a 6-year period, incorporating several distribution characteristics observed in measurement data. The simulated concentration fields were then used to characterize the amount and type of error due to spatial variability in ambient concentrations, as well as the impact of use of different exposure metrics in a time-series epidemiologic study. Finally, methodologies developed for the Atlanta area were applied to air pollution measurements in Dallas, Texas with consideration for the impact of this error on a health study of the Dallas-Fort Worth region that is currently underway.
|
796 |
Use of geostatistics in developing drilling programs at the Cananea copper mineCervantes-Montoya, Jesús Alberto January 1981 (has links)
No description available.
|
797 |
Αυτόματη εξαγωγή λεξικής - σημασιολογικής γνώσης από ηλεκτρονικά σώματα κειμένων με χρήση ελαχίστων πόρων / Automatic extraction of lexico - semantic knowledge from electronic text corpora using minimal resourcesΘανόπουλος, Αριστομένης 25 June 2007 (has links)
Το αντικείμενο της διατριβής είναι η μελέτη μεθόδων αυτόματης εξαγωγής των συμφράσεων και των σημασιολογικών ομοιοτήτων των λέξεων από μεγάλα σώματα κειμένων. Υιοθετείται μια προσέγγιση ελάχιστων γλωσσικών πόρων που εξασφαλίζει την απεριόριστη μεταφερσιμότητα των μεθόδων σε φυσικές γλώσσες και θεματικές περιοχές. Για την αξιολόγηση των προτεινόμενων μεθόδων προτείνονται, αξιολογούνται και εφαρμόζονται μεθοδολογίες με βάση πρότυπες βάσεις λεξικής γνώσης (στην Αγγλική), όπως το WordNet. Για την εξαγωγή των συμφράσεων προτείνονται νέα μέτρα εξαγωγής στατιστικά σημαντικών διγράμμων και γενικά ν-γράμμων που αξιολογούνται θετικά. Για την εξαγωγή των λεξικών - σημασιολογικών ομοιοτήτων των λέξεων ακολουθείται καταρχήν η προσέγγιση ομοιότητας περικειμένων λέξεων με παραθυρικές μεθόδους, όπου μελετώνται το πεδίο συμφραζομένων, το φιλτράρισμα των συνεμφανίσεων των λέξεων, τα μέτρα ομοιότητας, όπου εισάγεται ο παράγοντας του αριθμού κοινών παραμέτρων, καθώς και η αντιμετώπιση συστηματικών σφαλμάτων, ενώ προτείνεται η αξιοποίηση των λειτουργικών λέξεων. Επιπλέον, προτείνεται η αξιοποίηση της ομοιότητας περικείμενων εκφράσεων, που απαντάται συχνά σε θεματικώς εστιασμένα κείμενα, με ένα αλγόριθμο βασισμένο στην ετεροσυσχέτιση ακολουθιών λέξεων. Μελετάται η μεθοδολογία αξιοποίησης των παρατακτικών συνδέσεων ενώ προτείνεται μια μέθοδος ενοποίησης ετερογενών σωμάτων γνώσης λεξικών – σημασιολογικών ομοιοτήτων. Τέλος, η εξαχθείσα γνώση μετασχηματίζεται σε σημασιολογικές κλάσεις με μια συμβολική μέθοδο ιεραρχικής ομαδοποίησης και επίσης ενσωματώνεται επιτυχώς σε ένα διαλογικό σύστημα μηχανικής μάθησης όπου ενισχύει την απόδοση της αναγνώρισης του σκοπού του χρήστη συμβάλλοντας στην εκτίμηση του ρόλου των άγνωστων λέξεων. / The research described in this dissertation regards automatic extraction of collocations and lexico-semantic similarities from large text corpora. We follow an approach based on minimal linguistic resources in order to achieve unrestricted portability across languages and thematic domains. In order to evaluate the proposed methods we propose, evaluate and apply methodologies based on English gold standard lexical resources, such as WordNet. For the extraction of collocations we propose and test a few novel measures for the identification of statistically significant bigrams and, generally, n-grams, which exhibit strong performance. For the extraction of lexico-semantic similarities we follow a distributional window-based approach. We study the contextual scope, the filtering of lexical co-occurrences and the performance of similarity measures. We propose the incorporation of the number of common parameters into the latter, the exploitation of functional words and a method for the elimination of systematic errors. Moreover, we propose a novel approach to exploitation of word sequence similarities, common in technical texts, based on cross-correlation of word sequences. We refine an approach for word similarity extraction from coordinations and we propose a method for the amalgamation of lexico-semantic similarity databases extracted via different principles and methods. Finally, the extracted similarity knowledge is transformed in the form of soft hierarchical semantic clusters and it is successfully incorporated into a machine learning based dialogue system, reinforcing the performance of user’s plan recognition by estimating the semantic role of unknown words.
|
798 |
Dynamic moment analysis of non-stationary temperature data in AlbertaZhou, Qixuan January 2010 (has links)
Strong seasonality is observed in the volatile hourly Alberta temperature and its low- and high-order statistical moments. We propose a time series model consisting of a linear combination of an annual sinusoidal model, a diurnal sinusoidal model and a fractional residual model, to study the characteristics of these spatial and time-dependent Alberta temperatures. Wavelet multi-resolution analysis is used to measure Hurst exponents of the temperature series. Our empirical results show that these Hurst exponents vary over various time scales, indicating the existence of multi-fractality in the temperatures. Such temperature models are of importance for the pricing and insurance of agricultural crops, of tourist resorts and of all forms of energy extraction and generation of importance to the resource-based economy of Alberta. Of particular interests are the observed extreme volatilities in the winters, caused by the unpredictable Chinook winds, which may be an important reason to introduce a Chinook insurance option. / 64 leaves : map ; 29 cm
|
799 |
Estimation and analysis of measures of disease for HIV infection in childbearing women using serial seroprevalence data.Sewpaul, Ronel. January 2011 (has links)
The prevalence and the incidence are two primary epidemiological parameters
in infectious disease modelling. The incidence is also closely related
to the force of infection or the hazard of infection in survival analysis
terms. The two measures carry the same information about a disease because
they measure the rate at which new infections occur. The disease
prevalence gives the proportion of infected individuals in the population at
a given time, while the incidence is the rate of new infections.
The thesis discusses methods for estimating HIV prevalence, incidence
rates and the force of infection, against age and time, using cross-sectional
seroprevalence data for pregnant women attending antenatal clinics. The
data was collected on women aged 12 to 47 in rural KwaZulu-Natal for each
of the years 2001 to 2006.
The generalized linear model for binomial response is used extensively.
First the logistic regression model is used to estimate annual HIV prevalence
by age. It was found that the estimated prevalence for each year
increases with age, to peaks of between 36% and 57% in the mid to late
twenties, before declining steadily toward the forties. Fitted prevalence for
2001 is lower than for the other years across all ages.
Several models for estimating the force of infection are discussed and applied.
The fitted force of infection rises with age to a peak of 0.074 at age
15, and then decreases toward higher ages. The force of infection measures
the potential risk of infection per individual per unit time. A proportional
hazards model of the age to infection is applied to the data, and shows that
additional variables such as partner’s age and the number of previous pregnancies
do have a significant effect on the infection hazard.
Studies for estimating incidence from multiple prevalence surveys are reviewed.
The relative inclusion rate (RIR), accounting for the fact that the
probability of inclusion in a prevalence sample depends on the individual’s
HIV status, and its role in incidence estimation is discussed as a possible
future approach of extending the current work. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2011.
|
800 |
Statistical methods for reconstruction of entry, descent, and landing performance with application to vehicle designDutta, Soumyo 13 January 2014 (has links)
There is significant uncertainty in our knowledge of the Martian atmosphere and the aerodynamics of the Mars entry, descent, and landing (EDL) systems. These uncertainties result in conservatism in the design of the EDL vehicles leading to higher system masses and a broad range of performance predictions. Data from flight instrumentation onboard Mars EDL systems can be used to quantify these uncertainties, but the existing dataset is sparse and many parameters of interest have not been previously observable. Many past EDL reconstructions neither utilize statistical information about the uncertainty of the measured data nor quantify the uncertainty of the estimated parameters. Statistical estimation methods can blend together disparate data types to improve the reconstruction of parameters of interest for the vehicle. For example, integrating data obtained from aeroshell-mounted pressure transducers, inertial measurement unit, and radar altimeter can improve the estimates of the trajectory, atmospheric profile, and aerodynamic coefficients, while also quantifying the uncertainty in these estimates. These same statistical methods can be leveraged to improve current engineering models in order to reduce conservatism in future EDL vehicle design. The work in this thesis presents a comprehensive methodology for parameter reconstruction and uncertainty quantification while blending dissimilar Mars EDL datasets. Statistical estimation methods applied include the Extended Kalman Filter, Unscented Kalman Filter, and Adaptive Filter. The estimators are applied in a manner in which the observability of the parameters of interest is maximized while using the sparse, disparate EDL dataset. The methodology is validated with simulated data and then applied to estimate the EDL performance of the 2012 Mars Science Laboratory. The reconstruction methodology is also utilized as a tool for improving vehicle design and reducing design conservatism. A novel method of optimizing the design of future EDL atmospheric data systems is presented by leveraging the reconstruction methodology. The methodology identifies important design trends and the point of diminishing returns of atmospheric data sensors that are critical in improving the reconstruction performance for future EDL vehicles. The impact of the estimation methodology on aerodynamic and atmospheric engineering models is also studied and suggestions are made for future EDL instrumentation.
|
Page generated in 0.146 seconds