• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 33
  • 13
  • 4
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 128
  • 128
  • 31
  • 25
  • 25
  • 14
  • 12
  • 12
  • 11
  • 11
  • 10
  • 10
  • 10
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Observability and Economic aspects of Fault Detection and Diagnosis Using CUSUM based Multivariate Statistics

Bin Shams, Mohamed January 2010 (has links)
This project focuses on the fault observability problem and its impact on plant performance and profitability. The study has been conducted along two main directions. First, a technique has been developed to detect and diagnose faulty situations that could not be observed by previously reported methods. The technique is demonstrated through a subset of faults typically considered for the Tennessee Eastman Process (TEP); which have been found unobservable in all previous studies. The proposed strategy combines the cumulative sum (CUSUM) of the process measurements with Principal Component Analysis (PCA). The CUSUM is used to enhance faults under conditions of small fault/signal to noise ratio while the use of PCA facilitates the filtering of noise in the presence of highly correlated data. Multivariate indices, namely, T2 and Q statistics based on the cumulative sums of all available measurements were used for observing these faults. The ARLo.c was proposed as a statistical metric to quantify fault observability. Following the faults detection, the problem of fault isolation is treated. It is shown that for the particular faults considered in the TEP problem, the contribution plots are not able to properly isolate the faults under consideration. This motivates the use of the CUSUM based PCA technique previously used for detection, for unambiguously diagnose the faults. The diagnosis scheme is performed by constructing a family of CUSUM based PCA models corresponding to each fault and then testing whether the statistical thresholds related to a particular faulty model is exceeded or not, hence, indicating occurrence or absence of the corresponding fault. Although the CUSUM based techniques were found successful in detecting abnormal situations as well as isolating the faults, long time intervals were required for both detection and diagnosis. The potential economic impact of these resulting delays motivates the second main objective of this project. More specifically, a methodology to quantify the potential economical loss due to unobserved faults when standard statistical monitoring charts are used is developed. Since most of the chemical and petrochemical plants are operated under closed loop scheme, the interaction of the control is also explicitly considered. An optimization problem is formulated to search for the optimal tradeoff between fault observability and closed loop performance. This optimization problem is solved in the frequency domain by using approximate closed loop transfer function models and in the time domain using a simulation based approach. The optimization in the time domain is applied to the TEP to solve for the optimal tuning parameters of the controllers that minimize an economic cost of the process.
32

Observability and Economic aspects of Fault Detection and Diagnosis Using CUSUM based Multivariate Statistics

Bin Shams, Mohamed January 2010 (has links)
This project focuses on the fault observability problem and its impact on plant performance and profitability. The study has been conducted along two main directions. First, a technique has been developed to detect and diagnose faulty situations that could not be observed by previously reported methods. The technique is demonstrated through a subset of faults typically considered for the Tennessee Eastman Process (TEP); which have been found unobservable in all previous studies. The proposed strategy combines the cumulative sum (CUSUM) of the process measurements with Principal Component Analysis (PCA). The CUSUM is used to enhance faults under conditions of small fault/signal to noise ratio while the use of PCA facilitates the filtering of noise in the presence of highly correlated data. Multivariate indices, namely, T2 and Q statistics based on the cumulative sums of all available measurements were used for observing these faults. The ARLo.c was proposed as a statistical metric to quantify fault observability. Following the faults detection, the problem of fault isolation is treated. It is shown that for the particular faults considered in the TEP problem, the contribution plots are not able to properly isolate the faults under consideration. This motivates the use of the CUSUM based PCA technique previously used for detection, for unambiguously diagnose the faults. The diagnosis scheme is performed by constructing a family of CUSUM based PCA models corresponding to each fault and then testing whether the statistical thresholds related to a particular faulty model is exceeded or not, hence, indicating occurrence or absence of the corresponding fault. Although the CUSUM based techniques were found successful in detecting abnormal situations as well as isolating the faults, long time intervals were required for both detection and diagnosis. The potential economic impact of these resulting delays motivates the second main objective of this project. More specifically, a methodology to quantify the potential economical loss due to unobserved faults when standard statistical monitoring charts are used is developed. Since most of the chemical and petrochemical plants are operated under closed loop scheme, the interaction of the control is also explicitly considered. An optimization problem is formulated to search for the optimal tradeoff between fault observability and closed loop performance. This optimization problem is solved in the frequency domain by using approximate closed loop transfer function models and in the time domain using a simulation based approach. The optimization in the time domain is applied to the TEP to solve for the optimal tuning parameters of the controllers that minimize an economic cost of the process.
33

Model-based Learning: t-Families, Variable Selection, and Parameter Estimation

Andrews, Jeffrey Lambert 27 August 2012 (has links)
The phrase model-based learning describes the use of mixture models in machine learning problems. This thesis focuses on a number of issues surrounding the use of mixture models in statistical learning tasks: including clustering, classification, discriminant analysis, variable selection, and parameter estimation. After motivating the importance of statistical learning via mixture models, five papers are presented. For ease of consumption, the papers are organized into three parts: mixtures of multivariate t-families, variable selection, and parameter estimation. / Natural Sciences and Engineering Research Council of Canada through a doctoral postgraduate scholarship.
34

Monitoring of an Antigen Manufacturing Process Using Fluorescence

Zavatti, Vanessa 12 June 2015 (has links)
Bordetella pertussis is one of two Gram-negative bacteria responsible for causing whooping cough in humans, a highly contagious disease that infects the human upper respiratory tract. Whole-cell and acellular vaccines have been developed but due to side-effects resulting from whole-cell vaccines, acellular vaccines are currently preferred to prevent this disease. A second bacterium known to cause whooping cough is Bordetella parapertussis, but since it causes less aggressive symptoms, only B. pertussis is utilized in the manufacture of the vaccine. One acellular vaccine is based on four virulence factors: pertussis toxin (PT), filamentous hemagglutinin (FHA), pertactin (PRN), and fimbriae (FIM). The focus of this thesis was to explore the use of spectrofluorometry for monitoring and forecasting the performance of the upstream and downstream operations in the PRN purification process at Sanofi Pasteur. The upstream fermentation process includes a series of reactors of increasing volume where the microorganism is grown under controlled conditions. The PRN purification process involves a series of sequential steps for separating this protein from other proteins for later use in the vaccine. The PRN is precipitated in three steps with ammonium sulphate at three different concentrations. The pellet is collected by centrifugation and dissolved in a buffer solution followed by chromatographic separation. The run-through is then ultra-filtered and diafiltered in two separate steps. The resulting concentrate is dissolved in water and subjected to another chromatographic step and diafiltration. The final filtration of PRN involves a pre-filtration and sterile filtration. Finally, the samples are collected for quality control. The objective of this work was to monitor the process at different steps of the upstream and downstream purification process by multi-wavelength fluorescence spectroscopy in combination with multi-variate statistical methods. From the spectra, it was possible to identify fluorescent compounds, such as amino acids and enzyme cofactors, without performing an additional pre-treatment or purification step. Also, the identification of conformational changes in proteins and formation of complexes, such as NAD(P)-enzyme complex, was possible based on the shift in the emission peaks of the compounds identified. These results demonstrated the feasibility of using this tool for qualitative evaluation of the process. Multivariate methods, such as PCA and PLS, were used to extract relevant information and compress the fluorescence data acquired. PCA was effective for correlating variability in the yield of pertactin to a particular fluorescence fingerprint. As a result of the study, it was concluded that a possible source of variability in the productivity that is observed might be a metabolic shift during the fermentation steps that leads to the accumulation of NAD(P)H (or NAD(P)H-enzyme complex) probably due to oxygen transfer limitations. This conclusion was reached after investigating changes in the dissolved oxygen, aeration, agitation, supplementation time and key metabolites (lactate, glucose, glutamine) profiles. The correlation of these parameters with low productivity it was not straightforward; however, some consistencies were observed, for example, high levels of glutamine in batches with low productivity. This fact might be related to the start of the supplementation time, which may be related to the dissolved oxygen, since the addition of the supplement is done manually when an increase of the dissolved oxygen is detected. It is believed that this factor is related to the low production of protein product, such as pertactin. By means of PLS, it was possible to build regression models that allow for predicting the final concentration of pertactin from the fluorescence measurements. The models were built using the new variables obtained from data compression performed with PCA, and the final pertactin concentration measured by a Kjeldahl test. With this method, two regressions were constructed: (i) between NAD(P)H-enzyme complex spectra from the fermenters and pertactin concentration and (ii) between the pertactin fluorescence spectra from the last step of purification and pertactin concentration. A third model was built using the protein content, the NAD(P)H-enzyme complex content in the fermenters and pertactin concentration. Attempts were made to identify the possible enzyme that may bind to NAD(P)H, assumed to be a dehydrogenase. Substrates for different enzymes were employed with the objective of measuring changes in the fluorescence of the characteristic peak for this binding (Ex/Em=280/460 nm). Major changes were detected after addition of the substrates oxaloacetate, ubiquinone and succinate dehydrogenase. Since changes were detected with more than one substrate, it was not possible to unequivocally identify the enzyme; however, the results provide some insight into what may be happening at the metabolic level. The work carried out in this thesis involved both analysis of samples provided or collected by the industrial sponsor as well as analysis of samples prepared at the University of Waterloo for measurement, interpretation and calibration. The proposed fluorescence-based method was found suitable for assessing protein quantity as well as for providing an indication of possible protein aggregation and conformational changes. Future work will be required to identify the exact source of variability in the production of pertactin, by means of monitoring the evolution of fermentation, NAD(P)H and ATP measurements, and oxidation redox potential assays.
35

On the development of control systems technology for fermentation processes

Loftus, John January 2017 (has links)
Fermentation processes play an integral role in the manufacture of pharmaceutical products. The Quality by Design initiative, combined with Process Analytical Technologies, aims to facilitate the consistent production of high quality products in the most efficient and economical way. The ability to estimate and control product quality from these processes is essential in achieving this aim. Large historical datasets are commonplace in the pharmaceutical industry and multivariate methods based on PCA and PLS have been successfully used in a wide range of applications to extract useful information from such datasets. This thesis has focused on the development and application of novel multivariate methods to the estimation and control of product quality from a number of processes. The document is divided into four main categories. Firstly, the related literature and inherent mathematical techniques are summarised. Following this, the three main technical areas of work are presented. The first of these relates to the development of a novel method for estimating the quality of products from a proprietary process using PCA. The ability to estimate product quality is useful for identifying production steps that are potentially problematic and also increases process efficiency by ensuring that any defective products are detected before they undergo any further processing. The proposed method is simple and robust and has been applied to two separate case studies, the results of which demonstrate the efficacy of the technique. The second area of work concentrates on the development of a novel method of identifying the operational phases of batch fermentation processes and is based on PCA and associated statistics. Knowledge of the operational phases of a process can be beneficial from a monitoring and control perspective and allows a process to be divided into phases that can be approximated by a linear model. The devised methodology is applied to two separate fermentation processes and results show the capability of the proposed method. The third area of work focuses on undertaking a performance evaluation of two multivariate algorithms, PLS and EPLS, in controlling the end-point product yield of fermentation processes. Control of end-point product quality is of crucial importance in many manufacturing industries, such as the pharmaceutical industry. Developing a controller based on historical and identification process data is attractive due to the simplicity of modelling and the increasing availability of process data. The methodology is applied to two case studies and performance evaluated. From both a prediction and control perspective, it is seen that EPLS outperforms PLS, which is important if modelling data is limited.
36

Biogeochemistry and geochemical paleoceanography of the South Pacific Gyre

Dunlea, Ann G. 04 December 2016 (has links)
Pelagic clays cover nearly one half of the ocean floor, but are rarely used for paleoceanographic research because of their extremely slow sedimentation rates, post-depositional alteration(s), and the lack of biogenic material available to provide ages. My dissertation develops and applies approaches to study pelagic clays by targeting the largest marine sediment province in the world: the South Pacific Gyre (SPG). I present an unprecedented spatially and temporally extensive paleoceanographic history of the SPG and discuss authigenic processes in pelagic clays that are linked to changes in global seawater composition through the Cenozoic. My research was based on an extensive inorganic geochemical dataset I developed from samples gathered during Integrated Ocean Drilling Program Expedition 329. I applied multivariate statistical techniques (e.g., Q-mode factor analysis and constrained least squares multiple linear regression (CLS)) to the dataset in order to (a) identify the existence of six end-members in pelagic clay (namely, eolian dust, Fe/Mn-oxyhydroxides, apatite, excess Si, and two types of volcanic ash), (b) quantify their abundances, (c) determine their mass accumulation rates, and (d) infer major features in the paleoceanographic evolution of the SPG. Key parts of my research also developed improved MATLAB codes to facilitate and speed the search for best fitting end-member combinations in CLS modeling. Additionally, I expanded the natural gamma radiation instrumental capabilities on the D/V JOIDES Resolution to quantify concentrations of uranium, thorium, and potassium. I dated the pelagic clay at four of the IODP sites with a cobalt-based age model that I developed, and documented that the seawater behavior of cobalt determines the extent to which this method can be applied. Collectively, the results track the spatial extent of dust deposition in the SPG during the aridification of Australia, dispersed ash accumulation from episodes of Southern Hemisphere volcanism, and other features of Earth’s evolution during the Cenozoic. I further quantified two geochemically distinct types of authigenic ash alterations within the pelagic clay, indicating that altered ashes may be a significant and variable sink of magnesium in seawater over geologic timescales.
37

Estratégias de aplicação de análise estatística multivariada no desenvolvimento de novos produtos

Silveira, Manoel Mendonça January 2010 (has links)
No processo de desenvolvimento de novos produtos e serviços, o entendimento de quais demandas são exigidas pelo mercado conduz ao desenvolvimento de projetos com melhores soluções aos clientes. Na busca deste entendimento, técnicas estatísticas multivariadas são utilizadas como suporte para identificar e valorar os requisitos derivados destas demandas. Nesse contexto, o objetivo deste trabalho é apresentar uma abordagem para aplicação de técnicas estatísticas multivariadas no processo de desenvolvimento de novos produtos (PDP). Estas técnicas podem auxiliar as empresas no gerenciamento de requisitos, contribuindo para: (i) coletar e organizar os requisitos do produto; (ii) identificar os requisitos considerados como mais relevantes; (iii) identificar os segmentos de mercado baseado nas características valoradas pelo público-alvo; (iv) verificar associações entre requisitos de um produto e determinadas características do público-alvo. Este trabalho apresenta um exemplo de aplicação contemplando o uso combinado de técnicas estatísticas tais como o método CHAID (Chi-squared Automatic Interaction Detector), análise fatorial, análise conjunta de atributos e análise de correspondência. A demonstração do emprego destas técnicas é realizada no desenvolvimento de um novo produto de limpeza doméstica produzido com características de sustentabilidade. / The deep understanding of market’s requirements, during the manufacturing of new products and/or services, leads to the creation of products of better configuration attend customers’ necessities. Multivariable analysis techniques can be employed to help identifying such consumer preferences. Therefore, the aim of this study is to illustrate an approach to the employment of multivariate statistical procedures on the development of new products (DNP). These techniques can assist companies in managing products’ requirements by helping them to: (i) assemble and categorize products’ requirements; (ii) identify those requirements considered more relevant among all; (iii) identify market sectors based on the aspects most valuable to consumers; (iv) check on associations between one given product and certain features of general customers. The present work illustrates the combined use of statistical techniques such as the CHAID (Chi-squared Automatic Interaction Detector), factorial analysis, conjoint analysis and correspondence analysis. The successful application of these techniques is exemplified with the development of a new domestic cleaning environmental-friendly product.
38

Application of enviromental and hydrochemical analysis to characterize flow dynamics in the Sakumo Wetland, Ghana

Laar, Cynthia January 2018 (has links)
Philosophiae Doctor - PhD (Earth Science) / This research focused on understanding the current hydrogeology of the Sakumo wetland by developing a conceptual flow model and simulating the groundwater flow system. The purpose of the model is to assist in understanding the groundwater flow system and quantify the water fluxes contributing to the wetland water storage. The research adapted quantitative, qualitative and mixed analysis to characterize the water flow in the basin. This involved the use of numerical modelling techniques to determine the zones of groundwater discharge to the wetland and zones of wetland water released for groundwater recharge. Field investigation were carried out to estimate the hydraulic parameters and sample rainwater, wetland water and groundwater. The Sakumo wetland aquifer is situated in the quaternary unit consisting of sandy clay and weathered quartzite. The average annual precipitation in the study area from 1970 to 2016 was estimated at 760 mm/yr. Groundwater recharge rate was estimated as 5% of the mean annual rainfall which provided inputs into the numerical groundwater flow model. Evaporation from the wetland and evapotranspiration from the basin estimated using the Hargreaves and Samani method were 1341 mm/a and 546 mm/a, respectively. The hydrogeologic conceptual model was developed from the geology, borehole lithology, groundwater and wetland water levels.
39

Estratégias de aplicação de análise estatística multivariada no desenvolvimento de novos produtos

Silveira, Manoel Mendonça January 2010 (has links)
No processo de desenvolvimento de novos produtos e serviços, o entendimento de quais demandas são exigidas pelo mercado conduz ao desenvolvimento de projetos com melhores soluções aos clientes. Na busca deste entendimento, técnicas estatísticas multivariadas são utilizadas como suporte para identificar e valorar os requisitos derivados destas demandas. Nesse contexto, o objetivo deste trabalho é apresentar uma abordagem para aplicação de técnicas estatísticas multivariadas no processo de desenvolvimento de novos produtos (PDP). Estas técnicas podem auxiliar as empresas no gerenciamento de requisitos, contribuindo para: (i) coletar e organizar os requisitos do produto; (ii) identificar os requisitos considerados como mais relevantes; (iii) identificar os segmentos de mercado baseado nas características valoradas pelo público-alvo; (iv) verificar associações entre requisitos de um produto e determinadas características do público-alvo. Este trabalho apresenta um exemplo de aplicação contemplando o uso combinado de técnicas estatísticas tais como o método CHAID (Chi-squared Automatic Interaction Detector), análise fatorial, análise conjunta de atributos e análise de correspondência. A demonstração do emprego destas técnicas é realizada no desenvolvimento de um novo produto de limpeza doméstica produzido com características de sustentabilidade. / The deep understanding of market’s requirements, during the manufacturing of new products and/or services, leads to the creation of products of better configuration attend customers’ necessities. Multivariable analysis techniques can be employed to help identifying such consumer preferences. Therefore, the aim of this study is to illustrate an approach to the employment of multivariate statistical procedures on the development of new products (DNP). These techniques can assist companies in managing products’ requirements by helping them to: (i) assemble and categorize products’ requirements; (ii) identify those requirements considered more relevant among all; (iii) identify market sectors based on the aspects most valuable to consumers; (iv) check on associations between one given product and certain features of general customers. The present work illustrates the combined use of statistical techniques such as the CHAID (Chi-squared Automatic Interaction Detector), factorial analysis, conjoint analysis and correspondence analysis. The successful application of these techniques is exemplified with the development of a new domestic cleaning environmental-friendly product.
40

Desempenho da produção agropecuária dos municípios pertencentes ao escritório de desenvolvimento rural de Andradina/SP / Agricultural production performance of municipalities belonging to rural development office of Andradina/SP

Carvalho, Jaqueline Bonfim de [UNESP] 27 September 2016 (has links)
Submitted by Jaqueline Bonfim de Carvalho null (jaqueline91020@aluno.feis.unesp.br) on 2016-10-12T18:31:09Z No. of bitstreams: 1 Dissertação_Jaqueline_PPGA.pdf: 1046771 bytes, checksum: e0949a1dbe2620ff26a728f3e6220371 (MD5) / Approved for entry into archive by Juliano Benedito Ferreira (julianoferreira@reitoria.unesp.br) on 2016-10-18T17:12:21Z (GMT) No. of bitstreams: 1 carvalho_jb_me_ilha.pdf: 1046771 bytes, checksum: e0949a1dbe2620ff26a728f3e6220371 (MD5) / Made available in DSpace on 2016-10-18T17:12:21Z (GMT). No. of bitstreams: 1 carvalho_jb_me_ilha.pdf: 1046771 bytes, checksum: e0949a1dbe2620ff26a728f3e6220371 (MD5) Previous issue date: 2016-09-27 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / A análise de Escritórios de Desenvolvimento Rural (EDR) permite que se tenha uma melhor alocação dos recursos disponíveis, bem como um aumento da eficiência técnica da produção agropecuária, desde a geração de empregos, a produção de matérias-primas para a indústria e o mercado. O objetivo do trabalho foi analisar a eficiência agropecuária de treze municípios (DMU’s - Decision Making Units) pertencentes ao EDR de Andradina, Estado de São Paulo. A análise exploratória dos dados foi realizada, inicialmente, por meio da estatística multivariada, pelas técnicas de análise de agrupamento não hierárquica e pela análise dos componentes principais. Sequencialmente, foi realizada a análise envoltória de dados (Data Envelopment Analysis - DEA), com modelo variável de escala (BCC) e orientação output. Foram utilizados três inputs (terra, trabalho, capital) e um output (produção) totalizando quatro variáveis. A análise multivariada permitiu a formação de dois grupos, em que as variáveis terra, trabalho e produção contribuíram para a separação do agrupamento principal, tendo o município de Valparaíso como destaque. A análise DEA aponta que a maioria das DMU’s trabalhou de maneira ineficiente, tendo apenas 30,77% da amostra eficiente no modelo escolhido (BCC), indicando diferentes níveis tecnológicos das unidades agropecuárias analisadas. / The Rural Development Offices (EDR) analysis allows you to have a better allocation of available resources, and an increase in technical efficiency of agricultural production, since the creation of jobs, the production of raw materials for industry and the market. The objective was to analyze the agricultural efficiency of thirteen municipalities (DMU’s - Decision Making Units) belonging to the EDR of Andradina, State of São Paulo. Exploratory data analysis was performed, initially, by multivariate statistics, by non-hierarchical cluster analysis techniques and the analysis of the main components. Sequentially, the data envelopment analysis was conducted (Data Envelopment Analysis – DEA), with variable scale model (BCC) and output orientation. Three inputs were used (land, labor, capital) and output (production) totaling four variables. Multivariate analysis allowed the formation of two groups, in which the land variable, labor and production contributed for the separation of the main group, with the municipality of Valparaiso as highlight. The DEA analysis shows that most DMU's work inefficiently, with only 30.77% of the effective sample on the chosen model (BCC), indicating different technological levels of agricultural units analyzed.

Page generated in 0.1189 seconds