• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 247
  • 245
  • 41
  • 40
  • 24
  • 23
  • 9
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 746
  • 139
  • 128
  • 122
  • 105
  • 100
  • 82
  • 75
  • 72
  • 61
  • 58
  • 54
  • 53
  • 51
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Hybrid and data-driven modeling and control approaches to batch and continuous processes

Ghosh, Debanjan January 2022 (has links)
The focus of this thesis is on building models by utilizing process information: from data, from our knowledge of physics, or both. The closer the model approximates reality, the better is the expected performance in forecasting, soft-sensing, process monitoring, optimization and advanced process control. In the domain of batch and continuous manufacturing, quality models can help in ensuring tightly controlled product quality, having safe and reliable operating conditions and reducing production/operation costs. To this end, first a parallel grey box model was built which makes use of a mechanistic model, and a subspace identification model for modeling a batch poly methyl methacrylate (PMMA) polymerisation process. The efficacy of such a parallel hybrid model in the context of a control problem was illustrated thereafter for reducing the volume of fines. Real-time implementation of models in many cases demand the model to be tractable and simple enough, and thus the parallel hybrid model was next adapted to have a linear representation, and then used for control computations. While the parallel hybrid modelling strategy shows great advantages in many applications, there can be other avenues of using fundamental process knowledge in conjunction with historical data. In one such approach, a unique way of adding mechanistic knowledge to improve the estimation ability of PLS models was proposed. The predictor matrix of PLS was augmented with additional trajectory information coming strategically from a mechanistic model. This augmented model was used as a soft-sensor to estimate batch end quality for a seeded batch crystallizer process. In a collaborative work with an industrial partner focusing on estimating important variables of a hydroprocessing unit, an operational data based input-output model was chosen as the right fit in the absence of available mechanistic knowledge. The usefulness of linear dynamic modeling tools for such applications was demonstrated. / Thesis / Doctor of Philosophy (PhD)
62

Flow field analysis of batch and continuous mixing equipment

Yang, Haur-Horng January 1993 (has links)
No description available.
63

Monitoreo estadístico de procesos batch : aplicaciones a reactores de polimerización

Álvarez Medina, Carlos Rodrigo 27 March 2009 (has links)
Esta tesis comprende el desarrollo e implementación de nuevas metodologías de control estadístico multivariable (MSPC), su aplicación para monitorear la producción de procesos de polimerización en emulsión discontinuos (batch) y la comparación del desempeño de diferentes estrategias dedicadas al monitoreo estadístico de procesos batch. Los procesos de polimerización presentan características que los hacen atractivos como casos de estudio para evaluar las técnicas de MSPC. Dada la complejidad de estos sistemas, y la gran cantidad de fuentes de variación que los afectan, la implementación de técnicas de monitoreo y control basadas en datos resulta una alternativa atractiva y realizable. Los procedimientos de MSPC se componen de tres etapas fundamentales: detección, identificación y diagnosis. El estado del sistema se monitorea en forma continua para evaluar si el mismo se encuentra operando en condiciones normales. Si se detecta un evento anormal, resulta necesario identificar las variables que señalan esta condición y luego diagnosticar la causa primera de la anomalía. Los estudios y desarrollos de esta tesis comprenden a las etapas de detección e identificación de fallas. Se desarrolló una nueva metodología, denominada OSS (Original Space Strategy), para la descomposición del estadístico de Hotelling en el espacio de las mediciones, que permite evaluar la influencia que tiene cada variable en el valor de dicho estadístico. Las principales ventajas de la estrategia de identificación propuesta son las siguientes: permite monitorear el proceso usando un sólo estadístico, reduce significativamente las ambigüedades en la identificación de fallas propias de otras técnicas existentes, evita la posible pérdida de información originada por la proyección de los datos en un espacio de variables latentes de dimensión incorrecta, proporciona una clara comprensión del significado físico de las contribuciones negativas al estadístico y determina un valor límite para las mismas. Se propone la incorporación de la metodología OSS como herramienta de identificación en el nuevo procedimiento de monitoreo de procesos batch propuesto. En esta tesis se analizaron en detalle los métodos más utilizados para el monitoreo de procesos batch. Los mismos se basan en técnicas de proyección tales como el Análisis de Componentes Principales (PCA) y el Análisis de Componentes Independientes (ICA). Se comparó el desempeño de estas estrategias con la propuesta en esta tesis, mediante su aplicación a un reactor de polimerización. El conjunto de datos analizados se obtuvo por simulación usando un modelo riguroso de un reactor de polimerización en emulsión de metacrilato de metilo. La comparación del desempeño se basó en los resultados obtenidos mediante la aplicación de los procedimientos a una serie de fallas conocidas. Los mismos se compararon en términos de: la capacidad de cada técnica para detectar la existencia de la falla y la exactitud con la que la metodología indica las variables sospechosas durante la etapa de identificación. Los resultados de la aplicación de la estrategia propuesta, caracterizada por utilizar un solo estadístico, a este complejo caso de estudio muestran un muy buen desempeño de la misma, en lo referente a su velocidad de detección y exactitud en la identificación, cuando se lo compara con el obtenido usando PCA o ICA. / This thesis involves the development and implementation of new methodologies for Multivariate Statistical Process Control (MSPC), their application to monitor the production of batch emulsion polymerization processes and the performance comparison among different techniques devoted to the statistical monitoring of batch processes. Polymerization processes have distinctive features that make them attractive as case studies to evaluate MSPC techniques. Due to the inherent complexity of these systems and the huge amount of variation sources that affect them, the implementation of data-driven monitoring and control techniques arises as an attractive and feasible alternative. The MSPC procedures are made up of three essential stages: detection, identification and diagnosis. The process state is evaluated continuously to determine if its operation is normal. If an abnormal event is detected, it is necessary to identify the variables that signal this condition, and then to diagnosis the first source of the anomaly. The studies and developments of this thesis involve the fault detection and identification stages. A new methodology, called OSS (Original Space Strategy), is developed to decompose the Hotellings statistic that allows evaluating the influence of each variable on the statistic value. The main advantages of the proposed identification strategy are: process monitoring is accomplished using only one statistic, it significantly reduces the ambiguities in fault identification inherent to other existing techniques, it avoids the possible loss of information that may arise when data are projected into a latent variable space of inappropriate dimension, it provides a clear understanding of the physical meaning of negative contribution to the statistic value and determines a limit for them. It should be highlighted the incorporation of the OSS methodology as identification tool for the new proposed monitoring procedure. In this thesis the most used methods for batch process monitoring are analyzed in detail. They are based on projection techniques, such as, Principal Component Analysis (PCA) and Independent Component Analysis (ICA). A performance comparison is conducted among these strategies and the proposed one for their application to a polymerization reactor. The data set was obtained by simulation using a rigorous model of an emulsion polymerization reactor for the production of methylmetacrylate. The performance comparison is based on the results obtained from the application of the procedures to a set of known faults. They are compared in terms of the capacity of each technique to detect the existing faults and the accuracy of the methodology to indicate the suspicious variables during the identification stage. The application results of the proposed strategy, which is characterized by the use of only one statistic, to this complex case study show that it has a good performance regarding its detection velocity and identification accuracy, in comparison with methods base on PCA o ICA.
64

LRD and SRD Traffics: Review of Results and Open Issues for the Batch Renewal Process

Kouvatsos, Demetres D., Fretwell, Rod J. January 2002 (has links)
No / The batch renewal process is the least-biased choice of process given only the measures of count correlation and interval correlation at all lags.This paper reviews the batch renewal process, both for LRD (long-range-dependent) traffic and for SRD (short-range-dependent) traffic in the discrete space-discrete time domain, and in the wider context of general traffic in that domain. It shows some applications of the batch renewal process in simple queues and in queueing network models. The paper concludes with open research problems and issues arising from the discussion.
65

Characterization of the Recombinant Human Factor VIII Expressed in the Milk of Transgenic Swine

Hodges, William Anderson 28 February 2001 (has links)
Factor VIII is a protein which has therapeutic applications for the treatment of Hemophilia A. Its deficiency, either qualitative or quantitative, results in Hemophilia A, a disorder affecting approximately 1 in 10,000 males. Currently, FVIII replacement therapy uses FVIII derived from plasma or cell culture. The current cost of this therapy is in excess of $150,000 per patient per year. Thus, alternative sources that are more economical are attractive. The present work focuses upon the characterization of recombinant FVIII (rFVIII) made in the milk of transgenic pigs. Two dimensional western analysis of rFVIII obtained from pig whey showed a range of FVIII species having different isoelectric points (pI) consistent with diverse glycosylation patterns. The pI of these diverse FVIII populations were accurately predicted using theoretical calculations based upon primary protein structure as variable biantennary glycosylation patterns having 0, 1, or 2 sialic acid groups present. Kinetic limitations in the adsorption of rFVIII to anion exchange media due to the nature of the complex milk environment were observed. rFVIII was purified quantitatively using batch equilibration of whey with DEAE Sepharose. This material showed proteolytic processing that was very similar to FVIII obtained from human plasma. Based upon these results, it was postulated that a dissociation of the light (A3C1C2) and heavy (A1A2B) chain due to a lack of vWF may be responsible for the low FVIII activity. / Master of Science
66

Bayesian optimization with empirical constraints

Azimi, Javad 05 September 2012 (has links)
Bayesian Optimization (BO) methods are often used to optimize an unknown function f(���) that is costly to evaluate. They typically work in an iterative manner. In each iteration, given a set of observation points, BO algorithms select k ��� 1 points to be evaluated. The results of those points are then added to the set of observations and the procedure is repeated until a stopping criterion is met. The goal is to optimize the function f(���) with a small number of experiment evaluations. While this problem has been extensively studied, most existing approaches ignored some real world constraints frequently encountered in practical applications. In this thesis, we extend the BO framework in a number of important directions to incorporate some of these constraints. First, we introduce a constrained BO framework where instead of selecting a precise point at each iteration, we request a constrained experiment that is characterized by a hyper-rectangle in the input space. We introduce efficient sequential and non-sequential algorithms to select a set of constrained experiments that best optimize f(���) within a given budget. Second, we introduce one of the first attempts in batch BO where instead of selecting one experiment at each iteration, a set of k > 1 experiments is selected. This can significantly speedup the overall running time of BO. Third, we introduce scheduling algorithms for the BO framework when: 1) it is possible to run concurrent experiments; 2) the durations of experiments are stochastic, but with a known distribution; and 3) there is a limited number of experiments to run in a fixed amount of time. We propose both online and offline scheduling algorithms that effectively handle these constraints. Finally, we introduce a hybrid BO approach which switches between the sequential and batch mode. The proposed hybrid approach provides us with a substantial speedup against sequential policies without significant performance loss. / Graduation date: 2013
67

Run-to-run modelling and control of batch processes

Duran Villalobos, Carlos Alberto January 2016 (has links)
The University of ManchesterCarlos Alberto Duran VillalobosDoctor of Philosophy in the Faculty of Engineering and Physical SciencesDecember 2015This thesis presents an innovative batch-to-batch optimisation technique that was able to improve the productivity of two benchmark fed-batch fermentation simulators: Saccharomyces cerevisiae and Penicillin production. In developing the proposed technique, several important challenges needed to be addressed:For example, the technique relied on the use of a linear Multiway Partial Least Squares (MPLS) model to adapt from one operating region to another as productivity increased to estimate the end-point quality of each batch accurately. The proposed optimisation technique utilises a Quadratic Programming (QP) formulation to calculate the Manipulated Variable Trajectory (MVT) from one batch to the next. The main advantage of the proposed optimisation technique compared with other approaches that have been published was the increase of yield and the reduction of convergence speed to obtain an optimal MVT. Validity Constraints were also included into the batch-to-batch optimisation to restrict the QP calculations to the space only described by useful predictions of the MPLS model. The results from experiments over the two simulators showed that the validity constraints slowed the rate of convergence of the optimisation technique and in some cases resulted in a slight reduction in final yield. However, the introduction of the validity constraints did improve the consistency of the batch optimisation. Another important contribution of this thesis were a series of experiments that were implemented utilising a variety of smoothing techniques used in MPLS modelling combined with the proposed batch-to-batch optimisation technique. From the results of these experiments, it was clear that the MPLS model prediction accuracy did not significantly improve using these smoothing techniques. However, the batch-to-batch optimisation technique did show improvements when filtering was implemented.
68

Diferentes parâmetros de produção de goma xantana pela fermentação de Xanthomonas campestris pv campestris

Oliveira, Kassandra Sussi Mustafé [UNESP] 01 October 2009 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:27:24Z (GMT). No. of bitstreams: 0 Previous issue date: 2009-10-01Bitstream added on 2014-06-13T19:26:15Z : No. of bitstreams: 1 oliveira_ksm_me_rcla.pdf: 637533 bytes, checksum: f315d101c191461c95fee11e9fd4d042 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / A goma xantana é um biopolímero produzido por Xanthomonas campestris muito utilizado como agente espessante. A síntese do biopolímero pode ocorrer em variadas condições, no entanto a qualidade a goma produzida também muda. Alguns trabalhos sugerem que a produção de goma xantana por batelada alimentada pode resultar num desempenho melhor em concentração de goma, rendimento e produtividade quando comparado à batelada simples. Nesse trabalho foram avaliados a influência de diferentes metodologias de oferta de sacarose na produção de goma xantana e sua qualidade, e também a produtividade e rendimento do processo. As melhores metodologias foram testadas em bioreator de 7,5L. Também foram avaliadas diferentes metodologias de separação do biopolímero (com o acréscimo de sais ao caldo ou ao etanol na etapa de extração da goma), bem como a influência de surfactantes e sais na viscosidade da goma em solução. Em incubadora com agitação orbital a quantidade e qualidade de goma xantana produzida a 25°C foram mais interessantes. Em fermentador, a produção de xantana a 30°C em meio contendo 2% de sacarose pode ter seu tempo reduzido sem que isso afete a concentração e a viscosidade da goma obtida. A viscosidade do biopolímero produzido a 25°C em meio contendo 4% de sacarose foi superior (365,9 cP) quando comparado ao biopolímero produzido a 30°C, em meio contendo 2% de sacarose. A produtividade e concentração de goma obtidos estão entre os encontrados na literatura (0,34 g xantana L-1 h-1 e 24 g xantana L-1). O uso de sais na extração da goma permite a redução do solvente utilizado e uma goma de melhor qualidade. Destacaram-se a utilização dos sais NaCl na concentração de 0,01% e CaCl2 a 0,05%. O tratamento térmico aumenta a quantidade e a viscosidade do biopolímero além de eliminar as células. A adição de 0,01% de SDS e de 0,001% de Tween 80 na solução de goma xantana aumenta sua viscosidade. / Xanthan gum is a biopolymer produced by Xanthomonas campestris, used as a thickener. Its synthesis can happen in different conditions, but quality of xanthan produced also changes. Studies suggest that fed batch xanthan gum production can result in better performance in gum concentration, yield and productivity, when compared to batch. This work evaluates the influence of different methods of sucrose supply on xanthan gum quantity and quality, as well its productivity and process efficiency. The best methods were tested in a bioreactor of 7.5 L. Were also evaluated different separation methods (with the addition of salt in broth or ethanol during the separation phase) and the influence of surfactants on viscosity on xanthan solution. In shaker incubator, the gum produced at 25°C was more interesting. In bioreactor, the xanthan production at 30°C using 2% sucrose can be reduced to 40 hours, without affecting concentration and viscosity of the gum obtained. The viscosity of the biopolymer produced at 25°C using 4% sucrose were higher (365.9 cP) when compared to biopolymer produced at 30°C with 2% sucrose. Concentration and yield are similar than those found in the literature (0.34 g xanthan L-1 h-1 and 24 g xanthan L-1). The use of salts in xanthan extraction reduces the input and a better quality gum. The salts NaCl (concentration of 0.01%) and CaCl2 (0.05%) showed best results. Heat treatment increases the xanthan quantity and viscosity and eliminates cells from broth. The addition of 0.01% SDS and 0.001% Tween 80 in the solution of xanthan gum increases its viscosity.
69

Diferentes parâmetros de produção de goma xantana pela fermentação de Xanthomonas campestris pv campestris /

Oliveira, Kassandra Sussi Mustafé. January 2009 (has links)
Orientador: Pedro de Oliva Neto / Banca: Jonas Contiero / Banca: Ranulfo Monte Alegre / Resumo: A goma xantana é um biopolímero produzido por Xanthomonas campestris muito utilizado como agente espessante. A síntese do biopolímero pode ocorrer em variadas condições, no entanto a qualidade a goma produzida também muda. Alguns trabalhos sugerem que a produção de goma xantana por batelada alimentada pode resultar num desempenho melhor em concentração de goma, rendimento e produtividade quando comparado à batelada simples. Nesse trabalho foram avaliados a influência de diferentes metodologias de oferta de sacarose na produção de goma xantana e sua qualidade, e também a produtividade e rendimento do processo. As melhores metodologias foram testadas em bioreator de 7,5L. Também foram avaliadas diferentes metodologias de separação do biopolímero (com o acréscimo de sais ao caldo ou ao etanol na etapa de extração da goma), bem como a influência de surfactantes e sais na viscosidade da goma em solução. Em incubadora com agitação orbital a quantidade e qualidade de goma xantana produzida a 25°C foram mais interessantes. Em fermentador, a produção de xantana a 30°C em meio contendo 2% de sacarose pode ter seu tempo reduzido sem que isso afete a concentração e a viscosidade da goma obtida. A viscosidade do biopolímero produzido a 25°C em meio contendo 4% de sacarose foi superior (365,9 cP) quando comparado ao biopolímero produzido a 30°C, em meio contendo 2% de sacarose. A produtividade e concentração de goma obtidos estão entre os encontrados na literatura (0,34 g xantana L-1 h-1 e 24 g xantana L-1). O uso de sais na extração da goma permite a redução do solvente utilizado e uma goma de melhor qualidade. Destacaram-se a utilização dos sais NaCl na concentração de 0,01% e CaCl2 a 0,05%. O tratamento térmico aumenta a quantidade e a viscosidade do biopolímero além de eliminar as células. A adição de 0,01% de SDS e de 0,001% de Tween 80 na solução de goma xantana aumenta sua viscosidade. / Abstract: Xanthan gum is a biopolymer produced by Xanthomonas campestris, used as a thickener. Its synthesis can happen in different conditions, but quality of xanthan produced also changes. Studies suggest that fed batch xanthan gum production can result in better performance in gum concentration, yield and productivity, when compared to batch. This work evaluates the influence of different methods of sucrose supply on xanthan gum quantity and quality, as well its productivity and process efficiency. The best methods were tested in a bioreactor of 7.5 L. Were also evaluated different separation methods (with the addition of salt in broth or ethanol during the separation phase) and the influence of surfactants on viscosity on xanthan solution. In shaker incubator, the gum produced at 25°C was more interesting. In bioreactor, the xanthan production at 30°C using 2% sucrose can be reduced to 40 hours, without affecting concentration and viscosity of the gum obtained. The viscosity of the biopolymer produced at 25°C using 4% sucrose were higher (365.9 cP) when compared to biopolymer produced at 30°C with 2% sucrose. Concentration and yield are similar than those found in the literature (0.34 g xanthan L-1 h-1 and 24 g xanthan L-1). The use of salts in xanthan extraction reduces the input and a better quality gum. The salts NaCl (concentration of 0.01%) and CaCl2 (0.05%) showed best results. Heat treatment increases the xanthan quantity and viscosity and eliminates cells from broth. The addition of 0.01% SDS and 0.001% Tween 80 in the solution of xanthan gum increases its viscosity. / Mestre
70

Avaliação de materiais argilosos da Formação Corumbataí para uso em liners compactados (CCL) / Evaluation of clay materials from Corumbataí Formation to use in compressed liners (CCL)

Amanda Francieli de Almeida 18 December 2015 (has links)
A disposição final dos resíduos, de forma a minimizar a contaminação das águas, é feita, em geral, em aterros sanitários os quais devem apresentar na base camadas de argila compactada (CCL) que também são conhecidas como liners. Esses sistemas de barreiras desempenham funções diversas, dentre as quais se destacam o isolamento do resíduo e a diminuição da infiltração e a minimização da migração de contaminantes (filtragem, sorção e outras reações geoquímicas) em direção à água subterrânea. O objetivo deste trabalho foi avaliar os materiais argilosos relacionados à Formação Corumbataí com o intuito de selecionar os materiais que reúnem as melhores características para serem usados em liners compactados. Os aspectos avaliados foram a retenção de contaminantes por meio dos ensaios de equilíbrio em lote (batch test) e percolação em coluna com solução de CuCl2.2H2O, e avaliação da resistência à compressão simples do solo compactado, para suportar as cargas exercidas em um aterro sanitário. Para os cálculos dos parâmetros de adsorção utilizando o batch test, procedeu-se à construção e linearização das isotermas e, a partir do coeficiente de determinação, foi possível observar que os melhores ajustes foram com os modelos linear e de Freundlich. A isoterma de melhor ajuste para o cátion foi à de Freundlich em todas as amostras, destacando principalmente AM-2 e AM-16 com R² de 0,9983 e 0,9978 respectivamente. Na percolação em coluna os valores do fator de retardamento (Rd) para o Cl- e Cu++ foram determinados utilizando os métodos de Freeze e Cherry (1979) e Shackelford (1994) nas curvas de chegada. Na resistência à compressão simples a amostra mais significativa foi a AM-3 que resistiu uma força média de 992,1 N, chegando a uma tensão média de 477,4 kPa. Após uma análise integrada as amostras com maior desempenho foram AM-2 e AM-3, sendo que a AM-2 não foi apta apenas em um cenário elaborado para analisar a resistência à compressão simples. / The final waste disposal is usually the landfills. In order to minimize water contamination because of the waste, the landfills ought to have layers of clay compacted (CCL). Those layers are also known as liners. The barriers system has many functions, for instance, the isolation of the waste, the reduction of infiltration and also the reduction of contaminants migrations (filtering, sorption and other geochemical reactions) toward groundwater. This paper aims to evaluate the clay materials presents in Corumbataí Formation. The main objective was to select materials that have the best characteristics to be used in compacted liner. The aspects that were analyzed includes: the retentions of contaminants using batch test, and also column percolation with CuCl2.2H2O solution. It was also evaluated the resistance of the compacted soil to stand the loads exerted in a landfill. To calculate the adsorption parameters by using the batch test, the constructions and also the linearization of the isotherms were made, through coefficient of determination as its base. Because of those tests it was possible to identify that the best settings are the linear model and also the Freundlich model. The isotherm that presented the best adjustment for the cation was Freundlich isotherm. It was the best adjustment in all samples, mainly in AM-2 and also in AM-16 with R² of 0,9983 and 0,9978 respectively. In percolation column the values of retardation factor (Rd) for Cl- and also for Cu++, were determined by using Freeze and Cherry (1979) and also Shackelford\'s methods (1994) on breakthrough curves. In the \"compressive strength\", the most significant sample was AM-3 that resisted an average force of 992.1 N, reaching an average stress of 477.4 kPa. After an integrated analysis, the best samples were AM-2 and AM-3. However, the AM-2 was not able to work in a scenario that was created to analyze an unconfined compressive strength.

Page generated in 0.0239 seconds