• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 17
  • 7
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 83
  • 83
  • 15
  • 12
  • 11
  • 10
  • 10
  • 9
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Computational and experimental investigation of the enzymatic hydrolysis of cellulose

Bansal, Prabuddha 25 August 2011 (has links)
The enzymatic hydrolysis of cellulose to glucose by cellulases is one of the major steps in the conversion of lignocellulosic biomass to biofuel. This hydrolysis by cellulases, a heterogeneous reaction, currently suffers from some major limitations, most importantly a dramatic rate slowdown at high degrees of conversion in the case of crystalline cellulose. Various rate-limiting factors were investigated employing experimental as well as computational studies. Cellulose accessibility and the hydrolysable fraction of accessible substrate (a previously undefined and unreported quantity) were shown to decrease steadily with conversion, while cellulose reactivity, defined in terms of hydrolytic activity per amount of actively adsorbed cellulase, remained constant. Faster restart rates were observed on partially converted cellulose as compared to uninterrupted hydrolysis rates, supporting the presence of an enzyme clogging phenomenon. Cellulose crystallinity is a major substrate property affecting the rates, but its quantification has suffered from lack of consistency and accuracy. Using multivariate statistical analysis of X-ray data from cellulose, a new method to determine the degree of crystallinity was developed. Cel7A CBD is a promising target for protein engineering as cellulose pretreated with Cel7A CBDs exhibits enhanced hydrolysis rates resulting from a reduction in crystallinity. However, for Cel7A CBD, a high throughput assay is unlikely to be developed. In the absence of a high throughput assay (required for directed evolution) and extensive knowledge of the role of specific protein residues (required for rational protein design), the mutations need to be picked wisely, to avoid the generation of inactive variants. To tackle this issue, a method utilizing the underlying patterns in the sequences of a protein family has been developed.
22

Staffing service centers under arrival-rate uncertainty

Zan, Jing, 1983- 13 July 2012 (has links)
We consider the problem of staffing large-scale service centers with multiple customer classes and agent types operating under quality-of-service (QoS) constraints. We introduce formulations for a class of staffing problems, minimizing the cost of staffing while requiring that the long-run average QoS achieves a certain pre-specified level. The queueing models we use to define such service center staffing problems have random inter-arrival times and random service times. The models we study differ with respect to whether the arrival rates are deterministic or stochastic. In the deterministic version of the service center staffing problem, we assume that the customer arrival rates are known deterministically. It is computationally challenging to solve our service center staffing problem with deterministic arrival rates. Thus, we provide an approximation and prove that the solution of our approximation is asymptotically optimal in the sense that the gap between the optimal value of the exact model and the objective function value of the approximate solution shrinks to zero as the size of the system grows large. In our work, we also focus on doubly stochastic service center systems; that is, we focus on solving large-scale service center staffing problems when the arrival rates are uncertain in addition to the inherent randomness of the system's inter-arrival times and service times. This brings the modeling closer to reality. In solving the service center staffing problems with deterministic arrival rates, we provide a solution procedure for solving staffing problems for doubly stochastic service center systems. We consider a decision making scheme in which we must select staffing levels before observing the arrival rates. We assume that the decision maker has distributional information about the arrival rates at the time of decision making. In the presence of arrival-rate uncertainty, the decision maker's goal is to minimize the staffing cost, while ensuring the QoS achieves a given level. We show that as the system scales large in size, there is at most one key scenario under which the probability of waiting converges to a non-trivial value, i.e., a value strictly between 0 and 1. That is, the system is either over- or under-loaded in any other scenario as the size of the system grows to infinity. Exploiting this result, we propose a two-step solution procedure for the staffing problem with random arrival rates. In the first step, we use the desired QoS level to identify the key scenario corresponding to the optimal staffing level. After finding the key scenario, the random arrival-rate model reduces to a deterministic arrival-rate model. In the second step, we solve the resulting model, with deterministic arrival rate, by using the approximation model we point to above. The approximate optimal staffing level obtained in this procedure asymptotically converges to the true optimal staffing level for the random arrival-rate problem. The decision making scheme we sketch above, assumes that the distribution of the random arrival rates is known at the time of decision making. In reality this distribution must be estimated based on historical data and experience, and needs to be updated as new observations arrive. Another important issue that arises in service center management is that in the daily operation in service centers, the daily operational period is split into small decision time periods, for example, hourly periods, and then the staffing decisions need to be made for all such time periods. Thus, to achieve an overall optimal daily staffing policy, one must deal with the interaction among staffing decisions over adjacent time periods. In our work, we also build a model that handles the above two issues. We build a two-stage stochastic model with recourse that provides the staffing decisions over two adjacent decision time periods, i.e., two adjacent decision stages. The model minimizes the first stage staffing cost and the expected second stage staffing cost while satisfying a service quality constraint on the second stage operation. A Bayesian update is used to obtain the second-stage arrival-rate distribution based on the first-stage arrival-rate distribution and the arrival observations in the first stage. The second-stage distribution is used in the constraint on the second stage service quality. After reformulation, we show that our two-stage model can be expressed as a newsvendor model, albeit with a demand that is derived from the first stage decision. We provide an algorithm that can solve the two-stage staffing problem under the most commonly used QoS constraints. This work uses stochastic programming methods to solve problems arising in queueing networks. We hope that the ideas that we put forward in this dissertation lead to other attempts to deal with decision making under uncertainty for queueing systems that combine techniques from stochastic programming and analysis tools from queueing theory. / text
23

Advanced Analytical Model for the Prognostic of Industrial Systems Subject to Fatigue

Abou Jaoudé, Abdo 07 December 2012 (has links) (PDF)
This thesis is dedicated to the prognostic evaluation of dynamic systems. The work presented here aims at developing an advanced tool to treat the prognostic evaluation in linear and nonlinear deterministic context in a first part as well as in the stochastic context in a second part. Our purpose is to prepare a general prognostic tool that can be capable of well predicting the RUL of a system based on an analytical damage accumulation law in either a deterministic or a stochastic context.
24

Multi - Timescale Control of Energy Storage Enabling the Integration of Variable Generation

Zhu, Dinghuan 01 May 2014 (has links)
A two-level optimal coordination control approach for energy storage and conventional generation consisting of advanced frequency control and stochastic optimal dispatch is proposed to deal with the real power balancing control problem introduced by variable renewable energy sources (RESs) in power systems. In the proposed approach, the power and energy constraints on energy storage are taken into account in addition to the traditional power system operational constraints such as generator output limits and power network constraints. The advanced frequency control level which is based on the robust control theory and the decentralized static output feedback design is responsibl e for the system frequency stabilization and restoration, whereas the stochastic optimal dispatch level which is based on the concept of stochastic model predictive control (SMPC) determines the optimal dispatch of generation resources and energy storage under uncertainties introduced by RESs as well as demand. In the advanced frequency control level, low-order decentralized robust frequency controllers for energy storage and conventional generation are simultaneously designed based on a state-space structure-preserving model of the power system and the optimal controller gains are solved via an improved linear matrix inequality algorithm. In the stochastic optimal dispatch level, various optimization decomposition techniques including both primal and dual decompositions together with two different decomposition schemes (i.e. scenario-based decomposition and temporal-based decomposition) are extensively investigated in terms of convergence speed due to the resulting large-scale and computationally demanding SMPC optimization problem. A two-stage mixed decomposition method is conceived to achieve the maximum speedup of the SMPC optimization solution process. The underlying control design philosophy across the entire work is the so-called time-scale matching principle, i.e. the conventional generators are mainly responsible to balance the low frequency components of the power variations whereas the energy storage devices because of their fast response capability are employed to alleviate the relatively high frequency components. The performance of the proposed approach is tested and evaluated by numerical simulations on both the WECC 9-bus system and the IEEE New England 39-bus system.
25

Discrete-time Stochastic Analysis Of Land Combat

Eliiyi, Ugur 01 January 2004 (has links) (PDF)
In this study, we present the implementation and experimental analysis of a modeling approach for analyzing tactical level land combat to generate information for weapon and ammunition planning. The discrete-time stochastic model (DSM), which can handle small and moderately large force levels, is based on single shot kill probabilities. Forces are assumed to be heterogeneous on both sides, and both directed and area fire types are modeled by means of combinatorial analysis. DSM considers overkills and can handle noncombat loss and engagement processes, discrete reinforcements, force combinations and divisions. In addition to experimenting with DSM, we estimate attrition rate coefficients used in Lanchester combat models, such that the two models will yield similar figures for force levels throughout the combat.
26

Using the eddy covariance technique to measure gas exchanges in a beef cattle feedlot

Prajapati, Prajaya January 1900 (has links)
Doctor of Philosophy / Department of Agronomy / Eduardo Alvarez Santos / Measurements of methane (CH₄) emissions from livestock production could provide invaluable data to reduce uncertainties in the global CH₄ budget and to evaluate mitigation strategies to lower greenhouse gas (GHG) emissions. The eddy covariance (EC) technique has recently been applied as an alternative to measure CH₄ emissions from livestock systems, but heterogeneities in the source area and fetch limitations impose challenges to EC measurements. The main objectives of this study were to: 1) assess the performance of a closed-path EC system for measuring CH₄, CO₂, and H₂0 fluxes; 2) investigate the spatial variability of the EC fluxes in a cattle feedlot using flux footprint analysis; 3) estimate CH₄ emission rates per animal (Fanimal) from a beef cattle feedlot using the EC technique combined with two footprint models: an analytical footprint model (KM01) and a parametrization of a Lagrangian dispersion model (FFP); and 4) compare CH₄ emissions obtained using the EC technique and a footprint analysis with CH₄ emission estimates provided by a well-stablished backward-Lagrangian stochastic (bLS) model. A closed-path EC system was used to measure CH₄, CO₂, and H₂0 fluxes. To evaluate the performance of this closed-path system, a well-stablished open-path EC system was also deployed on the flux tower to measure CO₂ and H₂0 exchange. Methane concentration measurements and wind data provided by that system were used to estimate CH₄ emissions using the bLS model. The performance assessment that included comparison of gas cospectra and measured fluxes from the two EC systems showed that the closed-path system was suitable for the EC measurements. Flux values were quite variable during the field experiment. A one-dimensional flux footprint model was useful to interpret some of the flux temporal and spatial dynamics. Then, a more comprehensive data analysis was carried out using two-dimensional footprint models (FFP and KM01) to interpret fluxes and scale fluxes measured at landscape to animal level. The monthly average Fanimal, calculated using the footprint weighed stocking density ranged from 83 to 125 g animal⁻¹ d⁻¹ (KM01) and 75–114 g animal⁻¹ d⁻¹ (FFP). These emission values are consistent with the results from previous studies in feedlots however our results also suggested that in some occasions the movement of animals on the pens could have affected CH₄ emission estimates. The results from the comparisons between EC and bLS CH₄ emission estimates show good agreement (0.84; concordance coefficient) between the two methods. In addition, the precision of the EC as compared to the bLS estimates was improved by using a more rigorous fetch screening criterion. Overall, these results indicate that the eddy covariance technique can be successfully used to accurately measure CH₄ emissions from feedlot cattle. However, further work is still needed to quantify the uncertainties in Fanimal caused by errors in flux footprint model estimates and animal movement.
27

Modelo estocástico de pressões de produtos armazenados para a estimativa da confiabilidade estrutural de silos esbeltos / Reliability of slender silo evaluation using a pressure stochastic model

Andrés Batista Cheung 24 August 2007 (has links)
Os silos verticais são estruturas com elevado índice de deformações excessivas e ruptura causados, principalmente, pelo desconhecimento da variabilidade nas pressões devidas ao produto armazenado. O objetivo deste trabalho é apresentar um estudo teórico, numérico e experimental das pressões exercidas pelos produtos armazenados granulares nas paredes de silos esbeltos, com a proposta da incorporação de parâmetros com propriedades estocásticas, nos modelos de pressões apresentados na literatura. Os parâmetros mais relevantes dos modelos de pressões foram ajustados aos dados experimentais obtidos em um silo-piloto, utilizando a técnica de estimação de parâmetros por máxima verossimilhança (EMV), e, para isso, foram empregados os algoritmos genéticos (AGs) como procedimento de otimização. As avaliações experimentais no silo-piloto foram conduzidas com três produtos: soja, milho e ração. Com as variabilidades dos parâmetros dos modelos de pressões encontrados nos experimentos, a confiabilidade estrutural dos silos verticais metálicos cilíndricos de chapas onduladas e fundo plano foi avaliada por meio da técnica de simulação de Monte Carlo (SMC). Os resultados mostraram que os modelos de pressões de Janssen (1895) e de Jenike et al. (1973) podem ser utilizados para o cálculo das pressões com as variabilidades dos parâmetros representadas pela distribuição lognormal. A avaliação da probabilidade de falha para este sistema está acima dos limites recomendados internacionalmente, indicando que atenção especial deve ser dada aos projetos de silos verticais esbeltos. / Vertical silos are structures with a large number of deformations and failures mainly due to misunderstanding of pressure variability of the storage products. The aim of this work is theoretical, numerical and experimental study of wall pressure in slender silos with the incorporation of stochastic properties of the parameters in pressures models used in the international literature. The most relevant parameters of the pressure models were adjusted to the experimental data obtained from a pilot-silo using maximum likelihood function, and for this purpose, genetic algorithms (GA) were used in the optimization procedure. The experimental evaluation in pilot-silo was conducted with three different bulk solids, which are: maize, soy and animal feed mixture. With the pressures models parameters, the structural reliability of flat bottom corrugated cylindrical steel silos with was evaluated using Monte Carlo simulation (SMC) to simulate a stochastic process. The results showed that Janssen (1895) and Jenike et al. (1973) pressure models can be used to evaluate the pressures with the parameters uncertainties modeled to lognormal distributions. The reliability index determined in this structural system was less than international recommended values for the design of slender silos.
28

Avaliação de modelos estocásticos no posicionamento GNSS /

Silva, Heloísa Alves da. January 2009 (has links)
Resumo: Atualmente, o GNSS, em especial o GPS, é uma das tecnologias mais utilizadas para realizar posicionamento. Os modelos funcionais relacionados com as observações GNSS são mais conhecidos do que os modelos estocásticos, visto que o desenvolvimento destes últimos é mais complexo. Normalmente, no posicionamento GNSS são utilizados modelos estocásticos numa forma simplificada, com um modelo padrão, o qual assume que todas as medidas das observações GNSS têm a mesma variância e são estatisticamente independentes. Porém, atualmente os modelos estocásticos relacionados ao GNSS vêm sendo pesquisados com maior profundidade, por exemplo, considerando efeitos de cintilação ionosférica. Este efeito pode ser considerado na modelagem estocástica já que atualmente receptores GNSS permitem a extração de parâmetros de cintilação ionosférica. Além dessa, outro tipo de modelagem estocástica pode ser realizada, no caso, trata-se da consideração da variação dos ângulos de elevação dos satélites durante o rastreio dos dados. Sendo assim, nessa pesquisa foram desenvolvidos e analisados esses dois casos de modelagem estocástica, tanto no posicionamento relativo, quanto no absoluto (por ponto). No posicionamento relativo, ao se considerar a modelagem estocástica em função da cintilação ionosférica, os resultados atingiram melhorias em torno de 93,0% em relação à modelagem padrão. No processamento e análise foram utilizados dados GPS coletados no Norte da Europa, os quais estão sob condições de cintilação ionosférica. No posicionamento relativo considerando a modelagem estocástica em função dos ângulos de elevação dos satélites, as melhorias foram em torno de 89,2%. No caso do posicionamento por ponto, as melhorias em relação a modelagem estocástica padrão atingiram valores de aproximadamente 45,1% e 42,1% considerando, respectivamente... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Nowadays, the GNSS, especially the GPS, is one of the most used techniques to accomplish positioning. The functional models related with the GNSS observables are more known than the stochastic models, considering that the development of the last ones is more complex. Usually, they are used in a simplified form, as the standard model, which assumes that all the GNSS observable have the same variance and are statistically independent. However, the stochastic models are being investigated with more property, for example, considering the ionospheric scintillation effects. This effect can be considered in the stochastic modelling since now receivers GNSS allow the extraction of ionospheric scintillation parameters. Besides that, others stochastic modelling can be accomplished, e.g. considering the variation of the satellites elevation angles during the data tracking. Thus, in this dissertation it was investigated the two cases of stochastic modelling cited above, either in the relative or in the absolute positioning... (Complete abstract click electronic access below) / Orientador: Paulo de Oliveira Camargo / Coorientador: João Francisco Galera Monico / Banca: Mauricio Alfredo Gende / Banca: Silvio Jacks dos Anjos Garnés / Mestre
29

Implementation of a process management model and inventory control to increase the level of service in the after-sales area of industrial equipment

Guzman, Pamela, Montalvo, Franklin, Carvallo, Edgardo, Raymundo, Carlos 01 January 2019 (has links)
Currently, many industrial companies generate many losses in their after sales this is generated to an error in the control of inventories and in turn in a mishandling of their assets through their processes, therefore the present research proposes a management model of processes and control of inventories in order to reduce customer losses due to the large number of existing complaints and thus increase sales. Therefore, the process management model, which will contribute to the standardization, optimization and control of its critical processes, and for which indicators, procedures and policies will be proposed for each stage of the process in order to measure its efficiency and effectiveness for correct decision making. With respect to the control of inventories, it is proposed to make a segmentation of the products, which allows the identification of the most important products for the company, to then design a demand forecast model for families of products that will later be individualized considering their factors initial and finally using an inventory management model will obtain the optimal quantity of spare parts order to use in the technical service of the equipment reducing the annual costs related to the inventory.
30

Stochastic models for protein production : the impact of autoregulation, cell cycle and protein production interactions on gene expression / Modèles stochastiques pour la production des protéines : l'impact de l'autorégulation, du cycle cellulaire et des intéractions entre les productions de protéines sur l'expression génétique

Dessalles, Renaud 11 January 2017 (has links)
Le mécanisme de production des protéines, qui monopolise la majorité des ressources d'une bactérie, est hautement stochastique: chaque réaction biochimique qui y participe est due à des collisions aléatoires entre molécules, potentiellement présentes en petites quantités. La bonne compréhension de l'expression génétique nécessite donc de recourir à des modèles stochastiques qui sont à même de caractériser les différentes origines de la variabilité dans la production ainsi que les dispositifs biologiques permettant éventuellement de la contrôler.Dans ce contexte, nous avons analysé la variabilité d'une protéine produite avec un mécanisme d'autorégulation négatif: c'est-à-dire dans le cas où la protéine est un répresseur pour son propre gène. Le but est de clarifier l'effet de l'autorégulation sur la variance du nombre de protéines exprimées. Pour une même production moyenne de protéine, nous avons cherché à comparer la variance à l'équilibre d'une protéine produite avec le mécanisme d'autorégulation et celle produite en « boucle ouverte ». En étudiant un modèle limite, avec une mise à l'échelle (scaling), nous avons pu faire une telle comparaison de manière analytique. Il apparaît que l'autorégulation réduit effectivement la variance, mais cela reste néanmoins limité : un résultat asymptotique montre que la variance ne pourra pas être réduite de plus de 50%. L'effet sur la variance à l'équilibre étant modéré, nous avons cherché un autre effet possible de l'autorégulation: nous avons observé que la vitesse de convergence à l'équilibre est plus rapide dans le cadre d'un modèle avec autorégulation.Les modèles classiques de production des protéines considèrent un volume constant, sans phénomènes de division ou de réplication du gène, avec des ARN-polymérases et les ribosomes en concentrations constantes. Pourtant, les variation au cours du cycle de chacune de ces quantités a été proposée dans la littérature comme participant à la variabilité des protéines. Nous proposons une série de modèles de complexité croissante qui vise à aboutir à une représentation réaliste de l'expression génétique. Dans un modèle avec un volume suivant le cycle cellulaire, nous intégrons successivement le mécanisme de production des protéines (transcription et traduction), la répartition aléatoire des composés à la division et la réplication du gène. Le dernier modèle intègre enfin l'ensemble des gènes de la cellule et considère leurs interactions dans la production des différentes protéines à travers un partage commun des ARN-polymérases et des ribosomes, présents en quantités limitées. Pour les modèles où cela était possible, la moyenne et la variance de la concentration de chacune des protéines ont été déterminées analytiquement en ayant eu recours au formalisme des Processus Ponctuels de Poisson Marqués. Pour les cas plus complexes, nous avons estimé la variance au moyen de simulations stochastiques. Il apparaît que, dans l'ensemble des mécanismes étudiés, la source principale de la variabilité provient du mécanisme de production des protéines lui-même (bruit dit « intrinsèque »). Ensuite, parmi les autres aspects « extrinsèques », seule la répartition aléatoire des composés semble avoir potentiellement un effet significatif sur la variance; les autres ne montrent qu'un effet limité sur la concentration des protéines. Ces résultats ont été confrontés à certaines mesures expérimentales, et montrent un décalage encore inexpliqué entre la prédiction théorique et les données biologiques, ce qui appelle à de nouvelles hypothèses quant aux possibles sources de variabilité.En conclusion, les processus étudiés ont permis une meilleure compréhension des phénomènes biologiques en explorant certaines hypothèses difficilement testables expérimentalement. Des modèles étudiés, nous avons pu dégager théoriquement certaines tendances, montrant que la modélisation stochastique est un outil important pour la bonne compréhension des mécanismes d'expression génétique. / The mechanism of protein production, to which is dedicated the majority of resources of the bacteria, is highly stochastic: every biochemical reaction that is involved in this process is due to random collisions between molecules, potentially present in low quantities. The good understanding of gene expression requires therefore to resort to stochastic models that are able to characterise the different origins of protein production variability as well as the biological devices that potentially control it.In this context, we have analysed the variability of a protein produced with a negative autoregulation mechanism: i.e. in the case where the protein is a repressor of its own gene. The goal is to clarify the effect of this feedback on the variance of the number of produced proteins. With the same average protein production, we sought to compare the equilibrium variance of a protein produced with the autoregulation mechanism and the one produced in “open loop”. By studying the model under a scaling regime, we have been able to perform such comparison analytically. It appears that the autoregulation indeed reduces the variance; but it is nonetheless limited: an asymptotic result shows that the variance won't be reduced by more than 50%. The effect on the variance being moderate, we have searched for another possible effect for autoregulation: it havs been observed that the convergence to equilibrium is quicker in the case of a model with autoregulation.Classical models of protein production usually consider a constant volume, without any division or gene replication and with constant concentrations of RNA-polymerases and ribosomes. Yet, it has been suggested in the literature that the variations of these quantities during the cell cycle may participate to protein variability. We propose a series of models of increasing complexity that aims to reach a realistic representation of gene expression. In a model with a changing volume that follows the cell cycle, we integrate successively the protein production mechanism (transcription and translation), the random segregation of compounds at division, and the gene replication. The last model integrates then all the genes of the cell and takes into account their interactions in the productions of different proteins through a common sharing of RNA-polymerases and ribosomes, available in limited quantities. For the models for which it was possible, the mean and the variance of the concentration of each proteins have been analytically determined using the Marked Poisson Point Processes. In the more complex cases, we have estimated the variance using computational simulations. It appears that, among all the studied mechanisms, the main source of variability comes from the protein production mechanism itself (referred as “intrinsic noise”). Then, among the other “extrinsic” aspects, only the random segregation of compounds at division seems to have potentially a significant impact on the variance; the other aspects show only a limited effect on protein concentration. These results have been confronted to some experimental measures, and show a still unexplained decay between the theoretical predictions and the biological data; it instigates the formulations of new hypotheses for other possible sources of variability.To conclude, the processes studied have allowed a better understanding of biological phenomena by exploring some hypotheses that are difficult to test experimentally. In the studied models, we have been able to indicate theoretically some trends; hence showing that the stochastic modelling is an important tool for a good understanding of gene expression mechanisms.

Page generated in 0.0391 seconds