• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 69
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 25
  • 23
  • 19
  • 16
  • 15
  • 15
  • 13
  • 13
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Study of power plant with carbon dioxide capture ability through modelling and simulation

Biliyok, Chechet 11 1900 (has links)
With an increased urgency for global action towards climate change mitigation, this research was undertaken with the aim of evaluating post-combustion CO2 capture as an emission abatement strategy for gas-fired power plants. A dynamic rate-based model of a capture plant with MEA solvent was built, with imposed chemical equilibrium, and validated at pilot scale under transient conditions. The model predicted plant behaviour under multiple process inputs and disturbances. The validated model was next used to analyse the process and it was found that CO2 absorption is mass transfer limited. The model was then improved by explicitly adding reactions rate in the model continuity, the first such dynamic model to be reported for the capture process. The model is again validated and is observed to provide better predictions than the previous model. Next, high fidelity models of a gas-fired power plant, a scaled-up capture plant and a compression train were built and integrated for 90% CO2 capture. Steam for solvent regeneration is extracted from the power plant IP/LP crossover pipe. Net efficiency drops from 59% to 49%, with increased cooling water demand. A 40% exhaust gas recirculation resulted in a recovery of 1% efficiency, proving that enhanced mass transfer in the capture plant reduces solvent regeneration energy demands. Economic analysis reveals that overnight cost increases by 58% with CO2 capture, and cost of electricity by 30%. While this discourages deployment of capture technology, natural gas prices remain the largest driver for cost of electricity. Other integration approaches – using a dedicated boiler and steam extraction from the LP steam drum – were explored for operational flexibility, and their net efficiencies were found to be 40 and 45% respectively. Supplementary firing of exhaust gas may be a viable option for retrofit, as it is shown to minimise integrated plant output losses at a net efficiency of 43.5%. Areas identified for further study are solvent substitution, integrated plant part load operation, flexible control and use of rotating packed beds for CO2 capture.
42

Applications of Impulsive Differential Equations to the Control of Malaria Outbreaks and Introduction to Impulse Extension Equations: a General Framework to Study the Validity of Ordinary Differential Equation Models with Discontinuities in State

Church, Kevin January 2014 (has links)
Impulsive differential equations are often used in mathematical modelling to simplify complicated hybrid models. We propose an inverse framework inspired by impulsive differential equations, called impulse extension equations, which can be used as a tool to determine when these impulsive models are accurate. The linear theory is the primary focus, for which theorems analoguous to ordinary and impulsive differential equations are derived. Results explicitly connecting the stability of impulsive differential equations to related impulse extension equations are proven in what we call time scale consistency theorems. Opportunities for future research in this direction are discussed. Following the work of Smith? and Hove-Musekwa on malaria vector control by impulsive insecticide spraying, we propose a novel autonomous vector control scheme based on human disease incidence. Existence and stability of periodic orbits is established. We compare the implementation cost of the incidence-based control to a fixed-time spraying schedule. Hybrid control strategies are discussed.
43

Development of a First-principle Model of a Semi-batch Rhodium Dissolution Process

Nkoghe Eyeghe, Norbertin January 2017 (has links)
First-principle modelling of chemical processes and their unit operations has been of great interest in the chemical process, as well as the control and allied industries over the past decades. This is because it offers the opportunity to develop virtual representations (models) of real process systems, which can be used to describe and predict the dynamic behaviour of those systems. These models are based on the fundamentals of the transport phenomena of fluid dynamics (involving momentum transfer), mass transfer, and energy transfer of the systems they describe. A first-principle model of a semi-batch rhodium dissolution chemical process has been developed. It describes the dynamic behaviour of two exothermic reactions, occurring simultaneously in a semi-batch process. The dissolution of 29 kg of solid crude rhodium sponge (Rh) into 546 L of a solution of hydrochloric acid (HCl(aq)), to produce a solution of aqueous rhodium(III) chloride (RhCl3.H2O), as well as the reaction of chlorine (Cl2(aq)) with water (H2O(l)) to produce some more HCl(aq) in the reactor. The model was formulated as a system of explicit ordinary differential equations (ODEs), which demonstrated some good and stable qualitative tracking of the temperature and pressure data of the real reactor. The molar responses of all chemical species, as well as the heats of reactions, showed to be consistent with the description of the process, and no negative values of those variables were generated. Estimates of the key parameters of heat and mass transfer coefficients, arrhenius constants, and activation energies of reactions were assumed and tuned to satisfaction by trial-and-error, but not optimised. This is because during simulations, the numerical solver would often fail to integrate the equations, due to the appearance of large derivatives in some model equations whenever those parameters varied, thereby stopping simulations. Finally, the model was validated with a set of data from 45 batches. For all simulations done, the simulated temperature responses showed better prediction of data than the simulated pressure responses did, with an average percentage accuracy of 80% against 60 percent, respectively. / Dissertation (MSc)--University of Pretoria, 2017. / Anglo American Platinum / BluESP (Pty) Ltd / Chemical Engineering / MSc / Unrestricted
44

VALIDATION OF THE PSS/E MODEL FOR THE GOTLAND NETWORKR

Hrag, Margossian January 2010 (has links)
The aim of the project is to revise the load flow and dynamic PSS/E models of the Gotland network and validate them against a set of measurements collected during a major disturbance, a three phase short circuit in the 70 kV system. The main task in revising the model is to convert the induction machine models of the wind turbines into user and manufacturer wind turbine models. The validation of the model is divided into two phases. The first is to use the measurements as well as some assumptions on the wind power generation and load distribution from the time of the fault to validate the dynamic behaviour of the system. The second is to use new measurements during a normal operation day. The latter would not be very helpful to illustrate the dynamic behaviour of the system, because of the lack of a major fault that would drastically affect the system, but it would nevertheless be useful to validate the load flow with greater accuracy.
45

Reprezentace kontinentality v regionálních klimatických modelech / Continentality representation in regional climate models

Hudek, Jakub January 2021 (has links)
Continentality of climate is one of the basic climate phenomena, describing the climate at current place according to annual changes of basic meteorological elements such as temperature, precipitation, etc. Its measure is usually expressed by indices and is being determined either according to observations using collected data or simulated by climate models. The goal is usually to determine the ability of climate models to represent the present state of climate and to determine and analyse the scenarios of future evolvement for Europe as an examined area. In present diploma thesis are briefly introduced terms like continentality, its indices, global and regional climate models, the ERA-Interim reanalysis, as well as the EURO-CORDEX iniciative. Subsequently individual simulations are processed, analysed and compared with the observations according to the E-OBS dataset.
46

Meteorologische Einflüsse auf die Konzentrationen feiner und grober atmosphärischer Aerosolpartikel in Deutschland

Engler, Christa 10 February 2014 (has links)
Atmosphärische Aerosolpartikel können durch ein breites Spektrum an natürlichen oder anthropogenen Emissionen mit unterschiedlich hohen Konzentrationen in die Atmosphäre freigesetzt werden. Sie beeinflussen den Strahlungshaushalt und damit auch das Klima der Erde und können außerdem durch ihre Präsenz in der Atmosphäre Wechselwirkungen mit Mensch und Natur, also dem gesamten Ökosystem haben. Seit dem Jahr 2010 gelten in der EU Grenzwerte für die PM10 Tagesmittelkonzentration, die jedoch bereits wenige Monate nach Beginn der Gültigkeit an vielen Messstationen überschritten wurden. Das Ziel der vorliegenden Arbeit war eine objektive Bewertung der Herkunft und des Zustandes der an einem Messort ankommenden Luftmasse und der damit verbundenen Schadstoffniveaus. Im ersten Teil der Arbeit wurden PM10 Messdaten aus fünf Jahren in und um Leipzig sowie analog in fünf verschiedenen Regionen deutschlandweit in Bezug auf PM10 Grenzwertüberschreitungen untersucht. Es wurden Rückwärtstrajektorien für eine Clusteranalyse verwendet, mit der spezifische Wetterlagen unterschieden wurden und diesen dann die einzelnen Messtage mit den zugehörigen Schadstoffkonzentrationen zugeordnet wurden. Hierbei wurde deutlich, dass durch entsprechende meteorologische Bedingungen sowohl lokal als auch regional emittierte Schadstoffe in Bodennähe akkumulieren oder aber auch räumlich verteilt werden können. Außerdem wurde in dieser Arbeit eine Modellvalidierung vorgestellt. Es wurde das Modellsystem COSMO-MUSCAT/ext-M7 verwendet, dessen Ergebnisse mit Beobachtungsdaten verglichen wurden. Als erstes wurde die Beschreibung der meteorologischen Bedingungen, dann die räumliche Verteilung von PM10, die chemische Partikelzusammensetzung sowie physikalische Aerosolparameter wie Partikelanzahl, -volumen und -durchmesser verglichen. Die Ergebnisse deuten darauf hin, dass nach wie vor Probleme bei der Beschreibung der mikrophysikalischen Aerosoleigenschaften bestehen. Die Größenordnungen der verglichenen Parameter können vom Modell abgebildet werden, dennoch müssen Modellergebnisse nach wie vor mit Vorsicht interpretiert werden, insbesondere hinsichtlich von Prozessen, bei denen die Partikelanzahl eine Rolle spielen.
47

Empirical Validation and Comparison of the Hybrid Coordinate Ocean Model (HYCOM) Between the Gulf of Mexico and the Tongue of the Ocean

Cleveland, Cynthia A 04 December 2018 (has links)
Ocean models are increasingly able to synthesize a large temporal domain with fine spatial resolution. With this increase in functionality and availability, ocean models are in high demand by researchers, establishing a critical need for validating a model’s ability to represent interior ocean dynamics. Satellite measurements are typically used for validation, however these measurements are limited to the upper layers of the ocean and therefore satellite measurements of sea surface height and sea surface temperature are the most validated output parameters of three-dimensional ocean models. Unfortunately there is a paucity of model validation studies for the interior ocean. This study fills a knowledge gap by contrasting model data from the Hybrid Coordinate Ocean Model (HYCOM) for the interior ocean in the Tongue of the Ocean (TOTO), Bahamas and the Gulf of Mexico (GoM) against observational (i.e., in situ) data collected in both locations. Conductivity temperature and depth (CTD) data in the GoM were collected during five research cruises by the DEEPEND Consortium between May of 2015 and May 2017. These data were collected as part of the investigation into the impact of oil spills on faunal communities in deep water of the GoM. CTD and expendable CTD (XCTD) data in the TOTO were collected by the Naval Undersea Warfare Center (NUWC) detachment Atlantic Undersea Test and Evaluation Center (AUTEC) in support of U.S. Navy acoustic testing between 1997 and 2017 to characterize the sound velocity profile of the water column. The global 1/12° HYCOM configuration (GLBu0.08) was found to be a better fit in the upper 400 and 250 meters of the TOTO for temperature and salinity, respectively, than the GoM 1/25° HYCOM configuration (GOMI0.04 1/25°) fit the GoM in situ data for the same depths. The GoM 1/25° HYCOM configuration (GOMI0.04 1/25°) provided a better fit in the GoM for depths of 500 and 300 meters and deeper for temperature and salinity, respectively, than the global 1/12° HYCOM configuration (GLBu0.08) fit the TOTO in situ data at the same depths. A comprehensive comparison of the vertical profile between the model and observational data for each of the regions of interest provides insight into using HYCOM forecast data for future applications.
48

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Helsing, Joseph 05 1900 (has links)
Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations using SUMO, or Simulations of Urban Mobility, to simulate realistic traffic dynamics. Additionally, high performance computing methodologies were utilized to decrease agent load on simulations and improve performance. Further strategies, such as use of agent scaling and a time limit on simulation execution, were also examined. Experimental results indicate that the time to plan completion, i.e. the time when all individuals of the population have received medication, determined by VERPETS aligned well with current alternate methodologies. It was determined that the dynamic of traffic congestion at the POD itself was one of the major factors affecting the completion time of the plan, and thus allowed for more rapid calculations of plan completion time. Thus, this system provides not only a novel methodology to validate emergency response plans, but also a validation of other current strategies of emergency response plan validation.
49

Simulation of an Oxidizer-Cooled Hybrid Rocket Throat: Methodology Validation for Design of a Cooled Aerospike Nozzle

Brennen, Peter Alexander 01 June 2009 (has links) (PDF)
A study was undertaken to create a finite element model of a cooled throat converging/diverging rocket nozzle to be used as a tool in designing a cooled aerospike nozzle. Using ABAQUS, a simplified 2D axisymmetric model was created featuring only the copper throat and stainless steel support ring, which were brazed together for the experimental test firings. This analysis was a sequentially coupled thermal/mechanical model. The steady state thermal data matched closely to experimental data. The subsequent mechanical model predicted a life of over 300 cycles using the Manson-Halford fatigue life criteria. A mesh convergence study was performed to establish solution mesh independence. This model was expanded by adding the remainder of the parts of the nozzle aft of the rocket motor so as to attempt to match the transient nature of the experimental data. This model included variable hot gas side coefficients in the nozzle calculated using the Bartz coefficients and mapped onto the surface of the model using a FORTRAN subroutine. Additionally, contact resistances were accounted for between the additional parts. The results from the preliminary run suggested the need for a parameter re-evaluation for cold side gas conditions. Parametric studies were performed on contact resistance and cold side film coefficient. This data led to the final thermal contact conductance of k=0.005 BTU/s•in.•°R for contact between metals, k=0.001 BTU/s•in.•°R for contact between graphite and metal, and h=0.03235 BTU/s2•in.•°R for the cold side film coefficient. The transient curves matched closely and the results were judged acceptable. Finally, a 3D sector model was created using identical parameters as the 2D model except that a variable cold side film condition was added. Instead of modeling a symmetric one or two inlet/one or two outlet cooling channel, this modeled a one inlet/one outlet nozzle in which the coolant traveled almost the full 360° around the cooling annulus. To simplify the initial simulation, the model was cut at the barrier between inlet and outlet to form one large sector, rather than account for thermal gradients across this barrier. This simplified nozzle produced expected data, and a 3D full nozzle model was created. The cold side film coefficients were calculated from previous experimental data using a simplified 2D finite difference approach. The full nozzle model was created in the same manner as the 2D full nozzle model. A mesh convergence study was performed to establish solution mesh independence. The 3D model results matched well to experimental data, and the model was considered a useful tool for the design of an oxidizer cooled aerospike nozzle.
50

The Parameter Signature Isolation Method and Applications

McCusker, James Richard 13 May 2011 (has links)
The aim of this research was to develop a method of system identification that would draw inspiration from the approach taken by human experts for simulation model tuning and validation. Human experts are able to utilize their natural pattern recognition ability to identify the various shape attributes, or signatures, of a time series from simulation model outputs. They can also intelligently and effectively perform tasks ranging from system identification to model validation. However, the feature extraction approach employed by them cannot be readily automated due to the difficulty in measuring shape attributes. In order to bridge the gap between the approach taken by human experts and those employed for traditional iterative approaches, a method to quantify the shape attributes was devised. The method presented in this dissertation, the Parameter Signature Isolation Method (PARSIM), uses continuous wavelet transformation to characterize specific aspects of the time series shape through surfaces in the time-scale domain. A salient characteristic of these surfaces is their enhanced delineation of the model outputs and/or their sensitivities. One benefit of this enhanced delineation is the capacity to isolate regions of the time-scale plane, coined as parameter signatures, wherein individual output sensitivities dominate all the others. The parameter signatures enable the estimation of each model parameter error separately with applicability to parameter estimation. The proposed parameter estimation method has unique features, one of them being the capacity for noise suppression, wherein the feature of relying entirely on the time-scale domain for parameter estimation offers direct noise compensation in this domain. Yet another utility of parameter signatures is in measurement selection, whereby the existence of parameter signatures is attributed to the identifiability of model parameters through various outputs. The effectiveness of PARSIM is demonstrated through an array of theoretical models, such as the Lorenz System and the Van der Pol oscillator, as well as through the real-world simulation models of an injection molding process and a jet engine.

Page generated in 0.1376 seconds