231 |
A flexible suite of programs for modelling the cortex with a mean-field schemeChang, Yuan-Kuei January 2007 (has links)
The cerebral cortex contains many neurons. The neuron is part of the nervous system and it receives and transmits the electrical signals. These signals are significant to a human's behaviour. Since the neurons are charged, these charges produce electrical fields, so these neural signals can be measured by using scalp electrodes in electroencephalography (EEG). As long as the brain is not dead, the spontaneous activities of neurons will produce a series of EEG signals. There are many models that have been developed for simulating the cortical signal, and mostly each model is focused towards a different purpose or application. Often, a different computer code has to be written for each different application, and this can be inefficient. Therefore, this project aims to develop a software system for simulating cortical signals where the model used for the system can be changed easily. Furthermore, the system is requested to be versatile and easy-to-use for many applications. The developed system is written in MATLAB in response to a user requirement and mostly applies to any model which uses a mean-field approach. Only the specific inputs need to be modified for changing the model. This thesis details how this system is developed. The main limitation of the system is computational resources, much the same as other cortical modelling. However, all the user requirements had been satisfied. The system can simulate the response of the neurons for any condition and generate simulated EEG data to the user. The user can analyze the cortical activities using the standard signal processing techniques such as a power spectrum. This software is very helpful for the research of sleep and anaesthesia.
|
232 |
ADVANCED MODELLING OF EMULSION TERPOLYMERISATION FOR ONLINE OPTIMISATION AND CONTROLSrour, Mourtada H January 2008 (has links)
Doctor of Philosophy(PhD) / Polymer manufacturing is a major worldwide industry, attracting the attention of numerous industrial units and research institutes. Increasing demands on polymer quality, process safety and cost reduction are the main reasons for growing interest in the design and control of emulsion polymerisation. Emulsion polymerisation process implemented with free radical polymerisation has significant advantages over other processes, such as the production of polymer of higher molecular weights at high conversion rates, easier temperature control due to the low viscosity of the reaction media, high degree of selectivity and more friendly to environment due to the use of an aqueous medium. It allows for the production of particles with specially-tailored properties, including size, composition, morphology, and molecular weights. Introducing two or more different monomers to the polymerisation process (named multi-polymerisation) can lead to the synthesis of an almost unlimited number of new polymers types. Emulsion polymers are products by process, meaning that the manner in which the polymerisation is carried out is perhaps more important than the raw materials in determining the form of the final product. This highlights the significance of the systematic approach in online process control which requires thorough understanding of the process phenomena as a prerequisite for development of a mathematical description of the process as the model. It is thus evident and based on research observations that process control for emulsion terpolymerisation is a particularly difficult task because of the lack of validated models and the lack of online measurements of most of polymer properties of interest. Therefore, a well validated model is crucial for optimising and controlling the emulsion terpolymerisation operations allowing for design of the polymer product properties. In this study, a framework for process design and control of emulsion terpolymerisation reactors was developed. This framework consisted of three consecutive stages, dynamic modelling of the process, optimising the process for finding the optimal operating strategies and final online controlling the obtained optimal trajectories through multivariable constrained model predictive control. Within this framework, a comprehensive dynamic model was developed. Then a test case of emulsion terpolymerisation of styrene, methyl methacrylate and methyl acrylate was investigated on state of the art facilities for predicting, optimising and control end-use product properties including global and individual conversions, terpolymer composition, the average particle diameter and concentration, glass transition temperature, molecular weight distribution, the number- and weight-average molecular weights and particle size distribution. The resulting model was then exploited to understand emulsion terpolymerisation behavior and to undertake model-based optimization to readily develop optimal feeding recipes. The model equations include diffusion-controlled kinetics at high monomer conversions, where transition from a ‘zero-one’ to a ‘pseudo-bulk’ regime occurs. Transport equations are used to describe the system transients for batch and semi-batch processes. The particle evolution is described by population balance equations which comprise of a set of integro-partial differential and nonlinear algebraic equations. Backward finite difference approximation method is used to discretise the population equation and convert them from partial differential equations to ordinary differential equations. The model equations were solved using the advanced simulation environment of the gPROMS package. The dynamic model was then used to determine optimal control policies for emulsion terpolymerisation in a semi-batch reactor using the multiobjective dynamic optimisation method. The approach used allows the implementation of constrained optimisation procedures for systems described by complex mathematical models describing the operation of emulsion terpolymerisation reactors. The control vector parameterisation approach was adopted in this work. Styrene monomer feed rate, MMA monomer feed rate, MA monomer feed rate, surfactant feed rate, initiator feed rate and the temperature of reactor were used as the manipulating variables to produce terpolymers of desired composition, molecular weight distribution (MWD) and particle size distribution (PSD). The particle size polydispersity index (PSPI), molecular weight polydispersity index (MWPI) and the overall terpolymer composition ratios were incorporated as the objective functions to optimise the PSD, MWD and terpolymer composition, respectively. The optimised operational policies were successively validated with experiments via one stirred tank polymerisation reactor. Due to the lack of online measurements of key process product attributes for emulsion terpolymerisation, an inferential calorimetric soft sensor was developed based on temperature measurements. The calorimetric soft sensor obtains online measurements of reactor temperature, jacket inlet and outlet temperatures helped estimate the rate of polymerisation. The model includes the mass and energy balance equations over the reactor and its peripherals. Energy balance equations include the heat of reaction, internal and external heat transfer effects, as well as external heat losses. An online multivariable constrained model predictive control was formulated and implemented for online control of the emulsion terpolymerisation process. To achieve this implementation, a novel generic multilayer control architecture for real-time implementation of optimal control policies for particulate processes was developed. This strategy implements the dynamic model for the emulsion terpolymerisation as a real-time soft sensor which is incorporated within the implemented MPC. The methodology was successively validated using six case studies within the on-line control of terpolymerisation reactors. The cases were online controlled the composition of terpolymers, PSD and Mn with specific constraints for the operation conversion and particle average radius. An advanced Supervisory Control Architecture named ROBAS was used in this work. It provides a completely automated architecture allowing for the real time advanced supervisory monitoring and control of complex systems. The real time control application strategy was developed within MATLAB, Simulink, gPROMS and Excel Microsoft softwares and implemented on line through ROBAS Architecture. The manipulated variables are measured using on-line measurements connected to the DCS system through Honeywell. These measurements were sent to MATLAB and then to the dynamic model in gPROMS through an excel spread-sheet interface. Then the dynamic model used them to estimate the controlled variables of the MPC. The estimated values of the controlled variables obtained from the dynamic model, were then sent to the Simulink and fed through the DCS system to the MPC developed in MATLAB. The MPC would then calculate optimal trajectories, which are then sent as set point signals through the DCS system to the regulatory controller. The MPC formulation was found to be robust and handles disturbances to the process. The result showed that the online multivariable constrained MPC controller was able to control the desired composition and Mn as specified set points with great accuracy. The MPC algorithm succeeded under constrained conditions, in driving the PSD to the desired target. Although some offset was observed with a certain degree of model mismatch, the experimental results agreed well with predictions.
|
233 |
Thermal-hydraulic modelling of Forsmark 1 NPP in TRACE : Validation versus the 25th of July, 2006 plant transientBladh, Lisa January 2010 (has links)
<p>There is a widespread use of thermal hydraulic codes in nuclear industry. The codesare used to analyse the transient and steady-state behavior of the nuclear powerplants. The US Nuclear Regulatory Commission that has long experience of developing such codes are now incorporating the capabilities of their earlier codes into one modern simulation tool, called TRACE. The code is under development and validation work is required especially in the field of BWR applications. Eventually the code is expected to replace similar codes such as TRAC and Relap5.</p><p>With this in mind, a TRACE model of Forsmark 1 has been set up to investigate how well it can simulate a plant transient. On the 25th of July, 2006 there was a disturbance at Forsmark 1 that caused the RPV water level and pressure to decrease.In this project, plant data acquired during the event are used to validate the model of Forsmark 1. The validation work is focused on comparing measured and calculated water and pressure levels in the RPC during the transient.</p><p>The results show qualitatively good agreement with the validation data, however during a period of the simulations there are large discrepancies concerning the pressure and water level in the RPV. In total, 13 simulations are performed, studying the influences of parameters such as simulation time-step size, the feed water flow boundary conditions and the steam line isolation valve characteristics. Based on the results of the simulations, a number of recommendations are made regarding suggestions for further work.</p>
|
234 |
Electromechanical Behaviour of Surface-Bonded Piezoelectric Sensors/Actuators with Imperfect Adhesive LayersJin, Congrui 11 1900 (has links)
The performance of smart structures depends on the electromechanical behaviour of piezoelectric sensors/actuators and the bonding condition along the interface, which connects the sensor/actuator
and the host structures. This thesis documents a theoretical study
of the influence of material parameters of the imperfect bonding
layer on the coupled electromechanical characteristics of piezoelectric sensors/actuators. A one dimensional sensor/actuator model
with an imperfect bonding layer, which undergoes a shear deformation, is proposed. The emphasis of the current study is on the local
stress and strain fields near imperfectly bonded sensors/actuators
and the load transfer. Analytical solutions based on the integral
equation method are provided. Detailed numerical simulation is
conducted to evaluate the influence of the geometry and the material mismatch of the adhesive layer upon the sensing/actuating process. The interfacial debonding and its effect upon the strain/stress
distribution and the overall performance of the integrated structure
are evaluated in detail.
|
235 |
Measuring variation : an epistemological account of causality and causal modellingRusso, Federica 17 June 2005 (has links)
This doctoral dissertation deals with causal modelling in the social sciences. The specific question addressed here is: what is the notion, or the rationale, of causality involved in causal models? The answer to that epistemological query emerges from a careful analysis of the social science methodology, of a number of paradigmatic case studies and of the philosophical literature.
The main result is the development of the rationale of causality as the measure of variation. This rationale conveys the idea that to test – i.e. to confirm or disconfirm – causal hypotheses, social scientists test specific variations among variables of interest. The notion of variation is shown to be embedded in the scheme of reasoning of probabilistic theories of causality, in the logic of structural equation models and covariance structure models, and is also shown to be latent in many philosophical accounts.
Further, the rationale of causality as measure of variation leaves room for a number of epistemological consequences about the warranty of the causal interpretation of structural models, the levels of causation, and the interpretation of probability.
Firstly, it is argued that what guarantees the causal interpretation is the sophisticated apparatus of causal models, which is made of statistical, extra-statistical and causal assumptions, of a background context, of a conceptual hypothesis and of a hypothetico-deductive methodology. Next, a novel defence of twofold causality is provided and a principle to connect population-level causal claims and individual-level causal claims is offered. Last, a Bayesian interpretation of probability is defended, in particular, it is argued that empirically-based Bayesianism is the interpretation that best fit the epistemology of causality here presented.
The rationale of variation is finally shown to be involved or at least consistent with a number of alternative accounts of causality; notably, with the mechanist and counterfactual approach, with agency-manipulability theories and epistemic causality, with singularist accounts and with causal analysis by contingency tables.
|
236 |
Microsimulating Residential Mobility and Location Choice Processes within an Integrated Land Use and Transportation Modelling SystemHabib, Muhammad Ahsanul 13 April 2010 (has links)
This research investigates motivational and procedural aspects of households’ long-term decisions of residential locations. The main goal of the research is to develop microbehavioural models of location processes in order to implement this critical land use component within a microsimulation-based model of Integrated Land Use, Transportation and Environment (ILUTE). The research takes a disaggregate and longitudinal approach to develop the models, which is consistent with the real-world decision-making process of households concerning their movements from one residence to another over time. It identifies two sequential model components to represent households’ relocation behaviour: (1) a model of household residential mobility that determines whether a household decides to become active in the housing market, and (2) a (re) location choice model. Both components are empirically investigated using retrospective surveys of housing careers. For the residential mobility decision, the research tests continuous-time hazard duration models and discrete-time panel logit models, and attempts to capture heterogeneity effects due to repeated choices within both modelling techniques. A discrete-time random parameter model is selected for implementation within ILUTE since it incorporates time-varying covariates. Assuming a sequential decision process, this mobility decision model is linked to the (re) location choice model that establishes preference orderings for each active household for a given set of dwelling units that it considers to relocate within the housing market. A unique feature of the (re) location model developed in this research is that it incorporates reference dependence that explicitly recognizes the role of the status quo and captures asymmetric responses towards gains and losses in making location choice decisions. The research then estimates an asking price model, which is used to generate base prices for active dwellings to interact with active households through a market clearing process within a microsimulation environment. A multilevel model that simultaneously accounts for both temporal and spatial heterogeneity is developed in this research using multi-period property transaction data. Finally, this research simulates evolution of households’ location choices for a twenty-year period (1986-2006) and compares the results against observed location patterns.
|
237 |
3D Magnetic Resonance Image-based Cardiac Computer Models of Cardic ElectrophysiologyPop, Mihaela Paula 22 February 2011 (has links)
There is a clear need for improved methods (e.g. computer modelling, imaging) to characterize the substrate of abnormal rhythms like ventricular tachycardia (VT) developed by patients who have suffered a heart attack. Progress leading to improved disease management and treatment planning (based on predictive models) as well as outcomes assessment will have immediate impact on the quality of life in this large patient population. Prior to integration into clinical applications, the predictive models have to be properly validated using experimental techniques selected to reflect the electrophysiological phenomena at spatio-temporal scales similar to those considered in simulations.
This thesis advanced us toward this goal by addressing the challenge of building more accurate models of electrophysiology for individual hearts. A novel construction of a realistic 3D cardiac model from Magnetic Resonance Images (MRI), with a long-term aim to predict propagation of the electrical impulse in normal and pathologic large hearts (translatable to human hearts), and associated inducibility of VT is described. To parameterize the model, an original evaluation method of electrophysiological (EP) characteristics of the heart tissue was used. The method combined state-of-the-art experimental physiology tools like optical fluorescence imaging using voltage-sensitive dyes and a CARTO electro-anatomical system, with a cardiac computer model generated from high resolution MR scans of explanted normal and pathologic porcine hearts. Several input model parameters (e.g., conductivity, anisotropy, restitution) were successfully adjusted using the ex-vivo measurements of action potential to yield close correspondence between model output and experiments. Moreover, a simple, fast, and macroscopic mathematical model was used with computation times less than 1h, attractive for clinical EP applications.
|
238 |
Microsimulating Residential Mobility and Location Choice Processes within an Integrated Land Use and Transportation Modelling SystemHabib, Muhammad Ahsanul 13 April 2010 (has links)
This research investigates motivational and procedural aspects of households’ long-term decisions of residential locations. The main goal of the research is to develop microbehavioural models of location processes in order to implement this critical land use component within a microsimulation-based model of Integrated Land Use, Transportation and Environment (ILUTE). The research takes a disaggregate and longitudinal approach to develop the models, which is consistent with the real-world decision-making process of households concerning their movements from one residence to another over time. It identifies two sequential model components to represent households’ relocation behaviour: (1) a model of household residential mobility that determines whether a household decides to become active in the housing market, and (2) a (re) location choice model. Both components are empirically investigated using retrospective surveys of housing careers. For the residential mobility decision, the research tests continuous-time hazard duration models and discrete-time panel logit models, and attempts to capture heterogeneity effects due to repeated choices within both modelling techniques. A discrete-time random parameter model is selected for implementation within ILUTE since it incorporates time-varying covariates. Assuming a sequential decision process, this mobility decision model is linked to the (re) location choice model that establishes preference orderings for each active household for a given set of dwelling units that it considers to relocate within the housing market. A unique feature of the (re) location model developed in this research is that it incorporates reference dependence that explicitly recognizes the role of the status quo and captures asymmetric responses towards gains and losses in making location choice decisions. The research then estimates an asking price model, which is used to generate base prices for active dwellings to interact with active households through a market clearing process within a microsimulation environment. A multilevel model that simultaneously accounts for both temporal and spatial heterogeneity is developed in this research using multi-period property transaction data. Finally, this research simulates evolution of households’ location choices for a twenty-year period (1986-2006) and compares the results against observed location patterns.
|
239 |
3D Magnetic Resonance Image-based Cardiac Computer Models of Cardic ElectrophysiologyPop, Mihaela Paula 22 February 2011 (has links)
There is a clear need for improved methods (e.g. computer modelling, imaging) to characterize the substrate of abnormal rhythms like ventricular tachycardia (VT) developed by patients who have suffered a heart attack. Progress leading to improved disease management and treatment planning (based on predictive models) as well as outcomes assessment will have immediate impact on the quality of life in this large patient population. Prior to integration into clinical applications, the predictive models have to be properly validated using experimental techniques selected to reflect the electrophysiological phenomena at spatio-temporal scales similar to those considered in simulations.
This thesis advanced us toward this goal by addressing the challenge of building more accurate models of electrophysiology for individual hearts. A novel construction of a realistic 3D cardiac model from Magnetic Resonance Images (MRI), with a long-term aim to predict propagation of the electrical impulse in normal and pathologic large hearts (translatable to human hearts), and associated inducibility of VT is described. To parameterize the model, an original evaluation method of electrophysiological (EP) characteristics of the heart tissue was used. The method combined state-of-the-art experimental physiology tools like optical fluorescence imaging using voltage-sensitive dyes and a CARTO electro-anatomical system, with a cardiac computer model generated from high resolution MR scans of explanted normal and pathologic porcine hearts. Several input model parameters (e.g., conductivity, anisotropy, restitution) were successfully adjusted using the ex-vivo measurements of action potential to yield close correspondence between model output and experiments. Moreover, a simple, fast, and macroscopic mathematical model was used with computation times less than 1h, attractive for clinical EP applications.
|
240 |
Thermal-hydraulic modelling of Forsmark 1 NPP in TRACE : Validation versus the 25th of July, 2006 plant transientBladh, Lisa January 2010 (has links)
There is a widespread use of thermal hydraulic codes in nuclear industry. The codesare used to analyse the transient and steady-state behavior of the nuclear powerplants. The US Nuclear Regulatory Commission that has long experience of developing such codes are now incorporating the capabilities of their earlier codes into one modern simulation tool, called TRACE. The code is under development and validation work is required especially in the field of BWR applications. Eventually the code is expected to replace similar codes such as TRAC and Relap5. With this in mind, a TRACE model of Forsmark 1 has been set up to investigate how well it can simulate a plant transient. On the 25th of July, 2006 there was a disturbance at Forsmark 1 that caused the RPV water level and pressure to decrease.In this project, plant data acquired during the event are used to validate the model of Forsmark 1. The validation work is focused on comparing measured and calculated water and pressure levels in the RPC during the transient. The results show qualitatively good agreement with the validation data, however during a period of the simulations there are large discrepancies concerning the pressure and water level in the RPV. In total, 13 simulations are performed, studying the influences of parameters such as simulation time-step size, the feed water flow boundary conditions and the steam line isolation valve characteristics. Based on the results of the simulations, a number of recommendations are made regarding suggestions for further work.
|
Page generated in 0.1364 seconds