• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 173
  • 28
  • 22
  • 19
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 335
  • 335
  • 197
  • 79
  • 58
  • 56
  • 45
  • 40
  • 38
  • 35
  • 33
  • 32
  • 32
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Assessing the Regularity and Predictability of the Age-Trajectories of Healthcare Utilization

Turnbull, Margaret 20 August 2012 (has links)
This research examines the viability of a need-based approach that models the age-trajectories of healthcare utilization. We propose a fundamentally different way of treating age in modeling healthcare use. Rather than treating age as a need indicator, we refocus modeling efforts to predicting the age-trajectories of healthcare use. Using inpatient hospital utilization data from the Discharge Abstract Database, first, we model the age-trajectories of the rate of hospital use employing a common functional form. Second, we assess variation in these age-trajectories using growth curve modeling. Third, we explain variation in these age-trajectories using census variables. Our analysis shows that the regional variation in the age-trajectories of the rate of inpatient hospital use is sufficient to justify this method, and could be partially explained using census variables. This indicates that modeling age-trajectories of healthcare use is advantageous, and the current need-based approach may benefit from this new modeling strategy.
242

Hajj crowd management: Discovering superior performance with agent-based modeling and queueing theory

Khan, Imran 12 1900 (has links)
The thesis investigates how Agent-Based Modeling and Simulation (ABMS) and Queueing Theory (QT) techniques help manage mass gathering (MG) crowds. The techniques are applied to Hajj MG, which is one of the most complex annual MG, with a focus on its challenging Tawaf ritual. The objective is to develop a Tawaf Decision Support System (DSS) to better understand Tawaf crowd dynamics and discover decisions that lead to superior performance. TawafSIM is an ABMS model in the DSS, which simulates macro-level Tawaf crowd dynamics through micro-level pilgrim modeling to explore the impact of crowd characteristics, facility layout, and management preferences on emergent crowd behaviours with respect to throughput, satisfaction, health, and safety. Whereas, TawafQT is a QT model in the DSS to explore the impact of pilgrim arrival rate and Tawaf throughput on expected arrival, departure, and waiting times along with average queue length in the Tawaf waiting area. The thesis provides several contributions, including the following. First, it is the only Tawaf research to use a hybrid ABMS and QT approach. Second, TawafSIM is a comprehensive Tawaf simulator. It incorporates features for pilgrim characteristics, facility design, and management preferences. It calculates eight metrics for Tawaf performance, which includes one for throughput, three for satisfaction, one for health, and three for safety. It is the only Tawaf simulator to estimate satisfaction and spread of infectious disease. It conducts 42 simulation experiments in 12 categories. It generates observations for emergent, tipping point, expected, and counter intuitive behaviours. It recommends a default scenario as the best decision along with a small subset of alternative scenarios, which provide above average Tawaf performance. It generates a Tawaf Crowd Management Guide to better understand Tawaf crowd dynamics and how to pursue above average Tawaf performance under different conditions. Third, TawafQT is the only study of the Tawaf waiting area. It uses an accurate queueing model with finite source, single service, and PH type distribution, which is not only applicable to the Tawaf and other Hajj related queueing systems but also to any queueing system, which has finite population and single service characteristics.
243

Computer Based Interactive Medical Simulation 

Cotin, Stéphane 11 July 2008 (has links) (PDF)
La simulation médicale interactive sur ordinateur est une technologie révolutionnaire pour améliorer l'efficacité de nombreuses interventions médicales tout en réduisant le niveau de risque pour les patients. Bien que visant essentiellement l'apprentissage, ces simulations pourraient être utilisées, dans un futur proche, pour la planification d'interventions complexes ou même pour assister le praticien / clinicien dans la salle d'opération. Ce manuscrit présente une revue détaillée du domaine multi-disciplinaire de la simulation médicale, et illustre nos différentes contributions dans ce domaine. Après une vue d'ensemble, au Chapitre I, de nombreuses applications en simulation médicale, le Chapitre II décrit nos contributions sur les modèles, depuis la modélisation anatomique (afin de créer des représentations réalistes, et potentiellement adaptées au patient, de l'anatomie humaine) jusqu'à la modélisation biomécanique (pour déterminer les caractéristiques des tissus mous et définir des modèles mathématiques décrivant leur comportement). Les problématiques liées à la modélisation de matériel médical (instruments flexibles ou systèmes d'imagerie) ou encore la modélisation physiologique (pour le calcul d'écoulement sanguin par exemple) sont également abordées. Le Chapitre III s'attache à la modélisation des interactions entre instruments et tissus mous, qui occupent une part très importante dans toute intervention médicale. Les différentes techniques à mettre en oeuvre pour modéliser de telles interactions (détection de collision, modélisation des contacts et rendu haptique) sont décrites dans ce chapitre. Au Chapitre IV sont présentées plusieurs contributions liées à la validation, que ce soit pour comparer des modèles déformables ou pour l'évaluation de systèmes d'apprentissage. Le Chapitre V est dédié à la description de divers prototypes de simulateurs développés au cours de ces travaux de recherche, et le Chapitre VI présente nos récents travaux visant au développement d'une plate-forme Open Source dédiée à la simulation médicale. Cette plate-forme, appelée SOFA, est le fruit d'un travail collaboratif international à travers lequel nous espérons fédérer de nombreuses équipes de recherche. Finalement, le Chapitre VII résume nos différentes contributions et présente un ensemble de perspectives et de défis, en particulier dans les domaine de la simulation et de la planification sur des données spécifiques à des patients.
244

Religion and the Evolution of Democracy: A Revised Selectorate Model for the Arab Spring

Bagherpour, Amir K. 01 January 2012 (has links)
2011 was a seminal year in the history of the Middle East and North Africa (MENA). Popularly referred to as the Arab Spring, the region has experienced a wave of revolutions and instability. It can be classified in three broad categories within 2011: Uprisings that have resulted in the overthrow of standing regimes, uprisings that have failed to overthrow standing regimes, and states that have not experienced popular revolts. In the first category Libya, Egypt, Yemen, and Tunisia have all experienced uprisings resulting in the respective departure of Muamar Gaddafi, Hosni Mubarak, Ali Abdullah Saleh, and Zine Al Abidine Ben Ali. In contrast Syria and Bahrain have experienced uprisings that have not resulted into the toppling of their regimes thus far. Finally, countries such as Saudi Arabia and Iran have experienced none of the instability observed in 2011 within the same time period. In tracking the evolution of selectorates, I identified the rise of actors within the newly developing coalitions whose Islamist preferences are unaccounted for in the standard Selectorate Model. As later explained in detail, Selectorate Theory is driven by the public-private goods argument. The theory states that a leader’s political survival is based on the mix of private payoffs he can provide to his selectorate and public goods provided to the general population. The once secular despots are either gone or are on the way out as evident by the removal of Hosni Mubarak, Zine Abidine Ben-Ali, Saddam Hussein, Muamar Gaddafi, Ali Abdullah Saleh, and the currently embattled Bashar Al- Assad. They are being replaced or have already been removed by governments that are led by Islamic Parties. Therefore, newly elected or appointed leaders must take into account the role of religion in their calculus for political survival in a way that they did not before. This begs the question: what about the regimes in my case studies that have not been toppled such as Saudi Arabia, Iran, and Bahrain? Although these are highly autocratic governments, the leaders of such governments have a legitimacy that is derived from implicit approval of their Islamist allies. This strengthens the argument that religion must be accounted for beyond the standard Selectorate Model rationale for political survival in MENA. In such context I provide a revised Selectorate Model explanation that accounts for the role of religion. I conclude that the standard Selectorate Theory is insufficient for MENA because it is does not account for the role of religion. By testing the coalitional distribution and evolution of selectorates, I developed a revised Selectorate Model that includes the role of religion along with the standard private payoffs – public goods argument. The role of religion is expressed by the presence of religious stakeholders in the agent based model such as clerics, shura councils or Islamic parties present in all selectorates in MENA. I tracked the selectorates through a series of predictions made throughout the course of 2011 using the Senturion agent based model. It serves as a powerful alternative to standard historical analysis and wisdom. I provide an explanation of why certain regimes fell while others remained relatively stable and why some governments experiencing similar instability remain using agent based modeling (ABM) in application to Selectorate Theory.
245

Hajj crowd management: Discovering superior performance with agent-based modeling and queueing theory

Khan, Imran 12 1900 (has links)
The thesis investigates how Agent-Based Modeling and Simulation (ABMS) and Queueing Theory (QT) techniques help manage mass gathering (MG) crowds. The techniques are applied to Hajj MG, which is one of the most complex annual MG, with a focus on its challenging Tawaf ritual. The objective is to develop a Tawaf Decision Support System (DSS) to better understand Tawaf crowd dynamics and discover decisions that lead to superior performance. TawafSIM is an ABMS model in the DSS, which simulates macro-level Tawaf crowd dynamics through micro-level pilgrim modeling to explore the impact of crowd characteristics, facility layout, and management preferences on emergent crowd behaviours with respect to throughput, satisfaction, health, and safety. Whereas, TawafQT is a QT model in the DSS to explore the impact of pilgrim arrival rate and Tawaf throughput on expected arrival, departure, and waiting times along with average queue length in the Tawaf waiting area. The thesis provides several contributions, including the following. First, it is the only Tawaf research to use a hybrid ABMS and QT approach. Second, TawafSIM is a comprehensive Tawaf simulator. It incorporates features for pilgrim characteristics, facility design, and management preferences. It calculates eight metrics for Tawaf performance, which includes one for throughput, three for satisfaction, one for health, and three for safety. It is the only Tawaf simulator to estimate satisfaction and spread of infectious disease. It conducts 42 simulation experiments in 12 categories. It generates observations for emergent, tipping point, expected, and counter intuitive behaviours. It recommends a default scenario as the best decision along with a small subset of alternative scenarios, which provide above average Tawaf performance. It generates a Tawaf Crowd Management Guide to better understand Tawaf crowd dynamics and how to pursue above average Tawaf performance under different conditions. Third, TawafQT is the only study of the Tawaf waiting area. It uses an accurate queueing model with finite source, single service, and PH type distribution, which is not only applicable to the Tawaf and other Hajj related queueing systems but also to any queueing system, which has finite population and single service characteristics.
246

Simulation, Design and Optimization of Membrane Gas Separation, Chemical Absorption and Hybrid Processes for CO2 Capture

Chowdhury, Mohammad Hassan Murad 14 December 2011 (has links)
Coal-fired power plants are the largest anthropogenic point sources of CO2 emissions worldwide. About 40% of the world's electricity comes from coal. Approximately 49% of the US electricity in 2008 and 23% of the total electricity generation of Canada in 2000 came from coal-fired power plant (World Coal Association, and Statistic Canada). It is likely that in the near future there might be some form of CO2 regulation. Therefore, it is highly probable that CO2 capture will need to be implemented at many US and Canadian coal fired power plants at some point. Several technologies are available for CO2 capture from coal-fired power plants. One option is to separate CO2 from the combustion products using conventional approach such as chemical absorption/stripping with amine solvents, which is commercially available. Another potential alternative, membrane gas separation, involves no moving parts, is compact and modular with a small footprint, is gaining more and more attention. Both technologies can be retrofitted to existing power plants, but they demands significant energy requirement to capture, purify and compress the CO2 for transporting to the sequestration sites. This thesis is a techno-economical evaluation of the two approaches mentioned above along with another approach known as hybrid. This evaluation is based on the recent advancement in membrane materials and properties, and the adoption of systemic design procedures and optimization approach with the help of a commercial process simulator. Comparison of the process performance is developed in AspenPlus process simulation environment with a detailed multicomponent gas separation membrane model, and several rigorous rate-based absorption/stripping models. Fifteen various single and multi-stage membrane process configurations with or without recycle streams are examined through simulation and design study for industrial scale post-combustion CO2 capture. It is found that only two process configurations are capable to satisfy the process specifications i.e., 85% CO2 recovery and 98% CO2 purity for EOR. The power and membrane area requirement can be saved by up to 13% and 8% respectively by the optimizing the base design. A post-optimality sensitivity analysis reveals that any changes in any of the factors such as feed flow rate, feed concentration (CO2), permeate vacuum and compression condition have great impact on plant performance especially on power consumption and product recovery. Two different absorption/stripping process configurations (conventional and Fluor concept) with monoethanolamine (30 wt% MEA) solvent were simulated and designed using same design basis as above with tray columns. Both the rate-based and the equilibrium-stage based modeling approaches were adopted. Two kinetic models for modeling reactive absorption/stripping reactions of CO2 with aqueous MEA solution were evaluated. Depending on the options to account for mass transfer, the chemical reactions in the liquid film/phase, film resistance and film non-ideality, eight different absorber/stripper models were categorized and investigated. From a parametric design study, the optimum CO2 lean solvent loading was determined with respect to minimum reboiler energy requirement by varying the lean solvent flow rate in a closed-loop simulation environment for each model. It was realized that the success of modeling CO2 capture with MEA depends upon how the film discretization is carried out. It revealed that most of the CO2 was reacted in the film not in the bulk liquid. This insight could not be recognized with the traditional equilibrium-stage modeling. It was found that the optimum/or minimum lean solvent loading ranges from 0.29 to 0.40 and the reboiler energy ranges from 3.3 to 5.1 (GJ/ton captured CO2) depending on the model considered. Between the two process alternatives, the Fluor concept process performs well in terms of plant operating (i.e., 8.5% less energy) and capital cost (i.e., 50% less number of strippers). The potentiality of hybrid processes which combines membrane permeation and conventional gas absorption/stripping using MEA were also examined for post-combustion CO2 capture in AspenPlus®. It was found that the hybrid process may not be a promising alternative for post-combustion CO2 capture in terms of energy requirement for capture and compression. On the other hand, a stand-alone membrane gas separation process showed the lowest energy demand for CO2 capture and compression, and could save up to 15 to 35% energy compare to the MEA capture process depending on the absorption/stripping model used.
247

Development Of A Step Feature-based Intelligent Process Planning System For Prismatic Parts

Amaitik, Saleh Mohammed 01 June 2005 (has links) (PDF)
Computer aided process planning (CAPP) is generally acknowledged as a significant activity to achieve Computer Integrated Manufacturing (CIM). In coping with the dynamic changes in the modern manufacturing environment, the awareness of developing integrated and intelligent CAPP systems has been raised in an attempt to generate more successful implementation of intelligent manufacturing systems. The main objective of this research work has been to develop an integrated intelligent process planning system that helps designers and process planners to improve their design and planning in the early stages of the product life cycle. In order to achieve this goal, the following specific objectives have been accomplished: (1) Developed a STEP-based feature modeler for building mechanical parts using high-level 3D solid features as the basic design entities. The modeler is capable of generating high-level product data in XML format according to ISO 10303-AP224 standard. This file can be used to integrate into CAPP/CAM systems without using a complex feature recognition process. (2) Developed a STEP-based intelligent process planning system for prismatic parts. The system maps STEP AP224 XML data file and produces the corresponding machining operations to generate a digital process plan in XML format according to ISO 14649 (STEP-NC) standard. A Hybrid approach of most recent techniques of artificial intelligence (neural networks, fuzzy logic and rule-based) is used as the inference engine of the developed system. Three neural network models are implemented to select machining operations, cutting tools, and machine tools. Several fuzzy logic models are utilized to select machining parameters for different machining operations, work material, and tool material combinations. The rule-based functions are utilized to perform operation sequence and setup planning. An object-oriented approach has been used in the definition and implementation of the developed system. This approach offers advantages of incremental system development and reusability. The developed system is integrated with AutoCAD using Activex automation interface. Several examples have been presented to demonstrate and verify the applicability of the developed system. Finally, this research will contribute significantly to the applicability of advanced artificial intelligent techniques in CAPP systems. The implementation of STEP technology in this research will support the integration of CAPP system with other systems in CIM environment. The developed system is intended to be an effective concurrent engineering tool that bridges the gap between design and manufacturing.
248

[en] PERMEABILITY ESTIMATION IN TURBIDITE CHANNELS CONSTRAINED BY WELL-TESTING / [pt] ESTIMATIVA DA PERMEABILIDADE EM CANAIS TURBIDÍTICOS USANDO DADOS DE TESTE DE FORMAÇÃO

TAMIRES PEREIRA PINTO DA SILVA 08 January 2019 (has links)
[pt] O principal objetivo deste trabalho é preencher canais turbidíticos com alguma propriedade petrofísica, como a permeabilidade. Estes canais são geometricamente limitados por lobos turbidíticos, gerando a simulação de um sistema deposicional. Simulações numéricas são usadas para tentar ajustar a permeabilidade a um caso de referência por meio de uma função objetivo. Um simulador convencional de diferenças finitas foi usado para comparar os dados de referência com as simulações, obtendo resultados próximos. / [en] The main objective of this work is to populate turbidite channels with some petrophysical property such as permeability. These channels are geometrically constrained by turbidite lobes creating a simulated depositional system. Numerical simulations are used to try to fit the permeability field to a reference case through an objective function. A conventional finite difference simulator was used to compare the reference data to the simulations, obtaining close results.
249

Génération de contenu graphique / Graphical content generation

Marechal, Nicolas 07 July 2010 (has links)
L’objectif de cette thèse est la recherche de nouvelles techniques de génération de contenu numérique pour des applications de jeu vidéo. Le manque de variété de terrains, d’objets et de détails affecte fortement le réalisme des paysages de synthèse.Dans ce contexte, un des principaux goulots d’étranglements est la modélisation des ressources graphiques permettant de créer les scènes. Afin de simplifier et d’accélérer cette tâche, nous présentons des méthodes permettant de générer automatiquement du contenu graphique pour créer de grands paysages à la fois complexes et originaux.Notre première approche permet de créer et d’éditer rapidement des variétés d’objets à partir d’un modèle initial fourni par un graphiste, sous la contrainte d’une représentation avec très peu de triangles. Nous présentons également une méthode de génération procédurale des variétés d’objets. Ensemble, ces méthodes permettent de créer aussi bien des variétés de formes naturelles que des ouvrages d’arts tels que des routes, des ponts et des tunnels capables de s’adapter automatiquement au relief d’un paysage.Nous proposons une autre méthode, s’appuyant sur une simulation physique et thermique, pour créer des paysages hivernaux évoluant au cours du temps en fonction des conditions climatiques. Cette approche permet de suivre l’évolution du manteau neigeux ainsi que l’épaisseur de la glace qui se forme en surface d’un lac. / The goal of this thesis is the search for new techniques for generating digital content for video game applications. The lack of variety of objects and details strongly affects the realism of synthetic landscapes. In this context, a major bottleneck is the creation of graphical assets to create the scenes. To simplify and accelerate this task, we present methods to automatically generate graphical content with a view to creating large, convincing and realistic landscapes.We present a first category of method to create and edit quickly a vast variety of objects from an initial model provided by a graphic designer, under the constraint of low polygonal modeling. We also present a procedural method for generating of varieties of objects. Together, these methods can create many varieties of natural forms as well as roads, bridges and tunnels that can automatically adapt to the landform features of an input terrain.We propose another method, based on a physical simulation and heat transfer simulation to create winter landscapes that evolve over time depending on the weather conditions. This approach allows us to follow the evolution of the snowpack and the thickness of the ice that forms on the surface of a lake.
250

Pulp mill heat and mass balance model : Exploring the benefits and possibilities of process modelling as an applied method in a case study

Mählkvist, Simon, Pontus, Netzell January 2018 (has links)
This thesis focused on the modelling of a pulping process. The purpose was to see if an accurate model can be crated based on relatively simple premises and if the errors can be identified or analysed. To realise this, the authors conducted a literature study to identify the current state of the art regarding the chemical pulping process. In addition, flow charts and sample data from a case study were examined. Based on the literature review and case study, model assumptions were derived. The model is divided into sixteen components. Where mixing occurs, lumped conditions are assumed. The model has five validation points, four of which are temperatures and a mass flowrate. These are shown as deviations from the measured values. In conclusions, it was the model could produce stable results over a narrow time frame. More so if the transition period at the start of the simulation is overlooked. Several new model assumptions are presented with the purpose to increase accuracy e.g. account for the components ability to store mass.

Page generated in 0.0907 seconds