• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 172
  • 28
  • 22
  • 19
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 331
  • 331
  • 195
  • 79
  • 58
  • 54
  • 45
  • 40
  • 38
  • 35
  • 33
  • 32
  • 32
  • 30
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Falhas de mercado e redes em políticas públicas: desafios e possibilidades ao Sistema Único de Saúde / Market failures and public policy networks: challenges and possibilities for the Brazilian Unified Health System

Pinheiro Filho, Francisco Percival 22 January 2015 (has links)
Os princípios e as diretrizes organizativas do SUS, combinados ao modelo estipulado de financiamento da assistência à saúde, conduzem ao surgimento de falhas de mercado, que geram barreiras à apropriada gestão do sistema e, por conseguinte, impedem a concretização dos objetivos do SUS e, potencialmente, acarretam o aprofundamento das iniquidades em saúde na população. As características estruturais e a heterogeneidade de atores no âmbito da rede de serviços do SUS, tendo em vista as configurações diversas assumidas pelas sub-redes regionalmente definidas, cujos gestores apresentam certo grau de autonomia decisória e mantêm interações entre si segundo regras preestabelecidas, impõem dificuldades analíticas que não permitem que as características globais do sistema público de saúde brasileiro sejam apreendidas a partir de uma análise simplista da soma de seus agentes componentes, à medida que sua configuração global é resultado da evolução das interações ocorridas em seu interior. Propõe-se a utilização de métodos quantitativos baseados em dinâmica de agentes inseridos em redes sociais como alternativa ao emprego de métodos quantitativos retrospectivos baseados em estimativas parciais. A modelagem baseada em agentes apresenta significativas vantagens à compreensão da organização, heterogeneidade e dinâmica estrutural da rede de serviços do SUS, assim como possibilita uma busca pela minimização dos efeitos das falhas de mecanismos de mercado no sistema de saúde brasileiro / The principles and organizational guidelines of SUS, combined to set model of health care financing not only the emergence of market failures, which generate barriers to proper management of the system and therefore prevent the achievement of the goals and SUS, potentially, lead to the deepening of health inequities in the population. The structural characteristics and the heterogeneity of actors under the public health network, in view of the different configurations assumed by regionally defined subnets, whose managers have a certain degree of decision-making autonomy and maintain interactions with each other according predetermined rules, impose difficulties analytical that do not allow the global characteristics of the public system of Brazilian health are seized from a simplistic analysis of the sum of its components agents, as its overall configuration is the result of the evolution of interactions occurring inside. It is proposed the use of quantitative methods based on dynamic inserted agents in social networks as an alternative to the use of retrospective quantitative methods based on partial estimates. The modeling based on agents has significant advantages to understand the organization, heterogeneity and structural dynamics of the public health network, as well as enables a search for minimizing the effects of failures of market mechanisms in the Brazilian health system
232

Ensino do câncer com o uso de modelos baseados em agentes / Teaching Cancer using Agent-based Models

Santos, Anderson Josué Corrêa de Paula 07 October 2014 (has links)
No presente trabalho foi desenvolvida uma ferramenta para tornar o ensino do câncer na graduação mais efetivo. Tal ferramenta foi criada utilizando simulações multi-agente na plataforma NetLogo e conceitos gerais do processo de formação do câncer. O modelo que serviu de base para a criação da ferramenta foi o de Hanahan & Weinberg (2011). Inicialmente, para mostrar que a ferramenta é adequada para o ensino do câncer, foram definidos conceitualmente os Sistemas Complexos e o câncer e, em seguida, foi mostrado como estes se relacionam. Para desenvolver o presente trabalho, foram utilizadas pesquisas em dados secundários, entrevistas em profundidade e participação em aulas, palestras e seminários. O resultado desse processo foi uma ferramenta com diversas aplicações capaz de ensinar sobre o câncer através de muita interatividade e experimentação. Foi elaborada uma discussão sobre a problemática do ensino no Brasil e como isso afeta a aceitação de novas metodologias de ensino conforme a que é apresentada neste trabalho. Discutiu-se sobre o ensino do câncer no Brasil e a utilidade de ferramentas da área de sistemas complexos para tal. Para finalizar o trabalho, foram sugeridas algumas formas interessantes de se estender o uso da ferramenta desenvolvida na pesquisa científica, na clínica médica, no melhoramento de exames diagnósticos, entre outras / In the current work was developed a tool to increase the effectiveness of teaching cancer in undergraduate courses. This tool was built through multi-agents simulations in NetLogo platform and using general concepts of the cancer formation process. The tool is based in Hanahan & Weinbergs (2011) model. Initially, in order to justify the adequacy of the tool in teaching cancer, the Complex Systems and the Cancer are conceptualized and then, how they are related to each other. Secondary data research, in-depth interviews and participation in classes, seminars and lectures on relevant subjects were used to develop the current work. The result of this process is a teaching tool with many applications able to teach the student through much interaction and much experimentation. There is a discussion about the education in Brazil and how it affects new methodologies in class as the one presented in this work. There was also a discussion about the teaching of cancer in Brazil and the usefulness of tools from the field of Complex Systems for this teaching. To finish the work, some interesting forms of extension of this tool are suggested
233

Economic networks: communication, cooperation & complexity

Angus, Simon Douglas, Economics, Australian School of Business, UNSW January 2007 (has links)
This thesis is concerned with the analysis of economic network formation. There are three novel sections to this thesis (Chapters 5, 6 and 8). In the first, the non-cooperative communication network formation model of Bala and Goyal (2000) (BG) is re-assessed under conditions of no inertia. It is found that the Strict Nash circle (or wheel) structure is still the equilibrium outcome for n = 3 under no inertia. However, a counter-example for n = 4 shows that with no inertia infinite cycles are possible, and hence the system does not converge. In fact, cycles are found to quickly dominate outcomes for n > 4 and further numerical simulations of conditions approximating no inertia (probability of updating > 0.8 to 1) indicate that cycles account for a dramatic slowing of convergence times. These results, together with the experimental evidence of Falk and Kosfeld (2003) (FK) motivate the second contribution of this thesis. A novel artificial agent model is constructed that allows for a vast strategy space (including the Best Response) and permits agents to learn from each other as was indicated by the FK results. After calibration, this model replicates many of the FK experimental results and finds that an externality exploiting ratio of benefits and costs (rather than the difference) combined with a simple altruism score is a good proxy for the human objective function. Furthermore, the inequity aversion results of FK are found to arise as an emergent property of the system. The third novel section of this thesis turns to the nature of network formation in a trust-based context. A modified Iterated Prisoners' Dilemma (IPD) model is developed which enables agents to play an additional and costly network forming action. Initially, canonical analytical results are obtained despite this modification under uniform (non-local) interactions. However, as agent network decisions are 'turned on' persistent cooperation is observed. Furthermore, in contrast to the vast majority of non-local, or static network models in the literature, it is found that a-periodic, complex dynamics result for the system in the long-run. Subsequent analysis of this regime indicates that the network dynamics have fingerprints of self-organized criticality (SOC). Whilst evidence for SOC is found in many physical systems, such dynamics have been seldom, if ever, reported in the strategic interaction literature.
234

Physiologically based pharmacokinetic modeling in risk assessment - Development of Bayesian population methods

Jonsson, Fredrik January 2001 (has links)
In risk assessment of risk chemicals, variability in susceptibility in the population is an important aspect. The health hazard of a pollutant is related to the internal exposure to the chemical, i.e. the target dose, rather than the external exposure. The target dose may be calculated by physiologically based pharmacokinetic (PBPK) modeling. Furthermore, variability in target dose may be estimated by introducing variability in the physiological, anatomical, and biochemical parameters of the model. Data on these toxicokinetic model parameters may be found in the scientific literature. Since the early seventies, a large number of experimental inhalation studies of the kinetics of several volatiles in human volunteers have been performed at the National Institute for Working Life in Solna. To this day, only very limited analyses of these extensive data have been performed. A Bayesian analysis makes it possible to merge a priori knowledge from the literature with the information in experimental data. If combined with population PBPK modeling, the Bayesian approach may yield posterior estimates of the toxicokinetic parameters for each subject, as well as for the population. One way of producing these estimates is by so-called Markov-chain Monte Carlo (MCMC) simulation. The aim of the thesis was to apply the MCMC technique on previously published experimental data. Another objective was to assess the reliability of PBPK models in general by the combination of the extensive data and Bayesian population techniques. The population kinetics of methyl chloride, dichloromethane, toluene and styrene were assessed. The calibrated model for dichloromethane was used to predict cancer risk in a simulated Swedish population. In some cases, the respiratory uptake of volatiles was found to be lower than predicted from reference values on alveolar ventilation. The perfusion of fat tissue was found to be a complex process that needs special attention in PBPK modeling. These results provide a significant contribution to the field of PBPK modeling of risk chemicals. Appropriate statistical treatment of uncertainty and variability may increase confidence in model results and ultimately contribute to an improved scientific basis for the estimation of occupational health risks.
235

PDEModelica – A High-Level Language for Modeling with Partial Differential Equations

Saldamli, Levon January 2006 (has links)
This thesis describes work on a new high-level mathematical modeling language and framework called PDEModelica for modeling with partial differential equations. It is an extension to the current Modelica modeling language for object-oriented, equation-based modeling based on differential and algebraic equations. The language extensions and the framework presented in this thesis are consistent with the concepts of Modelica while adding support for partial differential equations and space-distributed variables called fields. The specification of a partial differential equation problem consists of three parts: 1) the description of the definition domain, i.e., the geometric region where the equations are defined, 2) the initial and boundary conditions, and 3) the actual equations. The known and unknown distributed variables in the equation are represented by field variables in PDEModelica. Domains are defined by a geometric description of their boundaries. Equations may use the Modelica derivative operator extended with support for partial derivatives, or vector differential operators such as divergence and gradient, which can be defined for general curvilinear coordinates based on coordinate system definitions. The PDEModelica system also allows the partial differential equation models to be defined using a coefficient-based approach, where PDE models from a library are instantiated with different parameter values. Such a library contains both continuous and discrete representations of the PDE model. The user can instantiate the continuous parts and define the parameters, and the discrete parts containing the equations are automatically instantiated and used to solve the PDE problem numerically. Compared to most earlier work in the area of mathematical modeling languages supporting PDEs, this work provides a modern object-oriented component-based approach to modeling with PDEs, including general support for hierarchical modeling, and for general, complex geometries. It is possible to separate the geometry definition from the model definition, which allows geometries to be defined separately, collected into libraries, and reused in new models. It is also possible to separate the analytical continuous model description from the chosen discretization and numerical solution methods. This allows the model description to be reused, independent of different numerical solution approaches. The PDEModelica field concept allows general declaration of spatially distributed variables. Compared to most other approaches, the field concept described in this work affords a clearer abstraction and defines a new type of variable. Arrays of such field variables can be defined in the same way as arrays of regular, scalar variables. The PDEModelica language supports a clear, mathematical syntax that can be used both for equations referring to fields and explicit domain specifications, used for example to specify boundary conditions. Hierarchical modeling and decomposition is integrated with a general connection concept, which allows connections between ODE/DAE and PDE based models. The implementation of a Modelica library needed for PDEModelica and a prototype implementation of field variables are also described in the thesis. The PDEModelica library contains internal and external solver implementations, and uses external software for mesh generation, requisite for numerical solution of the PDEs. Finally, some examples modeled with PDEModelica and solved using these implementations are presented.
236

Simulation, Design and Optimization of Membrane Gas Separation, Chemical Absorption and Hybrid Processes for CO2 Capture

Chowdhury, Mohammad Hassan Murad 14 December 2011 (has links)
Coal-fired power plants are the largest anthropogenic point sources of CO2 emissions worldwide. About 40% of the world's electricity comes from coal. Approximately 49% of the US electricity in 2008 and 23% of the total electricity generation of Canada in 2000 came from coal-fired power plant (World Coal Association, and Statistic Canada). It is likely that in the near future there might be some form of CO2 regulation. Therefore, it is highly probable that CO2 capture will need to be implemented at many US and Canadian coal fired power plants at some point. Several technologies are available for CO2 capture from coal-fired power plants. One option is to separate CO2 from the combustion products using conventional approach such as chemical absorption/stripping with amine solvents, which is commercially available. Another potential alternative, membrane gas separation, involves no moving parts, is compact and modular with a small footprint, is gaining more and more attention. Both technologies can be retrofitted to existing power plants, but they demands significant energy requirement to capture, purify and compress the CO2 for transporting to the sequestration sites. This thesis is a techno-economical evaluation of the two approaches mentioned above along with another approach known as hybrid. This evaluation is based on the recent advancement in membrane materials and properties, and the adoption of systemic design procedures and optimization approach with the help of a commercial process simulator. Comparison of the process performance is developed in AspenPlus process simulation environment with a detailed multicomponent gas separation membrane model, and several rigorous rate-based absorption/stripping models. Fifteen various single and multi-stage membrane process configurations with or without recycle streams are examined through simulation and design study for industrial scale post-combustion CO2 capture. It is found that only two process configurations are capable to satisfy the process specifications i.e., 85% CO2 recovery and 98% CO2 purity for EOR. The power and membrane area requirement can be saved by up to 13% and 8% respectively by the optimizing the base design. A post-optimality sensitivity analysis reveals that any changes in any of the factors such as feed flow rate, feed concentration (CO2), permeate vacuum and compression condition have great impact on plant performance especially on power consumption and product recovery. Two different absorption/stripping process configurations (conventional and Fluor concept) with monoethanolamine (30 wt% MEA) solvent were simulated and designed using same design basis as above with tray columns. Both the rate-based and the equilibrium-stage based modeling approaches were adopted. Two kinetic models for modeling reactive absorption/stripping reactions of CO2 with aqueous MEA solution were evaluated. Depending on the options to account for mass transfer, the chemical reactions in the liquid film/phase, film resistance and film non-ideality, eight different absorber/stripper models were categorized and investigated. From a parametric design study, the optimum CO2 lean solvent loading was determined with respect to minimum reboiler energy requirement by varying the lean solvent flow rate in a closed-loop simulation environment for each model. It was realized that the success of modeling CO2 capture with MEA depends upon how the film discretization is carried out. It revealed that most of the CO2 was reacted in the film not in the bulk liquid. This insight could not be recognized with the traditional equilibrium-stage modeling. It was found that the optimum/or minimum lean solvent loading ranges from 0.29 to 0.40 and the reboiler energy ranges from 3.3 to 5.1 (GJ/ton captured CO2) depending on the model considered. Between the two process alternatives, the Fluor concept process performs well in terms of plant operating (i.e., 8.5% less energy) and capital cost (i.e., 50% less number of strippers). The potentiality of hybrid processes which combines membrane permeation and conventional gas absorption/stripping using MEA were also examined for post-combustion CO2 capture in AspenPlus®. It was found that the hybrid process may not be a promising alternative for post-combustion CO2 capture in terms of energy requirement for capture and compression. On the other hand, a stand-alone membrane gas separation process showed the lowest energy demand for CO2 capture and compression, and could save up to 15 to 35% energy compare to the MEA capture process depending on the absorption/stripping model used.
237

Hybrid modeling and analysis of multiscale biochemical reaction networks

Wu, Jialiang 23 December 2011 (has links)
This dissertation addresses the development of integrative modeling strategies capable of combining deterministic and stochastic, discrete and continuous, as well as multi-scale features. The first set of studies combines the purely deterministic modeling methodology of Biochemical Systems Theory (BST) with a hybrid approach, using Functional Petri Nets, which permits the account of discrete features or events, stochasticity, and different types of delays. The efficiency and significance of this combination is demonstrated with several examples, including generic biochemical networks with feedback controls, gene regulatory modules, and dopamine based neuronal signal transduction. A study expanding the use of stochasticity toward systems with small numbers of molecules proposes a rather general strategy for converting a deterministic process model into a corresponding stochastic model. The strategy characterizes the mathematical connection between a stochastic framework and the deterministic analog. The deterministic framework is assumed to be a generalized mass action system and the stochastic analogue is in the format of the chemical master equation. The analysis identifies situations where internal noise affecting the system needs to be taken into account for a valid conversion from a deterministic to a stochastic model. The conversion procedure is illustrated with several representative examples, including elemental reactions, Michaelis-Menten enzyme kinetics, a genetic regulatory motif, and stochastic focusing. The last study establishes two novel, particle-based methods to simulate biochemical diffusion-reaction systems within crowded environments. These simulation methods effectively simulate and quantify crowding effects, including reduced reaction volumes, reduced diffusion rates, and reduced accessibility between potentially reacting particles. The proposed methods account for fractal-like kinetics, where the reaction rate depends on the local concentrations of the molecules undergoing the reaction. Rooted in an agent based modeling framework, this aspect of the methods offers the capacity to address sophisticated intracellular spatial effects, such as macromolecular crowding, active transport along cytoskeleton structures, and reactions on heterogeneous surfaces, as well as in porous media. Taken together, the work in this dissertation successfully developed theories and simulation methods which extend the deterministic, continuous framework of Biochemical Systems Theory to allow the account of delays, stochasticity, discrete features or events, and spatial effects for the modeling of biological systems, which are hybrid and multiscale by nature.
238

Enabling methods for the design and optimization of detection architectures

Payan, Alexia Paule Marie-Renee 08 April 2013 (has links)
The surveillance of geographic borders and critical infrastructures using limited sensor capability has always been a challenging task in many homeland security applications. While geographic borders may be very long and may go through isolated areas, critical assets may be large and numerous and may be located in highly populated areas. As a result, it is virtually impossible to secure each and every mile of border around the country, and each and every critical infrastructure inside the country. Most often, a compromise must be made between the percentage of border or critical asset covered by surveillance systems and the induced cost. Although threats to homeland security can be conceived to take place in many forms, those regarding illegal penetration of the air, land, and maritime domains under the cover of day-to-day activities have been identified to be of particular interest. For instance, the proliferation of drug smuggling, illegal immigration, international organized crime, resource exploitation, and more recently, modern piracy, require the strengthening of land border and maritime awareness and increasingly complex and challenging national security environments. The complexity and challenges associated to the above mission and to the protection of the homeland may explain why a methodology enabling the design and optimization of distributed detection systems architectures, able to provide accurate scanning of the air, land, and maritime domains, in a specific geographic and climatic environment, is a capital concern for the defense and protection community. This thesis proposes a methodology aimed at addressing the aforementioned gaps and challenges. The methodology particularly reformulates the problem in clear terms so as to facilitate the subsequent modeling and simulation of potential operational scenarios. The needs and challenges involved in the proposed study are investigated and a detailed description of a multidisciplinary strategy for the design and optimization of detection architectures in terms of detection performance and cost is provided. This implies the creation of a framework for the modeling and simulation of notional scenarios, as well as the development of improved methods for accurate optimization of detection architectures. More precisely, the present thesis describes a new approach to determining detection architectures able to provide effective coverage of a given geographical environment at a minimum cost, by optimizing the appropriate number, types, and locations of surveillance and detection systems. The objective of the optimization is twofold. First, given the topography of the terrain under study, several promising locations are determined for each sensor system based on the percentage of terrain it is covering. Second, architectures of sensor systems able to effectively cover large percentages of the terrain at minimal costs are determined by optimizing the number, types and locations of each detection system in the architecture. To do so, a modified Genetic Algorithm and a modified Particle Swarm Optimization are investigated and their ability to provide consistent results is compared. Ultimately, the modified Particle Swarm Optimization algorithm is used to obtain a Pareto frontier of detection architectures able to satisfy varying customer preferences on coverage performance and related cost.
239

A Methodology for Capability-Based Technology Evaluation for Systems-of-Systems

Biltgen, Patrick Thomas 26 March 2007 (has links)
Post-Cold War military conflicts have highlighted the need for a flexible, agile joint force responsive to emerging crises around the globe. The 2005 Joint Capabilities Integration and Development System (JCIDS) acquisition policy document mandates a shift away from stove-piped threat-based acquisition to a capability-based model focused on the multiple ways and means of achieving an effect. This shift requires a greater emphasis on scenarios, tactics, and operational concepts during the conceptual phase of design and structured processes for technology evaluation to support this transition are lacking. In this work, a methodology for quantitative technology evaluation for systems-of-systems is defined. Physics-based models of an aircraft system are exercised within a hierarchical, object-oriented constructive simulation to quantify technology potential in the context of a relevant scenario. A major technical challenge to this approach is the lack of resources to support real-time human-in-the-loop tactical decision making and technology analysis. An approach that uses intelligent agents to create a "Meta-General" capable of forecasting strategic and tactical decisions based on technology inputs is used. To demonstrate the synergy between new technologies and tactics, surrogate models are utilized to provide intelligence to individual agents within the framework and develop a set of tactics that appropriately exploit new technologies. To address the long run-times associated with constructive military simulations, neural network surrogate models are implemented around the forecasting environment to enable rapid trade studies. Probabilistic techniques are used to quantify uncertainty and richly populate the design space with technology-infused alternatives. Since a large amount of data is produced in the analysis of systems-of-systems, dynamic, interactive visualization techniques are used to enable "what-if" games on assumptions, systems, technologies, tactics, and evolving threats. The methodology developed in this dissertation is applied to a notional Long Range Strike air vehicle and system architecture in the context of quantitative technology evaluation for the United States Air Force.
240

MobIS 2010 - Modellierung betrieblicher Informationssysteme, Modellgestütztes Management

10 December 2010 (has links) (PDF)
This volume contains contribution form the refereed “MobIS 2010” main program and selected papers of its tracks. The conference on information systems modeling was held in Dresden September 15-17, 2010. The guiding theme for MobIS 2010 focused on modeling topics between model-based management and component and service engineering.

Page generated in 0.1129 seconds