• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 173
  • 28
  • 22
  • 19
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 335
  • 335
  • 197
  • 79
  • 58
  • 56
  • 45
  • 40
  • 38
  • 35
  • 33
  • 32
  • 32
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Communicative Competence: Computational Simulation Approach to Public Emergency Management

January 2012 (has links)
abstract: Public risk communication (i.e. public emergency warning) is an integral component of public emergency management. Its effectiveness is largely based on the extent to which it elicits appropriate public response to minimize losses from an emergency. While extensive studies have been conducted to investigate individual responsive process to emergency risk information, the literature in emergency management has been largely silent on whether and how emergency impacts can be mitigated through the effective use of information transmission channels for public risk communication. This dissertation attempts to answer this question, in a specific research context of 2009 H1N1 influenza outbreak in Arizona. Methodologically, a prototype agent-based model is developed to examine the research question. Along with the specific disease spread dynamics, the model incorporates individual decision-making and response to emergency risk information. This simulation framework synthesizes knowledge from complexity theory, public emergency management, epidemiology, social network and social influence theory, and both quantitative and qualitative data found in previous studies. It allows testing how emergency risk information needs to be issued to the public to bring desirable social outcomes such as mitigated pandemic impacts. Simulation results generate several insightful propositions. First, in the research context, emergency managers can reduce the pandemic impacts by increasing the percent of state population who use national TV to receive pandemic information to 50%. Further increasing this percent after it reaches 50% brings only marginal effect in impact mitigation. Second, particular attention is needed when emergency managers attempt to increase the percent of state population who believe the importance of information from local TV or national TV, and the frequency in which national TV is used to send pandemic information. Those measures may reduce the pandemic impact in one dimension, while increase the impact in another. Third, no changes need to be made on the percent of state population who use local TV or radio to receive pandemic information, and the frequency in which either channel is used for public risk communication. This dissertation sheds light on the understanding of underlying dynamics of human decision-making during an emergency. It also contributes to the discussion of developing a better understanding of information exchange and communication dynamics during a public emergency and of improving the effectiveness of public emergency management practices in a dynamic environment. / Dissertation/Thesis / Ph.D. Public Administration 2012
172

A Tour Level Stop Scheduling Framework and A Vehicle Type Choice Model System for Activity Based Travel Forecasting

January 2014 (has links)
abstract: This dissertation research contributes to the advancement of activity-based travel forecasting models along two lines of inquiry. First, the dissertation aims to introduce a continuous-time representation of activity participation in tour-based model systems in practice. Activity-based travel demand forecasting model systems in practice today are largely tour-based model systems that simulate individual daily activity-travel patterns through the prediction of day-level and tour-level activity agendas. These tour level activity-based models adopt a discrete time representation of activities and sequence the activities within tours using rule-based heuristics. An alternate stream of activity-based model systems mostly confined to the research arena are activity scheduling systems that adopt an evolutionary continuous-time approach to model activity participation subject to time-space prism constraints. In this research, a tour characterization framework capable of simulating and sequencing activities in tours along the continuous time dimension is developed and implemented using readily available travel survey data. The proposed framework includes components for modeling the multitude of secondary activities (stops) undertaken as part of the tour, the time allocated to various activities in a tour, and the sequence in which the activities are pursued. Second, the dissertation focuses on the implementation of a vehicle fleet composition model component that can be used not only to simulate the mix of vehicle types owned by households but also to identify the specific vehicle that will be used for a specific tour. Virtually all of the activity-based models in practice only model the choice of mode without due consideration of the type of vehicle used on a tour. In this research effort, a comprehensive vehicle fleet composition model system is developed and implemented. In addition, a primary driver allocation model and a tour-level vehicle type choice model are developed and estimated with a view to advancing the ability to track household vehicle usage through the course of a day within activity-based travel model systems. It is envisioned that these advances will enhance the fidelity of activity-based travel model systems in practice. / Dissertation/Thesis / Doctoral Dissertation Civil and Environmental Engineering 2014
173

Managing Distributed Information: Implications for Energy Infrastructure Co-production

January 2018 (has links)
abstract: The Internet and climate change are two forces that are poised to both cause and enable changes in how we provide our energy infrastructure. The Internet has catalyzed enormous changes across many sectors by shifting the feedback and organizational structure of systems towards more decentralized users. Today’s energy systems require colossal shifts toward a more sustainable future. However, energy systems face enormous socio-technical lock-in and, thus far, have been largely unaffected by these destabilizing forces. More distributed information offers not only the ability to craft new markets, but to accelerate learning processes that respond to emerging user or prosumer centered design needs. This may include values and needs such as local reliability, transparency and accountability, integration into the built environment, and reduction of local pollution challenges. The same institutions (rules, norms and strategies) that dominated with the hierarchical infrastructure system of the twentieth century are unlikely to be good fit if a more distributed infrastructure increases in dominance. As information is produced at more distributed points, it is more difficult to coordinate and manage as an interconnected system. This research examines several aspects of these, historically dominant, infrastructure provisioning strategies to understand the implications of managing more distributed information. The first chapter experimentally examines information search and sharing strategies under different information protection rules. The second and third chapters focus on strategies to model and compare distributed energy production effects on shared electricity grid infrastructure. Finally, the fourth chapter dives into the literature of co-production, and explores connections between concepts in co-production and modularity (an engineering approach to information encapsulation) using the distributed energy resource regulations for San Diego, CA. Each of these sections highlights different aspects of how information rules offer a design space to enable a more adaptive, innovative and sustainable energy system that can more easily react to the shocks of the twenty-first century. / Dissertation/Thesis / Doctoral Dissertation Sustainability 2018
174

Procedural reconstruction of buildings : towards large scale automatic 3D modeling of urban environments / Reconstruction procédurale de bâtiments : vers l’automatisation à grande échelle de la modélisation 3D d’environnements urbains

Simon, Loïc 25 July 2011 (has links)
La présente thèse est consacrée à la modélisation 2D et 3D d’environnements urbains à l’aide de représentations structurées et de grammaires de formes. Notre approche consiste à introduire une représentation sémantique de bâtiments, qui encode les contraintes architecturales attendues, et qui soit capable de traiter des exemples complexes en utilisant des grammaires très simples. En outre, nous proposons deux nouveaux algorithmes d’inférence permettant l’analyse grammaticale d’images en utilisant ces grammaires. En premier lieu, un algorithme dit de hill climbing permet d’extraire les règles de grammaire et les paramètres correspondants à partir d’une vue unique d’une façade. Ce concept combine astucieusement les contraintes grammaticales et les propriétés visuelles attendues pour les différents éléments architecturaux. Cependant, afin de pouvoir traiter de cas plus complexes et également d’incorporer de l’information 3D, une deuxième stratégie d’inférence basée sur des algorithmes évolutionnaires a été adoptée pour optimiser un fonction à deux objectifs qui introduit notamment des notions de profondeur. Le système proposé a été évalué tant qualitativement que quantitativement sur un panel de façades de référence toute munies d’annotations, démontrant ainsi sa robustesse face à des situations d’abords difficiles. Grâce à la force du contexte grammatical, des améliorations substantielles ont été démontrées par rapport aux performances des mêmes modèles couplés à des a priori uniquement locaux. Par conséquent, notre approche fournit des outils puissants permettant de faire face à la demande croissante en modélisation 3D d’environnements réels à large échelle, grâce à des représentations sémantiques compactes et structurées. Ce travail ouvre par ailleurs un vaste champ de perspectives pour le domaine de l’interprétation d’images / This thesis is devoted to 2D and 3D modeling of urban environments using structured representations and grammars. Our approach introduces a semantic representation for buildings that encodes expected architectural constraints and is able to derive complex instances using fairly simple grammars. Furthermore, we propose two novel inference algorithms to parse images using such grammars. To this end, a steepest ascent hill climbing concept is considered to derive the grammar and the corresponding parameters from a single facade view. It combines the grammar constraints with the expected visual properties of the different architectural elements. Towards addressing more complex scenarios and incorporating 3D information, a second inference strategy based on evolutionary computational algorithms is adopted to optimize a two-component objective function introducing depth cues. The proposed framework was evaluated qualitatively and quantitatively on a benchmark of annotated facades, demonstrating robustness to challenging situations. Substantial improvement due to the strong grammatical context was shown in comparison to the performance of the same appearance models coupled with local priors. Therefore, our approach provides powerful techniques in response to increasing demand on large scale 3D modeling of real environments through compact, structured and semantic representations, while opening new perspectives for image understanding
175

An agent-based forest sector modeling approach to analyzing the economic effects of natural disturbances

Schwab, Olaf Sebastian 05 1900 (has links)
This dissertation describes the development of CAMBIUM, an agent-based forest sector model for large-scale strategic analysis. This model is designed as a decision support tool for assessing the effect that changes in forest product demand and resource inventories can have on the structure and economic viability of the forest sector. CAMBIUM complements existing forest sector models by modeling aggregate product supply as an emergent property of individual companies’ production decisions and stand-level ecological processes. Modeling the forest products sector as a group of interacting autonomous agents makes it possible to introduce production capacity dynamics and the potential for mill insolvencies as factors in modeling the effects of market and forest inventory based disturbances. This thesis contains four main manuscripts. In the first manuscript I develop and test a dispersal algorithm that projects aggregated forest inventory information onto a lattice grid. This method can be used to generate ecologically and statistically consistent datasets where high-quality spatial inventory data is otherwise unavailable. The second manuscript utilizes this dataset in developing a provincial-level resource dynamics model for assessing the timber supply effects of introducing weevil-resistant spruce. This model employs a stand-level approach to simulating weevil infestation and associated merchantable volume losses. Provincial-level impacts are determined by simulating harvest activities over a 350 year time horizon. In the third manuscript I shift the focus to interactions between forest companies. I analyze the effects of strategic decisions on sector structure by developing CAMBIUM as an agent-based model of competition and industry structure evolution. The forest sector is modeled as a group of autonomous, interacting agents that evolve and compete within the limitations posed by resource inventories and product demand. In the final manuscript I calibrate CAMBIUM to current conditions in the British Columbia forest sector. Industry agents compete for roundwood inputs, as well as for profits in finished product markets for pulp, panel products, and lumber. To test the relevance and utility of this model, CAMBIUM is used to quantify the cumulative impacts of a market downturn for forest products and mountain pine beetle induced timber supply fluctuations on the structure of the forest sector. / Forestry, Faculty of / Graduate
176

Exploring Impacts of Project Overload on Creativity : An Agent-Based Modeling Approach

Motamediyan, Farnaz January 2012 (has links)
Project overload is an unpleasant phenomenon which is happening for employees inside those organizations trying to make the most efficient use of their resources. AE project inside VolvoCE (VCE) is an Advanced Engineering project eager to be innovative suffering from project overload. This research aims to help VCE to move towards creative organizational climate. To do this, the author used the Agent-based modeling (ABM) approach to examine the current reality of VCE and AE projects, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. The results of this research allowed the researcher to define the gaps inside AE project and create a list of some recommendations. From these results: project overload can damage the employees’ focus and bring psychological stress reactions; Creative actions are less likely to be the result of a team with high level of project. On the other hand, motivation on proper challenging goal is more likely to help individual to alleviate the negative aspects of low level of project overload.
177

Detección de anomalías en molino de bolas usando modelos no-paramétricos

López Salazar, Alejandro Hernán January 2013 (has links)
Ingeniero Civil Electricista / La industria de la minería se ha visto expuesta a un mercado cada vez más exigente, factor que ha hecho necesario establecer estrategias claras de mejoramiento de sus políticas de producción con el fin de satisfacer los desafíos que se le presentan. Es por ello que una de las áreas que ha debido reforzar es la del mantenimiento, donde juegan un rol importante los sistemas que se puedan implementar para optimizar tareas tales como la detección y/o predicción de fallas/anomalías, dentro de estrategias de mantenimiento predictivo. Es en este punto donde se centra el aporte de la presente Memoria de Título, ya que su objetivo principal es el de desarrollar una herramienta para poder llevar a cabo la tarea de detección de anomalías para el caso de los molinos de bolas, los que forman parte importante en el proceso de molienda del material particulado. En este estudio se aborda el caso de un molino de bolas con potencia y velocidades nominales de 12 MW y 100 rpm respectivamente, siendo una de sus principales características su accionamiento (gearless) que opera como un motor síncrono de gran escala. Ello tiene algunas desventajas, como el presentar fallas que otros molinos no tienen; considérese por ejemplo un mayor riesgo de que sus polos magnéticos resulten quemados. Precisamente este modo de falla motiva el trabajo desarrollado en esta Memoria de Título, que se centra en la detección de la mencionada anomalía a través de la generación de modelos del proceso y análisis de residuos. Para la generación de residuos se utilizaron Modelos Basados en Similitud (SBM de sus siglas en inglés). Fue implementado un procedimiento para llevar a cabo esta tarea utilizando herramientas estadísticas multivariables, como es el caso del análisis en componentes principales (PCA por sus siglas en inglés) obteniéndose errores relativos (en sus estimaciones) con respecto al conjunto de datos medidos del 0,88% en promedio, lo que resulta menor en comparación con un modelo de regresión lineal, que tuvo un 3,69% mayor error relativo medio. En cuanto al método de detección de anomalías, se hizo uso de un sistema basado en histogramas del error, que compara el vector residual obtenido de un conjunto de datos etiquetado como normal con posibles anomalías. Utiliza como pilar el estudio de funciones de distribución (Chi-Cuadrado), para una significancia del 95%, obteniendo una efectividad del 100%, puesto que cada uno de los datos etiquetados como anómalos fueron detectados. El presente documento pretende ser una base para estudios futuros, asociados a la búsqueda de anomalías en otras maquinarias y análisis de otros modos de fallas.
178

A Family of Role-Based Languages

Kühn, Thomas 29 August 2017 (has links) (PDF)
Role-based modeling has been proposed in 1977 by Charles W. Bachman, as a means to model complex and dynamic domains, because roles are able to capture both context-dependent and collaborative behavior of objects. Consequently, they were introduced in various fields of research ranging from data modeling via conceptual modeling through to programming languages. More importantly, because current software systems are characterized by increased complexity and context-dependence, there is a strong demand for new concepts beyond object-oriented design. Although mainstream modeling languages, i.e., Entity-Relationship Model, Unified Modeling Language, are good at capturing a system's structure, they lack ways to model the system's behavior, as it dynamically emerges through collaborating objects. In turn, roles are a natural concept capturing the behavior of participants in a collaboration. Moreover, roles permit the specification of interactions independent from the interacting objects. Similarly, more recent approaches use roles to capture context-dependent properties of objects. The notion of roles can help to tame the increased complexity and context-dependence. Despite all that, these years of research had almost no influence on current software development practice. To make things worse, until now there is no common understanding of roles in the research community and no approach fully incorporates both the context-dependent and the relational nature of roles. In this thesis, I will devise a formal model for a family of role-based modeling languages to capture the various notions of roles. Together with a software product line of Role Modeling Editors, this, in turn, enables the generation of a role-based language family for Role-based Software Infrastructures (RoSI).
179

Utilization of phenomena-based modeling in unit operation design

Kulju, T. (Timo) 09 December 2014 (has links)
Abstract In the design and development of unit operations in chemical engineering, experimental testing is often very expensive or even impossible to perform. In these kinds of situations, numerical simulations offer a good approach to study process characteristics. Typically in chemical engineering, data-based modeling is applied to study the process. This requires many experiments for tuning the model parameters and validating the model. In a phenomenology-based approach, the evolution of the system is dictated by fluid and particle transport equations. These equations are independent of the process, and can therefore be applied in various systems. However, depending on the system, there are several aspects that have to be taken into account in order to choose the correct model for the problem in hand. In this work, computational fluid dynamics (CFD) and discrete element method (DEM) modeling have been applied in different unit operations in the field of chemical engineering. CFD was applied in preventing sedimentation in a tube heat exchanger, estimating the cooling efficiency of a vertical water jet onto a hot metal plate, and studying the formation of the slag free open-eye area on the steel ladle. For comparison, DEM was applied in the continuous high-shear granulation of pharmaceutical powder. The different models used in this work are reviewed, and the results are presented from the point of view of model and process development. The grid aspects in CFD simulations and the termination criteria for DEM and CFD simulations are also studied. Based on the results of this work, phenomenological modeling can be considered to be an efficient tool for unit operation design. Together with experimental work, different modeling strategies offer a powerful tool for the design and development of unit operations. / Tiivistelmä Kemiantekniikan yksikköprosessien suunnittelussa kokeellinen tutkimus on usein erittäin kallista ja joskus jopa mahdotonta toteuttaa. Tällöin mallinnus tarjoaa hyvän lähestymistavan yksikköprosessin ominaisuuksien tutkimiselle. Tyypillisesti kemianteollisuudessa, datapohjaista mallinnusmenetelmiä on käytetty systeemin ominaisuuksien tutkimiseksi. Tämä menetelmä vaatii kuitenkin paljon koetoistoja mallin parametrien virittämiseksi ja mallin validoimiseksi. Ilmiöpohjaisessa mallinnuksessa, systeemin aikakehitys määräytyy fluidi- ja partikkelivirtauksia määräävien kuljetusyhtälöiden perusteella. Nämä yhtälöt ovat prosessista riippumattomia, jolloin niitä voidaan soveltaa yleisesti mihin tahansa systeemiin. Riippuen kuitenkin tutkittavasta yksikköprosessista, eri asioita on otettava huomioon, jotta oikea malli voidaan valita kuvaamaan systeemiä. Tässä työssä virtauslaskentaa (computational fluid dynamics, CFD) ja partikkelimenetelmää (discrete element method, DEM) on käytetty erilaisten kemiantekniikan yksikköprosessien tutkimuksessa. CFD:n avulla on tutkittu putkilämmönvaihtimen sedimentaation ehkäisyä, laminaarisen suorasuihkujäähdytyksen tehokkuutta teräslevyn jäähdytyksessä, sekä senkkaprosessissa teräksen pinnalle ilmestyvän kuonasilmäkkeen muodostumista. DEM mallinnusta käytettiin lääkejauheen jatkuvatoimisen rakeistuksen tutkimuksessa. Mallinnuksessa käytetyt mallit esitellään ja niiden tulokset esitellään malli- ja prosessikehityksen näkökulmasta. Työssä on otettu myös esille mallinnustekniset asiat CFD:n vaatiman laskentahilan ja laskennan lopetuskriteerien näkökulmasta. Työssä esitettyjen tulosten perusteella voidaan todeta, että ilmiöpohjainen mallinnus on tehokas työkalu yksikköprosessien suunnittelussa. Yhdessä kokeellisen tutkimuksen kanssa, eri mallinnusmenetelmät tehostavat yksikköprosessien suunnittelua ja kehitystä.
180

Die Rolle der Ökonomik in der Wissenschaftsphilosophie

Baier, Melanie 10 February 2017 (has links) (PDF)
Die Dissertation wendet sich insbesondere der Rolle der Ökonomik auf der Metaebene der wissenschaftsphilosophischen Argumentation zu. Ziel ist zu klären, welchen Erklärungsgehalt ökonomische Instrumente in der Wissenschaftsphilosophie haben können. Mit der Economics of Scientific Knowledge (ESK) hat sich seit Mitte der 1990er Jahre ein Literaturzweig herausgebildet, in dem genau diese Zielsetzung verfolgt wird, nämlich das Erkenntnisobjekt der wissenschaftlichen Koordination mit unterschiedlichen Methoden und Instrumenten der Ökonomik zu untersuchen. Es wird gezeigt, dass den analytischen Modellen der ESK einige Probleme inhärent sind, die prinzipiell durch neue Methoden und Instrumente gelöst werden können. Als ein geeigneter Kandidat wird die Agentenbasierte Modellierung (ABM) identifiziert, die eine realitätsnähere Abbildung der Akteure, eine ergebnisoffene Modellierung ihrer Entscheidungen und des Koordinationsprozesses erlaubt. Der Analyse von der ESK zuzuordnenden analytischen und agentenbasierten Modellen folgt im zweiten Teil der Dissertation die Programmierung einer eigenen ABM Continuous Opinions of Satisficing Agents and Discrete Actions (COSDA) mit Hilfe der Multi-Agenten-Programmiersprache NetLogo. In der heuristischen ABM COSDA werden zentrale wissenschaftsphilosophische und ökonomische Prämissen, die im ersten Teil der Arbeit als Problemfelder identifiziert wurden, aufgegeben. Mit Modellierung heterogener Agententypen, die - mit unterschiedlichen Präferenzen und Verhaltensheuristiken ausgestattet - miteinander interagieren, wird eine mögliche Mikrospezifikation für die Emergenz eines Makrophänomens erzeugt. Das Makrophänomen, d.h. die unterschiedlichen Resultate im wissenschaftlichen Koordinationsprozess, sind aus den selbstverstärkenden Effekten der Interaktion erklärbar, aber nicht vorhersehbar. Die Mikrospezifikation kann als relevante, durch eine kohärente Fiktion formulierte Möglichkeit interpretiert werden, die anders als analytische Modelle der ESK kein rationales Entscheidungskalkül der Agenten voraussetzt.

Page generated in 0.0679 seconds