• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 66
  • 66
  • 17
  • 17
  • 11
  • 10
  • 10
  • 10
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Modeling and simulation of photocatalytic degradation of organic components in wastewater

Eckert, Hagen 26 March 2021 (has links)
Organische Schadstoffe werden in vielen Phasen unseres täglichen Lebens in den Wasserkreislauf eingeleitet. Die herkömmliche Abwasserbehandlung ist nicht zur effektiven Entfernung einiger dieser Stoffe, insbesondere von Arzneimitteln, geeignet. Die Fotokatalyse basierend auf der Suspension von katalytischen Nanopartikeln und ultraviolettem Licht stellt eine effiziente Methode dar, um diese organischen Stoffe im Abwasser zu reduzieren. Während das allgemeine Konzept der fotokatalytischen Wasserreinigung gut etabliert ist, fehlte ein beschreibendes und einfach anwendbares Modell der wesentlichen Abbauprozesse. Ein solches Modell ist entscheidend, um experimentelle Ergebnisse systematisch vergleichen zu können, und stellt eine wertvolle Hilfe bei der Optimierung von Prozessen dar. Diese Arbeit präsentiert einen Modellierungsansatz zur Simulation der kinetischen Prozesse basierend auf dem Langmuir-Hinshelwood-Mechanismus. Dieses Grundmodell wurde erweitert, um auch die Bildung von organischen Zwischenprodukten zu beschreiben. Diese Erweiterungen basieren entweder auf einem inkrementellen oder einen fragmentierenden Abbaumechanismus, der durch das Einbinden von überschüssigen Bindungen ergänzt werden kann. Die simulierte Konzentrationsentwicklung von Zwischenprodukten sowie die Entwicklung des verbleibenden organischen Kohlenstoffes werden für verschiedene Modellannahmen bezüglich ihrer Desorptionsraten von der Photokatalysatoroberfläche diskutiert. Die Modellparameter wurden aus dem Vergleich mit experimentellen Ergebnissen ermittelt. Grundlegende Experimente wurden unter Verwendung des Antibiotikums Ciprofloxacin und des Farbstoffs Methylenblau als Beispiele für organische Verbindungen und Titandioxid und Zinkoxid als fotokatalytische Materialien durchgeführt. Darüber hinaus wird die Anwendbarkeit des Modells auf komplexere Systeme durch den Vergleich mit dem fotokatalytischen Abbau von 14 Medikamenten im Abfluss von Kläranlagen demonstriert. Nach der Evaluierung des Modells wurde es in ein Open-Source-Softwarepaket implementiert, um eine breitere Anwendung zu ermöglichen und eine solide Grundlage für weitergehende Entwicklungen zu schaffen.:Abstract Kurzfassung Symbols Constants Abbreviations 1. Motivation 2. Introduction 2.1. Modeling and simulation 2.2. Heterogeneous photocatalysis 2.2.1. History 2.2.2. Semiconductor band structure 2.2.3. Interface between a semiconductor and a redox electrolyte 2.3. Photocatalytic material 2.3.1. Overview 2.3.2. Titanium dioxide 2.3.3. Zinc oxide 2.4. Light sources 2.4.1. Solar 2.4.2. Fluorescent tubes and mercury-vapor lamps 2.4.3. Light-emitting diodes 2.4.4. Organic light-emitting diodes 3. Materials and methods 3.1. Analytic methods 3.1.1. Nanoparticle characterization 3.1.2. Ultraviolet-visible absorption spectrometry 3.1.3. SPE-HPLC-MS/MS 3.1.4. Non-purgeable organic carbon 3.2. Experimental investigations 3.2.1. Model substances 3.2.2. Adsorption-desorption 3.2.3. Photocatalytic degradation 3.2.4. Wastewater treatment plant effluent 3.3. Modeling approach 3.3.1. Single organic species model 3.3.2. Multiple organic species model 3.4. Model implementation 3.4.1. Development objectives 3.4.2. Molecule parameters 3.4.3. Solving the differential equation system 3.4.4. Fit to experimental results 3.4.5. Availability 4. Results and discussion 4.1. Nanoparticle properties 4.2. Adsorption-desorption 4.3. Photocatalytic degradation 4.3.1. Single organic species model 4.3.2. Multi organic species model 4.4. Wastewater treatment plant effluent 4.4.1. Influence of effluent 4.4.2. Degradation of pharmaceuticals in the effluent 5. Conclusions Appendix A. Analytical solution B. Effluent pharmaceuticals concentrations C. pdom handbook List of figures List of tables Bibliography / Organic pollutants are discharged into the water cycle at many stages in our daily lives. Conventional wastewater treatments are ineffective in the removal of some of them, especially clearing pharmaceuticals. Photocatalytic degradation utilizing catalytic nanosuspensions under ultraviolet irradiation represents an efficient method to reduce those organic components in the wastewater. While the general concept of photocatalytic water purification is well established, a descriptive and easy to use model of the essential processes was missing. Such a model is critical to ensure the systematic comparability of experimental results and supports process optimization. This work presents a modeling approach to simulate the involved kinetic processes based on the Langmuir–Hinshelwood mechanism. Further, the fundamental model is extended to include the formation of intermediate organic components. This extension uses either an incremental degradation mechanism or a fragmentation based mechanism, that can include excess bonds. The simulated concentration evolution of intermediates, as well as the evolution of the total organic carbon, are discussed for different model assumptions concerning their desorption rates from the photocatalyst surface. The model parameters were estimated from comparison with experimental findings. Basic experiments were performed using the antibiotic ciprofloxacin, and the dye methylene blue as organic compounds and titanium dioxid and zinc oxide as photocatalytic materials. Furthermore, the application of the model to more complex systems is shown by the photocatalytic degradation of 14 pharmaceuticals in wastewater treatment plant effluent. Following successful evaluation of this model, it was implemented in an open-source software package to enable a wider adoption and a sound foundation for further developments.:Abstract Kurzfassung Symbols Constants Abbreviations 1. Motivation 2. Introduction 2.1. Modeling and simulation 2.2. Heterogeneous photocatalysis 2.2.1. History 2.2.2. Semiconductor band structure 2.2.3. Interface between a semiconductor and a redox electrolyte 2.3. Photocatalytic material 2.3.1. Overview 2.3.2. Titanium dioxide 2.3.3. Zinc oxide 2.4. Light sources 2.4.1. Solar 2.4.2. Fluorescent tubes and mercury-vapor lamps 2.4.3. Light-emitting diodes 2.4.4. Organic light-emitting diodes 3. Materials and methods 3.1. Analytic methods 3.1.1. Nanoparticle characterization 3.1.2. Ultraviolet-visible absorption spectrometry 3.1.3. SPE-HPLC-MS/MS 3.1.4. Non-purgeable organic carbon 3.2. Experimental investigations 3.2.1. Model substances 3.2.2. Adsorption-desorption 3.2.3. Photocatalytic degradation 3.2.4. Wastewater treatment plant effluent 3.3. Modeling approach 3.3.1. Single organic species model 3.3.2. Multiple organic species model 3.4. Model implementation 3.4.1. Development objectives 3.4.2. Molecule parameters 3.4.3. Solving the differential equation system 3.4.4. Fit to experimental results 3.4.5. Availability 4. Results and discussion 4.1. Nanoparticle properties 4.2. Adsorption-desorption 4.3. Photocatalytic degradation 4.3.1. Single organic species model 4.3.2. Multi organic species model 4.4. Wastewater treatment plant effluent 4.4.1. Influence of effluent 4.4.2. Degradation of pharmaceuticals in the effluent 5. Conclusions Appendix A. Analytical solution B. Effluent pharmaceuticals concentrations C. pdom handbook List of figures List of tables Bibliography
32

HEAVY-DUTY TRUCK PLATOONING ON HILLY TERRAIN: METHODS FOR ASSESSMENT AND IMPROVEMENT

Miles J Droege (11128536) 22 July 2021 (has links)
Class 8 heavy-duty truck platooning has demonstrated significant fuel economy benefits on routes with road grade less than±2% in literature, but there is little to no platooning research on routes with road grade greater than±2% - which make up a significant portion of U.S. highways. Therefore, the effort described in this thesis is aimed at assessing currently available two-truck platoon control strategies as well as developing new strategies to improve platoon performance on hilly terrain. Specifically, the strategies tested in this work include four types of lead truck speed control strategies and two types of platoon transmission shifting strategies. These strategies are tested using two experimentally validated heavy-duty, two-truck platoon simulation approaches where each approach has its own advantages and disadvantages. The trends observed from these two simulation approaches indicate that the lead truck speed control and transmission shifting strategies have a significant effect on the platoon fuel economy and gap control performance when the platoon operates on a hilly terrain route.
33

Návrh a implementace modelů efektů v simulačním systému OneSAF Testbed Baseline / Design and Implementation of Effect Model in Simulation System OneSAF Testbed Baseline

Hubeňák, Marian January 2009 (has links)
This project deals with the modelling and the simulation, its basic characteristics as well as its respective branches. The majority of the project is aimed at modelling and simulation in military environment with a description of the asset in this area while considering different uses of these simulations. I also introduce the development of modelling and simulation in the army of Czech Republic. The project also deals with the constructive simulation and tries to bring out the OneSAF Testbed Baseline simulation system. At last, I propose some concepts of models for this particular simulation system - the model of psychological effects and the model of generating the rubble during the operations of the artillery.
34

Constraint-Based Soft Tissue Simulation for Virtual Surgical Training

Tang, W., Wan, Tao Ruan January 2014 (has links)
yes / Most of surgical simulators employ a linear elastic model to simulate soft tissue material properties due to its computational efficiency and the simplicity. However, soft tissues often have elaborate nonlinearmaterial characteristics. Most prominently, soft tissues are soft and compliant to small strains, but after initial deformations they are very resistant to further deformations even under large forces. Such material characteristic is referred as the nonlinear material incompliant which is computationally expensive and numerically difficult to simulate. This paper presents a constraint-based finite-element algorithm to simulate the nonlinear incompliant tissue materials efficiently for interactive simulation applications such as virtual surgery. Firstly, the proposed algorithm models the material stiffness behavior of soft tissues with a set of 3-D strain limit constraints on deformation strain tensors. By enforcing a large number of geometric constraints to achieve the material stiffness, the algorithm reduces the task of solving stiff equations of motion with a general numerical solver to iteratively resolving a set of constraints with a nonlinear Gauss–Seidel iterative process. Secondly, as a Gauss–Seidel method processes constraints individually, in order to speed up the global convergence of the large constrained system, a multiresolution hierarchy structure is also used to accelerate the computation significantly, making interactive simulations possible at a high level of details . Finally, this paper also presents a simple-to-build data acquisition system to validate simulation results with ex vivo tissue measurements. An interactive virtual reality-based simulation system is also demonstrated.
35

Distributed Situation Awareness Framework to Assess and Design Complex Systems

Alhaider, Abdulrahman Abdulqader 20 January 2023 (has links)
Communication and coordination in complex sociotechnical systems require continuous assessment on its artefacts and how they are utilized to improve system performance. Situation Awareness (SA) is considered as a fundamental concept in designing and understanding interactions between human and non-human agents (i.e., information systems) that impact system performance. The interaction efficiency is partly determined by quality of information or SA distributed across agents to ensure the accuracy of decision making and resource allocations. Disrupting SA distribution between agents can significantly affect operations of the system with financial and safety consequences. This research applied the Distributed Situation Awareness (DSA) theory to study and improve patient flow management. The main objective of this research was to advance methodology in the DSA literature for (1) deriving design implications from DSA models, and (2) developing quantitative DSA models to formally compare system designs. This DSA research was situated in the domain of patient flow management. Data were collected using the three-part method of data elicitation, extraction, and representation to investigate DSA at a patient flow command and control center at Carilion Clinic in Roanoke, VA. The data used were elicited from observations and interviews on workers daily activities and available historical database (i.e., TeleTracking). Then, data were represented into a combined network to highlight social, task and knowledge elements in patient flows for studying and assessing patient flow management. The influence of the DSA on complex systems was examined qualitatively and quantitatively. The DSA combined network qualitatively characterized patient flow management and identified deficiencies of the command-and-control center functions. The network characterized admission, clinical (inside-hospital) transportation, discharge, and environmental services functions managed by Carilion Transfer and Communications Center (CTaC). These characterizations led to the identification of design principles on job roles, tasks performed, and SA transactions and distribution adopted by the state-of-the-art patient flow management facility. In addition, the network representing the current operation of CTaC illustrated the connection between functional groups, arbitration of resources, and job roles that could become the bottlenecks in transmitting SA. The network also helped identify inefficient task loops, which resulted in delay due to missing/poor SA, and task orders that could be modified to improve the patient flow and thus reduce the likelihood of delay. The qualitative (i.e., combined network) model was partially translated into a quantitative model based on discrete event simulation (DES) and agent-based modeling (ABM) to simulate patient transportation inside the hospital. The simulation model consisted of 28 patient origins, 29 equipment origins, 12 destinations, and more than 200 entities (i.e., simulation objects). The model was validated by lack of significant difference on various outcome metrics between 100 simulation replications and historical data using one-way t-tests. The simulation model captured the distribution and transactions of knowledge elements between agents within the modeled processes. Further, the model successfully verified the deficiencies in the existing system (i.e., delay and cancelation), attributing various instances of deficiency to be either SA related or non-SA related. The simulation model tested two interventions for eliminating SA deficiencies revealed by the qualitative model: (1) updating the wards nurse before picking up patients from inpatient floor, and (2) updating the X-ray nurse/team before arriving with the patient. Both interventions involved updates from the transporters to nurses, transmitting SA on the estimated time of arrival and patient information for the nurse to become aware of the transport status. The simulation ran for 1500 replications for results on transport time and cancellation rate on these two interventions. One-way t-tests revealed that the intervention to update the wards nurse resulted in significant reductions in mean transport and cancellation rate time compared to historical data (i.e., TeleTracking), yielding 0.42 minutes to 1.24 minutes reduction in transport time and 2% to 5% less cancelations. However, the second intervention resulted in a significant increase in transport time and thus was ineffective. DES and ABM supplemented the qualitative modeling with quantitative evidence on DSA concepts and assessment of potential interventions for improving DSA in patient flow management. Specifically, the DES and ABM enabled comparison and prediction of performance outcome from recommended changes to communication protocols. These findings indicate that DSA is a promising framework for analyzing communication and coordination in complex systems and assessing improvement on SA design quantitatively. / Doctor of Philosophy / Hospitals aim to provide care and treatment to patients in a timely and effective manner but their operations can be hindered by delays and long treatment times, resulting in high operating cost. A potential cause of this problem is the complexity of patient treatment process and the number agents involved. The treatment is provided through nurses, doctors, and technicians who work together to deliver care to patients. They all rely on the information from various medical devices and communicate patient-related information continuously to streamline patient movement towards different locations for different retreatments in the hospital. Thus, studying and improving communication and coordination between the medical staff could improve patient flow that in turn reduce idling resources and patient time in hospital, thereby lowering the healthcare cost without sacrificing quality of care. This dissertation investigated patient movement in the hospital adopting Distributed Situation Awareness (DSA). DSA models communication and coordination by examining the distribution of situation awareness (SA) between human and machine agents for a given task. The primary objective of this dissertation was to investigate how to develop DSA models for studying and improving patient flow management. The dissertation showed how to build a network model that illustrated how SA were generated and distributed amongst hospital staff and information systems to manage admission, discharge, room cleaning, and patient transportation. The network model also helped identified the deficiencies associated with different activities. Further, simulation was used to study the processes and SA communicated quantitively. The simulation model was able to show how one recommended change to the communication protocol could reduce patient transport time and cancellation for one route but another. This dissertation enhanced the methodology DSA for real-world applications and demonstrated the method to quantify SA distribution and transaction in complex system.
36

Influence of meteorological network density on hydrological modeling using input from the Canadian Precipitation Analysis (CaPA)

Abbasnezhadi, Kian 31 March 2017 (has links)
The Canadian Precipitation Analysis (CaPA) system has been developed by Environment and Climate Change Canada (ECCC) to optimally combine different sources of information to estimate precipitation accumulation across Canada. The system combines observations from different networks of weather stations and radar measurements with the background information generated by ECCC's Regional Deterministic Prediction System (RDPS), derived from the Global Environmental Multiscale (GEM) model. The main scope of this study is to assess the importance of weather stations when combined with the background information for hydrological modeling. A new approach to meteorological network design, considered to be a stochastic hydro-geostatistical scheme, is proposed and investigated which is particularly useful for augmenting data-sparse networks. The approach stands out from similar approaches of its kind in that it is comprised of a data assimilation component included based on the paradigm of an Observing System Simulation Experiment (OSSE), a technique used to simulate data assimilation systems in order to evaluate the sensitivity of the analysis to new observation network. The proposed OSSE-based algorithm develops gridded stochastic precipitation and temperature models to generate synthetic time-series assumed to represent the 'reference' atmosphere over the basin. The precipitation realizations are used to simulate synthetic observations, associated with hypothetical station networks of various densities, and synthetic background data, which in turn are assimilated in CaPA to realize various pseudo-analyses. The reference atmosphere and the pseudo-analyses are then compared through hydrological modeling in WATFLOOD. By comparing the flow rates, the relative performance of each pseudo-analysis associated with a specific network density is assessed. The simulations show that as the network density increases, the accuracy of the hydrological signature of the CaPA precipitation products improves hyperbolically to a certain limit beyond which adding more stations to the network does not result in further accuracy. This study identifies an observation network density that can satisfy the hydrological criteria as well as the threshold at which assimilated products outperforms numerical weather prediction outputs. It also underlines the importance of augmenting observation networks in small river basins to better resolve mesoscale weather patterns and thus improve the predictive accuracy of streamflow simulation. / May 2017
37

Time series methods for the simulation of wind speed fields across Great Britain

Edwards, Gruffudd January 2014 (has links)
This thesis presents the development of a time series model and associated algorithms capable of generating synthetic time-series datasets representing the hourly-averaged wind-speed field across the Country – as represented by a set of 20 points. This field is of interest as the energy resource available to wind generators connected to the Great Britain (GB) electricity networks. A wind power output dataset was also generated for an example distribution of wind generation capacities. The datasets generated are suitable for use in sequential Monte Carlo simulations of the GB electricity system – either the present system or future scenarios, potentially with full consideration of network constraints. Accurate representation of the spatio-temporal behaviours of renewable resources are an essential aspect of such simulations, along with their relationship to demand, with rarely occurring extreme events of particular interest. Therefore, variability in the resource occurring on all timescales – from turbulence to climatic shifts between decades must be represented. The synthetic data are time-stamped with time of the day and day of the year, so care was taken to ensure that all relevant deterministic and stochastic patterns are accurately reproduced. A major component of the research project was identification of the optimum level of complexity for various aspects of the model structure, and the associated computational expense of generating the series, particularly given the high dimensionality of the problem. The final choice of wind speed model was 2-factor-VGARMA-APARCH, along with several deterministic transformations.
38

Design Of A Secondary Packaging Robotic System

Sahin, Hakan 01 December 2005 (has links) (PDF)
The use of robotic systems in consumer goods industry has increased over recent years. However, food industry has not taken to the robotics technology with the same desire as in other industries due to technical and commercial reasons. Difficulties in matching human speed and flexibility, variable nature of food products, high production volume rates, lack of appropriate end-effectors, high initial investment rate of the so-called systems and low margins in food products are still blocking the range of use of robotics in food industry. In this thesis study, as a contribution to the use of robotic systems in food industry, a secondary packaging robotic system is designed. The system is composed of two basic subsystems: a dual-axis controlled robotic arm and a special-purpose gripper. Mechanical and control systems design of basic subsystems are performed within the scope of the study. During the designing process, instead of using classical design methods, modern computer-aided design and engineering tools are utilized.
39

Conversão de uso das terras e alterações das condições hidrológicas = o caso das bacias do Atibainha e do Cachoeira / Land use conversion and hydrological changes : the case of Atibainha and Cachoeira basins

Pereira, Vânia Rosa, 1980- 07 November 2012 (has links)
Orientador: José Teixeira Filho / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Geociências / Made available in DSpace on 2018-08-22T03:32:02Z (GMT). No. of bitstreams: 1 Pereira_VaniaRosa_D.pdf: 39575910 bytes, checksum: a5b13e6f4e56d6fbd47c97a2c191a9ce (MD5) Previous issue date: 2012 / Resumo: Este trabalho foi desenvolvido com o intuito de verificar se as mudanças de uso das terras ocorridas entre os períodos de 1985 a 2005 interferiram na dinâmica das vazões dos rios Atibainha e Cachoeira. Seus resultados mostraram que não existem evidências de que o uso das terras interferiu diretamente no regime de vazões de estiagem. Em relação às características de uso das terras, a conversão mais significativa ocorreu na forma de mata secundária em estágio inicial de recuperação para silvicultura. A silvicultura tem finalidade para lenha e carvão vegetal, e, por isso, possui ciclo curto de cinco anos. Grande parte da conversão ocorreu nas áreas de alta declividade. Menos de 20% da área das bacias foi modificada neste intervalo estudado e tal conversão ocorreu de forma diluída ao longo dos anos. A análise estatística das séries temporais de chuva e vazão em período coincidente com as mudanças de uso das terras demonstrou tendências de decréscimos significativos para as chuvas, tanto na escala local quanto na regional, especialmente durante o período de recarga, no verão. As vazões apresentaram decréscimos nos totais anuais, de cheia e estiagem. Os resultados também indicaram evidências de que as intensidades de chuva decresceram durante o intervalo analisado e que não houve mudança no padrão de distribuição de dias de chuva e nem na concentração dos dias de chuva. A simulação do modelo determinístico para os cenários reais indicou que a quantidade de conversão de uso das terras ocorrida durante o intervalo analisado não seria suficiente para modificar os totais das vazões nos recortes anuais e sazonais (cheia e estiagem) sob o mesmo regime de chuvas. Nos cenários hipotéticos extremos, as mudanças de uso afetam os totais de vazão. De acordo com os resultados da simulação, para as bacias do Atibainha e Cachoeira, as mudanças na evapotranspiração das plantas influenciam de forma mais direta os totais de vazão do que as no escoamento. Apesar dos resultados indicarem que o uso das terras não afetou diretamente o regime hídrico anual e sazonal, os mesmos podem ser considerados inconclusivos. O regime de chuvas diárias não foi suficientemente detalhado para compreender os processos de chuva/escoamento/vazão. E, a escala de análise utilizada não permitiu identificar detalhes do processo de conversão que podem influenciar o regime hídrico ao longo dos anos, como, por exemplo, o efeito acumulativo do manejo da silvicultura e a aplicação de fogo na colheita das toras / Abstract: This thesis objective was to investigate the main driving factor that explains the river discharge changes during the dry period. The study area - Atibainha and Cachoeira river basins - are located in southeast Brazil. The results did not showed evidences that land use changes affected directly the discharge during the dry period. The changes occurred in a transition from secondary forest in its initial restoration stage to silviculture with Eucalyptos urophylla. The silviculture supplies the charcoal and firewood demand. So the harvest rotation is up 4 to 5 years. The majority of land use conversion was in hilly upland areas. Less than 20% of the two studied basins area was changed during the period. The amount of land use conversion was diluted during years. The precipitation and discharge time series analysis in the coincident period with the land use changes showed significant decreases. The precipitation negative trends with p_value < 0.001 occurred in the local and regional scale and were mainly in summer i.e. the recharge period. The discharge showed negative trends p value < 0.001 in all season and in the annual analysis. The precipitation time series analyses showed evidences of decreases in intensity. No changes however were found in concentration or distribution of precipitation events. Two land use scenarios were analyzed with the deterministic hydrological simulation model - HSPF under the same precipitation regime. A real land use scenario with the evolution from 1985 to 2005 and a hypothetical extreme one, as the entire basin with one land use were analyzed. The real scenario simulation suggests that the land use change i. e. less than 20% of basin was not sufficient to modify the annual and seasonal discharge. The hypothetic extreme scenarios simulation showed that changes/losses in evapotranspiration would be the main driving factor to impact the river discharge while the runoff would be the secondary. The results indicated that land use changes were not the main driving factor to explain the discharge negative trends. The results however could be considered inconclusive. The daily time series discretization and the short interval considered here i.e. 20 years analysis were not enough to understand the precipitation/runoff/discharge basin processes. The soil cumulative effects of silviculture practices should be considered in future studies. The harvesting with fire and the short rotation and regrowth cycle consequences must be better understood / Doutorado / Análise Ambiental e Dinâmica Territorial / Doutora em Geografia
40

A Generalized Framework for Representing Complex Networks

Viplove Arora (8086250) 06 December 2019 (has links)
<div>Complex systems are often characterized by a large collection of components interacting in nontrivial ways. Self-organization among these individual components often leads to emergence of a macroscopic structure that is neither completely regular nor completely random. In order to understand what we observe at a macroscopic scale, conceptual, mathematical, and computational tools are required for modeling and analyzing these interactions. A principled approach to understand these complex systems (and the processes that give rise to them) is to formulate generative models and infer their parameters from given data that is typically stored in the form of networks (or graphs). The increasing availability of network data from a wide variety of sources, such as the Internet, online social networks, collaboration networks, biological networks, etc., has fueled the rapid development of network science. </div><div><br></div><div>A variety of generative models have been designed to synthesize networks having specific properties (such as power law degree distributions, small-worldness, etc.), but the structural richness of real-world network data calls for researchers to posit new models that are capable of keeping pace with the empirical observations about the topological properties of real networks. The mechanistic approach to modeling networks aims to identify putative mechanisms that can explain the dependence, diversity, and heterogeneity in the interactions responsible for creating the topology of an observed network. A successful mechanistic model can highlight the principles by which a network is organized and potentially uncover the mechanisms by which it grows and develops. While it is difficult to intuit appropriate mechanisms for network formation, machine learning and evolutionary algorithms can be used to automatically infer appropriate network generation mechanisms from the observed network structure.</div><div><br></div><div>Building on these philosophical foundations and a series of (not new) observations based on first principles, we extrapolate an action-based framework that creates a compact probabilistic model for synthesizing real-world networks. Our action-based perspective assumes that the generative process is composed of two main components: (1) a set of actions that expresses link formation potential using different strategies capturing the collective behavior of nodes, and (2) an algorithmic environment that provides opportunities for nodes to create links. Optimization and machine learning methods are used to learn an appropriate low-dimensional action-based representation for an observed network in the form of a row stochastic matrix, which can subsequently be used for simulating the system at various scales. We also show that in addition to being practically relevant, the proposed model is relatively exchangeable up to relabeling of the node-types. </div><div><br></div><div>Such a model can facilitate handling many of the challenges of understanding real data, including accounting for noise and missing values, and connecting theory with data by providing interpretable results. To demonstrate the practicality of the action-based model, we decided to utilize the model within domain-specific contexts. We used the model as a centralized approach for designing resilient supply chain networks while incorporating appropriate constraints, a rare feature of most network models. Similarly, a new variant of the action-based model was used for understanding the relationship between the structural organization of human brains and the cognitive ability of subjects. Finally, our analysis of the ability of state-of-the-art network models to replicate the expected topological variations in network populations highlighted the need for rethinking the way we evaluate the goodness-of-fit of new and existing network models, thus exposing significant gaps in the literature.</div>

Page generated in 0.1414 seconds