• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 2
  • 1
  • Tagged with
  • 8
  • 8
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Theory and application of joint interpretation of multimethod geophysical data

Kozlovskaya, E. (Elena) 12 April 2001 (has links)
Abstract This work is devoted to the theory of joint interpretation of multimethod geophysical data and its application to the solution of real geophysical inverse problems. The targets of such joint interpretation can be geological bodies with an established dependence between various physical properties that cause anomalies in several geophysical fields (geophysical multiresponse). The establishing of the relationship connecting the various physical properties is therefore a necessary first step in any joint interpretation procedure. Bodies for which the established relationship between physical properties is violated (single-response bodies) can be targets of separate interpretations. The probabilistic (Bayesian) approach provides the necessary formalism for addressing the problem of the joint inversion of multimethod geophysical data, which can be non-linear and have a non-unique solution. Analysis of the lower limit of resolution of the non-linear problem of joint inversion using the definition of e-entropy demonstrates that joint inversion of multimethod geophysical data can reduce non-uniqueness in real geophysical inverse problems. The question can be formulated as a multiobjective optimisation problem (MOP), enabling the numerical methods of this theory to be employed for the purpose of geophysical data inversion and for developing computer algorithms capable of solving highly non-linear problems. An example of such a problem is magnetotelluric impedance tensor inversion with the aim of obtaining a 3-D resistivity distribution. An additional area of application for multiobjective optimisation can be the combination of various types of uncertain information (probabilistic and non-probabilistic) in a common inversion scheme applicable to geophysical inverse problems. It is demonstrated how the relationship between seismic velocity and density can be used to construct an algorithm for the joint interpretation of gravity and seismic wide-angle reflection and refraction data. The relationship between the elastic and electrical properties of rocks, which is a necessary condition for the joint inversion of data obtained by seismic and electromagnetic methods, can be established for solid- liquid rock mixtures using theoretical modelling of the elastic and electrical properties of rocks with a fractal microstructure and from analyses of petrophysical data and borehole log data.
2

Improving hydrological post-processing for assessing the conditional predictive uncertainty of monthly streamflows

Romero Cuellar, Jonathan 07 January 2020 (has links)
[ES] La cuantificación de la incertidumbre predictiva es de vital importancia para producir predicciones hidrológicas confiables que soporten y apoyen la toma de decisiones en el marco de la gestión de los recursos hídricos. Los post-procesadores hidrológicos son herramientas adecuadas para estimar la incertidumbre predictiva de las predicciones hidrológicas (salidas del modelo hidrológico). El objetivo general de esta tesis es mejorar los métodos de post-procesamiento hidrológico para estimar la incertidumbre predictiva de caudales mensuales. Esta tesis pretende resolver dos problemas del post-procesamiento hidrológico: i) la heterocedasticidad y ii) la función de verosimilitud intratable. Los objetivos específicos de esta tesis son tres. Primero y relacionado con la heterocedasticidad, se propone y evalúa un nuevo método de post-procesamiento llamado GMM post-processor que consiste en la combinación del esquema de modelado de probabilidad Bayesiana conjunta y la mezcla de Gaussianas múltiples. Además, se comparó el desempeño del post-procesador propuesto con otros métodos tradicionales y bien aceptados en caudales mensuales a través de las doce cuencas hidrográficas del proyecto MOPEX. A partir de este objetivo (capitulo 2), encontramos que GMM post-processor es el mejor para estimar la incertidumbre predictiva de caudales mensuales, especialmente en cuencas de clima seco. Segundo, se propone un método para cuantificar la incertidumbre predictiva en el contexto de post-procesamiento hidrológico cuando sea difícil calcular la función de verosimilitud (función de verosimilitud intratable). Algunas veces en modelamiento hidrológico es difícil calcular la función de verosimilitud, por ejemplo, cuando se trabaja con modelos complejos o en escenarios de escasa información como en cuencas no aforadas. Por lo tanto, se propone el ABC post-processor que intercambia la estimación de la función de verosimilitud por el uso de resúmenes estadísticos y datos simulados. De este objetivo específico (capitulo 3), se demuestra que la distribución predictiva estimada por un método exacto (MCMC post-processor) o por un método aproximado (ABC post-processor) es similar. Este resultado es importante porque trabajar con escasa información es una característica común en los estudios hidrológicos. Finalmente, se aplica el ABC post-processor para estimar la incertidumbre de los estadísticos de los caudales obtenidos desde las proyecciones de cambio climático, como un caso particular de un problema de función de verosimilitud intratable. De este objetivo específico (capitulo 4), encontramos que el ABC post-processor ofrece proyecciones de cambio climático más confiables que los 14 modelos climáticos (sin post-procesamiento). De igual forma, ABC post-processor produce bandas de incertidumbre más realista para los estadísticos de los caudales que el método clásico de múltiples conjuntos (ensamble). / [CA] La quantificació de la incertesa predictiva és de vital importància per a produir prediccions hidrològiques confiables que suporten i recolzen la presa de decisions en el marc de la gestió dels recursos hídrics. Els post-processadors hidrològics són eines adequades per a estimar la incertesa predictiva de les prediccions hidrològiques (eixides del model hidrològic). L'objectiu general d'aquesta tesi és millorar els mètodes de post-processament hidrològic per a estimar la incertesa predictiva de cabals mensuals. Els objectius específics d'aquesta tesi són tres. Primer, es proposa i avalua un nou mètode de post-processament anomenat GMM post-processor que consisteix en la combinació de l'esquema de modelatge de probabilitat Bayesiana conjunta i la barreja de Gaussianes múltiples. A més, es compara l'acompliment del post-processador proposat amb altres mètodes tradicionals i ben acceptats en cabals mensuals a través de les dotze conques hidrogràfiques del projecte MOPEX. A partir d'aquest objectiu (capítol 2), trobem que GMM post-processor és el millor per a estimar la incertesa predictiva de cabals mensuals, especialment en conques de clima sec. En segon lloc, es proposa un mètode per a quantificar la incertesa predictiva en el context de post-processament hidrològic quan siga difícil calcular la funció de versemblança (funció de versemblança intractable). Algunes vegades en modelació hidrològica és difícil calcular la funció de versemblança, per exemple, quan es treballa amb models complexos o amb escenaris d'escassa informació com a conques no aforades. Per tant, es proposa l'ABC post-processor que intercanvia l'estimació de la funció de versemblança per l'ús de resums estadístics i dades simulades. D'aquest objectiu específic (capítol 3), es demostra que la distribució predictiva estimada per un mètode exacte (MCMC post-processor) o per un mètode aproximat (ABC post-processor) és similar. Aquest resultat és important perquè treballar amb escassa informació és una característica comuna als estudis hidrològics. Finalment, s'aplica l'ABC post-processor per a estimar la incertesa dels estadístics dels cabals obtinguts des de les projeccions de canvi climàtic. D'aquest objectiu específic (capítol 4), trobem que l'ABC post-processor ofereix projeccions de canvi climàtic més confiables que els 14 models climàtics (sense post-processament). D'igual forma, ABC post-processor produeix bandes d'incertesa més realistes per als estadístics dels cabals que el mètode clàssic d'assemble. / [EN] The predictive uncertainty quantification in monthly streamflows is crucial to make reliable hydrological predictions that help and support decision-making in water resources management. Hydrological post-processing methods are suitable tools to estimate the predictive uncertainty of deterministic streamflow predictions (hydrological model outputs). In general, this thesis focuses on improving hydrological post-processing methods for assessing the conditional predictive uncertainty of monthly streamflows. This thesis deal with two issues of the hydrological post-processing scheme i) the heteroscedasticity problem and ii) the intractable likelihood problem. Mainly, this thesis includes three specific aims. First and relate to the heteroscedasticity problem, we develop and evaluate a new post-processing approach, called GMM post-processor, which is based on the Bayesian joint probability modelling approach and the Gaussian mixture models. Besides, we compare the performance of the proposed post-processor with the well-known exiting post-processors for monthly streamflows across 12 MOPEX catchments. From this aim (chapter 2), we find that the GMM post-processor is the best suited for estimating the conditional predictive uncertainty of monthly streamflows, especially for dry catchments. Secondly, we introduce a method to quantify the conditional predictive uncertainty in hydrological post-processing contexts when it is cumbersome to calculate the likelihood (intractable likelihood). Sometimes, it can be challenging to estimate the likelihood itself in hydrological modelling, especially working with complex models or with ungauged catchments. Therefore, we propose the ABC post-processor that exchanges the requirement of calculating the likelihood function by the use of some sufficient summary statistics and synthetic datasets. With this aim in mind (chapter 3), we prove that the conditional predictive distribution is similarly produced by the exact predictive (MCMC post-processor) or the approximate predictive (ABC post-processor), qualitatively speaking. This finding is significant because dealing with scarce information is a common condition in hydrological studies. Finally, we apply the ABC post-processing method to estimate the uncertainty of streamflow statistics obtained from climate change projections, such as a particular case of intractable likelihood problem. From this specific objective (chapter 4), we find that the ABC post-processor approach: 1) offers more reliable projections than 14 climate models (without post-processing); 2) concerning the best climate models during the baseline period, produces more realistic uncertainty bands than the classical multi-model ensemble approach. / I would like to thank the Gobernación del Huila Scholarship Program No. 677 (Colombia) for providing the financial support for my PhD research. / Romero Cuellar, J. (2019). Improving hydrological post-processing for assessing the conditional predictive uncertainty of monthly streamflows [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/133999 / TESIS
3

Développement des méthodes numériques et expérimentales pour la certification vibratoire des disques aubagés monoblocs / Development of the numerical and experimental methods for dynamic certification of integrally bladed disks

Cazenove, Jean de 25 June 2014 (has links)
Les roues aubagées de turbomachines sont soumises en fonctionnement `a des sollicitations statiqueset dynamiques, qui peuvent conduire `a des situations de fatigue vibratoire pour des excitationsau voisinage des fréquences de résonance. Ce probléme est aggravé par le désaccordage involontaire,auquel sont sujets les ensembles aubagés notamment du fait des dispersions de fabrication.L’objectif de ce travail de recherche est de proposer une stratégie mixte numérique et expérimentalepermettant de caractériser le comportement dynamique d’une roue d’essai au sein des statistiquesdécrivant une flotte simulée de moteurs en service, en vue de la certification vibratoire. Un modèle numérique fidèle basée sur l’acquisition optique d’une roue expérimentale a été développé; une série d’essais en laboratoire a permis de vérifier sa représentativité. L’exploitation de mesures réalisées en configuration moteur a montré une bonne cohérence globale des niveaux d’amplitude prédits à l’aidedu modèle fidèle. Enfin, la simulation du comportement d’une population de roues désaccordées à l’aide d’une approche probabiliste non-Paramétrique a permis de positionner l’amplitude de réponse maximale rencontrée sur la pièce d’essai par rapport à la valeur théorique obtenue par simulation. La stratégie proposée permet une prédiction des niveaux vibratoires maximaux pour une flotte de rouesen service. / Under operating conditions, turbomachinery blisks are subject to static and dynamic loads which mayresult in High-Cycle Fatigue situations when excited at the neighbourhood of resonant frequencies.Random mistuning, which affects blisks due to machining deviations, turns this issue even morecritical. The objective of the current study is to introduce a numerical-Experimental strategy allowingthe dynamic characterization of an experimental bladed disk with regard to the statistics representingthe simulated behaviour for a population of operating blisks. A high-Fidelity numerical model basedon the optical acquisition of an experimental blisk has been set up. Test series performed in labconditions allowed to verify its coherence. The comparison of the response amplitudes measuredunder operating conditions to the model predictions revealed an acceptable matching between testand simulation data. Finally, a non-Parametric probabilistic approach has been used to predict thetheoretical maximal amplification factor. The maximum amplification factor obtained by means ofsimulation was compared to the amplification factor of the test specimen. The strategy proposed inthis study allows maximum amplification factor predictions for a population of blisks
4

Combined Fuzzy and Probabilistic Simulation for Construction Management

Sadeghi, Naimeh Unknown Date
No description available.
5

Combined Fuzzy and Probabilistic Simulation for Construction Management

Sadeghi, Naimeh 11 1900 (has links)
Simulation has been used extensively for addressing probabilistic uncertainty in range estimating for construction projects. However, subjective and linguistically expressed information results in added non-probabilistic uncertainty in construction management. Fuzzy logic has been used successfully for representing such uncertainties in construction projects. In practice, an approach that can handle both random and fuzzy uncertainties in a risk assessment model is necessary. In this thesis, first, a Fuzzy Monte Carlo Simulation (FMCS) framework is proposed for risk analysis of construction projects. To verify the feasibility of the FMCS framework and demonstrate its main features, a cost range estimating template is developed and employed to estimate the cost of a highway overpass project. Second, a hybrid framework that considers both fuzzy and probabilistic uncertainty for discrete event simulation of construction projects is suggested. The application of the proposed framework is discussed using a real case study of a pipe spool fabrication shop. / Construction Engineering and Management
6

Modeling of complex network, application to road and cultural networks

Jiang, Jian 12 September 2011 (has links) (PDF)
Many complex systems arising from nature and human society can be described as complex networks. In this dissertation, on the basis of complex network theory, we pay attention to the topological structure of complex network and the dynamics on it. We established models to investigate the influences of the structure on the dynamics of networks and to shed light on some peculiar properties of complex systems. This dissertation includes four parts. In the first part, the empirical properties (degree distribution, clustering coefficient, diameter, and characteristic path length) of urban road network of Le Mans city in France are studied. The degree distribution shows a double power-law which we studied in detail. In the second part, we propose two models to investigate the possible mechanisms leading to the deviation from simple power law. In the first model, probabilistic addition of nodes and links, and rewiring of links are considered; in the second one, only random and preferential link growth is included. The simulation results of the modelling are compared with the real data. In the third part,the probabilistic uncertainty behavior of double power law distribution is investigated. The network optimization and optimal design of scale free network to random failures are discussed from the viewpoint of entropy maximization. We defined equilibrium network ensemble as stationary ensembles of graphs by using some thermodynamics like notions such as "energy", "temperature", "free energy" for network. In the forth part, an union-division model is established to investigate the time evolution of certain networks like cultural or economical networks. In this model, the nodes represent, for example, the cultures. Several quantities such as richness, age, identity, ingredient etc. are used to parameterize the probabilistic evolution of the network. The model offers a long term view on the apparently periodic dynamics of an ensemble of cultural or economic entities in interaction.
7

Uncertainty Based Damage Identification and Prediction of Long-Time Deformation in Concrete Structures

Biswal, Suryakanta January 2016 (has links) (PDF)
Uncertainties are present in the inverse analysis of damage identification with respect to the given measurements, mainly the modelling uncertainties and the measurement uncertainties. Modelling uncertainties occur due to constructing a representative model of the real structure through finite element modelling, and representing damage in the real structures through changes in material parameters of the finite element model (assuming smeared crack approach). Measurement uncertainties are always present in the measurements despite the accuracy with which the measurements are measured or the precision of the instruments used for the measurement. The modelling errors in the finite element model are assumed to be encompassed in the updated uncertain parameters of the finite element model, given the uncertainties in the measurements and in the prior uncertainties of the parameters. The uncertainties in the direct measurement data are propagated to the estimated output data. Empirical models from codal provisions and standard recommendations are normally used for prediction of long-time deformations in concrete structures. Uncertainties are also present in the creep and shrinkage models, in the parameters of these models, in the shrinkage and creep mechanisms, in the environmental conditions, and in the in-situ measurements. All these uncertainties are needed to be considered in the damage identification and prediction of long-time deformations in concrete structures. In the context of modelling uncertainty, uncertainties can be categorized into aleatory or epistemic uncertainty. Aleatory uncertainty deals with the irresolvable indeterminacy about how the uncertain variable will evolve over time, whereas epistemic uncertainty deals with lack of knowledge. In the field of damage detection and prediction of long time deformations, aleatory uncertainty is modeled through probabilistic analysis, whereas epistemic uncertainty can be modeled through (1) Interval analysis (2) Ellipsoidal modeling (3) Fuzzy analysis (4) Dempster-Shafer evidence theory or (5) Imprecise probability. Many a times it is di cult to determine whether a particular uncertainty is to be considered as an aleatory or as an epistemic uncertainty, and the model builder makes the distinction. The model builder makes the choice based on the general state of scientific knowledge, on the practical need for limiting the model sophistication to a significant engineering importance, and on the errors associated with the measurements. Measurement uncertainty can be stated as the dispersion of real data resulting from systematic error (instrumental error, environmental error, observational error, human error, drift in measurement, measurement of wrong quantity) and random error (all errors apart from systematic errors). Most of instrumental errors given by the manufacturers are in terms of plus minus ranges and can be better represented through interval bounds. The vagueness involved in the representation of human error, observational error, and drift in measurement can be represented through interval bounds. Deliberate measurement of wrong quantity through cheaper and more convenient measurement units can lead to bad quality data. Quality of data can be better handled through interval analysis, with good quality data having narrow width of interval bounds and bad quality data having wide interval bounds. The environmental error, the electronic noise coming from transmitting the data and the random errors can be represented through probability distribution functions. A major part of the measurement uncertainties is better represented through interval bounds and the other part, is better represented through probability distributions. The uncertainties in the direct measurement data are propagated to the estimated output data (in damage identification techniques, the damaged parameters, and in the long-time deformation, the uncertain parameters of the deformation models, which are then used for the prediction of long-time deformations). Uncertainty based damage identification techniques and long-time deformations in concrete structures require further studies, when the measurement uncertainties are expressed through interval bounds only, or through both interval and probability using imprecise techniques. The thesis is divided into six chapters. Chapter 1 provides a review of existing literature on uncertainty based techniques for damage identification and prediction of long-time deformations in concrete structures. A brief review of uncertainty based methods for engineering applications is made, with special highlight to the need of interval analysis and imprecise probability for modeling uncertainties in the damage identification techniques. The review identifies that the available techniques for damage identification, where the uncertainties in the measurements and in the structural and material parameters are expressed in terms of interval bounds, lack e ciency, when the size of the damaged parameter vector is large. Studies on estimating the uncertainties in the damage parameters when the uncertainties in the measurements are expressed through imprecise probability analysis, are also identified as problems that will be considered in this thesis. Also the need for estimating the short-term time period, which in turn helps in accurate prediction of long-time deformations in concrete structures, along with a cost effective and easy to use system of measuring the existing prestress forces at various time instances in the short-time period is noted. The review identifies that most of modelers and analysts have been inclined to select a single simulation model for the long-time deformations resulted from creep, shrinkage and relaxation, rather than take all the possibilities into consideration, where the model selection is made based on the hardly realistic assumption that we can certainly select a correct, and the lack of confidence associated with model selection brings about the uncertainty that resides in a given model set. The need for a single best model out of all the available deformation models is needed to be developed, when uncertainties are present in the models, in the measurements and in the parameters of each models is also identified as a problem that will be considered in this thesis. In Chapter 2, an algorithm is proposed adapting the existing modified Metropolis Hastings algorithm for estimating the posterior probability of the damage indices as well as the posterior probability of the bounds of the interval parameters, when the measurements are given in terms of interval bounds. A damage index is defined for each element of the finite element model considering the parameters of each element are intervals. Methods are developed for evaluating response bounds in the finite element software ABAQUS, when the parameters of the finite element model are intervals. Illustrative examples include reinforced concrete beams with three damage scenarios mainly (i) loss of stiffness, (ii) loss of mass, and (iii) loss of bond between concrete and reinforcement steel, that have been tested in our laboratory. Comparison of the prediction from the proposed method with those obtained from Bayesian analysis and interval optimization technique show improved accuracy and computational efficiency, in addition to better representation of measurement uncertainties through interval bounds. Imprecise probability based methods are developed in Chapter 3, for damage identifi cation using finite element model updating in concrete structures, when the uncertainties in the measurements and parameters are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Illustrative examples include reinforced concrete beams and prestressed concrete beams tested in our laboratory. In Chapter 4, a steel frame is designed to measure the existing prestressing force in the concrete beams and slabs when embedded inside the concrete members. The steel frame is designed to work on the principles of a vibrating wire strain gauge and is referred to as a vibrating beam strain gauge (VBSG). The existing strain in the VBSG is evaluated using both frequency data on the stretched member and static strain corresponding to a fixed static load, measured using electrical strain gauges. The crack reopening load method is used to compute the existing prestressing force in the concrete members and is then compared with the existing prestressing force obtained from the VBSG at that section. Digital image correlation based surface deformation and change in neutral axis monitored by putting electrical strain gauges across the cross section, are used to compute the crack reopening load accurately. Long-time deformations in concrete structures are estimated in Chapter 5, using short-time measurements of deformation responses when uncertainties are present in the measurements, in the deformation models and in the parameters of the deformation models. The short-time period is defined as the least time up to which if measurements are made available, the measurements will be enough for estimating the parameters of the deformation models in predicting the long time deformations. The short-time period is evaluated using stochastic simulations where all the parameters of the deformation models are defined as random variables. The existing deformation models are empirical in nature and are developed based on an arbitrary selection of experimental data sets among all the available data sets, and each model contains some information about the deformation patterns in concrete structures. Uncertainty based model averaging is performed for obtaining the single best model for predicting the long-time deformation in concrete structures. Three types of uncertainty models are considered namely, probability models, interval models and imprecise probability models. Illustrative examples consider experiments in the Northwestern University database available in the literature and prestressed concrete beams and slabs cast in our laboratory for prediction of long-time prestress losses. A summary of contributions made in this thesis, together with a few suggestions for future research, are presented in Chapter 6. Finally the references that were studies are listed.
8

Modeling of complex network, application to road and cultural networks / Modeling of complex network, application to road and cultural networks

Jiang, Jian 12 September 2011 (has links)
De nombreux systèmes complexes provenant de phénomènes naturels ou de la société humaine peuvent être décrits comme des réseaux complexes. Dans cette thèse, sur la base de la théorie des réseaux complexes, nous allons nous pencher sur la structure topologique de ces réseaux complexes et leurs dynamiques. Nous avons créé des modèles pour étudier les influences de la structure sur la dynamique des réseaux et mis en évidence quelques propriétés particulières des systèmes complexes. Cette thèse comporte quatre parties. Dans la première partie, les propriétés empiriques (degré de distribution, coefficient d’agrégation, diamètre, longueur caractéristique de parcours) des réseaux de routes urbaines de la ville du Mans en France sont étudiées. Dans la seconde partie, nous proposons deux modèles pour étudier le mécanisme éventuel conduisant à s’écarter de la loi de puissance simple. Dans le premier modèle, la probabilité d’addition de noeuds et de liens, la création de liens est étudiée ; dans le second modèle, seule la croissance aléatoire et préférentielle de liens est ajoutée. Les résultats de la simulation de ce modèle sont comparés aux données réelles. Dans la troisième partie, les propriétés probabilistes incertaines de la loi de distribution en double puissance sont étudiées. L’optimisation du réseau et l’étude optimale du réseau sans échelle vers l’échec aléatoire sont étudiées en se servant du principe de maximisation de l’entropie. Nous avons défini l’ensemble du réseau à l’équilibre comme des ensembles stationnaires de graphes en utilisant des notions thermodynamiques telle que ”énergie”, ”température”, ” énergie libre” pour les réseaux. Dans la quatrième partie, un modèle d’union-division est mis au point pour étudier l’évolution temporelle de certains réseaux culturels ou économiques. Dans ce modèle, les noeuds représentent les cultures. Plusieurs grandeurs telles que la richesse, l’âge, identité, contenu etc. sont utilisées pour paramétrer l’évolution probable du réseau. Le modèle offre une vision à long terme sur une dynamique apparemment périodique d’ensemble de grandeurs culturelles ou économiques en interaction. / Many complex systems arising from nature and human society can be described as complex networks. In this dissertation, on the basis of complex network theory, we pay attention to the topological structure of complex network and the dynamics on it. We established models to investigate the influences of the structure on the dynamics of networks and to shed light on some peculiar properties of complex systems. This dissertation includes four parts. In the first part, the empirical properties (degree distribution, clustering coefficient, diameter, and characteristic path length) of urban road network of Le Mans city in France are studied. The degree distribution shows a double power-law which we studied in detail. In the second part, we propose two models to investigate the possible mechanisms leading to the deviation from simple power law. In the first model, probabilistic addition of nodes and links, and rewiring of links are considered; in the second one, only random and preferential link growth is included. The simulation results of the modelling are compared with the real data. In the third part,the probabilistic uncertainty behavior of double power law distribution is investigated. The network optimization and optimal design of scale free network to random failures are discussed from the viewpoint of entropy maximization. We defined equilibrium network ensemble as stationary ensembles of graphs by using some thermodynamics like notions such as ”energy”, ”temperature”, ”free energy” for network. In the forth part, an union-division model is established to investigate the time evolution of certain networks like cultural or economical networks. In this model, the nodes represent, for example, the cultures. Several quantities such as richness, age, identity, ingredient etc. are used to parameterize the probabilistic evolution of the network. The model offers a long term view on the apparently periodic dynamics of an ensemble of cultural or economic entities in interaction.

Page generated in 0.1114 seconds