• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 44
  • 2
  • Tagged with
  • 46
  • 46
  • 46
  • 46
  • 32
  • 7
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Variational convergences for functionals and differential operators depending on vector fields

Maione, Alberto 09 December 2020 (has links)
In this Ph.D. thesis we discuss results concerning variational convergences for functionals and differential operators on Lipschitz continuous vector fields. The convergences taken into account are gamma-convergence (for functionals) and H-convergence (for differential operators).
32

Some optimization problems in electromagnetism

Caselli, Gabriele 17 May 2022 (has links)
Electromagnetism and optimal control stand out as a topics that feature impactful applications in modern engineering, as well as challenging theoretical aspects of mathematical analysis. Within this context, a major role is played by the search of necessary and sufficient conditions characterizing optimal solutions, as they are functional to numerical algorithms aiming to approximate such solutions. In this thesis, three standalone topics in optimization sharing the underlying framework of Maxwell-related PDEs are discussed. First, I present an optimal control problem driven by a quasi-linear magneto-static obstacle problem featuring first-order differential state constraints. The non-linearity allows to suitably model electromagnetic waves in the presence of ferromagnetic materials, while the first-order obstacle is relevant for applications in the field of magnetic shielding. Existence theory and the derivation of an optimality system are addressed with an approximation technique based on a relaxation-penalization of the variational inequality. Second, I analyze an eddy current problem controlled through a dipole type source, i.e. a Dirac mass with fixed position and variable intensity: well-posedness of the state equation through a fundamental solution (of a curl curl - Id operator) approach and first order conditions are dealt with. To conclude, I discuss the computation of the topological derivative for shape functionals constrained to low-frequency electromagnetic problems (closely related to the eddy current model), with respect to the inclusion/removal of conducting material; the results are obtained using a Lagrangian approach and in particular the so-called averaged adjoint method. This approach requires the study of the asymptotic behavior of the solutions of some problems defined in the whole space, and the introduction and consequent analysis of appropriate function spaces.
33

From data to mathematical analysis and simulation in models in epidemiology and ecology

Clamer, Valentina January 2016 (has links)
This dissertation is divided into three different parts. In the first part we analyse collected data on the occurrence of influenza-like illness (ILI) symptoms regarding the 2009 influenza A/H1N1 virus pandemic in two primary schools of Trento, Italy. These data were used to calibrate a discrete-time SIR model, which was designed to estimate the probabilities of influenza transmission within the classes, grades and schools using Markov Chain Monte Carlo (MCMC) methods. We found that the virus was mainly transmitted within class, with lower levels of transmission between students in the same grade and even lower, though not significantly so, among different grades within the schools. We estimated median values of R0 from the epidemic curves in the two schools of 1.16 and 1.40; on the other hand, we estimated the average number of students infected by the first school case to be 0.85 and 1.09 in the two schools. This discrepancy suggests that household and community transmission played an important role in sustaining the school epidemics. The high probability of infection between students in the same class confirms that targeting within-class transmission is key to controlling the spread of influenza in school settings and, as a consequence, in the general population. In the second part, by starting from a basic host-parasitoid model, we study the dynamics of a 2 hosts-1 parasitoid model assuming, for the sake of simplicity, that larval stages have a fixed duration. If each host is subjected to density-dependent mortality in its larval stage, we obtain explicit conditions for coexistence of both hosts, as long as each 1 host-parasitoid system would tend to an equilibrium point. Otherwise, if mortality is density-independent, under the same conditions host coexistence is impossible. On the other hand, if at least one of the 1 host-parasitoid systems has an oscillatory dynamics (which happens under some parameter values), we found, through numerical bifurcation, that coexistence is favoured. It is also possible that coexistence between the two hosts occurs even in the case without density-dependence. Analysis of this case has been based on methods of approximation of the dominant characteristic multipliers of the monodromy operator using a recent method introduced by Breda et al. Models of this type may be relevant for modelling control strategies for Drosophila suzukii, a recently introduced fruit fly that caused severe production losses, based on native parasitoids of indigenous fruit flies. In the third part, we present a starting point to analyse raw data collected by Stacconi et al. in the province of Trento, Italy. We present an extensions of the model presented in Part 2 where we have two hosts and two parasitoids. Since its analysis is complicated, we begin with a simpler one host-one parasitoid model to better understand the possible impact of parasitoids on a host population. We start by considering that the host population is at an equilibrium without parasitoids, which are then introduced as different percentages of initial adult hosts. We compare the times needed by parasitoids to halve host pupae and we found that the best percentage choice is 10%. Thus we decide to fix this percentage of parasitoid introduction and analyse what happens if parasitoids are introduced when the host population is not at equilibrium both by introducing always the same percentage or the same amount of parasitoids. In this case, even if the attack rate is at 1/10 of its maximum value, parasitoids would have a strong effect on host population, shifting it to an oscillatory regime. However we found that this effect would require more than 100 days but we also found that it can faster if parasitoids are introduced before the host population has reached the equilibrium without parasitoids. Thus there could be possible releases when host population is low. Last we investigate also what happens if in nature mortality rates of these species increase and we found that there is not such a big difference respect to the results obtained using laboratory data.
34

Mathematical models for vector-borne disease: effects of periodic environmental variations.

Moschini, Pamela Mariangela January 2015 (has links)
Firstly, I proposed a very simple SIS/SIR model for a general vector-borne disease transmission considering constant population sizes over the season, where contact between the host and the vector responsible of the transmission is assumed to occur only during the summer of each year. I discussed two different types of threshold for pathogen persistence that I explicitly computed: a "short-term threshold" and a "long-term threshold". Later, I took into account the seasonality of the populations involved in the transmission. For a single season, the model consists of system of non linear differential equations considering the various stages of the infection transmission between the vector and the host population. Assuming the overwintering in the mosquito populations, I simulated the model for several years. Finally, I studied the spatial spread of a vector-borne disease throught an impusive reaction-diffusion model and I showed some simulations.
35

Mathematical modelling of emerging and re-emerging infectious diseases in human and animal populations

Dorigatti, Ilaria January 2011 (has links)
The works presented in this thesis are very different one from the other but they all deal with the mathematical modelling of emerging infectious diseases which, beyond being the leitmotiv of this thesis, is an important research area in the field of epidemiology and public health. A minor but significant part of the thesis has a theoretical flavour. This part is dedicated to the mathematical analysis of the competition model between two HIV subtypes in presence of vaccination and cross-immunity proposed by Porco and Blower (1998). We find the sharp conditions under which vaccination leads to the coexistence of the strains and using arguments from bifurcation theory, draw conclusions on the equilibria stability and find that a rather unusual behaviour of histeresis-type might emerge after repeated variations of the vaccination rate within a certain range. The most of this thesis has been inspired by real outbreaks occurred in Italy over the last 10 years and is about the modelling of the 1999-2000 H7N1 avian influenza outbreak and of the 2009-2010 H1N1 pandemic influenza. From an applied perspective, parameter estimation is a key part of the modelling process and in this thesis statistical inference has been performed within both a classical framework (i.e. by maximum likelihood and least square methods) and a Bayesian setting (i.e. by Markov Chain Monte Carlo techniques). However, my contribution goes beyond the application of inferential techniques to specific case studies. The stochastic, spatially explicit, between-farm transmission model developed for the transmission of the H7N1 virus has indeed been used to simulate different control strategies and asses their relative effectiveness. The modelling framework presented here for the H1N1 pandemic in Italy constitutes a novel approach that can be applied to a variety of different infections detected by surveillance system in many countries. We have coupled a deterministic compartmental model with a statistical description of the reporting process and have taken into account for the presence of stochasticity in the surveillance system. We thus tackled some statistical challenging issues (such as the estimation of the fraction of H1N1 cases reporting influenza-like-illness symptoms) that had not been addressed before. Last, we apply different estimation methods usually adopted in epidemiology to real and simulated school outbreaks, in the attempt to explore the suitability of a specific individual-based model at reproducing empirically observed epidemics in specific social contexts.
36

The influence of the population contact network on the dynamics of epidemics transmission

Ottaviano, Stefania January 2016 (has links)
In this thesis we analyze the relationship between epidemiology and network theory, starting from the observation that the viral propagation between interacting agents is determined by intrinsic characteristics of the population contact network. We aim to investigate how a particular network structure can impact on the long-term behavior of epidemics. This field is way too large to be fully discussed; we limit ourselves to consider networks that are partitioned into local communities, in order to incorporate realistic contact structures into the model. The gross structure of hierarchical networks of this kind can be described by a quotient graph. The rationale of this approach is that individuals infect those belonging to the same community with higher probability than individuals in other communities. We describe the epidemic process as a continuous-time individual-based susceptible–infected–susceptible (SIS) model using a first-order mean-field approximation, both in homogeneous and in heterogeneous setting. For this mean-field model we show that the spectral radius of the smaller quotient graph, in connection with the infecting and curing rates, is related to the epidemic threshold, and it gives conditions in order to decide whether the overall healthy-state defines a globally asymptotically stable or an unstable equilibrium. Moreover we show that above the threshold another steady-state exists that can be computed using a lower-dimensional dynamical system associated with the evolution of the process on the quotient graph. Our investigations are based on the graph-theoretical notion of equitable partition and of its recent and rather flexible generalization, that of almost equitable partition. We also consider the important issue related to the control of the infectious disease. Taking into account the connectivity of the network, we provide a cost-optimal distribution of resources to prevent the disease from persisting indefinitely in the population; for a particular case of two-level immunization problem we report on the construction of a polynomial time complexity algorithm. In the second part of the thesis we include stochasticity in the model, considering the infection rates in the form of independent stochastic processes. This allows us to get stochastic differential equation for the probability of infection in each node. We report on the existence of the solution for all times. Moreover we show that there exist two regions, given in terms of the coefficients of the model, one where the system goes to extinction almost surely, and the other where it is stochastic permanent.
37

Computational inverse scattering via qualitative methods

Aramini, Riccardo January 2011 (has links)
This Ph.D. thesis presents a threefold revisitation and reformulation of the linear sampling method (LSM) for the qualitative solution of inverse scattering problems (in the resonance region and in time-harmonic regime): 1) from the viewpoint of its implementation (in a 3D setting), the LSM is recast in appropriate Hilbert spaces, whereby the set of algebraic systems arising from an angular discretization of the far-field equation (written for each sampling point of the numerical grid covering the investigation domain and for each sampling polarization) is replaced by a single functional equation. As a consequence, this 'no-sampling' LSM requires a single regularization procedure, thus resulting in an extremely fast algorithm: complex 3D objects are visualized in around one minute without loss of quality if compared to the traditional implementation; 2) from the viewpoint of its application (in a 2D setting), the LSM is coupled with the reciprocity gap functional in such a way that the influence of scatterers outside the array of receiving antennas is excluded and an inhomogeneous background inside them can be allowed for: then, the resulting 'no-sampling' algorithm proves able to detect tumoural masses inside numerical (but rather realistic) phantoms of the female breast by inverting the data of an appropriate microwave scattering experiment; 3) from the viewpoint of its theoretical foundation, the LSM is physically interpreted as a consequence of the principle of energy conservation (in a lossless background). More precisely, it is shown that the far-field equation at the basis of the LSM (which does not follow from physical laws) can be regarded as a constraint on the power flux of the scattered wave in the far-field region: if the flow lines of the Poynting vector carrying this flux verify some regularity properties (as suggested by numerical simulations), the information contained in the far-field constraint is back-propagated to each point of the background up to the near-field region, and the (approximate) fulfilment of such constraint forces the L^2-norm of any (approximate) solution of the far-field equation to behave as a good indicator function for the unknown scatterer, i.e., to be 'small' inside the scatterer itself and 'large' outside.
38

Prime Numbers and Polynomials

Goldoni, Luca January 2010 (has links)
This thesis deals with the classical problem of prime numbers represented by polynomials. It consists of three parts. In the first part I collected many results about the problem. Some of them are quite recent and this part can be considered as a survey of the state of the art of the subject. In the second part I present two results due to P. Pleasants about the cubic polynomials with integer coefficients in several variables. The aim of this part is to simplify the works of Pleasants and modernize the notation employed. In such a way these important theorems are now in a more readable form. In the third part I present some original results related with some algebraic invariants which are the key-tools in the works of Pleasants. The hidden diophantine nature of these invariants makes them very difficult to study. Anyway some results are proved. These results make the results of Pleasants somewhat more effective.
39

Selected Topics in Analysis in Metric Measure Spaces

Capolli, Marco 02 February 2021 (has links)
The thesis is composed by three sections, each devoted to the study of a specific problem in the setting of PI spaces. The problem analyzed are: a C^m Lusin approximation result for horizontal curves in the Heisenberg group, a limit result in the spirit of Burgain-Brezis-Mironescu for Orlicz-Sobolev spaces in Carnot groups and the differentiability of Lipschitz functions in Laakso spaces.
40

Theoretical and Algorithmic Solutions for Null models in Network Theory

Gobbi, Andrea January 2013 (has links)
The graph-theoretical based formulation for the representation of the data-driven structure and the dynamics of complex systems is rapidly imposing as the paramount paradigm [1] across a variety of disciplines, from economics to neuroscience, with biological -omics as a major example. In this framework, the concept of Null Model borrowed from the statistical sciences identifies the elective strategy to obtain a baseline points of modelling comparison [2]. Hereafter, a null model is a graph which matches one specific graph in terms of some structural features, but which is otherwise taken to be generated as an instance of a random network. In this view, the network model introduced by Erdos & Renyi [3], where random edges are generated as independently and identically distributed Bernoulli trials, can be considered the simplest possible null model. In the following years, other null models have been developed in the framework of graph theory, with the detection of the community structure as one of the most important target[4]. In particular, the model described in [5] introduces the concept of a randomized version of the original graph: edges are rewired at random, with each expected vertex degree matching the degree of the vertex in the original graph. Although aimed at building a reference for the community detection, this approach will play a key role in one of the model considered in this thesis. Note that, although being the ï¬ rst problem to be considered, designing null models for the community structures detection is still an open problem [6, 7]. Real world applications of null model in graph theory have also gained popularity in many different scientific areas, with ecology as the ï¬ rst example: see [8] for a comprehensive overview. More recently, interest for network null models arose also in computational biology [9, 10], geosciences [11] and economics [12, 13], just to name a few. In the present work the theoretical design and the practical implementation of a series of algorithms for the construction of null models will be introduced, with applications ranging from functional genomics to game theory for social studies. The four chapters devoted to the presentation of the examples of null model are preceded by an introductory chapter including a quick overview of graph theory, together with all the required notations. The ï¬ rst null model is the topic of the second chapter, where a suite of novel algorithms is shown, aimed at the efficient generation of complex networks under different constraints on the node degrees. Although not the most important example in the thesis, the premiment position dedicated to this topic is due to its strict familiarity with the aforementioned classical null models for random graph construction. Together with the algorithms definition and examples, a thorough theoretical analysis of the proposed solutions is shown, highlighting the improvements with respect to the state-of-the-art and the occurring limitations. Apart from its intrinsic mathematical value, the interest for these algorithms by the community of systems biology lies in the need for benchmark graphs resembling the real biological networks. They are in fact of uttermost importance when testing novel inference methods, and as testbeds for the network reconstruction challenges such as the DREAM series [14, 15, 16]. The following Chapter three includes the most complex application of null models presented in this thesis. The scientific workï¬ eld is again functional genomics, namely the combinatorial approach to the modelling of patterns of mutations in cancer as detected by Next Generation Sequencing exome Data. This problem has a natural mathematical representation in terms of rewiring of bipartite networks and mutual-exclusively mutated modules [17, 18], to which Markov chain updates (switching-steps) are applied through a Switching Algorithm SA. Here we show some crucial improvements to the SA, we analytically derive an approximate lower bound for the number of steps required, we introduce BiRewire, an R package implementing the improved SA and we demonstrate the effectiveness of the novel solution on a breast cancer dataset. A novel threshold-selection method for the construction of co-expression net- works based on the Pearson coefficient is the third and last biological example of null model, and it is outlined in Chapter four. Gene co-expression networks inferred by correlation from high-throughput proï¬ ling such as microarray data represent a simple but effective technique for discovering and interpreting linear gene relationships. In the last years several approach have been proposed to tackle the problem of deciding when the resulting correlation values are statistically significant. This is mostly crucial when the number of samples is small, yielding a non negligible chance that even high correlation values are due to random effects. Here we introduce a novel hard thresholding solution based on the assumption that a coexpression network inferred by randomly generated data is expected to be empty. The theoretical derivation of the new bound by geometrical methods is shown together with two applications in oncogenomics. The last two chapters of the thesis are devoted to the presentation of null models in non-biological contexts. In Chapter 5 a novel dynamic simulation model is introduced mimicking a random market in which sellers and buyers follow different price distributions and matching functions. The random marked is mathematically formulated by a dynamic bipartite graph, and the analytical formula for the evolution along time of the mean price exchange is derived, together with global likelihood function for retrieving the initial parameters under different assumptions. Finally in Chapter 6 we describe how graph tools can be used to model abstraction and strategy (see [19, 20, 21]) for a class of games in particular the TTT solitaire. We show that in this solitaire it is not possible to build an optimal (in the sense of minimum number of moves) strategy dividing the big problems into smaller subproblems. Nevertheless, we ï¬ nd some subproblems and strategies for solving the TTT solitaire with a negligible increment in the number of moves. Although quite simple and far from simulating highly complex real-world situations of decision making, the TTT solitaire is an important tool for starting the exploration of the social analysis of the trajectories of the implementation of winning strategies through different learning procedures [22].

Page generated in 0.1381 seconds