• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 108
  • 24
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 142
  • 71
  • 55
  • 55
  • 51
  • 48
  • 32
  • 28
  • 27
  • 22
  • 22
  • 17
  • 17
  • 17
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Impatto e dialogo delle strade nazionali con l'ambiente : evoluzione lungo i trent'anni di realizzazione nel territorio ticinese /

Sailer, Giorgio. January 1990 (has links)
Diss. rer. pol. Bern, 1989. / Bibliogr.
72

Theoretical and Algorithmic Solutions for Null models in Network Theory

Gobbi, Andrea January 2013 (has links)
The graph-theoretical based formulation for the representation of the data-driven structure and the dynamics of complex systems is rapidly imposing as the paramount paradigm [1] across a variety of disciplines, from economics to neuroscience, with biological -omics as a major example. In this framework, the concept of Null Model borrowed from the statistical sciences identifies the elective strategy to obtain a baseline points of modelling comparison [2]. Hereafter, a null model is a graph which matches one specific graph in terms of some structural features, but which is otherwise taken to be generated as an instance of a random network. In this view, the network model introduced by Erdos & Renyi [3], where random edges are generated as independently and identically distributed Bernoulli trials, can be considered the simplest possible null model. In the following years, other null models have been developed in the framework of graph theory, with the detection of the community structure as one of the most important target[4]. In particular, the model described in [5] introduces the concept of a randomized version of the original graph: edges are rewired at random, with each expected vertex degree matching the degree of the vertex in the original graph. Although aimed at building a reference for the community detection, this approach will play a key role in one of the model considered in this thesis. Note that, although being the ï¬ rst problem to be considered, designing null models for the community structures detection is still an open problem [6, 7]. Real world applications of null model in graph theory have also gained popularity in many different scientific areas, with ecology as the ï¬ rst example: see [8] for a comprehensive overview. More recently, interest for network null models arose also in computational biology [9, 10], geosciences [11] and economics [12, 13], just to name a few. In the present work the theoretical design and the practical implementation of a series of algorithms for the construction of null models will be introduced, with applications ranging from functional genomics to game theory for social studies. The four chapters devoted to the presentation of the examples of null model are preceded by an introductory chapter including a quick overview of graph theory, together with all the required notations. The ï¬ rst null model is the topic of the second chapter, where a suite of novel algorithms is shown, aimed at the efficient generation of complex networks under different constraints on the node degrees. Although not the most important example in the thesis, the premiment position dedicated to this topic is due to its strict familiarity with the aforementioned classical null models for random graph construction. Together with the algorithms definition and examples, a thorough theoretical analysis of the proposed solutions is shown, highlighting the improvements with respect to the state-of-the-art and the occurring limitations. Apart from its intrinsic mathematical value, the interest for these algorithms by the community of systems biology lies in the need for benchmark graphs resembling the real biological networks. They are in fact of uttermost importance when testing novel inference methods, and as testbeds for the network reconstruction challenges such as the DREAM series [14, 15, 16]. The following Chapter three includes the most complex application of null models presented in this thesis. The scientific workï¬ eld is again functional genomics, namely the combinatorial approach to the modelling of patterns of mutations in cancer as detected by Next Generation Sequencing exome Data. This problem has a natural mathematical representation in terms of rewiring of bipartite networks and mutual-exclusively mutated modules [17, 18], to which Markov chain updates (switching-steps) are applied through a Switching Algorithm SA. Here we show some crucial improvements to the SA, we analytically derive an approximate lower bound for the number of steps required, we introduce BiRewire, an R package implementing the improved SA and we demonstrate the effectiveness of the novel solution on a breast cancer dataset. A novel threshold-selection method for the construction of co-expression net- works based on the Pearson coefficient is the third and last biological example of null model, and it is outlined in Chapter four. Gene co-expression networks inferred by correlation from high-throughput proï¬ ling such as microarray data represent a simple but effective technique for discovering and interpreting linear gene relationships. In the last years several approach have been proposed to tackle the problem of deciding when the resulting correlation values are statistically significant. This is mostly crucial when the number of samples is small, yielding a non negligible chance that even high correlation values are due to random effects. Here we introduce a novel hard thresholding solution based on the assumption that a coexpression network inferred by randomly generated data is expected to be empty. The theoretical derivation of the new bound by geometrical methods is shown together with two applications in oncogenomics. The last two chapters of the thesis are devoted to the presentation of null models in non-biological contexts. In Chapter 5 a novel dynamic simulation model is introduced mimicking a random market in which sellers and buyers follow different price distributions and matching functions. The random marked is mathematically formulated by a dynamic bipartite graph, and the analytical formula for the evolution along time of the mean price exchange is derived, together with global likelihood function for retrieving the initial parameters under different assumptions. Finally in Chapter 6 we describe how graph tools can be used to model abstraction and strategy (see [19, 20, 21]) for a class of games in particular the TTT solitaire. We show that in this solitaire it is not possible to build an optimal (in the sense of minimum number of moves) strategy dividing the big problems into smaller subproblems. Nevertheless, we ï¬ nd some subproblems and strategies for solving the TTT solitaire with a negligible increment in the number of moves. Although quite simple and far from simulating highly complex real-world situations of decision making, the TTT solitaire is an important tool for starting the exploration of the social analysis of the trajectories of the implementation of winning strategies through different learning procedures [22].
73

Modeling the interaction of light with photonic structures by direct numerical solution of Maxwell's equations

Vaccari, Alessandro January 2015 (has links)
The present work analyzes and describes a method for the direct numerical solution of the Maxwell's equations of classical electromagnetism. This is the FDTD (Finite-Difference Time-Domain) method, along with its implementation in an "in-house" computing code for large parallelized simulations. Both are then applied to the modelization of photonic and plasmonic structures interacting with light. These systems are often too complex, either geometrically and materially, in order to be mathematically tractable and an exact analytic solution in closed form, or as a series expansion, cannot be obtained. The only way to gain insight on their physical behavior is thus to try to get a numerical approximated, although convergent, solution. This is a current trend in modern physics because, apart from perturbative methods and asymptotic analysis, which represent, where applicable, the typical instruments to deal with complex physico-mathematical problems, the only general way to approach such problems is based on the direct approximated numerical solution of the governing equations. Today this last choice is made possible through the enormous and widespread computational capabilities offered by modern computers, in particular High Performance Computing (HPC) done using parallel machines with a large number of CPUs working concurrently. Computer simulations are now a sort of virtual laboratories, which can be rapidly and costless setup to investigate various physical phenomena. Thus computational physics has become a sort of third way between the experimental and theoretical branches. The plasmonics application of the present work concerns the scattering and absorption analysis from single and arrayed metal nanoparticles, when surface plasmons are excited by an impinging beam of light, to study the radiation distribution inside a silicon substrate behind them. This has potential applications in improving the eciency of photovoltaic cells. The photonics application of the present work concerns the analysis of the optical reflectance and transmittance properties of an opal crystal. This is a regular and ordered lattice of macroscopic particles which can stops light propagation in certain wavelenght bands, and whose study has potential applications in the realization of low threshold laser, optical waveguides and sensors. For these latters, in fact, the crystal response is tuned to its structure parameters and symmetry and varies by varying them. The present work about the FDTD method represents an enhacement of a previous one made for my MSc Degree Thesis in Physics, which has also now geared toward the visible and neighboring parts of the electromagnetic spectrum. It is organized in the following fashion. Part I provides an exposition of the basic concepts of electromagnetism which constitute the minimum, although partial, theoretical background useful to formulate the physics of the systems here analyzed or to be analyzed in possible further developments of the work. It summarizes Maxwell's equations in matter and the time domain description of temporally dispersive media. It addresses also the plane wave representation of an electromagnetic field distribution, mainly the far field one. The Kirchhoff formula is described and deduced, to calculate the angular radiation distribution around a scatterer. Gaussian beams in the paraxial approximation are also slightly treated, along with their focalization by means of an approximated diraction formula useful for their numericall FDTD representation. Finally, a thorough description of planarly multilayered media is included, which can play an important ancillary role in the homogenization procedure of a photonic crystal, as described in Part III, but also in other optical analyses. Part II properly concerns the FDTD numerical method description and implementation. Various aspects of the method are treated which globally contribute to a working and robust overall algorithm. Particular emphasis is given to those arguments representing an enhancement of previous work.These are: the analysis from existing literature of a new class of absorbing boundary conditions, the so called Convolutional-Perfectly Matched Layer, and their implementation; the analysis from existing literature and implementation of the Auxiliary Differential Equation Method for the inclusion of frequency dependent electric permittivity media, according to various and general polarization models; the description and implementation of a "plane wave injector" for representing impinging beam of lights propagating in an arbitrary direction, and which can be used to represent, by superposition, focalized beams; the parallelization of the FDTD numerical method by means of the Message Passing Interface (MPI) which, by using the here proposed, suitable, user dened MPI data structures, results in a robust and scalable code, running on massively parallel High Performance Computing Machines like the IBM/BlueGeneQ with a core number of order 2X10^5. Finally, Part III gives the details of the specific plasmonics and photonics applications made with the "in-house" developed FDTD algorithm, to demonstrate its effectiveness. After Chapter 10, devoted to the validation of the FDTD code implementation against a known solution, Chapter 11 is about plasmonics, with the analytical and numerical study of single and arrayed metal nanoparticles of different shapes and sizes, when surface plasmon are excited on them by a light beam. The presence of a passivating embedding silica layer and a silicon substrate are also included. The next Chapter 12 is about the FDTD modelization of a face-cubic centered (FCC) opal photonic crystal sample, with a comparison between the numerical and experimental transmittance/reflectance behavior. An homogenization procedure is suggested of the lattice discontinuous crystal structure, by means of an averaging procedure and a planarly multilayered media analysis, through which better understand the reflecting characteristic of the crystal sample. Finally, a procedure for the numerical reconstruction of the crystal dispersion banded omega-k curve inside the first Brillouin zone is proposed. Three Appendices providing details about specific arguments dealt with during the exposition conclude the work.
74

Exponential integrators: tensor structured problems and applications

Cassini, Fabio 21 April 2023 (has links)
The solution of stiff systems of Ordinary Differential Equations (ODEs), that typically arise after spatial discretization of many important evolutionary Partial Differential Equations (PDEs), constitutes a topic of wide interest in numerical analysis. A prominent way to numerically integrate such systems involves using exponential integrators. In general, these kinds of schemes do not require the solution of (non)linear systems but rather the action of the matrix exponential and of some specific exponential-like functions (known in the literature as φ-functions). In this PhD thesis we aim at presenting efficient tensor-based tools to approximate such actions, both from a theoretical and from a practical point of view, when the problem has an underlying Kronecker sum structure. Moreover, we investigate the application of exponential integrators to compute numerical solutions of important equations in various fields, such as plasma physics, mean-field optimal control and computational chemistry. In any case, we provide several numerical examples and we perform extensive simulations, eventually exploiting modern hardware architectures such as multi-core Central Processing Units (CPUs) and Graphic Processing Units (GPUs). The results globally show the effectiveness and the superiority of the different approaches proposed.
75

Mathematical models for host-parasitoid interactions and biological control of Drosophila suzukii

Pfab, Ferdinand January 2017 (has links)
This thesis treats mathematical models for host-parasitoid interactions. It is composed of three parts. In the first part, a class of such models is analyzed theoretically. It focuses on the phenomena of multiple coexistence equilibria of competing parasitoid species. The second part is about a model for determining how a parasitoid release should be timed to optimally control the invasive fruit fly Drosophila suzukii. The third part analyzes an experiment for releasing parasitoids in a greenhouse which is infested by D.suzukii. The models presented are used to discuss how to improve such biological control strategies.
76

Numerical Methods for Optimal Control Problems with Application to Autonomous Vehicles

Frego, Marco January 2014 (has links)
In the present PhD thesis an optimal problem suite is proposed as benchmark for the test of numerical solvers. The problems are divided in four categories, classic, singular, constrained and hard problems. Apart from the hard problems, where it is not possible to give the analytical solution but only some details, all other problems are supplied with the derivation of the solution. The exact solution allows a precise comparison of the performance of the considered software. All of the proposed problems were taken from published papers or books, but it turned out that an analytic exact solution was only rarely provided, thus a true and reliable comparison among numerical solvers could not be done before. A typical wrong conclusion when a solver obtains a lower value of the target functional with respect to other solvers is to claim it better than the others, but it is not recognized that it has only underestimated the true value. In this thesis, a cutting edge application of optimal control to vehicles is showed: the optimization of the lap time in a race circuit track considering a number of realistic constraints. A new algorithm for path planning is completely described for the construction of a quasi G2 fitting of the GPS data with a clothoid spline in terms of the G1 Hermite interpolation problem. In particular the present algorithm is proved to work better than state of the art algorithms in terms of both efficiency and precision.
77

The importance of climatic and ecological factors for vector-borne infections: Culex pipiens and West Nile virus

Marini, Giovanni January 2017 (has links)
About three quarters of human emerging infectious diseases are caused by zoonotic pathogens, and many of them are spread by vectors such as mosquitoes. Mathematical models nowadays represent very powerful tools to make investigations and predictions for biological dynamical systems, providing helpful insights that can be extremely valuable for several aims. In this thesis, we will focus on a particular mosquito-borne zoonosis, West Nile virus (WNV), a flavivirus of emerging public health relevance in Europe and North America, and its main European vector, Culex pipiens mosquitoes. As the transmission of mosquito-borne diseases is largely driven by the abundance of the vector, to design appropriate control strategies it is crucial to understand the population dynamics of existing vector populations and evaluate how it depends on biotic and environmental factors. This thesis presents some new mathematical models that provide insights on several aspects of mosquito population dynamics by using different statistical and computational approaches, including for instance Linear Models and Markov chain Monte Carlo technique. Specifically, they aim to study the effect of biotic and abiotic factors on Cx. pipiens dynamics by using adult mosquito trapping data, gathered over several years in Northern Italy, to feed theoretical models. Furthermore, the effects of host competition and vector feeding preferences on the dynamics of a vector-borne infection (such as WNV) are investigated through a more theoretical study.
78

Some optimal visiting problems: from a single player to a mean-field type model

Marzufero, Luciano 19 July 2022 (has links)
In an optimal visiting problem, we want to control a trajectory that has to pass as close as possible to a collection of target points or regions. We introduce a hybrid control-based approach for the classic problem where the trajectory can switch between a group of discrete states related to the targets of the problem. The model is subsequently adapted to a mean-field game framework, that is when a huge population of agents plays the optimal visiting problem with a controlled dynamics and with costs also depending on the distribution of the population. In particular, we investigate a single continuity equation with possible sinks and sources and the field possibly depending on the mass of the agents. The same problem is also studied on a network framework. More precisely, we study a mean-field game model by proving the existence of a suitable definition of an approximated mean-field equilibrium and then we address the passage to the limit.
79

CARATTERIZZAZIONE DELLA MICOFLORA ASSOCIATA AI PRODOTTI CARNEI STAGIONATI SUINI CON PARTICOLARE RIFERIMENTO ALLA PRESENZA DI PENICILLIUM NORDICUM ED AL SUO BIOCONTROLLO / CHARACTERIZATION OF THE MYCOFLORA ASSOCIATED TO DRY CURED PORK MEAT PRODUCTS WITH FOCUS ON PENICILLIUM NORDICUM AND ITS BIOCONTROL

SPADOLA, GIORGIO 19 February 2014 (has links)
Penicillium nordicum è un importante contaminante di salumi, rappresentanando il 10 % e il 26 % della popolazione di Penicillium spp . isolati , rispettivamente dall'aria e dai prodotti carnei stagionati in un'indagine gestita in Italia ( Battilani et al. , 2007). Diverse colonie di P. nordicum isolate dai salumi hanno dimostrato di essere importanti produttori di ocratossina A , OTA ( Sansom e Frisvad , 2004 . Pietri et al, 2006 ; . Battilani et al , 2010). Attualmente, l'impostazione appropriata delle condizioni ambientali (temperatura, umidità relativa e circolazione dell'aria ), è l'unico strumento accettato per impedire la crescita incontrollata di P. nordicum all'interno degli impianti di stagionatura attraverso una accurata analisi dei punti critici di controllo e l’ideazione di un relativo piano HACCP (Hazard Analysis and Critical Control Points) ben struttutato ( Asefa et al , 2011; Virgili et al , 2012). Anche se il sistema HACCP è stato applicato con successo nel settore alimentare ci sono rischi per la sicurezza alimentare non attentamente considerati. Questo è particolarmente vero per quanto riguarda i rischi micotossigeni associati ai prodotti alimentari di origine animale. Il termine "rischi micotossigeni" è utilizzato da Asefa et al. ( 2011) per descrivere lieviti patogeni e metaboliti secondari tossici prodotti da specie fungine tossigene che contaminano i prodotti alimentari e incidono sulla sicurezza alimentare. La maggior parte dei piani HACCP nelle attività di trasformazione alimentare, come ad esempio la produzione di formaggi e di prodotti carnei stagionati, tiene in considerazione principalmente il rischio derivante da agenti batterici (Arvanitoyannis e Mavropoulos, 2000; Barbuti e Parolari, 2002) anche se tali prodotti alimentari vengono spesso contaminati da funghi micotossigeni e dai loro metaboliti (Spotti et al 1989; Spotti et al , 2001a; Battilani et al 2007). Pertanto, dovrebbe essere cruciale definire un piano HACCP specificamente incentrato sui rischi micotossigeni. L'identificazione, il controllo e la standardizzazione della micoflora superficie dei salumi è fondamentale per preservare la sicurezza delle produzioni e la salute dei consumatori . Questo è il contesto in cui deve essere valutata l’efficacia e l’affidabilità per l’identificazione delle popolazioni di Penicillium spp di interessante per la produzione alimentare. In questo contesto , il progetto di ricerca di questa tesi di dottorato ha cercato di approfondire le conoscenze su tali tematiche con l'intento di limitare il rischio micotossigeno nella catena di produzione dei prodotti carnei stagionati. Sono stati affrontati i seguenti argomenti: 1 . studio della composizione e dinamica della microflora fungina presente sulla superficie dei salumi (prodotto testato, salame) e l'aria di ambienti di stagionatura tenendo conto dell'influenza di alcuni parametri di processo (inoculo starter, temperatura, fase produttiva). 2 . sviluppo di un metodo MALDI TOF MS per l'identificazione di Penicilium a livello di specie per le prospettive future di screening diretti della microflora presente sui salumi. 3 . confronto e integrazione di diverse tecniche, come l'analisi morfologica, l’analisi molecolare e l’analisi tramite spettrometria di massa, per l'identificazione delle specie di Penicillium presenti nei salumi. 4 . valutazione dei lieviti selezionati, isolati dalla superficie di prosciutto crudo, per competere con P. nordicum ed inibire l'accumulo di OTA nella prospettiva del loro uso come starter superficiali con funzione di agenti di biocontrollo. / Penicillium nordicum is an important contaminant of cured meat products, representing 10% and 26% of the Penicillium spp. isolated, respectively, from the air or the products in a survey managed in Italy (Battilani et al., 2007). Several P. nordicum cured meat isolates proved to be important producers of ochratoxin A, OTA (Sansom and Frisvad, 2004; Pietri et al., 2006; Battilani et al., 2010). Currently, the appropriate setting of environmental conditions (temperature, relative humidity and air circulation), is the only accepted tool to prevent the uncontrolled growth of P. nordicum inside dry-curing plants through a carefully structured Hazard Analysis Critical Control Point (HACCP) plan (Asefa et al., 2011; Virgili et al., 2012). Even if the HACCP system has been successfully applied in the food industry, there are food safety hazards not carefully considered. This is especially true with regard to mycotoxigenic hazards associated with animal food products. The term “mycotoxigenic hazards” is used by Asefa et al. (2011) to describe pathogenic yeasts and toxic secondary metabolites of toxigenic moulds that contaminate food products and affect food safety. Most HACCP plans in food processing activities, such as the production of cheese and dry-cured meat products, considered mainly bacterial agents (Arvanitoyannis and Mavropoulos, 2000; Barbuti and Parolari, 2002), even if such food products get often contaminated with mycotoxigenic fungi and their metabolites (Spotti et al 1989; Spotti et al., 2001a; Battilani et al 2007). Therefore, it should be crucial to define a HACCP plan specifically focused on the mycotoxigenic hazards. The identification, control and standardization of the surface mycoflora of cured meat products is mandatory to preserve the productions safety and the consumers health. This is the context of the effectiveness and reliability evaluation for the Penicillium spp. identification methods of interesting species for food production. In this context, the research project of this PHD thesis tried to fill some gaps of knowledge with the attempt to limit the mycotoxigenic risk in the cured meat products chain. The following topics were faced: 1. study of the composition and dynamic of fungal microflora present on the surface of cured meat products (salami) and the air of seasoning environments taking into account the influence of some process parameters (starter inoculum, curing temperature, stage of seasoning). 2. development of a MALDI TOF MS method for the identification of Penicilium at species level for future direct screening perspectives of the microflora present on cured meat products. 3. comparison and integration of different techniques, as morphological, molecular and mass spectral analysis, for the identification of Penicillium species in cured meat products. 4. evaluation of selected yeasts, isolated from dry-cured ham surface, to compete with P. nordicum and to inhibit OTA accumulation in the perspective of their use as surface starter biocontrol agents.
80

La distanza conta: Tre elaborati in Economia Spaziale / DISTANCE MATTERS: THREE ESSAYS IN SPATIAL ECONOMIC ANALYSIS

CALEGARI, ELENA 27 May 2016 (has links)
Waldo Tobler, con la sua prima legge della geografia, afferma “Ogni cosa è correlata con qualsiasi altra, ma le cose vicine sono più relazionate di quelle lontane" (Tobler, 1970). Se questo era certamente vero nel 1970, tale convinzione è stata messa in discussione con l’avvento delle Tecnologie dell’Informazione e della Comunicazione (ICT). Nel dibattito riguardo al processo di globalizzazione molti studiosi e giornalisti sostengono infatti che, con la velocizzazione delle telecomunicazioni, la distanza fisica è destinata a perdere il proprio potere esplicativo relativamente a molti fenomeni socio-economici (Cairncross, 2001; Friedman, 2005). Questa dissertazione vuole contribuire al dibattito rispondendo, seppure parzialmente, alla domanda “La distanza importa ancora?” e definire alcune possibili implicazioni di policy. L’obiettivo è quello di mostrare il ruolo della distanza geografica in tre diversi contesti economici caratterizzati da differenti dimensioni dell’unità di analisi. I risultati suggeriscono che, anche se su scala globale lo sviluppo delle nuove tecnologie ha modificato la percezione individuale della distanza come deterrente alle interazioni, lo spazio geografico mantiene ancora la sua rilevanza del definire le relazioni socio-economiche locali, aumentando il ruolo di città e regioni quali centri della maggioranza delle attività economiche. / Waldo Tobler, with his first law of geography, stated “Everything is related to everything else, but near things are more related than distant things" (Tobler, 1970). If it was certainly true in 1970, this belief is called into question in an era of development of Information and Communication Technologies (ICTs). In the debate over globalization processes, several scholars and journalists argue indeed that, with the increasing speed of telecommunications, physical distance is losing its explanatory power as determinant of socio-economical relationships (Cairncross, 2001; Friedman, 2005). This dissertation aims to give a contribution to this debate, partially answering to the broad question “Does distance still matter?" and to draw possible policy implications. The purpose is to show the role of geographical distance in three different economic environments, characterized by diversified size of the unit of analysis. Results suggest that, even if at a global scale improvements in ICTs have changed the individual perception of the distance as deterrent in interactions, geographical space still maintains its relevance in defining local socio-economic relationships, increasing the role of cities and regions as the core of most of economic activities.

Page generated in 0.1036 seconds