• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 34
  • 10
  • 7
  • Tagged with
  • 51
  • 48
  • 46
  • 46
  • 45
  • 45
  • 16
  • 11
  • 11
  • 10
  • 9
  • 8
  • 8
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Optimization in graphs under degree constraints. application to telecommunication networks

Sau Valls, Ignasi 16 October 2009 (has links)
La premi ere partie de cette th ese s'int eresse au groupage de tra c dans les r eseaux de t el ecommunications. La notion de groupage de tra c correspond a l'agr egation de ux de faible d ebit dans des conduits de plus gros d ebit. Cependant, a chaque insertion ou extraction de tra c sur une longueur d'onde il faut placer dans le noeud du r eseau un multiplexeur a insertion/extraction (ADM). De plus il faut un ADM pour chaque longueur d'onde utilis ee dans le noeud, ce qui repr esente un co^ut d' equipements important. Les objectifs du groupage de tra c sont d'une part le partage e cace de la bande passante et d'autre part la r eduction du co^ut des equipements de routage. Nous pr esentons des r esultats d'inapproximabilit e, des algorithmes d'approximation, un nouveau mod ele qui permet au r eseau de pouvoir router n'importe quel graphe de requ^etes de degr e born e, ainsi que des solutions optimales pour deux sc enarios avec tra c all-to-all: l'anneau bidirectionnel et l'anneau unidirectionnel avec un facteur de groupage qui change de mani ere dynamique. La deuxi eme partie de la th ese s'interesse aux probl emes consistant a trouver des sousgraphes avec contraintes sur le degr e. Cette classe de probl emes est plus g en erale que le groupage de tra c, qui est un cas particulier. Il s'agit de trouver des sous-graphes d'un graphe donn e avec contraintes sur le degr e, tout en optimisant un param etre du graphe (tr es souvent, le nombre de sommets ou d'ar^etes). Nous pr esentons des algorithmes d'approximation, des resultats d'inapproximabilit e, des etudes sur la complexit e param etrique, des algorithmes exacts pour les graphes planaires, ainsi qu'une m ethodologie g en erale qui permet de r esoudre e cacement cette classe de probl emes (et de mani ere plus g en erale, la classe de probl emes tels qu'une solution peut ^etre cod e avec une partition d'un sous-ensemble des sommets) pour les graphes plong es dans une surface. Finalement, plurieurs annexes pr esentent des r esultats sur des probl emes connexes.
2

Random combinatorial structures with low dependencies : existence and enumeration

Perarnau Llobet, Guillem 01 October 2013 (has links)
En aquesta tesi s'estudien diferents problemes en el camp de la combinatòria i la teoria de grafs, utilitzant el mètode probabilístic. Aquesta tècnica, introduïda per Erdős , ha esdevingut una eina molt potent per tal de donar proves existencials per certs problemes en diferents camps de les matemàtiques on altres mètodes no ho han aconseguit. Un dels seus principals objectius és l'estudi del comportament de les variables aleatòries. El cas en que aquestes variables compten el nombre d'esdeveniments dolents que tenen lloc en una estructura combinatòria és de particular interès. La idea del Paradigma de Poisson és estimar la probabilitat que tots aquests esdeveniments dolents no succeeixin a la vegada, quan les dependències entre ells són febles o escasses. En tal cas, aquesta probabilitat s'hauria de comportar de forma similar al cas on tots els esdeveniments són independents. El Lema Local de Lovász o la Desigualtat de Suen són exemples d'aquesta idea. L'objectiu de la tesi és estudiar aquestes tècniques ja sigui proveint-ne noves versions, refinant-ne les existents per casos particulars o donant-ne noves aplicacions. A continuació s'enumeren les principals contribucions de la tesi. La primera part d'aquesta tesi estén un resultat d' Erdős i Spencer sobre transversals llatins. Els autors proven que qualsevol matriu d'enters on cap nombre apareix massa vegades, admet un transversal on tots els nombres són diferents. Això equival a estudiar els aparellaments multicolors en aresta-coloracions de grafs complets bipartits. Sota les mateixes hipòtesis que, es donen resultats sobre el nombre d'aquests aparellaments. Les tècniques que s'utilitzen estan basades en l'estratègia desenvolupada per Lu i Székely. En la segona part d'aquesta tesi s'estudien els codis identificadors. Un codi identificador és un conjunt de vèrtexs tal que tots els vèrtexs del graf tenen un veïnatge diferent en el codi. Aquí s'estableixen cotes en la mida d'un codi identificador mínim en funció dels graus i es resol parcialment una conjectura de Foucaud et al.. En un altre capítol, es mostra que qualsevol graf suficientment dens conté un subgraf que admet un codi identificador òptim. En alguns casos, provar l'existència d'un cert objecte és trivial. Tot i així, es poden utilitzar les mateixes tècniques per obtenir resultats d'enumeració. L'estudi de patrons en permutacions n'és un bon exemple. A la tercera part de la tesi es desenvolupa una nova tècnica per tal d'estimar el nombre de permutacions d'una certa llargada que eviten còpies consecutives d'un patró donat. En particular, es donen cotes inferiors i superiors per a aquest nombre. Una de les conseqüències és la prova de la conjectura CMP enunciada per Elizalde i Noy així com nous resultats en el comportament de la majoria dels patrons. En l'última part de la tesi s'estudia la Conjectura Lonely Runner, enunciada independentment per Wills i Cusick i que té múltiples aplicacions en diferents camps de les matemàtiques. Aquesta coneguda conjectura diu que per qualsevol conjunt de corredors que corren al llarg d'un cercle unitari, hi ha un moment on tots els corredors estan suficientment lluny de l'origen. Aquí, es millora un resultat de Chen ampliant la distància de tots els corredors a l'origen. També s'estén el teorema del corredor invisible de Czerwiński i Grytczuk . / In this thesis we study different problems in combinatorics and in graph theory by means of the probabilistic method. This method, introduced by Erdös, has become an extremely powerful tool to provide existential proofs for certain problems in different mathematical branches where other methods had failed utterly. One of its main concerns is to study the behavior of random variables. In particular, one common situation arises when these random variables count the number of bad events that occur in a combinatorial structure. The idea of the Poisson Paradigm is to estimate the probability of these bad events not happening at the same time when the dependencies among them are weak or rare. If this is the case, this probability should behave similarly as in the case where all the events are mutually independent. This idea gets reflected in several well-known tools, such as the Lovász Local Lemma or Suen inequality. The goal of this thesis is to study these techniques by setting new versions or refining the existing ones for particular cases, as well as providing new applications of them for different problems in combinatorics and graph theory. Next, we enumerate the main contributions of this thesis. The first part of this thesis extends a result of Erdös and Spencer on latin transversals [1]. They showed that an integer matrix such that no number appears many times, admits a latin transversal. This is equivalent to study rainbow matchings of edge-colored complete bipartite graphs. Under the same hypothesis of, we provide enumerating results on such rainbow matchings. The second part of the thesis deals with identifying codes, a set of vertices such that all vertices in the graph have distinct neighborhood within the code. We provide bounds on the size of a minimal identifying code in terms of the degree parameters and partially answer a question of Foucaud et al. On a different chapter of the thesis, we show that any dense enough graph has a very large spanning subgraph that admits a small identifying code. In some cases, proving the existence of a certain object is trivial. However, the same techniques allow us to obtain enumerative results. The study of permutation patterns is a good example of that. In the third part of the thesis we devise a new approach in order to estimate how many permutations of given length avoid a consecutive copy of a given pattern. In particular, we provide upper and lower bounds for them. One of the consequences derived from our approach is a proof of the CMP conjecture, stated by Elizalde and Noy as well as some new results on the behavior of most of the patterns. In the last part of this thesis, we focus on the Lonely Runner Conjecture, posed independently by Wills and Cusick and that has multiple applications in different mathematical fields. This well-known conjecture states that for any set of runners running along the unit circle with constant different speeds and starting at the same point, there is a moment where all of them are far enough from the origin. We improve the result of Chen on the gap of loneliness by studying the time when two runners are close to the origin. We also show an invisible runner type result, extending a result of Czerwinski and Grytczuk.
3

Combinatorial dynamics of strip patterns of quasiperiodic skew products in the cylinder

Morales López, Leopoldo 26 March 2016 (has links)
La tesi consta de dues parts. En la primera s'estenen els resultats i tècniques de Fabbri et al. 2005 per estudiar la dinàmica combinatòria, el <<forcing>> i l'entropia topològica de certes aplicacions triangulars al cilindre forçades quasiperiòdicament. Aquesta teoria dóna com a corol.lari una demostració més estructurada del Teorema de Sharkovski per aquest cas, demostrat inicialment a Fabbri et al., 2005. Quant a l'entropia es defineix la noció de ferradura en aquest context i es prova, com en el cas de l'interval, que si una d'aquestes funcions té una s-ferradura llavors la seva entropía topològica és més gran o igual a log s. D'això se'n dedueixen fites inferior de l'entropia en funció dels períodes de les orbites periòdiques. Això representa una extensió dels resultats anàlegs a l'interval a aquest context. En el context anterior sorgeix de manera natural la següent pregunta: El Teorema de Sharkovsky es compleix restringit a corbes en lloc de bandes generals? L'objectiu de la segona part de la memòria és el de respondre a aquesta pregunta de forma negativa mitjançant un contraexample: Es construeix una funció que té una òrbita periòdica de període 2 de corbes (que són, de fet, els cercles superiors i inferiors del cilindre) i sense cap corba invariant (solament té una pseudocorba invariant). En particular, això demostra que hi ha aplicacions triangulars al cilindre forçades quasiperiòdicament sense corbes invariants. Aquest és el primer resultat analític d'aquesta naturalesa que apareix a la literatura malgrat l'existencia d'evidències numèriques prèvies en aquest sentit. Els resultats obtinguts són solament una primera fase en la comprensió analítica/topològica de la dinàmica d'aquestes aplicacions, el que obra una via de treball futura. / The thesis consists of two parts. In the first we aim at extending the results and techniques from Fabbri et al. 2005 to study the Combinatorial Dynamics, the <<forcing>> and the topological Entropy of certain quasiperiodically forced skew-product on the cylinder. This theory gives a structured demonstration from the Sharkovski Theorem as a corollary, proved initially in Fabbri et al., 2005. About entropy defines the notion of horseshoe in this context and shwow, as in the interval case, if one of these functions has a s-horseshoe then its topological entropy is greater than or equal to log s. It follows lower entropy based on periodic orbits periods. This represents an similar extension to the results a l'interval in this context. In the above context arises naturally the following question: Sharkovsky theorem holds restricted curves instead of bands general? The aim of the second part of the report is to answer this question negatively by a contraexample: It constructs a function that has two curves as periodic orbit of period 2 (which are, in fact, the upper and lower circles cylinder) with no invariant curve (only has an invariant pseudo-curve). In particular, this shows that there are quasiperiodically forced skew-product on the cylinder without invariant curves. This is the first analytical result of this kind appearing in the literature despite the existence of previous numerical evidence in this regard. The results are only the first stage in understanding analytic/topological dynamics of these applications, which work via a future job.
4

On the structure of graphs without short cycles

Salas Piñón, Julián 20 December 2012 (has links)
The objective of this thesis is to study cages, constructions and properties of such families of graphs. For this, the study of graphs without short cycles plays a fundamental role in order to develop some knowledge on their structure, so we can later deal with the problems on cages. Cages were introduced by Tutte in 1947. In 1963, Erdös and Sachs proved that (k, g) -cages exist for any given values of k and g. Since then, large amount of research in cages has been devoted to their construction. In this work we study structural properties such as the connectivity, diameter, and degree regularity of graphs without short cycles. In some sense, connectivity is a measure of the reliability of a network. Two graphs with the same edge-connectivity, may be considered to have different reliabilities, as a more refined index than the edge-connectivity, edge-superconnectivity is proposed together with some other parameters called restricted connectivities. By relaxing the conditions that are imposed for the graphs to be cages, we can achieve more refined connectivity properties on these families and also we have an approach to structural properties of the family of graphs with more restrictions (i.e., the cages). Our aim, by studying such structural properties of cages is to get a deeper insight into their structure so we can attack the problem of their construction. By way of example, we studied a condition on the diameter in relation to the girth pair of a graph, and as a corollary we obtained a result guaranteeing restricted connectivity of a special family of graphs arising from geometry, such as polarity graphs. Also, we obtained a result proving the edge superconnectivity of semiregular cages. Based on these studies it was possible to develop the study of cages. Therefore obtaining a relevant result with respect to the connectivity of cages, that is, cages are k/2-connected. And also arising from the previous work on girth pairs we obtained constructions for girth pair cages that proves a bound conjectured by Harary and Kovács, relating the order of girth pair cages with the one for cages. Concerning the degree and the diameter, there is the concept of a Moore graph, it was introduced by Hoffman and Singleton after Edward F. Moore, who posed the question of describing and classifying these graphs. As well as having the maximum possible number of vertices for a given combination of degree and diameter, Moore graphs have the minimum possible number of vertices for a regular graph with given degree and girth. That is, any Moore graph is a cage. The formula for the number of vertices in a Moore graph can be generalized to allow a definition of Moore graphs with even girth (bipartite Moore graphs) as well as odd girth, and again these graphs are cages. Thus, Moore graphs give a lower bound for the order of cages, but they are known to exist only for very specific values of k, therefore it is interesting to study how far a cage is from this bound, this value is called the excess of a cage. We studied the excess of graphs and give a contribution, in the sense of the work of Biggs and Ito, relating the bipartition of girth 6 cages with their orders. Entire families of cages can be obtained from finite geometries, for example, the graphs of incidence of projective planes of order q a prime power, are (q+1, 6)-cages. Also by using other incidence structures such as the generalized quadrangles or generalized hexagons, it can be obtained families of cages of girths 8 and 12. In this thesis, we present a construction of an entire family of girth 7 cages that arises from some combinatorial properties of the incidence graphs of generalized quadrangles of order (q,q).
5

Collected results on semigroups, graphs and codes

Vico Oton, Albert 22 October 2012 (has links)
In this thesis we present a compendium of _ve works where discrete mathematics play a key role. The _rst three works describe di_erent developments and applications of the semigroup theory while the other two have more independent topics. First we present a result on semigroups and code e_ciency, where we introduce our results on the so-called Geil-Matsumoto bound and Lewittes' bound for algebraic geometry codes. Following that, we work on semigroup ideals and their relation with the Feng-Rao numbers; those numbers, in turn, are used to describe the Hamming weights which are used in a broad spectrum of applications, i.e. the wire-tap channel of type II or in the t-resilient functions used in cryptography. The third work presented describes the non-homogeneous patterns for semigroups, explains three di_erent scenarios where these patterns arise and gives some results on their admissibility. The last two works are not as related as the _rst three but still use discrete mathematics. One of them is a work on the applications of coding theory to _ngerprinting, where we give results on the traitor tracing problem and we bound the number of colluders in a colluder set trying to hack a _ngerprinting mark made with a Reed-Solomon code. And _nally in the last work we present our results on scientometrics and graphs, modeling the scienti_c community as a cocitation graph, where nodes represent authors and two nodes are connected if there is a paper citing both authors simultaneously. We use it to present three new indices to evaluate an author's impact in the community.
6

Propagació d'informació en grafs i digrafs que modelen xarxes d'interconnexió simètriques

Mitjana, Margarida 11 March 1999 (has links)
L'objectiu d'aquesta tesi és aprofondir en l'estudi d'una certa família de dígrafs, els dígrafs de prefix-cicle, donant nous detalls sobre la seva estructura, noves maneres d'enfocar el seu estudi, i dissenyant bons esquemes de comunicació. Es completa d'aquesta forma l'estudi iniciat per altres autors i s'en refoça el seu interès com a bon model de xarxa d'interconnexió.
7

Mètodes gràfics i estadístics per a la detecció de valors extrems

Padilla Cozar, Maria 05 February 2016 (has links)
La teoria de valors extrems (EVT) és l’única disciplina estadística que desenvolupa tècniques i models per descriure el comportament inusual en comptes de l’habitual, l’objectiu principal de la qual és l’estimació de quantils corresponents a successos molt extrems. En moltes àrees d’aplicació un requisit típic és estimar el valor en risc a un cert nivell (VaR), prou alt per a que la possibilitat de superació d’aquest sigui menor a una quantitat donada. La teoria probabilista associada a l’EVT es troba ben establerta, gràcies als resultats de Fisher i Tippet (1923), Balkema i de Haan (1974) i Pickands (1975). Dos enfocs principals van ser desenvolupats: el mètode per blocs de màxims i el mètode d’excessos sobre un llindar, l’aplicació dels quals presenta dificultats a l’hora d’aplicar eines estadístiques, Diebold et al. (1998). La determinació del llindar a partir del qual la distribució límit pot utilitzar-se i del comportament d’aquesta són els problemes principals a tractar. Per distingir les dades generals de les que són objecte d’estudi, es farà servir el concepte de cua, el qual fa referència a aquells valors que es troben per sobre d’un valor suficientment alt. Per als excessos sobre un llindar, la distribució asimptòtica que caracteritza el comportament de la cua és la Pareto Generalitzada (GPD); el seu paràmetre de forma, anomenat índex del valor extrem, permet classificar les cues en pesades, exponencials i lleugeres. L’aplicació del model GPD es pot trobar extensament detallada a McNeil et al. (2005) o Embrechts et al. (1997), però es troben limitacions, per exemple, quan no hi ha moments suficients o la subjectivitat que sorgeix quan s’utilitzen mètodes gràfics. L’objectiu d’aquesta tesi és presentar noves eines per a l’EVT, que serveixen per a la selecció de llindar i l’estimació de l’índex del valor extrem i solucionen alguns dels problemes existents. En el Capítol 1 es fa un repàs de la teoria estadística per a valors extrems. Es recorden els mètodes gràfics més utilitzats, el mean excess plot i el Hill-plot, i els mètodes d’estimació disponibles per a la GPD, finalment es presenten un nou mètode gràfic anomenat CV-plot, Castillo et al. (2014), i un enfoc aparegut recentment per a cues pesades i exponencials, Castillo et al. (2013). En el Capítol 2 s’utilitza el fet que el coeficient de variació residual caracteritza distribucions, Gupta i Kirmani (2000), per trobar el CV-plot teòric per a algunes distribucions i aplicar la teoria asimptòtica al cas de la GPD, sempre que hi hagi existència de moments d’ordre quatre. Gràcies a una transformació, presentada en el Capítol 3, el CV-plot es podrà aplicar en qualsevol situació. En el Capítol 4 es presenten uns estadístics que permeten estimar l’índex del valor extrem, contrastar la hipòtesi de GPD i s’utilitzen en un algoritme automàtic de selecció de llindars. Aquest tercer punt suposa un gran avenç per a l’aplicació del mètode Peak over Threshold, el qual requereix de l’estimació del llindar, a partir d’un mètode gràfic, i l’estimació de l’índex del valor extrem, mitjançant màxima versemblança, Coles (2001), ja que desapareix la subjectivitat de l’investigador quan s’estima el llindar. El Capítol 5 es troba dedicat a l’estudi de 16 conjunts de dades en sistemes informàtics encastats. El CV-plot i els estadístics Tm han estat utilitzats, obtenint bons resultats per a la majoria dels casos. En el Capítol 6 es tornen a aplicar les noves eines a les dades daneses sobre assegurances de focs, McNeil (1997), i les dades financeres analitzades a Gomes i Pestana (2007). Per finalitzar, en el Capítol 7 es presenten les conclusions d’aquest treball i s’exposen les noves línies de recerca que es poden seguir. / Extreme value theory (EVT) is the only statistical discipline that develops techniques and models to describe the unusual behavior instead of the usual one, the main objective of which is the quantile estimation corresponding to very extreme events. In many areas of application, a typical requirement is to estimate the value at risk (VaR), high enough to overcome the possibility that this is less than a given amount. The probability theory associated to the EVT is well-established thanks to the results of Fisher and Tippet (1923), Balkema and de Haan (1974), and Pickands (1975). Two main approaches were developed; the block maxima method and the method of the excesses over a threshold, the application of which presents difficulties in applying statistical tools, Diebold et al. (1998). The determination of the threshold from which the limit distribution can be used and its behavior are the main problems to deal with. To distinguish the general data from those which are under study, we will use the concept of tail, which refers to values that are above a sufficiently high value. For excesses over a threshold the distribution that characterizes the asymptotic behavior of the tail is the Generalized Pareto (GPD), its shape parameter, called extreme value index, classifies tails in heavy, exponential, and light. The application of the GPD model is extensively detailed in McNeil et al. (2005) and Embrechts et al. (1997), but there are limitations, for example, when there are no finite moments or the subjectivity that arises when graphical methods are used. The aim of this thesis is to present new tools for EVT, which can be used for the threshold selection and the extreme value index estimation and they solve some existing problems. Chapter 1 is a review of statistical theory for extreme values. The most used graphical methods are remembered, the mean excess plot and the Hill-plot, and the estimation methods available for GPD too. Finally, a new graphical method called CV plot, Castillo et al. (2014), and a recently appeared approach to heavy and exponential tails, Castillo et al. (2013), are presented. In Chapter 2 the fact that the coefficient of variation characterizes residual distributions, Gupta and Kirman (2000), is used to find the theoretical CV plot for some distributions and to apply the asymptotic theory to the case of GPD, provided by the existence of four finite moments. Thanks to a transformation, presented in Chapter 3, the CV-plot can be applied in any situation. Chapter 4 presents statistics that allow us to estimate the extreme value index, to contrast the hypothesis of GPD and they are used in an automated selection thresholds algorithm. This third point is a major breakthrough for the application of the method Peak over Threshold, which requires estimating the threshold from a graphical method and estimating the extreme value index through the maximum likelihood, Coles (2001), since researcher's subjectivity disappears when the threshold is estimated. Chapter 5 is dedicated to the study of 16 data sets in embedded systems. The CV plot and statistical Tm have been used, obtaining good results in most cases. In Chapter 6, we will again apply new tools to data on Danish fire insurance, McNeil (1997), and financial data analyzed in Gomes and Pestana (2007). Finally, Chapter 7 presents the conclusions of this work and the new lines of research.
8

Hybrid methodologies for symmetric and asymmetric vehicle routing problems

Herrero Antón, Rosa 25 January 2016 (has links)
En las últimas décadas, la globalización ha impulsado la adaptación del sector del Transporte y la Logística a las nuevas demandas sociales. Al mismo tiempo, el transporte ha sido la columna vertebral de la globalización. Esta necesidad social crea consumidores ambiciosos que necesitan sus productos de forma rápida y a un precio asequible muchas veces sin ser conscientes de su origen, el transporte o los aspectos medioambientales, entre otros factores. Sin embargo, para satisfacer las demandas del cliente, es necesario encontrar el modo de transporte más barato, que significa la mejora de la logística del transporte de estos productos. Por lo tanto, estas demandas requieren un servicio cada vez más flexible para satisfacer las necesidades del cliente, y, además, las empresas quieren un transporte eficiente y productivo. Traveling Salesman Problems (TSP) and Vehicle Routing Problems (VRP) proporcionan el marco teórico para tratar este tipo de problemas logísticos relacionados con la distribución física de mercancías desde un almacén central hasta los clientes. Son dos de los problemas más desafiantes e investigados debido a su complejidad y aplicabilidad. El objetivo principal de esta tesis doctoral es la introducción de metodologías híbridas que integran varias técnicas para resolver de manera eficiente rich VRPs con restricciones realistas. Esta tesis comienza con problemas teóricos y evoluciona hacia escenarios más realistas abordando un total de seis problemas combinatorios relacionados con el transporte por carretera. Una metaheurística llamada Tailored Lagrangian Metaheuristic (TLM) se ha desarrollado para abordar el TSP. Está basa en la relajación de Lagrange que se utiliza para explotar la estructura del problema que reduce considerablemente su complejidad moviendo restricciones difíciles de satisfacer a la función objetivo, asociando una penalización en caso de que no se cumplen. La metaheurística desarrollada para el TSP se ha integrado en dos metodologías híbridas combinadas con Constraint Programming para hacer frente a problemas más complejos. En primer lugar, el Capacitated Vehicle Routing Problem (CVRP), cuyos los vehículos tienen limitaciones de capacidad de carga de las mercancías que deben ser entregadas, es abordado. En segundo lugar, se ha abordado un problema real, Home Health Care (HHC), del servicio de Salud en el municipio de Ferrara, Italia. Este consiste en la asignación de los tratamientos de los pacientes a las enfermeras que viajan a las casas de los pacientes. Investigaciones teóricas suelen asumir la simetría de los costes basados en la distancia de viajar de un lugar a otro y la existencia de una flota homogénea de vehículos con una misma capacidad. Esta tesis estudia diferentes variantes centradas en el impacto que causa la asimetría de los costes y la heterogeneidad de la flota. Para estos estudios, se abordan la versión con costes asimétricos del TSP y del CVRP -el Asymmetric Traveling Salesman Problem (ATSP) y el Asymmetric Capacitated Vehicle Routing Problem (ACVRP)- y la versión con flota heterogénea del ACVRP –el Asymmetric and Heterogeneous Vehicle Routing Problem (AHVRP). / Over the last decades, globalization has driven the adaptation of the Transport and Logistics sector to new social demands. At the same time, transport has been the backbone of globalization. This social need creates ambitious consumers who need their products quickly and an affordable price often unaware of their origin, transport mode or environmental aspects, among other factors. Nevertheless, to satisfy customer demands, it is needed to find the cheapest transport mode, which in turn means the improvement of transport logistics of the products. Therefore, these demands require an increasingly flexible service to meet customer requirements, and in addition companies want an efficient and productive transport. The so called Traveling Salesman Problems (TSP) and Vehicle Routing Problems (VRP) provide the theoretical framework for approaching this class of logistic problems associated with the physical distribution of goods from a central depot to customers. They are two of the most challenging and researched problems because of their complexity and applicability. The main goal of this PhD thesis is to introduce hybrid methodologies that integrate several techniques to efficiently solve rich VRPs with realistic constraints. It starts with theoretical problems and evolves into more realistic scenarios tackling six combinatorial problems related to road transport. A metaheuristic named Tailored Lagrangian Metaheuristic (TLM) has been developed to tackle the TSP. It is based on the Lagrangian Relaxation which is used to exploit the structure of the problem reducing considerably its complexity by moving hard-to-satisfy constraints into the objective function, associating a penalty in case the constraints are not satisfied. The developed metaheuristic for the TSP has been integrated into two hybrid methodologies combined with Constraint Programming to tackle more complex problems. First of all, it is addressed the Capacitated Vehicle Routing Problem (CVRP), whose vehicles have limited loading capacity of the goods that must be delivered. Secondly, it has been addressed a real problem of the Home Health Care (HHC) service in the municipality of Ferrara, Italy. It consists on assigning patients' services to nurses which travel to each patient’s home. Theoretical researches typically assume the symmetry of the distance-based costs associated with traveling from one place to another as well as the existence of a homogeneous fleet of vehicles with limited capacity. This thesis studies different variants focusing on the impact that causes the asymmetry of the costs and the heterogeneity of the fleet. For these purpose, the Asymmetric Traveling Salesman Problem (ATSP), the Asymmetric Capacitated Vehicle Routing Problem (ACVRP) and the Asymmetric and Heterogeneous Vehicle Routing Problem (AHVRP) are addressed.
9

Teoremes de No Free Lunch en el continu i el model del Pont Brownià en optimització

Caballero Monteso, Ricard Josep 03 February 2016 (has links)
Aquesta tesi consta principalment de dues parts. A la primera part, es generalitza a variable contínua un famós teorema en l'àmbit tèoric de l'optimització: El No-Free-Lunch Theorem, el qual diu que tots els algorismes són igual de bons quan es fa la mitjana de la seva eficiència sobre totes les possibles funcions a optimitzar. Aquesta generalització utilitza de manera molt natural la teoria de processos estocàstics, i arriba a la conclusió que no hi ha teoremes de No-Free-Lunch en el continu, excepte en determinats casos extrems de poca importància pràctica. A la segona part, s'ha considerat el Pont Brownià com un model probabilístic per a problemes d'optimització tipus “caixa negra”, en què no es té cap forma analítica de la funció, sinó que aquesta només pot ser avaluada en un nombre determinat de punts, i a més a més, considerant que cadascuna d'aquestes avaluacions és molt costosa de fer, i per tant només se'n faran unes quantes. El model probabilístic considera que la funció a optimitzar és una trajectòria d'un procés amb una determinada llei. Des del punt de vista de la complexitat computacional, això correspon a estudiar el “average performance” d'algorismes, en front de l'habitual “worst-case performance”. Però això es fa sempre des del punt de vista asimptòtic, quan el nombre d'avaluacions tendeix a infinit, i un dels objectius d'aquesta tesi se centra en la millora de l'estimació del valor òptim quan només es pot avaluar la funció en pocs punts. En aquest sentit, i en un estudi que mancava a la literatura, es comparen i analitzen diverses heurístiques adaptatives i no-adaptatives, arribant a la conclusió de quina és més eficient. D'altra banda, el treball amb el Pont Brownià, ha donat lloc a dues fórmules no explicitades anteriorment, la densitat de l'argument del mínim del Pont Brownià i la densitat conjunta del Pont Brownià i del seu mínim. A més, en aquesta tesi es realitzen molts experiments de simulació per calcular quantitats que són molt difícils, costoses o impossibles d'obtenir analíticament. Seguint una filosofia de practicitat, s'han programat rutines, com per exemple l'histograma de l'argument del mínim d'un Pont Brownià condicionat a n punts, que obtenen una estimació probabilística raonable de l'error comès. / This Ph.D. thesis consists mainly of two parts. In the first part, a famous theorem in the field of theoretical optimisation is generalised to continuous variable: The No-Free-Lunch Theorem, which states that all algorithms are equally good when one averages its efficiency over all possible functions to optimise. This generalisation uses in a very natural way the theory of stochastic processes, and concludes that there are no No-Free-Lunch theorems in the continuum, except for certain extreme cases of little practical significance. In the second part, the Brownian Bridge has been considered as a probabilistic model for optimisation problems of the "black box" type, in which we do not have an analytical form of the function, but that the latter can only be evaluated in a certain number of points. Moreover, we consider that each of these evaluations are very expensive, so that only a few of them will be made. The probabilistic model considers that the function to optimise is a path of a stochastic process with a given probability law. From the point of view of computational complexity, this study corresponds to the "average performance" of algorithms, versus the usual "worst-case performance". But this has always been done before from the asymptotic standpoint, when the number of evaluations tends to infinity, and one of the goals of this thesis focuses on improving the estimation of the optimal value when the function can only be evaluated in a few points. In this regard, in a study that was missing in the literature, we compare and analyze several heuristics, adaptive and non-adaptive, concluding which one is most efficient. Moreover, in working with Brownian bridge, we found two formulae not given explicitly before, namely the density of the argument of the Brownian bridge (in the general setting) and the joint density of a bridge and its minimum. In addition, in this thesis many simulation experiments have been performed to calculate quantiites that are difficult, expensive or impossible to obtain analytically. Following a philosophy of practicality, some routines have been programmed, such as the histogram of the argument of the minimum of the Brownian Bridge conditioned to n points, that get a reasonable probabilistic estimate of the error incurred.
10

Aportaciones al estudio de los sistemas electorales

Sales i Inglès,Vicenç 28 January 2016 (has links)
An important question that modern societies have to decide is the election of some people who represent them and can also make some decisions. Mechanisms to do it are called Electoral Systems. In fact, there are a lot of them and they are very different. In this work, we present some ideas to carry out a mathematical analysis of them. The first chapter is a global analysis of electoral systems. It begins with the definition of electoral system --helped by Probability Theory-- and the introduction of most important simple examples found in practice. Next we define two operations on electoral systems and, in particular, we obtain two different generalizations of each one of the examples. We also introduce some important properties that electoral systems may have -- superadditivity, monotonicity, increasing and stability-- and we study which examples enjoy them and the behavior of these properties with respect to the operations defined before. Finally, stability give us the possibility to define majority and proportional electoral systems. In the second chapter we study electoral systems individually. In this sense, we introduce the electoral expectatives, obtained when the vote vector of an arbitrary list of candidates is fixed. Then we study their connection with the operations before defined and we finish the chapter by introducing some individual properties that electoral systems may have and analysing what happens when we consider again the operations defined before. The third chapter refers to a parameter introduced to assess whether an arbitrary list of candidates gets profit or not from an electoral system. The way used to do it consists in considering the expected value of electoral expectatives introduced in the second chapter. We obtain in this form the concept of average electoral expectative. And we finish the chapter by studying the behavior of this concept with respect to the operations and examples introduced above. In the fourth chapter we analyse three questions of majority weighted games that we will use in the next chapter: another form to define them, a new operation between them and their convergence. Finally, in the fifth chapter we replace the number of seats of each litst of candidates by its Shapley-Shubik power index and we study the electoral systems using this new indicator. In this form, we obtain the concept of power system and, analogously, the concepts of power expectative and average power expectative. We finish by introducing some new properties of each one of these concepts and we also analyse their relation with the analogous properties of electoral systems. / Una cuestión importante que las sociedades modernas deben decidir es la elección de algunos individuos que los representen y puedan asimismo tomar ciertas decisiones. Los mecanismos encargados de hacer esto se llaman Sistemas Electorales. De hecho, existen muchos y son muy diferentes. En este trabajo, presentamos algunas ideas para analizamos matemáticamente. El primer capítulo es un análisis global de los sistemas electorales. Comienza con la definición de sistema electoral --con la ayuda de la Teoría de Probabilidades-- y la introducción de los ejemplos simples más importantes que hay. Después definimos dos operaciones para los sistemas electorales, con lo que obtenemos dos generalizaciones diferentes de cada uno de los ejemplos. Introducimos también algunas propiedades importantes que pueden poseer los sistemas electorales --superaditividad, monotonía, crecimiento y estabilidad-- y estudiamos cuáles de los ejemplos las poseen y el comportamiento de dichas propiedades respecto de las operaciones anteriormente definidas. Finalmente, la estabilidad nos proporciona la posibilidad de definir los sistemas electorales mayoritarios y proporcionales. En el segundo capítulo estudiamos los sistemas electorales desde el punto de vista individual. En tal sentido, introducimos las expectativas electorales, obtenidas fijando el vector de votos de una candidatura arbitraria. Seguidamente, estudiamos su relación con las operaciones anteriormente definidas y terminamos el capítulo introduciendo algunas propiedades de tipo individual que los sistemas electorales pueden poseer y analizando qué sucede cuando consideramos las operaciones anteriores. El tercer capítulo trata sobre un parámetro introducido para evaluar si una candidatura cualquiera resulta beneficiada o no por un sistema electoral. La forma de hacerlo ha sido considerar la media de las expectativas electorales introducidas en el segundo capítulo. Obtenemos de esta forma el concepto de expectativa electoral media. Y finalizamos el capítulo estudiando de nuevo su comportamiento respecto de las operaciones y los ejemplos introducidos. En el cuarto capítulo analizamos tres cuestiones sobre juegos de mayoría ponderada que serán utilizados en el capítulo siguiente: otra forma de definirlos, una nueva operación entre ellos y su convergencia. Finalmente, en el quinto capítulo substituimos el número de respresentantes de cada candidatura por su índice de poder de Shapley-Shubik y estudiamos los sistemas electorales utilizando este nuevo indicador. De esta forma, obtenemos el concepto de sistema de poder y, análogamente, los de expectativa de poder y expectativa de poder media. E introducimos algunas propiedades nuevas sobre cada uno de los conceptos anteriores y analizamos también su relación con las propiedades análogas de los sistemas electorales.

Page generated in 0.0341 seconds