• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 161
  • 32
  • 32
  • 22
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 312
  • 61
  • 42
  • 38
  • 36
  • 34
  • 31
  • 29
  • 26
  • 24
  • 24
  • 24
  • 23
  • 22
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Complexity Bounds for Search Problems

Nicholas Joseph Recker (18390417) 18 April 2024 (has links)
<p dir="ltr">We analyze the query complexity of multiple search problems.</p><p dir="ltr">Firstly, we provide lower bounds on the complexity of "Local Search". In local search we are given a graph G and oracle access to a function f mapping the vertices to numbers, and seek a local minimum of f; i.e. a vertex v such that f(v) <= f(u) for all neighbors u of v. We provide separate lower bounds in terms of several graph parameters, including congestion, expansion, separation number, mixing time of a random walk, and spectral gap. To aid in showing these bounds, we design and use an improved relational adversary method for classical algorithms, building on the prior work of Scott Aaronson. We also obtain some quantum bounds using the traditional strong weighted adversary method.</p><p dir="ltr">Secondly, we show a multiplicative duality gap for Yao's minimax lemma by studying unordered search. We then go on to give tighter than asymptotic bounds for unordered and ordered search in rounds. Inspired by a connection through sorting with rank queries, we also provide tight asymptotic bounds for proportional cake cutting in rounds.</p>
142

Estimación y acotación del error de discretización en el modelado de grietas mediante el método extendido de los elementos finitos

González Estrada, Octavio Andrés 19 February 2010 (has links)
El Método de los Elementos Finitos (MEF) se ha afianzado durante las últimas décadas como una de las técnicas numéricas más utilizadas para resolver una gran variedad de problemas en diferentes áreas de la ingeniería, como por ejemplo, el análisis estructural, análisis térmicos, de fluidos, procesos de fabricación, etc. Una de las aplicaciones donde el método resulta de mayor interés es en el análisis de problemas propios de la Mecánica de la Fractura, facilitando el estudio y evaluación de la integridad estructural de componentes mecánicos, la fiabilidad, y la detección y control de grietas. Recientemente, el desarrollo de nuevas técnicas como el Método Extendido de los Elementos Finitos (XFEM) ha permitido aumentar aún más el potencial del MEF. Dichas técnicas mejoran la descripción de problemas con singularidades, con discontinuidades, etc., mediante la adición de funciones especiales que enriquecen el espacio de la aproximación convencional de elementos finitos. Sin embargo, siempre que se aproxima un problema mediante técnicas numéricas, la solución obtenida presenta discrepancias con respecto al sistema que representa. En las técnicas basadas en la representación discreta del dominio mediante elementos finitos (MEF, XFEM, ...) interesa controlar el denominado error de discretización. En la literatura se pueden encontrar numerosas referencias a técnicas que permiten cuantificar el error en formulaciones convencionales de elementos finitos. No obstante, por ser el XFEM un método relativamente reciente, aún no se han desarrollado suficientemente las técnicas de estimación del error para aproximaciones enriquecidas de elementos finitos. El objetivo de esta Tesis es cuantificar el error de discretización cuando se utilizan aproximaciones enriquecidas del tipo XFEM para representar problemas propios de la Mecánica de la Fractura Elástico Lineal (MFEL), como es el caso del modelado de una grieta. / González Estrada, OA. (2010). Estimación y acotación del error de discretización en el modelado de grietas mediante el método extendido de los elementos finitos [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/7203
143

Civility: Its Distinctness and Significance

Love, Christopher William 26 October 2017 (has links)
Civility has many critics. Some challenge its distinctness as a virtue, others its moral significance. In this essay, I attempt to meet both challenges by offering an account of civility that stands distinct from other concepts and holds great value, both intrinsically and instrumentally. I claim that we show civility toward others when we dispute their ideas in ways that respect those persons' intrinsic worth. My account pays particular attention to the connections between civility, clarity and reconciliation--connections that make civility vital for modern pluralistic societies. I then consider a recent alternative to this conception of civility advanced by Calhoun (2000), arguing that it secures distinctness at the cost of moral significance. / Master of Arts / In the midst of our contemporary polarization and hostile exchanges, one often hears calls for greater civility. Yet what is civility? Does it name a distinct concept, or is it merely synonymous with other virtues? And why, if at all, should we value it? In this essay, I attempt to answer these questions. Cheshire Calhoun notes that the latter two question pose serious challenges for would-be defenders of civility, challenges of distinctness and significance. I aim to offer an account of civility that meets both of these challenges. I argue that we show civility toward others when we dispute their ideas in ways that respect those persons’ intrinsic worth. My account pays particular attention to the connections between civility, clarity and reconciliation— connections that make civility particularly important for modern pluralistic societies. I then consider Calhoun’s own conception of civility, arguing that it secures distinctness at the cost of moral significance.
144

For better or for worse : A study on the impact of exchange rate volatility on trade / For better or for worse : A study on the impact of exchange rate volatility on trade

Hillgren, Jonathan, Magnusson, Emma January 2017 (has links)
Sammanfattning Examensarbete i finansiering, Civilekonomprogrammet Ekonomihögskolan vid Linnéuniversitetet, VT-2017 Författare: Emma Magnusson &amp; Jonathan Hillgren Handledare: Håkan Locking Examinator: Andreas Stephan Titel: For better or for worse – A study on the impact of exchange rate volatility on trade   Bakgrund: Växelkurssvängningar har studerats av flertalet forskare då detta anses vara en osäkerhet vars effekt inte är säkerställd. Då internationell handel är en viktig faktor för tillväxt och välstånd i en nation är dess samband med volatiliteten betydelsefullt att fastställa för att identifiera huruvida inverkan på landet är positiv eller negativ.   Problemformulering: Har volatiliteten i eurons växelkurs mot rörliga valutor någon påverkan på den bilaterala handeln mellan eurozonen och andra europeiska länder?   Syfte: Att kunna urskilja effekten av växelkursvolatilitetens påverkan på export och import vilket kan gynna företag i dess handelsbeslut, strategier och framtidprognostisering.    Metod: Undersökningens tillvägagångssätt grundar sig i en tidsserieanalys där beräkningar för volatiliteten ligger som grund till förklaringsvariabeln i modellen för att studera dess effekt på handeln, vilket skattas genom en noga utvald ARDL-metod. Regressionerna ger både ett lång- och kortsiktigt samband för att visa skillnader i influenser från volatiliteten på export och import för Sverige och Norge som studeras i rapporten.   Slutsatser: De erhållna resultaten för både Sveriges och Norges export visar att ingen påverkan alls kan urskiljas från växelkurssvängningar vilket innebär att exporten fortskrider oavsett grad av volatilitet vilket kan förklaras genom dess förmodade likheter i handelsmönster, varukategorier och exponering mot euroländerna. En möjlig slutsats är även att det inte är volatiliteten i sig som påverkar handeln, utan underliggande faktorer som inte kontrollerats för, vilket åskådliggjordes när oljeprisindex inkluderandes och eliminerade volatilitetens effekt på Norges export. Importen visade en långsiktig negativ effekt av volatiliteten för Norge och en kortsiktig positiv påverkan för Sverige. Skillnader i importen antas bero på olikheter i valutasäkring, trögheter i ekonomin och relationen till EU. / Abstract   Master Thesis in finance, Business and Administration School of Business and Economics at Linnaeus University, VT-2017 Authors: Emma Magnusson &amp; Jonathan Hillgren Advisor: Håkan Locking Examiner: Andreas Stephan Title: For better or for worse – A study on the impact of exchange rate volatility on trade   Background: Exchange rate fluctuations have been studied by numerous researchers, since it is thought of as an uncertainty whose effect is not guaranteed. Because international trade is an important factor to growth and wealth for a country, its connection to volatility is important to establish in order to identify whether the influence on the nation is positive or negative.   Problem: Does the volatility in the exchange rate between the euro and floating currencies affect bilateral trade between the euro area and other European countries?   Purpose: The purpose of the study is to distinguish the effect of the exchange rate volatility on export and import, which can favor companies in their trade decisions and strategies.   Method: The approach of the study is built on a time series analysis where estimates of volatility are underlying the explanatory variable to find its effect on trade, which is calculated by a carefully selected ARDL method. The regressions obtain both long-term and short-term relationships to show differences in the effect from the volatility on export and import for Sweden and Norway, the studied countries in this report.   Conclusions: The results for the export of both Sweden and Norway do not show any impact from the exchange rate fluctuations, which means the export continues regardless of the level of volatility. This can be explained by their similarities in the pattern of trade, products and exposure to the euro countries. Another possible conclusion is that the volatility itself is not affecting trade but that the underlying factors not being controlled for are, which was shown when the oil price index was included and eliminated the effect of the volatility on Norwegian exports. The import exposed a long-term negative effect of the volatility for Norway and a short-term positive effect for Sweden. The disparities are assumed to be due to differences in the use of hedging, inertia in the economy and the relationship with the European Union.
145

Mortality linked derivatives and their pricing

Bahl, Raj Kumari January 2017 (has links)
This thesis addresses the absence of explicit pricing formulae and the complexity of proposed models (incomplete markets framework) in the area of mortality risk management requiring the application of advanced techniques from the realm of Financial Mathematics and Actuarial Science. In fact, this is a multi-essay dissertation contributing in the direction of designing and pricing mortality-linked derivatives and offering the state of art solutions to manage longevity risk. The first essay investigates the valuation of Catastrophic Mortality Bonds and, in particular, the case of the Swiss Re Mortality Bond 2003 as a primary example of this class of assets. This bond was the first Catastrophic Mortality Bond to be launched in the market and encapsulates the behaviour of a well-defined mortality index to generate payoffs for bondholders. Pricing this type of bond is a challenging task and no closed form solution exists in the literature. In my approach, we adapt the payoff of such a bond in terms of the payoff of an Asian put option and present a new methodology to derive model-independent bounds for catastrophic mortality bonds by exploiting the theory of comonotonicity. While managing catastrophic mortality risk is an upheaval task for insurers and re-insurers, the insurance industry is facing an even bigger challenge - the challenge of coping up with increased life expectancy. The recent years have witnessed unprecedented changes in mortality rate. As a result academicians and practitioners have started treating mortality in a stochastic manner. Moreover, the assumption of independence between mortality and interest rate has now been replaced by the observation that there is indeed a correlation between the two rates. Therefore, my second essay studies valuation of Guaranteed Annuity Options (GAOs) under the most generalized modeling framework where both interest rate and mortality risk are stochastic and correlated. Pricing these types of options in the correlated environment is an arduous task and a closed form solution is non-existent. In my approach, I employ the use of doubly stochastic stopping times to incorporate the randomness about the time of death and employ a suitable change of measure to facilitate the valuation of survival benefit, there by adapting the payoff of the GAO in terms of the payoff of a basket call option. I then derive general price bounds for GAOs by employing the theory of comonotonicity and the Rogers-Shi (Rogers and Shi, 1995) approach. Moreover, I suggest some `model-robust' tight bounds based on the moment generating function (m.g.f.) and characteristic function (c.f.) under the affine set up. The strength of these bounds is their computational speed which makes them indispensable for annuity providers who rely heavily on Monte Carlo simulations to calculate the fair market value of Guaranteed Annuity Options. In fact, sans Monte Carlo, the academic literature does not offer any solution for the pricing of the GAOs. I illustrate the performance of the bounds for a variety of affine processes governing the evolution of mortality and the interest rate by comparing them with the benchmark Monte Carlo estimates. Through my work, I have been able to express the payoffs of two well known modern mortality products in terms of payoffs of financial derivatives, there by filling the gaps in the literature and offering state of art techniques for pricing of these sophisticated instruments.
146

Heat kernel estimates based on Ricci curvature integral bounds

Rose, Christian 22 August 2017 (has links)
Any Riemannian manifold possesses a minimal solution of the heat equation for the Dirichlet Laplacian, called the heat kernel. During the last decades many authors investigated geometric properties of the manifold such that its heat kernel fulfills a so-called Gaussian upper bound. Especially compact and non-compact manifolds with lower bounded Ricci curvature have been examined and provide such Gaussian estimates. In the compact case it ended even with integral Ricci curvature assumptions. The important techniques to obtain Gaussian bounds are the symmetrization procedure for compact manifolds and relative Faber-Krahn estimates or gradient estimates for the heat equation, where the first two base on isoperimetric properties of certain sets. In this thesis, we generalize the existing results to the following. Locally uniform integral bounds on the negative part of Ricci curvature lead to Gaussian upper bounds for the heat kernel, no matter whether the manifold is compact or not. Therefore, we show local isoperimetric inequalities under this condition and use relative Faber-Krahn estimates to derive explicit Gaussian upper bounds. If the manifold is compact, we can even generalize the integral curvature condition to the case that the negative part of Ricci curvature is in the so-called Kato class. We even obtain uniform Gaussian upper bounds using gradient estimate techniques. Apart from the geometric generalizations for obtaining Gaussian upper bounds we use those estimates to generalize Bochner’s theorem. More precisely, the estimates for the heat kernel obtained above lead to ultracontractive estimates for the heat semigroup and the semigroup generated by the Hodge Laplacian. In turn, we can formulate rigidity results for the triviality of the first cohomology group if the amount of curvature going below a certain positive threshold is small in a suitable sense. If we can only assume such smallness of the negative part of the Ricci curvature, we can bound the Betti number by explicit terms depending on the generalized curvature assumptions in a uniform manner, generalizing certain existing results from the cited literature. / Jede Riemannsche Mannigfaltigkeit besitzt eine minimale Lösung für die Wärmeleitungsgleichung des zur Mannigfaltigkeit gehörigen Dirichlet-Laplaceoperators, den Wärmeleitungskern. Während der letzten Jahrzehnte fanden viele Autoren geometrische Eigenschaften der Mannigfaltigkeiten unter welchen der Wärmeleitungskern eine sogenannte Gaußsche obere Abschätzung besitzt. Insbesondere bestizen sowohl kompakte als auch nichtkompakte Mannigfaltigkeiten mit nach unten beschränkter Ricci-Krümmung solche Gaußschen Abschätzungen. Im kompakten Fall reichten bisher sogar Integralbedingungen an die Ricci-Krümmung aus. Die wichtigen Techniken, um Gaußsche Abschätzungen zu erhalten, sind die Symmetrisierung für kompakte Mannigfaltigkeiten und relative Faber-Krahn- und Gradientenabschätzungen für die Wärmeleitungsgleichung, wobei die ersten beiden auf isoperimetrischen Eigenschaften gewisser Mengen beruhen. In dieser Arbeit verallgemeinern wir die bestehenden Resultate im folgenden Sinne. Lokal gleichmäßig beschränkte Integralschranken an den Negativteil der Ricci-Krümmung ergeben Gaußsche obere Abschätzungen sowohl im kompakten als auch nichtkompakten Fall. Dafür zeigen wir lokale isoperimetrische Ungleichungen unter dieser Voraussetzung und nutzen die relativen Faber-Krahn-Abschätzungen für eine explizite Gaußsche Schranke. Für kompakte Mannigfaltigkeiten können wir sogar die Integralschranken an den Negativteil der Ricci-Krümmung durch die sogenannte Kato-Bedingung ersetzen. In diesem Fall erhalten wir gleichmäßige Gaußsche Abschätzungen mit einer Gradientenabschätzung. Neben den geometrischen Verallgemeinerungen für Gaußsche Schranken nutzen wir unsere Ergebnisse, um Bochners Theorem zu verallgemeinern. Wärmeleitungskernabschätzungen ergeben ultrakontraktive Schranken für die Wärmeleitungshalbgruppe und die Halbgruppe, die durch den Hodge-Operator erzeugt wird. Damit können wir Starrheitseigenschaften für die erste Kohomologiegruppe zeigen, wenn der Teil der Ricci-Krümmung, welcher unter einem positiven Level liegt, in einem bestimmten Sinne klein genug ist. Wenn der Negativteil der Ricci-Krümmung nicht zu groß ist, können wir die erste Betti-Zahl noch immer explizit uniform abschätzen.
147

UNIVERSAL CONSTRAINTS ON 2D CFTS AND 3D GRAVITY

Qualls, Joshua D 01 January 2014 (has links)
We study constraints imposed on a general unitary two-dimensional conformal field theory by modular invariance. We begin with a review of previous bounds on the conformal dimension Delta1 of the lowest primary operator assuming unitarity, a discrete spectrum, modular invariance, cL, cR > 1, and no extended chiral algebra. We then obtain bounds on the conformal dimensions Delta2, Delta3 using no additional assumptions. We also show that in order to find a bound for Delta4 or higher Deltan, we need to assume a larger minimum value for ctot that grows logarithmically with n. We next extend the previous results to remove the requirement that our two-dimensional conformal field theories have no extended chiral algebra. We then show that modular invariance also implies an upper bound on the total number of states of positive energy less than ctot=24 (or equivalently, states of conformal dimension between ctot=24 and ctot=12), in terms of the number of negative energy states. Finally, we consider the case where the CFT has a gravitational dual and investigate the gravitational interpretation of our results. Using the AdS3/CFT2 correspondence, we obtain an upper bound on the lightest few massive excitations (both with and without the constraint of no chiral primary operators) in a theory of 3D matter and gravity with Lambda < 0. We show our results are consistent with facts and expectations about the spectrum of BTZ black holes in 2+1 gravity. We then discuss the upper and lower bounds on number of states and primary operators in the dual gravitational theory, focusing on the case of AdS3 pure gravity.
148

Individual treatment effect heterogeneity in multiple time points trials

Ndum, Edwin Andong January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Gary Gadbury / In biomedical studies, the treatment main effect is often expressed in terms of an “average difference.” A treatment that appears superior based on the average effect may not be superior for all subjects in a population if there is substantial “subject-treatment interaction.” A parameter quantifying subject-treatment interaction is inestimable in two sample completely randomized designs. Crossover designs have been suggested as a way to estimate the variability in individual treatment effects since an “individual treatment effect” can be measured. However, variability in these observed individual effects may include variability due to the treatment plus inherent variability of a response over time. We use the “Neyman - Rubin Model of Causal Inference” (Neyman, 1923; Rubin, 1974) for analyses. This dissertation consists of two parts: The quantitative and qualitative response analyses. The quantitative part focuses on disentangling the variability due to treatment effects from variability due to time effects using suitable crossover designs. Next, we propose a variable that defines the variance of a true individual treatment effect in a two crossover designs and show that they are not directly estimable but the mean effect is estimable. Furthermore, we show the variance of individual treatment effects is biased under both designs. The bias depends on time effects. Under certain design considerations, linear combinations of time effects can be estimated, making it possible to separate the variability due to time from that due to treatment. The qualitative section involves a binary response and is centered on estimating the average treatment effect and bounding a probability of a negative effect, a parameter which relates to the individual treatment effect variability. Using a stated joint probability distribution of potential outcomes, we express the probability of the observed outcomes under a two treatment, two periods crossover design. Maximum likelihood estimates of these probabilities are found using an iterative numerical method. From these, we propose bounds for an inestimable probability of negative effect. Tighter bounds are obtained with information from subjects that receive the same treatments over the two periods. Finally, we simulate an example of observed count data to illustrate estimation of the bounds.
149

Význam struktury daňových systémů pro ekonomický růst v zemích OECD - Extreme Bounds Analysis / The importance of tax system structure for economic growth in OECD countries - Extreme Bounds Analysis

Choutka, Petr January 2015 (has links)
The importance of tax system structure for economic growth in OECD countries - Extreme Bounds Analysis Abstract The thesis examines the importance of tax system structures for economic growth in OECD countries. It aims to find out whether a revenue-neutral tax reform can promote economic growth. In other words, its objective is to identify taxes which are most harmful for economic growth and suggest tax policy implications accordingly. The extreme bounds analysis is employed to examine the robustness of relationship between particular taxes and the growth rate. This method consists in running a number of regressions and observing how the coefficients respond to various model alterations. The results suggest that taxes levied on personal income have a robust negative impact on economic growth. On the other hand, consumption and property taxes appear to be non-significant predictors of economic growth. The policy implication is drawn that a revenue-neutral tax reform shifting the tax burden from personal income towards consumption and property is likely to boost the economy. JEL classification: H21, H24, H27, O11, O47 Key words: tax system structure, economic growth, extreme bounds analysis, tax reform
150

Untere Schranken für Steinerbaumalgorithmen und die Konstruktion von Bicliquen in dichten Graphen

Kirchner, Stefan 02 September 2008 (has links)
Die vorliegende Arbeit besteht aus zwei Teilen. Der erste Teil der Arbeit befasst sich mit unteren Schranken für approximative Steinerbaumalgorithmen. Ein Steinerbaum ist ein kürzester Teilgraph, der eine gegebene Teilmenge der Knoten eines Graphen spannt. Das Berechnen eines Steinerbaumes ist ein klassisches NP-schweres Problem, und es existieren mehrere Approximationsalgorithmen, wobei bei den meisten Algorithmen die Approximationsgüte nur durch untere und obere Schranken eingegrenzt werden kann. Für einige dieser Algorithmen werden in dieser Arbeit Instanzen vorgestellt, welche die unteren Schranken verbessern. Für den Relativen Greedy Algorithmus wird außerdem ein Verfahren vorgestellt, mit dem die Güte des Algorithmus eingeschränkt auf die Graphenklasse mit k Terminalen auf einen beliebigen Faktor genau bestimmt werden kann. Der zweite Teil der Arbeit widmet sich vollständig bipartiten Subgraphen mit gleicher Partitionsgrößse, sogenannten balancierten Bicliquen. Seit langem ist bekannt, dass in dichten bipartiten Graphen balancierte Bicliquen mit Omega(log(n)) Knoten existieren, aber es ist unbekannt, ob und wie diese in polynomieller Zeit konstruiert werden können. Der zweite Teil liefert dazu einen Beitrag, indem ein polynomieller Algorithmus vorgestellt wird, der in hinreichend großen dichten bipartiten Graphen eine balancierte Biclique mit Omega(sqrt(log(n))) Knoten konstruiert. / This thesis consists of two parts. The first part is concerned with lower bounds for approximating Steiner trees. The Steiner tree problem is to find a shortest subgraph that spans a given set of vertices in a graph and is a classical NP-hard problem. Several approximation algorithms exist, but for most algorithms only lower and upper bounds for the approximation ratio are known. For some of these algorithms we present instances which improve the lower bounds. When the problem is restricted to the class of graphs with k terminals, we also present a method which can be used to determine the approximation ratio of the Relative Greedy Algorithm with arbitrary precision. The second part is about balanced bicliques, i.e. complete bipartite subgraphs with equal partition sizes. It has been known for a long time that every dense bipartite graph contains a balanced biclique of size Omega(log(n)), but whether and how such a biclique can be constructed in polynomial time is still unknown. Our contribution to this problem is a polynomial time algorithm which constructs a balanced biclique of size Omega(sqrt(log(n))) in sufficiently large and dense bipartite graphs.

Page generated in 0.0493 seconds