51 |
Essays on the Macroeconomic Implications of Information AsymmetriesMalherbe, Frédéric 02 September 2010 (has links)
Along this dissertation I propose to walk the reader through several macroeconomic
implications of information asymmetries, with a special focus on financial
issues. This exercise is mainly theoretical: I develop stylized models that aim
at capturing macroeconomic phenomena such as self-fulfilling liquidity dry-ups,
the rise and the fall of securitization markets, and the creation of systemic risk.
The dissertation consists of three chapters. The first one proposes an explanation
to self-fulfilling liquidity dry-ups. The second chapters proposes a formalization
of the concept of market discipline and an application to securitization
markets as risk-sharing mechanisms. The third one offers a complementary
analysis to the second as the rise of securitization is presented as banker optimal
response to strict capital constraints.
Two concepts that do not have unique acceptations in economics play a central
role in these models: liquidity and market discipline.
The liquidity of an asset refers to the ability for his owner to transform it into
current consumption goods. Secondary markets for long-term assets play thus
an important role with that respect. However, such markets might be illiquid due
to adverse selection.
In the first chapter, I show that: (1) when agents expect a liquidity dry-up
on such markets, they optimally choose to self-insure through the hoarding of
non-productive but liquid assets; (2) this hoarding behavior worsens adverse selection and dries up market liquidity; (3) such liquidity dry-ups are Pareto inefficient
equilibria; (4) the government can rule them out. Additionally, I show
that idiosyncratic liquidity shocks à la Diamond and Dybvig have stabilizing effects,
which is at odds with the banking literature. The main contribution of the
chapter is to show that market breakdowns due to adverse selection are highly
endogenous to past balance-sheet decisions.
I consider that agents are under market discipline when their current behavior
is influenced by future market outcomes. A key ingredient for market discipline
to be at play is that the market outcome depends on information that is observable
but not verifiable (that is, information that cannot be proved in court, and
consequently, upon which enforceable contracts cannot be based).
In the second chapter, after introducing this novel formalization of market
discipline, I ask whether securitization really contributes to better risk-sharing:
I compare it with other mechanisms that differ on the timing of risk-transfer. I
find that for securitization to be an efficient risk-sharing mechanism, it requires
market discipline to be strong and adverse selection not to be severe. This seems
to seriously restrict the set of assets that should be securitized for risk-sharing
motive.
Additionally, I show how ex-ante leverage may mitigate interim adverse selection
in securitization markets and therefore enhance ex-post risk-sharing. This
is interesting because high leverage is usually associated with “excessive” risktaking.
In the third chapter, I consider risk-neutral bankers facing strict capital constraints;
their capital is indeed required to cover the worst-case-scenario losses.
In such a set-up, I find that: 1) banker optimal autarky response is to diversify
lower-tail risk and maximize leverage; 2) securitization helps to free up capital
and to increase leverage, but distorts incentives to screen loan applicants properly; 3) market discipline mitigates this problem, but if it is overestimated by
the supervisor, it leads to excess leverage, which creates systemic risk. Finally,
I consider opaque securitization and I show that the supervisor: 4) faces uncertainty
about the trade-off between the size of the economy and the probability
and the severity of a systemic crisis; 5) can generally not set capital constraints
at the socially efficient level.
|
52 |
Effective Resource Allocation for Non-cooperative Spectrum SharingJacob-David, Dany D. 13 October 2011 (has links)
Spectrum access protocols have been proposed recently to provide flexible and efficient use
of the available bandwidth. Game theory has been applied to the analysis of the problem
to determine the most effective allocation of the users’ power over the bandwidth. However,
prior analysis has focussed on Shannon capacity as the utility function, even though it is
known that real signals do not, in general, meet the Gaussian distribution assumptions of that metric. In a non-cooperative spectrum sharing environment, the Shannon capacity utility function results in a water-filling solution. In this thesis, the suitability of the water-filling solution is evaluated when using non-Gaussian signalling first in a frequency non-selective environment to focus on the resource allocation problem and its outcomes. It is then extended to a frequency selective environment to examine the proposed algorithm in a more realistic wireless environment. It is shown in both scenarios that more effective resource allocation can be achieved when the utility function takes into account the actual signal characteristics.
Further, it is demonstrated that higher rates can be achieved with lower transmitted power,
resulting in a smaller spectral footprint, which allows more efficient use of the spectrum
overall. Finally, future spectrum management is discussed where the waveform adaptation
is examined as an additional option to the well-known spectrum agility, rate and transmit
power adaptation when performing spectrum sharing.
|
53 |
Effective Resource Allocation for Non-cooperative Spectrum SharingJacob-David, Dany D. 13 October 2011 (has links)
Spectrum access protocols have been proposed recently to provide flexible and efficient use
of the available bandwidth. Game theory has been applied to the analysis of the problem
to determine the most effective allocation of the users’ power over the bandwidth. However,
prior analysis has focussed on Shannon capacity as the utility function, even though it is
known that real signals do not, in general, meet the Gaussian distribution assumptions of that metric. In a non-cooperative spectrum sharing environment, the Shannon capacity utility function results in a water-filling solution. In this thesis, the suitability of the water-filling solution is evaluated when using non-Gaussian signalling first in a frequency non-selective environment to focus on the resource allocation problem and its outcomes. It is then extended to a frequency selective environment to examine the proposed algorithm in a more realistic wireless environment. It is shown in both scenarios that more effective resource allocation can be achieved when the utility function takes into account the actual signal characteristics.
Further, it is demonstrated that higher rates can be achieved with lower transmitted power,
resulting in a smaller spectral footprint, which allows more efficient use of the spectrum
overall. Finally, future spectrum management is discussed where the waveform adaptation
is examined as an additional option to the well-known spectrum agility, rate and transmit
power adaptation when performing spectrum sharing.
|
54 |
Optimal Path Searching through Specified Routes using different AlgorithmsFarooq, Farhan January 2009 (has links)
To connect different electrical, network and data devices with the minimum cost and shortest path, is a complex job. In huge buildings, where the devices are placed at different locations on different floors and only some specific routes are available to pass the cables and buses, the shortest path search becomes more complex. The aim of this thesis project is, to develop an application which indentifies the best path to connect all objects or devices by following the specific routes.To address the above issue we adopted three algorithms Greedy Algorithm, Simulated Annealing and Exhaustive search and analyzed their results. The given problem is similar to Travelling Salesman Problem. Exhaustive search is a best algorithm to solve this problem as it checks each and every possibility and give the accurate result but it is an impractical solution because of huge time consumption. If no. of objects increased from 12 it takes hours to search the shortest path. Simulated annealing is emerged with some promising results with lower time cost. As of probabilistic nature, Simulated annealing could be non optimal but it gives a near optimal solution in a reasonable duration. Greedy algorithm is not a good choice for this problem. So, simulated annealing is proved best algorithm for this problem. The project has been implemented in C-language which takes input and store output in an Excel Workbook
|
55 |
The Trefftz Method using Fundamental Solutions for Biharmonic EquationsTing-chun, Daniel 30 June 2008 (has links)
In this thesis, the analysis of the method of fundamental solution(MFS) is expanded for biharmonic equations. The bounds of errors are derived for the traditional and the Almansi's approaches in bounded simply-connected domains. The exponential and the polynomial convergence rates are obtained from highly and finite smooth solutions, respectively. Also the bounds of condition number are derived for the disk domains, to show the exponential growth rates. The analysis in this thesis is the first time to provide the rigor analysis of the CTM for biharmonic equations, and the intrinsic nature of accuracy and stability is similar to that of Laplace's equation.
Numerical experiment are carried out for both smooth and singularity problems. The numerical results coincide with the theoretical analysis made. When the particular solutions satisfying the biharmonic equation can be found, the method of particular solutions(MPS) is always superior to MFS, supported by numerical examples. However, if such singular particular solutions near the singular points can not be found, the local refinement of collocation nodes and the greedy adaptive techniques can be used. It seems that the greedy adaptive techniques may provide a better solution for singularity problems. Beside, the numerical solutions by Almansi's approaches are slightly better in accuracy and stability than those by the traditional FS. Hence, the MFS with Almansi's approaches is recommended, due to the simple analysis, which can be obtained directly from the analysis of MFS for Laplace's equation.
|
56 |
Greedy structure learning of Markov Random FieldsJohnson, Christopher Carroll 04 November 2011 (has links)
Probabilistic graphical models are used in a variety of domains to capture and represent general dependencies in joint probability distributions. In this document we examine the problem of learning the structure of an undirected graphical model, also called a Markov Random Field (MRF), given a set of independent and identically distributed (i.i.d.) samples. Specifically, we introduce an adaptive forward-backward greedy algorithm for learning the structure of a discrete, pairwise MRF given a high dimensional set of i.i.d. samples. The algorithm works by greedily estimating the neighborhood of each node independently through a series of forward and backward steps. By imposing a restricted strong convexity condition on the structure of the learned graph we show that the structure can be fully learned with high probability given $n=\Omega(d\log (p))$ samples where $d$ is the dimension of the graph and $p$ is the number of nodes. This is a significant improvement over existing convex-optimization based algorithms that require a sample complexity of $n=\Omega(d^2\log(p))$ and a stronger irrepresentability condition. We further support these claims with an empirical comparison of the greedy algorithm to node-wise $\ell_1$-regularized logistic regression as well as provide a real data analysis of the greedy algorithm using the Audioscrobbler music listener dataset. The results of this document provide an additional representation of work submitted by A. Jalali, C. Johnson, and P. Ravikumar to NIPS 2011. / text
|
57 |
"Route Record Distance Vector Protocol for Minimization of Intra-Flow Interference"Seibel, Roman 24 October 2013 (has links)
No description available.
|
58 |
Characterizing problems for realizing policies in self-adaptive and self-managing systemsBalasubramanian, Sowmya 15 March 2013 (has links)
Self-adaptive and self-managing systems optimize their own behaviour according to high-level objectives and constraints. One way for human administrators to effectively specify goals for such optimization problems is using policies. Over the past decade, researchers produced various approaches, models and techniques for policy specification in different areas including distributed systems, communication networks, web services, autonomic computing, and cloud computing. Research challenges range from characterizing policies for ease of specification in particular application domains to categorizing policies for achieving good solution qualities for particular algorithmic techniques.
The contributions of this thesis are threefold. Firstly, we give a mathematical formulation for each of the three policy types, action, goal and utility function policies, introduced in the policy framework by Kephart and Walsh. In particular, we introduce a first precise characterization of goal policies for optimization problems. Secondly, this thesis introduces a mathematical framework that adds structure to the underlying optimization problem for different types of policies. Structure is added either to the objective function or the constraints of the optimization problem. These mathematical structures, imposed on the underlying problem, progressively increase the quality of the solutions obtained when using the greedy optimization technique. Thirdly, we show the applicability of our framework through case studies by analyzing several optimization problems encountered in self-adaptive and self-managing systems, such as resource allocation, quality of service management, and Service Level Agreement (SLA) profit optimization to provide quality guarantees for their solutions.
Our approach combines the algorithmic results by Edmonds, Fisher et al., and Mestre, and the policy framework of Kephart and Walsh. Our characterization and approach will help designers of self-adaptive and self-managing systems formulate optimization problems, decide on algorithmic strategies based on policy requirements, and reason about solution qualities. / Graduate / 0984
|
59 |
Effective Resource Allocation for Non-cooperative Spectrum SharingJacob-David, Dany D. 13 October 2011 (has links)
Spectrum access protocols have been proposed recently to provide flexible and efficient use
of the available bandwidth. Game theory has been applied to the analysis of the problem
to determine the most effective allocation of the users’ power over the bandwidth. However,
prior analysis has focussed on Shannon capacity as the utility function, even though it is
known that real signals do not, in general, meet the Gaussian distribution assumptions of that metric. In a non-cooperative spectrum sharing environment, the Shannon capacity utility function results in a water-filling solution. In this thesis, the suitability of the water-filling solution is evaluated when using non-Gaussian signalling first in a frequency non-selective environment to focus on the resource allocation problem and its outcomes. It is then extended to a frequency selective environment to examine the proposed algorithm in a more realistic wireless environment. It is shown in both scenarios that more effective resource allocation can be achieved when the utility function takes into account the actual signal characteristics.
Further, it is demonstrated that higher rates can be achieved with lower transmitted power,
resulting in a smaller spectral footprint, which allows more efficient use of the spectrum
overall. Finally, future spectrum management is discussed where the waveform adaptation
is examined as an additional option to the well-known spectrum agility, rate and transmit
power adaptation when performing spectrum sharing.
|
60 |
Geographisches Routing : Grundlagen und Basisalgorithmen /Frey, Hannes. January 2006 (has links)
Universiẗat, Diss., 2006--Trier.
|
Page generated in 0.0368 seconds