• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 5
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 75
  • 75
  • 14
  • 12
  • 10
  • 9
  • 9
  • 9
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Incertezas de modelo na análise de torres metálicas treliçadas de linhas de transmissão / Model uncertainties in the transmission lines latticed steel towers analysis

Kaminski Junior, Joao January 2007 (has links)
Incertezas de modelo invadem todos os estágios de uma análise de confiabilidade estrutural, desde a determinação das ações e do próprio sistema estrutural, até o processo pelo qual o efeito destas ações é avaliado. Neste trabalho, o enfoque é dado nesse último tópico, mais especificamente na avaliação das incertezas de modelo mecânico em torres metálicas treliçadas de linhas de transmissão (LT), o qual tem permanecido ignorado nas estimativas de confiabilidade até então, em parte devido a sua natureza elusiva. Logo, o problema consiste em avaliar as incertezas na predição da resposta estrutural, uma vez que todos os parâmetros que definem as ações externas e o próprio sistema são claramente definidos. A principal motivação deste trabalho partiu de um estudo conduzido pela CIGRÉ sobre torres metálicas treliçadas de LT submetidas a cargas estáticas, o qual sugere que as incertezas de modelo neste tipo de estrutura são relevantes e não podem ser desprezadas, podendo influenciar significativamente na estimativa da confiabilidade. Neste trabalho, são avaliados diferentes modelos mecânicos de torres de LT sujeitos a ações estáticas, além de modelos de torres e trechos de LT submetidos à ação dinâmica de ruptura de cabo, adotada por ser um carregamento dinâmico “bem definido”. Na análise estática, são estudados desde modelos simplificados de torres autoportantes, adotados na prática usual de projeto, até modelos mais aprimorados. A dispersão nos resultados numéricos entre os modelos é usada para quantificar as incertezas relacionadas ao modelo mecânico, e os resultados disponíveis de ensaios estáticos em protótipos são utilizados para encontrar os modelos cuja resposta mais se aproxima dos valores experimentais. A resposta dinâmica de torres metálicas treliçadas de LT submetidas à ruptura de cabo, é comparada entre vários modelos, com diferentes graus de sofisticação e detalhe. São estudados desde o modelo usual de análise e projeto de torres para este tipo de carregamento, passando por modelos relativamente simples, com uma única torre sujeita a uma carga variável no tempo, simulando o efeito da ruptura de um cabo, até modelos mais complexos de trechos de LT, os quais incluem várias torres, cabos e cadeias de isoladores. Diversas fontes de incerteza são avaliadas, considerando a influência de fatores relevantes tais como: a discretização dos elementos de cabo, as condições de contorno dos elementos de cabo das extremidades, as leis constitutivas dos elementos de barra e de cabo e o amortecimento estrutural. Por fim, são discutidas e apresentadas possíveis maneiras de considerar explicitamente a incerteza de modelo na estimativa da confiabilidade e em códigos de projeto de estruturas de linhas de transmissão. / Model uncertainties pervade all stages of a structural reliability analysis, from the description of loads and the system itself, to the process by which the effect of loads on the system is evaluated. In this study, attention is focused on the last issue, specifically in the evaluation of model uncertainties on transmission lines (TL) latticed steel towers, which has remained largely ignored in previous developments of structural reliability, in part due to its elusive nature. In essence, the problem consists of evaluating the uncertainty in response predictions, once all parameters that define the external actions and the system itself have been unequivocally prescribed. The main motivation of this thesis was a study conducted by CIGRÉ on TL latticed steel towers subjected to static loads, among other exploratory assessments, which suggests that model uncertainty is a relevant factor and cannot be disregarded, could significantly influence the outcome of reliability assessments. Herein, different mechanical models of TL self-supporting towers subjected to static loads are evaluated, besides the models of towers and TL segments submitted to dynamic load due to cable rupture, adopted by being a “well defined” loading. In the static analysis, from simplified models of self-supporting towers, like adopted in usual practice of project, to more refined models are studied. The dispersion in the numeric results among the models, together with the data of static prototype tests, are used to quantify model uncertainties. The dynamic response of latticed TL steel towers subjected to cable rupture is predicted by use of various models with different degrees of sophistication or detailing. The predictions of the various models are compared with the aim of quantifying model uncertainty. Several uncertainty sources are evaluated, considering the influence of relevant factors such as: the discretization of the cable elements, the boundary conditions of the end cable elements, the constitutive laws of cables and tower members and the structural damping. Finally, possible ways to explicitly consider model uncertainty in reliability assessments and in code formulations are discussed.
62

Mají devizové rezervy centrálních bank dopad na inflaci? / Do Central Bank FX Reserves Matter for Inflation?

Keblúšek, Martin January 2020 (has links)
01 Abstract Foreign exchange reserves are a useful tool and a buffer but maintaining an amount that is too large can be costly to the economy. Recent accumulation of these reserves points to the importance of this topic. This thesis focuses on one specific part of the effect of FX reserves on the economy - the inflation. I use panel data for 74 countries from the year 1996 to the year 2017. There is a certain degree of model uncertainty for which this thesis accounts for by using Bayesian model averaging (BMA) estimation technique. The findings from my model averaging estimations show FX reserves to not be of importance for inflation determination with close to no change when altering lags, variables, when limiting the sample to fixed FX regimes nor when limiting the sample to inflation targeting regimes. The most important variables are estimated to be a central bank financial strength proxy, exchange rate depreciation, money supply, inflation targeting, and capital account openness. These results are robust to lag changes, prior changes, and for the most part remain the same when Pooled OLS is used.
63

Odhad životnosti železobetonových mostů / Life-cycle analysis of reinforced concrete bridges

Doležel, Jiří January 2016 (has links)
With increasing age of the concrete road bridges, the highly topical question is to determine their reliability and load-bearing capacity level required for the residual life of the structure. Doctoral thesis presents a comprehensive methodology for assessing the reliability of reinforced and prestressed concrete bridges based on non-linear finite element method damage and failure virtual simulations at both deterministic and stochastic levels. Load-bearing capacity values are specified by the structure’s design load capacity estimation by global safety factor methods or they are based on a fully probabilistic load capacity analysis using the direct resistance estimation. For the fully probabilistic calculations, the simulation technique Latin Hypercube Sampling is used.
64

A LIFE CYCLE ANALYSIS OF FOREST MANAGEMENT DECISIONS ON HARDWOODS PLANTATIONS

Sayon Ghosh (15361603) 26 April 2023 (has links)
<p>In the Central Hardwood Region, the quantity and quality of hardwood timber critically depend on forest management decisions made by private landowners, since they hold the largest share of woodlands, some of which are plantations. These plantations are in a unique and critical position to provide much-needed hardwood resources. However, there is a lack of research and tools enabling rigorous assessments of profitability of long-term investments in hardwood plantations. Partially due to this, the majority of these privately held plantations remain unmanaged.</p> <p>This study aims at providing scientific evidence and tools to help promote forest management on hardwood plantations held by private landowners. To this end, I demonstrate in Chapter 1 an economic-modeling approach that minimizes establishment costs while ensuring free-to-grow status by year 5, and crown closure by year 10. Using temperate hardwoods such as black walnut and red oak as focal species, I find a black walnut plantation can attain crown closure in year six at the lowest cost ($4,540/ha) with 6 feet x 7 feet spacing, herbicide application for the first year, and fencing. For red oak, the minimum-cost option ($5,371/ ha) which achieves crown closure in year 10 requires a planting density of 6 feet x 7 feet, herbicide application for the first three years, and fencing. Modelling uncertainty in growth and mortality in a stochastic counterpart shifts optimal solutions to denser plantings for black walnut; planting more trees is, thus, risk mitigative. Based upon these research outcomes, I identify the tradeoffs between efficacy of treatments towards establishment success viz a viz their relative costs which serve as a solid foundation for the assessment of subsequent management strategies.</p> <p>Next, in chapter 2, I first calibrate growth, yield, and crown-width models for black walnut trees with existing and new tree measurements on selected Hardwood Tree Improvement and Regeneration Center (HTIRC) plots. Using spatial information on trees, I develop an individual tree level thinning model and simulate their post-thinning growth and yield. Significant predictors of annual diameter growth between years 10 to 18 include the initial tree DBH, forest edge effects, distance-dependent neighborhood competition, and tree age. Significant edge effects exist up to 3 rows and 3 trees from the non-forested edge. A tree on the perimeter rows grows 0.30 cm (0.12in.) in DBH more per year than the interior trees, between years 10 to 18. Next, I dovetail my results from the spatially explicit thinning model with the USFS Forest Vegetation Simulator (FVS) to understand the impacts of different scenarios of planting densities, site productivities, thinning treatments, and expected yields (as percentage of the total volume) of veneer sawlogs to quantify the growth and profitability from the mid-rotation until the final harvest. To support the attendant financial analyses, I incorporate risk into these projections by simulating stochastic windthrows based on certain assumptions. My projections suggest that, without the threat of windthrow damage, the net present day value (NPV) could exceed $4,900 per acre on the highest quality sites (SI =100) and high densities at planting (6 feet x 6 feet), assuming 10% or more of final volume was veneer and using a 3% discount rate. In contrast, under simulations of probable windthrow disturbances from mid-rotation to final harvest, the chances that standing timber value at harvest exceeds $5,000 per acre are 43.13% for a 96- and 90-year rotation and increase to 45.48% for 75 and further to 56.04% for 60.</p>
65

Development of Visual Tools for Analyzing Ensemble Error and Uncertainty

Anreddy, Sujan Ranjan Reddy 04 May 2018 (has links)
Climate analysts use Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations to make sense of models performance in predicting extreme events such as heavy precipitation. Similarly, weather analysts use numerical weather prediction models (NWP) to simulate weather conditions either by perturbing initial conditions or by changing multiple input parameterization schemes, e.g., cumulus and microphysics schemes. These simulations are used in operational weather forecasting and for studying the role of parameterization schemes in synoptic weather events like storms. This work addresses the need for visualizing the differences in both CMIP5 and NWP model output. This work proposes three glyph designs used for communicating CMIP5 model error. It also describes Ensemble Visual eXplorer tool that provides multiple ways of visualizing NWP model output and the related input parameter space. The proposed interactive dendrogram provides an effective way to relate multiple input parameterization schemes with spatial characteristics of model uncertainty features. The glyphs that were designed to communicate CMIP5 model error are extended to encode both parameterization schemes and graduated uncertainty, to provide related insights at specific locations such as storm center and the areas surrounding it. The work analyzes different ways of using glyphs to represent parametric uncertainty using visual variables such as color and size, in conjunction with Gestalt visual properties. It demonstrates the use of visual analytics in resolving some of the issues such as visual scalability. As part of this dissertation, we evaluated three glyph designs using average precipitation rate predicted by CMIP5 simulations, and Ensemble Visual eXplorer tool using WRF 1999 March 4th, North American storm track dataset.
66

Reversible Jump Markov Chain Monte Carlo

Neuhoff, Daniel 15 March 2016 (has links)
Die vier in der vorliegenden Dissertation enthaltenen Studien beschäftigen sich vorwiegend mit dem dynamischen Verhalten makroökonomischer Zeitreihen. Diese Dynamiken werden sowohl im Kontext eines einfachen DSGE Modells, als auch aus der Sichtweise reiner Zeitreihenmodelle untersucht. / The four studies of this thesis are concerned predominantly with the dynamics of macroeconomic time series, both in the context of a simple DSGE model, as well as from a pure time series modeling perspective.
67

Matching DSGE models to data with applications to fiscal and robust monetary policy

Kriwoluzky, Alexander 01 December 2009 (has links)
Diese Doktorarbeit untersucht drei Fragestellungen. Erstens, wie die Wirkung von plötzlichen Änderungen exogener Faktoren auf endogene Variablen empirisch im Allgemeinen zu bestimmen ist. Zweitens, welche Effekte eine Erhöhung der Staatsausgaben im Speziellen hat. Drittens, wie optimale Geldpolitik bestimmt werden kann, wenn der Entscheider keine eindeutigen Modelle für die ökonomischen Rahmenbedingungen hat. Im ersten Kapitel entwickele ich eine Methode, mithilfe derer die Effekte von plötzlichen Änderungen exogener Faktoren auf endogene Variablen geschätzt werden können. Dazu wird die gemeinsame Verteilung von Parametern einer Vektor Autoregression (VAR) und eines stochastischen allgemeinen Gleichgewichtsmodelles (DSGE) bestimmt. Auf diese Weise können zentrale Probleme gelöst werden: das Identifikationsproblem der VAR und eine mögliche Misspezifikation des DSGE Modells. Im zweitem Kapitel wende ich die Methode aus dem ersten Kapitel an, um den Effekt einer angekündigten Erhöhung der Staatsausgaben auf den privaten Konsum und die Reallöhne zu untersuchen. Die Identifikation beruht auf der Einsicht, dass endogene Variablen, oft qualitative Unterschiede in der Periode der Ankündigung und nach der Realisation zeigen. Die Ergebnisse zeigen, dass der private Konsum negativ im Zeitraum der Ankündigung reagiert und positiv nach der Realisation. Reallöhne steigen zum Zeitpunkt der Ankündigung und sind positiv für zwei Perioden nach der Realisation. Im abschließendem Kapitel untersuche ich gemeinsam mit Christian Stoltenberg, wie Geldpolitik gesteuert werden sollte, wenn die Modellierung der Ökonomie unsicher ist. Wenn ein Modell um einen Parameter erweitert wird, kann das Modell dadurch so verändert werden, dass sich die Politikempfehlungen zwischen dem ursprünglichen und dem neuen Modell unterscheiden. Oft wird aber lediglich das erweiterte Modell betrachtet. Wir schlagen eine Methode vor, die beiden Modellen Rechnung trägt und somit zu einer besseren Politik führt. / This thesis is concerned with three questions: first, how can the effects macroeconomic policy has on the economy in general be estimated? Second, what are the effects of a pre-announced increase in government expenditures? Third, how should monetary policy be conducted, if the policymaker faces uncertainty about the economic environment. In the first chapter I suggest to estimate the effects of an exogenous disturbance on the economy by considering the parameter distributions of a Vector Autoregression (VAR) model and a Dynamic Stochastic General Equilibrium (DSGE) model jointly. This allows to resolve the major issue a researcher has to deal with when working with a VAR model and a DSGE model: the identification of the VAR model and the potential misspecification of the DSGE model. The second chapter applies the methodology presented in the preceding chapter to investigate the effects of a pre-announced change in government expenditure on private consumption and real wages. The shock is identified by exploiting its pre-announced nature, i.e. different signs of the responses in endogenous variables during the announcement and after the realization of the shock. Private consumption is found to respond negatively during the announcement period and positively after the realization. The reaction of real wages is positive on impact and positive for two quarters after the realization. In the last chapter ''Optimal Policy Under Model Uncertainty: A Structural-Bayesian Estimation Approach'' I investigate jointly with Christian Stoltenberg how policy should optimally be conducted when the policymaker is faced with uncertainty about the economic environment. The standard procedure is to specify a prior over the parameter space ignoring the status of some sub-models. We propose a procedure that ensures that the specified set of sub-models is not discarded too easily. We find that optimal policy based on our procedure leads to welfare gains compared to the standard practice.
68

Robust stochastic analysis with applications

Prömel, David Johannes 02 December 2015 (has links)
Diese Dissertation präsentiert neue Techniken der Integration für verschiedene Probleme der Finanzmathematik und einige Anwendungen in der Wahrscheinlichkeitstheorie. Zu Beginn entwickeln wir zwei Zugänge zur robusten stochastischen Integration. Der erste, ähnlich der Ito’schen Integration, basiert auf einer Topologie, erzeugt durch ein äußeres Maß, gegeben durch einen minimalen Superreplikationspreis. Der zweite gründet auf der Integrationtheorie für rauhe Pfade. Wir zeigen, dass das entsprechende Integral als Grenzwert von nicht antizipierenden Riemannsummen existiert und dass sich jedem "typischen Preispfad" ein rauher Pfad im Ito’schen Sinne zuordnen lässt. Für eindimensionale "typische Preispfade" wird sogar gezeigt, dass sie Hölder-stetige Lokalzeiten besitzen. Zudem erhalten wir Verallgemeinerungen von Föllmer’s pfadweiser Ito-Formel. Die Integrationstheorie für rauhe Pfade kann mit dem Konzept der kontrollierten Pfade und einer Topologie, welche die Information der Levy-Fläche enthält, entwickelt werden. Deshalb untersuchen wir hinreichende Bedingungen an die Kontrollstruktur für die Existenz der Levy-Fläche. Dies führt uns zur Untersuchung von Föllmer’s Ito-Formel aus der Sicht kontrollierter Pfade. Para-kontrollierte Distributionen, kürzlich von Gubinelli, Imkeller und Perkowski eingeführt, erweitern die Theorie rauher Pfade auf den Bereich von mehr-dimensionale Parameter. Wir verallgemeinern diesen Ansatz von Hölder’schen auf Besov-Räume, um rauhe Differentialgleichungen zu lösen, und wenden die Ergebnisse auf stochastische Differentialgleichungen an. Zum Schluß betrachten wir stark gekoppelte Systeme von stochastischen Vorwärts-Rückwärts-Differentialgleichungen (FBSDEs) und erweitern die Theorie der Existenz, Eindeutigkeit und Regularität der sogenannten Entkopplungsfelder auf Markovsche FBSDEs mit lokal Lipschitz-stetigen Koeffizienten. Als Anwendung wird das Skorokhodsche Einbettungsproblem für Gaußsche Prozesse mit nichtlinearem Drift gelöst. / In this thesis new robust integration techniques, which are suitable for various problems from stochastic analysis and mathematical finance, as well as some applications are presented. We begin with two different approaches to stochastic integration in robust financial mathematics. The first one is inspired by Ito’s integration and based on a certain topology induced by an outer measure corresponding to a minimal superhedging price. The second approach relies on the controlled rough path integral. We prove that this integral is the limit of non-anticipating Riemann sums and that every "typical price path" has an associated Ito rough path. For one-dimensional "typical price paths" it is further shown that they possess Hölder continuous local times. Additionally, we provide various generalizations of Föllmer’s pathwise Ito formula. Recalling that rough path theory can be developed using the concept of controlled paths and with a topology including the information of Levy’s area, sufficient conditions for the pathwise existence of Levy’s area are provided in terms of being controlled. This leads us to study Föllmer’s pathwise Ito formulas from the perspective of controlled paths. A multi-parameter extension to rough path theory is the paracontrolled distribution approach, recently introduced by Gubinelli, Imkeller and Perkowski. We generalize their approach from Hölder spaces to Besov spaces to solve rough differential equations. As an application we deal with stochastic differential equations driven by random functions. Finally, considering strongly coupled systems of forward and backward stochastic differential equations (FBSDEs), we extend the existence, uniqueness and regularity theory of so-called decoupling fields to Markovian FBSDEs with locally Lipschitz continuous coefficients. These results allow to solve the Skorokhod embedding problem for a class of Gaussian processes with non-linear drift.
69

Robust utility maximization, f-projections, and risk constraints

Gundel, Anne 01 June 2006 (has links)
Ein wichtiges Gebiet der Finanzmathematik ist die Bestimmung von Auszahlungsprofilen, die den erwarteten Nutzen eines Agenten unter einer Budgetrestriktion maximieren. Wir charakterisieren optimale Auszahlungsprofile für einen Agenten, der unsicher ist in Bezug auf das genaue Marktmodell. Der hier benutzte Dualitätsansatz führt zu einem Minimierungsproblem für bestimmte konvexe Funktionale über zwei Mengen von Wahrscheinlichkeitsmaßen, das wir zunächst lösen müssen. Schließlich führen wir noch eine zweite Restriktion ein, die das Risiko beschränkt, das der Agent eingehen darf. Wir gehen dabei wie folgt vor: Kapitel 1. Wir betrachten das Problem, die f-Divergenz f(P|Q) über zwei Mengen von Wahrscheinlichkeitsmaßen zu minimieren, wobei f eine konvexe Funktion ist. Wir zeigen, dass unter der Bedingung "f( undendlich ) / undendlich = undendlich" Minimierer existieren, falls die erste Menge abgeschlossen und die zweite schwach kompakt ist. Außerdem zeigen wir, dass unter der Bedingung "f( undendlich ) / undendlich = 0" ein Minimierer in einer erweiterten Klasse von Martingalmaßen existiert, falls die zweite Menge schwach kompakt ist. Kapitel 2. Die Existenzresultate aus dem ersten Kapitel implizieren die Existenz eines Auszahlungsprofils, das das robuste Nutzenfunktional inf E_Q[u(X)] über eine Menge von finanzierbaren Auszahlungen maximiert, wobei das Infimum über eine Menge von Modellmaßen betrachtet wird. Die entscheidende Idee besteht darin, die minimierenden Maße aus dem ersten Kapitel als gewisse "worst-case"-Maße zu identifizieren. Kapitel 3. Schließlich fordern wir, dass das Risiko der Auszahlungsprofile beschränkt ist. Wir lösen das robuste Problem in einem unvollständigen Marktmodell für Nutzenfunktionen, die nur auf der positiven Halbachse definiert sind. In einem Beispiel vergleichen wir das optimale Auszahlungsprofil unter der Risikorestriktion mit den optimalen Auszahlungen ohne eine solche Restriktion und unter einer Value-at-Risk-Nebenbedingung. / Finding payoff profiles that maximize the expected utility of an agent under some budget constraint is a key issue in financial mathematics. We characterize optimal contingent claims for an agent who is uncertain about the market model. The dual approach that we use leads to a minimization problem for a certain convex functional over two sets of measures, which we first have to solve. Finally, we incorporate a second constraint that limits the risk that the agent is allowed to take. We proceed as follows: Chapter 1. Given a convex function f, we consider the problem of minimizing the f-divergence f(P|Q) over these two sets of measures. We show that, if the first set is closed and the second set is weakly compact, a minimizer exists if f( infinity ) / infinity = infinity. Furthermore, we show that if the second set of measures is weakly compact and f( infinifty ) / infinity = 0, then there is a minimizer in a class of extended martingale measures. Chapter 2. The existence results in Chapter 1 lead to the existence of a contingent claim which maximizes the robust utility functional inf E_Q[u(X)] over some set of affordable contingent claims, where the infimum is taken over a set of subjective or modell measures. The key idea is to identify the minimizing measures from the first chapter as certain worst case measures. Chapter 3. Finally, we require the risk of the contingent claims to be bounded. We solve the robust problem in an incomplete market for a utility function that is only defined on the positive halfline. In an example we compare the optimal claim under this risk constraint with the optimal claims without a risk constraint and under a value-at-risk constraint.
70

Revisiting the Effects of IMF Programs on Poverty and Inequality

Oberdabernig, Doris Anita 20 August 2012 (has links) (PDF)
Investigating how lending programs of the International Monetary Fund (IMF) affect poverty and inequality, we explicitly address model uncertainty. We control for endogenous selection into IMF programs using data on 86 low- and middle income countries for the 1982-2009 period and analyze program effects on various poverty and inequality measures. The results rely on averaging over 90 specifications of treatment effect models and indicate adverse short-run effects of IMF agreements on poverty and inequality for the whole sample, while for a 2000-2009 subsample the results are reversed. There is evidence that significant short-run effects might disappear in the long-run. (author's abstract) / Series: Department of Economics Working Paper Series

Page generated in 0.0231 seconds