• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 584
  • 124
  • 76
  • 66
  • 54
  • 41
  • 38
  • 35
  • 17
  • 10
  • 10
  • 8
  • 7
  • 7
  • 6
  • Tagged with
  • 1319
  • 342
  • 246
  • 236
  • 149
  • 129
  • 123
  • 121
  • 121
  • 119
  • 94
  • 94
  • 90
  • 83
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
451

Tradeoff between Investments in Infrastructure and Forecasting when Facing Natural Disaster Risk

Kim, Seong D. 2009 May 1900 (has links)
Hurricane Katrina of 2005 was responsible for at least 81 billion dollars of property damage. In planning for such emergencies, society must decide whether to invest in the ability to evacuate more speedily or in improved forecasting technology to better predict the timing and intensity of the critical event. To address this need, we use dynamic programming and Markov processes to model the interaction between the emergency response system and the emergency forecasting system. Simulating changes in the speed of evacuation and in the accuracy of forecasting allows the determination of an optimal mix of these two investments. The model shows that the evacuation improvement and the forecast improvement give different patterns of impact to their benefit. In addition, it shows that the optimal investment decision changes by the budget and the feasible range of improvement.
452

Analysis of flood hazard under consideration of dike breaches

Vorogushyn, Sergiy January 2008 (has links)
River reaches protected by dikes exhibit high damage potential due to strong value accumulation in the hinterland areas. While providing an efficient protection against low magnitude flood events, dikes may fail under the load of extreme water levels and long flood durations. Hazard and risk assessments for river reaches protected by dikes have not adequately considered the fluvial inundation processes up to now. Particularly, the processes of dike failures and their influence on the hinterland inundation and flood wave propagation lack comprehensive consideration. This study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The model was developed and tested on a ca. 91 km heavily diked river reach on the German part of the Elbe River between gauges Torgau and Vockerode. The reach is characterised by low slope and fairly flat extended hinterland areas. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100, 200, 500, 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. In the disaggregated display mode, the dike hazard maps indicate the failure probabilities for each considered breach mechanism. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. Finally, scenarios of polder deployment for the extreme floods with T = 200, 500, 1000 were simulated with IHAM. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management. / Entlang eingedeichter Flussabschnitte kann das Hinterland ein hohes Schadenspotential, aufgrund der starken Akkumulation der Werte, aufweisen. Obwohl Deiche einen effizienten Schutz gegen kleinere häufiger auftretende Hochwässer bieten, können sie unter der Last hoher Wasserstände sowie langer Anstaudauer versagen. Gefährdungs- und Risikoabschätzungsmethoden für die eingedeichten Flussstrecken haben bisher die fluvialen Überflutungsprozesse nicht hinreichend berücksichtigt. Besonders, die Prozesse der Deichbrüche und deren Einfluss auf Überflutung im Hinterland und Fortschreiten der Hochwasserwelle verlangen eine umfassende Betrachtung. Die vorliegende Studie setzt ihren Fokus auf die Entwicklung und Anwendung eines neuen Modellierungssystems, das eine umfassende Hochwassergefährdungsanalyse entlang eingedeichter Flussstrecken unter Berücksichtigung von Deichbrüchen ermöglicht. Das vorgeschlagene Inundation Hazard Assessment Model (IHAM) stellt ein hybrides probabilistisch-deterministisches Modell dar. Es besteht aus drei laufzeitgekoppelten Modellen: (1) einem 1D instationären hydrodynamisch-numerischen Modell für den Flussschlauch und die Vorländer zwischen den Deichen, (2) einem probabilistischen Deichbruchmodell, welches die möglichen Bruchstellen, Breschenbreiten und Breschenausflüsse berechnet, und (3) einem 2D raster-basierten Überflutungsmodell für das Hinterland, das auf dem Speiherzellenansatz und der Diffusionswellengleichung basiert ist. Das probabilistische Deichbruchmodell beschreibt Deichbrüche, die infolge von drei Bruchmechanismen auftreten: dem Überströmen, dem Piping im Deichuntergrund und dem Versagen der landseitigen Böschung als Folge des Sickerflusses und der Erosion im Deichkörper (Mikro-Instabilität). Das 2D Speicherzellenmodell, angetrieben durch den Breschenausfluss als Randbedingung, berechnet ein erweitertes Spektrum der Hochwasserintensitätsindikatoren wie: Überflutungstiefe, Fliessgeschwindigkeit, Impuls, Überflutungsdauer und Wasseranstiegsrate. IHAM wird im Rahmen einer Monte Carlo Simulation ausgeführt und berücksichtigt die natürliche Variabilität der Hochwasserentstehungsprozesse, die in der Form der Hydrographen und deren Häufigkeit abgebildet wird, und die Zufälligkeit des Deichversagens, gegeben durch die Lokationen der Bruchstellen, der Zeitpunkte der Brüche und der Breschenbreiten. Das Modell wurde entwickelt und getestet an einem ca. 91 km langen Flussabschnitt. Dieser Flussabschnitt ist durchgängig eingedeicht und befindet sich an der deutschen Elbe zwischen den Pegeln Torgau und Vockerode. Die Szenarioberechnungen wurden von synthetischen Hydrographen für den Hauptstrom und Nebenfluss angetrieben, die für Hochwässer mit Wiederkehrintervallen von 100, 200, 500, und 1000 Jahren entwickelt wurden. Basierend auf den Modellierungsergebnissen wurden probabilistische Deichgefährdungskarten generiert. Sie zeigen die Versagenswahrscheinlichkeiten der diskretisierten Deichabschnitte für jede modellierte Hochwassermagnitude. Die Deichgefährdungskarten im disaggregierten Darstellungsmodus zeigen die Bruchwahrscheinlichkeiten für jeden betrachteten Bruchmechanismus. Neben den binären Überflutungsmustern, die die Wahrscheinlichkeit der Überflutung jeder Rasterzelle im Hinterland zeigen, generiert IHAM probabilistische Hochwassergefährdungskarten. Diese Karten stellen räumliche Muster der in Betracht gezogenen Hochwasserintensitätsindikatoren und entsprechende Jährlichkeiten dar. Schließlich, wurden mit IHAM Szenarien mit Aktivierung vom Polder bei extremen Hochwässern mit Jährlichkeiten von 200, 500, 1000 Jahren simuliert. Das entwickelte IHAM Modellierungssystem stellt ein neues wissenschaftliches Werkzeug für die Untersuchung fluvialer Überflutungsdynamik in extremen Hochwassersituationen unter Berücksichtigung des Einflusses technischer Hochwasserschutzmaßnahmen dar. Das IHAM System hat eine hohe praktische Bedeutung für die Entscheidungsunterstützung im Hochwassermanagement aufgrund der neuartigen Deichbruch- und Hochwassergefährdungskarten, die das Hauptprodukt der Simulationen darstellen.
453

NONPARAMETRIC INFERENCES FOR THE HAZARD FUNCTION WITH RIGHT TRUNCATION

Akcin, Haci Mustafa 03 May 2013 (has links)
Incompleteness is a major feature of time-to-event data. As one type of incompleteness, truncation refers to the unobservability of the time-to-event variable because it is smaller (or greater) than the truncation variable. A truncated sample always involves left and right truncation. Left truncation has been studied extensively while right truncation has not received the same level of attention. In one of the earliest studies on right truncation, Lagakos et al. (1988) proposed to transform a right truncated variable to a left truncated variable and then apply existing methods to the transformed variable. The reverse-time hazard function is introduced through transformation. However, this quantity does not have a natural interpretation. There exist gaps in the inferences for the regular forward-time hazard function with right truncated data. This dissertation discusses variance estimation of the cumulative hazard estimator, one-sample log-rank test, and comparison of hazard rate functions among finite independent samples under the context of right truncation. First, the relation between the reverse- and forward-time cumulative hazard functions is clarified. This relation leads to the nonparametric inference for the cumulative hazard function. Jiang (2010) recently conducted a research on this direction and proposed two variance estimators of the cumulative hazard estimator. Some revision to the variance estimators is suggested in this dissertation and evaluated in a Monte-Carlo study. Second, this dissertation studies the hypothesis testing for right truncated data. A series of tests is developed with the hazard rate function as the target quantity. A one-sample log-rank test is first discussed, followed by a family of weighted tests for comparison between finite $K$-samples. Particular weight functions lead to log-rank, Gehan, Tarone-Ware tests and these three tests are evaluated in a Monte-Carlo study. Finally, this dissertation studies the nonparametric inference for the hazard rate function for the right truncated data. The kernel smoothing technique is utilized in estimating the hazard rate function. A Monte-Carlo study investigates the uniform kernel smoothed estimator and its variance estimator. The uniform, Epanechnikov and biweight kernel estimators are implemented in the example of blood transfusion infected AIDS data.
454

Nonparametric Inferences for the Hazard Function with Right Truncation

Akcin, Haci Mustafa 03 May 2013 (has links)
Incompleteness is a major feature of time-to-event data. As one type of incompleteness, truncation refers to the unobservability of the time-to-event variable because it is smaller (or greater) than the truncation variable. A truncated sample always involves left and right truncation. Left truncation has been studied extensively while right truncation has not received the same level of attention. In one of the earliest studies on right truncation, Lagakos et al. (1988) proposed to transform a right truncated variable to a left truncated variable and then apply existing methods to the transformed variable. The reverse-time hazard function is introduced through transformation. However, this quantity does not have a natural interpretation. There exist gaps in the inferences for the regular forward-time hazard function with right truncated data. This dissertation discusses variance estimation of the cumulative hazard estimator, one-sample log-rank test, and comparison of hazard rate functions among finite independent samples under the context of right truncation. First, the relation between the reverse- and forward-time cumulative hazard functions is clarified. This relation leads to the nonparametric inference for the cumulative hazard function. Jiang (2010) recently conducted a research on this direction and proposed two variance estimators of the cumulative hazard estimator. Some revision to the variance estimators is suggested in this dissertation and evaluated in a Monte-Carlo study. Second, this dissertation studies the hypothesis testing for right truncated data. A series of tests is developed with the hazard rate function as the target quantity. A one-sample log-rank test is first discussed, followed by a family of weighted tests for comparison between finite $K$-samples. Particular weight functions lead to log-rank, Gehan, Tarone-Ware tests and these three tests are evaluated in a Monte-Carlo study. Finally, this dissertation studies the nonparametric inference for the hazard rate function for the right truncated data. The kernel smoothing technique is utilized in estimating the hazard rate function. A Monte-Carlo study investigates the uniform kernel smoothed estimator and its variance estimator. The uniform, Epanechnikov and biweight kernel estimators are implemented in the example of blood transfusion infected AIDS data.
455

Unemployment Insurance Eligibility and the Dynamics of the Labor Market

Zhang, Min 23 February 2010 (has links)
This thesis examines a number of issues regarding the Mortensen-Pissarides search and matching model’s empirical performance. Chapter 1 documents the volatility puzzle with the Canadian data. The combined data from both Canada and the United States present an additional difficulty. Even if the unobserved value of leisure is allowed to be as high as required to fit the business cycle in the United States or in Canada, the model cannot reconcile the similar labor cycles with the large policy differences in the UI benefits and income taxes in the two countries when the value of leisure is assumed to be the same in both countries. Chapter 2 takes into account the realistic institutional features of the UI system and investigates the impacts of the UI benefits on the labor market outcomes. If entitlement to UI benefits must be earned with employment, generous UI is an additional benefit to an employment relationship, so it promotes job creation. If individuals are risk neutral, UI is fairly priced, and the UI system prevents moral-hazard unemployed workers, the generosity of UI has no effect on unemployment. Chapter 3 shows that the Mortensen-Pissarides search and matching model can be successfully parameterized to generate observed large cyclical fluctuations in unemployment and modest responses of unemployment to changes in the UI benefits. The key features behind this success are the endogenous eligibility for UI benefits and the heterogeneity of workers. With the linear utilities commonly assumed in the Mortensen-Pissarides model, a fully rated UI system designed to prevent moral hazard has no effect on unemployment. However, the UI system in the United States is neither fully rated nor able to prevent workers with low productivity from quitting their jobs or rejecting employment offers to collect benefits. As a result, an increase in UI generosity has a positive, but realistically small, effect on unemployment. This chapter answers the Costain and Reiter (2008) criticism with the Mortensen-Pissarides model.
456

Weather-related geo-hazard assessment model for railway embankment stability

Gitirana Jr., Gilson 01 June 2005
The primary objective of this thesis is to develop a model for quantification of weather-related railway embankments hazards. The model for quantification of embankment hazards constitutes an essential component of a decision support system that is required for the management of railway embankment hazards. A model for the deterministic and probabilistic assessment of weather-related geo-hazards (W-GHA model) is proposed based on concepts of unsaturated soil mechanics and hydrology. The model combines a system of two-dimensional partial differential equations governing the thermo-hydro-mechanical behaviour of saturated/unsaturated soils and soil-atmosphere coupling equations. A Dynamic Programming algorithm for slope stability analysis (Safe-DP) was developed and incorporated into the W-GHA model. Finally, an efficient probabilistic and sensitivity analysis framework based on an alternative point estimate method was proposed. According to the W-GHA model framework, railway embankment hazards are assessed based on factors of safety and probabilities of failures computed using soil property variability and case scenarios. <p> A comprehensive study of unsaturated property variability is presented. A methodology for the characterization and assessment of unsaturated soil property variability is proposed. Appropriate fitting equations and parameter were selected. Probability density functions adequate for representing the unsaturated soil parameters studied were determined. Typical central tendency measures, variability measures, and correlation coefficients were established for the unsaturated soil parameters. The inherent variability of the unsaturated soil properties can be addressed using the probabilistic analysis framework proposed herein. <p> A large number of hypothetical railway embankments were analysed using the proposed model. The embankment analyses were undertaken in order to demonstrate the application of the proposed model and in order to determine the sensitivity of the factor of safety to the uncertainty in several input variables. The conclusions drawn from the sensitivity analysis study resulted in important simplifications of the W-GHA model. It was shown how unsaturated soil mechanics can be applied for the assessment of near ground surface stability hazards. The approach proposed in this thesis forms a protocol for application of unsaturated soil mechanics into geotechnical engineering practice. This protocol is based on predicted unsaturated soil properties and based on the use of case scenarios for addressing soil property uncertainty. Other classes of unsaturated soil problems will benefit from the protocol presented in this thesis.
457

Unemployment Insurance Eligibility and the Dynamics of the Labor Market

Zhang, Min 23 February 2010 (has links)
This thesis examines a number of issues regarding the Mortensen-Pissarides search and matching model’s empirical performance. Chapter 1 documents the volatility puzzle with the Canadian data. The combined data from both Canada and the United States present an additional difficulty. Even if the unobserved value of leisure is allowed to be as high as required to fit the business cycle in the United States or in Canada, the model cannot reconcile the similar labor cycles with the large policy differences in the UI benefits and income taxes in the two countries when the value of leisure is assumed to be the same in both countries. Chapter 2 takes into account the realistic institutional features of the UI system and investigates the impacts of the UI benefits on the labor market outcomes. If entitlement to UI benefits must be earned with employment, generous UI is an additional benefit to an employment relationship, so it promotes job creation. If individuals are risk neutral, UI is fairly priced, and the UI system prevents moral-hazard unemployed workers, the generosity of UI has no effect on unemployment. Chapter 3 shows that the Mortensen-Pissarides search and matching model can be successfully parameterized to generate observed large cyclical fluctuations in unemployment and modest responses of unemployment to changes in the UI benefits. The key features behind this success are the endogenous eligibility for UI benefits and the heterogeneity of workers. With the linear utilities commonly assumed in the Mortensen-Pissarides model, a fully rated UI system designed to prevent moral hazard has no effect on unemployment. However, the UI system in the United States is neither fully rated nor able to prevent workers with low productivity from quitting their jobs or rejecting employment offers to collect benefits. As a result, an increase in UI generosity has a positive, but realistically small, effect on unemployment. This chapter answers the Costain and Reiter (2008) criticism with the Mortensen-Pissarides model.
458

Accounting Conservatism and the Prediction of Corporate Bankruptcy

Perkins, Alexander H 01 January 2013 (has links)
This paper examines the relationship between the accounting conservatism construct and the prediction of corporate bankruptcy. Prior research has explored the link between accounting quality and bankruptcy prediction, but it has not examined the relationship between accounting conservatism and bankruptcy prediction. This study hypothesizes that the inclusion of conservatism metrics in the bankruptcy hazard model estimation process should have an incremental effect on the predictive ability of bankruptcy hazard models. This paper finds that the inclusion of conservatism metrics does enhance the predictive power of bankruptcy hazard models for certain subgroups of a population partitioned on the basis of accounting conservatism metrics.
459

The Occurrence and Behavior of Rainfall-Triggered Landslides in Coastal British Columbia

Guthrie, Richard 05 June 2009 (has links)
This thesis seeks to analyze the occurrence and behavior of rainfall-triggered landslides in coastal British Columbia. In particular, it focuses on the analysis of landslide temporal and spatial distributions occurrence and their magnitudes, and considers the major factors that influence regional landslide behavior. Implicit in the research is the understanding that the landscape of coastal BC is managed, and that landslides, in addition to occurring naturally may be caused by, and certainly impact, resources that are important to humankind. Underlying each chapter is the rationale that by better understanding the causes of, and controls on landslide occurrence and magnitude, we can reduce the impacts and lower the associated risk. Statistical magnitude-frequency relationships are examined in coastal BC. Observations suggest that landslides in coastal British Columbia tend to a larger size until about 10,000 m2 in total area. At this point larger landslides are limited by landscape controls according to a power law. Probabilistic regional hazard analysis is one logical outcome of magnitude-frequency analysis and a regional mass movement hazard map for Vancouver Island is presented. Physiographic controls on statistical magnitude-frequency distributions are examined using a cellular automata based model and results compare favorably to actual landslide behavior: modeled landslides bifurcate at local elevation highs, deposit mass preferentially where the local slopes decrease, find routes in confined valley or channel networks, and, when sufficiently large, overwhelm the local topography. The magnitude-frequency distribution of both the actual landslides and the cellular automata model follow a power law for magnitudes higher than 10,000 m2 - 20,000 m2 and show a flattening of the slope for smaller magnitudes. The results provide strong corroborative evidence for physiographic limitations related to slope, slope distance and the distribution of mass within landslides. The physiographic controls on landslide magnitude, debris flow mobility and runout behavior is examined using detailed field and air photograph analysis. The role of slope on deposition and scour is investigated and a practical method for estimating both entrainment and runout in the field, as well as in the GIS environment, is presented. Further controls on landslide mobility, including the role of gullies and stream channels, roads and benches and intact forests, are considered. The role of landslides in controlling landscape physiography is also examined. In particular, it is determined that moderate-sized landslides do the most work transporting material on hillslopes, defined by a work peak, and that magnitude varies based on local physiography and climate. Landslides that form the work peak are distinct from catastrophic landslides that are themselves formative and system resetting. The persistence time for debris slides/debris flows and rock slides/rock avalanches is calculated over six orders of magnitude and an event is considered catastrophic when it persists in the landscape ten times longer than the population of landslides that form the work peak. A detailed case study examines meteorological controls on landslide occurrence and the role of extreme weather is considered. A critical onset of landslide triggering rainfall intensity is determined to be between 80 mm and 100 mm in 24 hours and wind is determined to result in increased local precipitation. The role of rain-on-snow is also evaluated and determined to be crucial to landslide occurrence. Finally, a conceptual model of landslide-induced denudation for coastal mountain watersheds spanning 10,000 years of environmental change is presented. Recent human impacts are calculated for landslide frequencies over the 20th century. The impact of logging during the last 100 years is unambiguous; logging induced landslides almost doubles the effect frequency of the wettest millennia in the last 10,000 years. This suggests that the impact of logging outpaces that of climatic change. Debris slides and debris flows are estimated to have resulted in a landscape lowering of 0.7 m across the Vancouver Island during the last 10,000 years.
460

Reinsurance Contracting with Adverse Selection and Moral Hazard: Theory and Evidence

Yan, Zhiqiang 03 September 2009 (has links)
This dissertation includes two essays on adverse selection and moral hazard problems in reinsurance markets. The first essay builds a competitive principal-agent model that considers adverse selection and moral hazard jointly, and characterizes graphically various forms of separating Nash equilibria. In the second essay, we use panel data on U.S. property liability reinsurance for the period 1995-2000 to test for the existence of adverse selection and moral hazard. We find that (1) adverse selection is present in private passenger auto liability reinsurance market and homeowners reinsurance market, but not in product liability reinsurance market; (2) residual moral hazard does not exist in all the three largest lines of reinsurance, but is present in overall reinsurance markets; and (3) moral hazard is present in the product liability reinsurance market, but not in the other two lines of reinsurance.

Page generated in 0.0715 seconds