• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 10
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 70
  • 70
  • 70
  • 21
  • 21
  • 18
  • 11
  • 10
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

The use of absorbing boundaries in the analysis of bankruptcy

Hildebrand, Paul 11 1900 (has links)
An explicit solution is given for the value of a risk neutral firm with stochastic revenue facing the possibility of bankruptcy. The analysis is conducted in continuous time. Uncertainty is modeled using an Ito process and bankruptcy is modeled as an absorbing boundary. The analysis yields an ordinary differential equation with a closed form solution. The value function is used to calculate the firm's demand for high interest rate loans, showing a positive demand at interest rates which appear intuitively to be excessive. A value function is also derived for a risk neutral lender advancing funds to the firm. The borrowing and lending value functions are then used to examine various aspects of lender-borrower transactions under different bargaining structures. In a competitive lending market, the model shows that credit rationing occurs inevitably. In a monopoly lending market, the lender sets interest rates and maximum loan levels which reduce the borrower to zero profit. When a second borrower is introduced, the lender must allocate limited funds between two borrowers. A lender is shown to squeeze the smaller "riskier" borrower out of the market when the lender's overall credit constraint is tight. Under each bargaining structure, the model is also used to examine changes in the respective "salvage" recoveries of the lender and borrower on bankruptcy. Accepted: / Arts, Faculty of / Vancouver School of Economics / Graduate
42

A Multivariate Modeling Approach for Generating Ensemble Climatology Forcing for Hydrologic Applications

Khajehei, Sepideh 21 July 2015 (has links)
Reliability and accuracy of the forcing data plays a vital role in the Hydrological Streamflow Prediction. Reliability of the forcing data leads to accurate predictions and ultimately reduction of uncertainty. Currently, Numerical Weather Prediction (NWP) models are developing ensemble forecasts for various temporal and spatial scales. However, it is proven that the raw products of the NWP models may be biased at the basin scale; unlike model grid scale, depending on the size of the catchment. Due to the large space-time variability of precipitation, bias-correcting the ensemble forecasts has proven to be a challenging task. In recent years, Ensemble Pre-Processing (EPP), a statistical approach, has proven to be helpful in reduction of bias and generation of reliable forecast. The procedure is based on the bivariate probability distribution between observation and single-value precipitation forecasts. In the current work, we have applied and evaluated a Bayesian approach, based on the Copula density functions, to develop an ensemble precipitation forecasts from the conditional distribution of the single-value precipitation. Copula functions are the multivariate joint distribution of univariate marginal distributions and are capable of modeling the joint distribution of two variables with any level of correlation and dependency. The advantage of using Copulas, amongst others, includes its capability of modeling the joint distribution independent of the type of marginal distribution. In the present study, we have evaluated the capability of copula-based functions in EPP and comparison is made against an existing and commonly used procedure for same i.e. meta-Gaussian distribution. Monthly precipitation forecast from Climate Forecast System (CFS) and gridded observation from Parameter-elevation Relationships on Independent Slopes Model (PRISM) have been utilized to create ensemble pre-processed precipitation over three sub-basins in the western USA at 0.5-degree spatial resolution. The comparison has been made using both deterministic and probabilistic frameworks of evaluation. Across all the sub-basins and evaluation techniques, copula-based technique shows more reliability and robustness as compared to the meta-Gaussian approach.
43

Synoptic-scale water budgets for quantitative precepitation diagnosis and forecasting

Domm, Geoffrey Shepherd January 1980 (has links)
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Meteorology, 1980. / Microfiche copy available in Archives and Science. / Bibliography: leaf 134. / by Geoffrey Shepherd Domm. / M.S.
44

Synoptic and diagnostic analyses of CASP storm #14

Jean, Michel, 1959 Sept. 29- January 1987 (has links)
No description available.
45

The use of neural networks in the combining of time series forecasts with differential penalty costs

Kohers, Gerald 21 October 2005 (has links)
The need for accurate forecasting and its potential benefits are well established in the literature. Virtually all individuals and organizations have at one time or another made decisions based on forecasts of future events. This widespread need for accurate predictions has resulted in considerable growth in the science of forecasting. To a large degree, practitioners are heavily dependent on academicians for generating new and improved forecasting techniques. In response to an increasingly dynamic environment, diverse and complex forecasting methods have been proposed to more accurately predict future events. These methods, which focus on the different characteristics of historical data, have ranged in complexity from simplistic to very sophisticated mathematical computations requiring a high level of expertise. By combining individual techniques to form composite forecasts in order to improve on the forecasting accuracy, researchers have taken advantage of the various strengths of these techniques. A number of combining methods have proven to yield better forecasts than individual methods, with the complexity of the various combining methods ranging from a simple average to quite complex weighting schemes. The focus of this study is to examine the usefulness of neural networks in composite forecasting. Emphasis is placed on the effectiveness of two neural networks (i.e., a backpropagation neural network and a modular neural network) relative to three traditional composite models (i.e., a simple average, a constrained mathematical programming model, and an unconstrained mathematical programming model) in the presence of four penalty cost functions for forecasting errors. Specifically, the overall objective of this study is to compare the shortterm predictive ability of each of the five composite forecasting techniques on various first-order autoregressive models, taking into account penalty cost functions representing four different situations. The results of this research suggest that in the vast majority of scenarios examined in this study, the neural network model clearly outperformed the other composite models. / Ph. D.
46

Evaluation of a financial distress model for Department of Defense hardware contractors

Collins, Richard B. 10 October 2009 (has links)
This thesis investigates the accuracy of a model that the Department of the Navy uses to predict the financial health of major defense hardware contractors. The inputs to this model are six financial ratios derived from a firm’s income sheet and balance statement. The output of this model is a single Z-score that indicates the health of a firm. Depending on the score, a firm’s financial standing is classified as healthy, distressed, or uncertain. The model is tested using a database compiled for this thesis that includes financial information for a total of 72 defense and non-defense firms. The test database is unique relative to the underlying model database; it reflects a more recent timeframe and a greater number of firms, none of which were used to develop the model. Model accuracy was computed by measuring how often the model correctly classified a bankrupt firm as distressed and a nonbankrupt firm as healthy. Then, model accuracy was evaluated by comparing these test results (i.e., percent correctly classified) to results published by the model’s developers. This comparison produced mixed results. In light of this fact, the thesis concludes that the model should be improved and recommends a course of action. / Master of Arts
47

Land use forecasting in regional air quality modeling

Song, Ji Hee 28 August 2008 (has links)
Not available / text
48

Land use forecasting in regional air quality modeling

Song, Ji Hee, 1980- 18 August 2011 (has links)
Not available / text
49

Simulated forecasting of yellow perch (Perca flavescens) relative population density for Indiana waters of Lake Michigan : responses to varying harvest and alewife density

Cwalinski, Tim A. January 1996 (has links)
The yellow perch, (Perca flavescens), is an important commercial and sport fish in Indiana waters of Lake Michigan. The population is currently managed by temporary restrictions of commercial harvest. A computer simulation model was developed to examine the effects of various constant harvest quotas and alewife densities on yellow perch relative numbers.Model design is based on the SLAM II simulation language incorporating a FORTRAN biological subroutine. The age-structured population model includes measured or predicted biological characteristics of the dynamic pool model. Recruitment is based on a preestablished three-dimensional Ricker stock-recruitment function including alewife (Alosa pseudoharengus) species interaction as a constant or stochastic factor. Sex-specific natural mortality rates were established through life history parameter analysis and the von Bertalanffy growth factors. Density-dependent growth is incorporated into each year of a model run and fluctuates with the simultaneous density of fish. Constant levels of commercial harvest ranging from 0 to 700,000 kg were used in 20-year forecasts. Initial conditions for model runs were 1984 and 1994 trawl CPUE levels when yellow perch were at high and low levels, respectively according to standardized sampling. Response variables were examined as mean catches over each forecast length and included: age 2 fish, spawning stock (z 190 mm), and total catch > age 1.Alewife densities had a tremendous impact on mean catches of the response variables. Highest catches under any forecast period occurred when alewife was considered absent from the system. Catches declined as alewife density was increased as a 20-year constant under each harvest regimen.Catches of spawning size fish were maintained at highest levels for all forecast periods when harvest was set to zero. Catches of young fish were moderate with this harvest regimen if initial catch conditions were high such as in 1984. Catches of young fish were always higher in the absence of a commercial fishery if initial catch conditions were low such as in 1994. Low to moderate harvest quotas could maintain moderate levels of young fish for the forecast length if initial model conditions were high. However, these quota levels for the 1984-2004 forecast length resulted in lower mean catches of spawning size fish as compared to the no commercial fishery regimen. The best case scenario for all response variables when initial catch conditions were low was under a no commercial harvest regimen. / Department of Biology
50

The statistical tests on mean reversion properties in financial markets

Wong, Chun-mei, May., 王春美 January 1994 (has links)
published_or_final_version / Statistics / Master / Master of Philosophy

Page generated in 0.2272 seconds