• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1740
  • 414
  • 161
  • 72
  • 54
  • 54
  • 50
  • 50
  • 50
  • 50
  • 50
  • 48
  • 40
  • 37
  • 34
  • Tagged with
  • 3207
  • 437
  • 430
  • 381
  • 364
  • 304
  • 291
  • 264
  • 262
  • 243
  • 231
  • 229
  • 225
  • 216
  • 211
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
941

The Use of Asset Pricing Models and The Forecast of Investment Risk on Financial Distress Firms

Shu, Hung-Chieh 25 August 2005 (has links)
none
942

Application of Optimal Approach in Load Forecasting and Unit Commitment Problems

Liao, Gwo-Ching 25 October 2005 (has links)
An Integrated Chaos Search Genetic Algorithm (CGA) /Fuzzy System (FS), Tabu Search (TS) and Neural Fuzzy Network (NFN) method for load forecasting is presented in this paper. A Fuzzy Hyper-Rectangular Composite Neural Networks (FHRCNNs) was used for the initial load forecasting. Then we used CGAFS and TS to find the optimal solution of the parameters of the FHRCNNs, instead of Back-Propagation (BP). First the CGAFS generates a set of feasible solution parameters and then puts the solution into the TS. The CGAFS has good global optimal search capabilities, but poor local optimal search capabilities. The TS method on the other hand has good local optimal search capabilities. We combined both methods to try and obtain both advantages, and in doing so eliminate the drawback of the traditional ANN training by BP. This thesis presents a hybrid Chaos Search Immune Algorithm (IA)/Genetic Algorithm (GA) and Fuzzy System (FS) method (CIGAFS) for solving short-term thermal generating unit commitment problems (UC). The UC problem involves determining the start-up and shutdown schedules for generating units to meet the forecasted demand at the minimum cost. The commitment schedule must satisfy other constraints such as the generating limits per unit, reserve and individual units. We combined IA and GA, then added chaos search and fuzzy system approach in it. Then we used the hybrid system to solve UC. Numerical simulations were carried out using four cases; ten, twenty and thirty thermal units power systems over a 24-hour period.
943

Essays in monetary policy conduction and its effectiveness: monetary policy rules, probability forecasting, central bank accountability, and the sacrifice ratio

Gabriel, Casillas Olvera, 15 November 2004 (has links)
Monetary policy has been given either too many positive attributes or, in contrast, only economy-disturbing features. Central banks must take into account a wide variety of factors to achieve a proper characterization of modern economies for the optimal implementation of monetary policy. Such is the case of central bank accountability and monetary policy effectiveness. The objective of this dissertation is to examine these two concerns relevant to the current macroeconomic debate. The analyses are carried out using an innovative set of tools to extract presumably important information from historical data of selected macroeconomic indicators. This dissertation consists of three essays. The first essay explores the causality between the elements of the "celebrated" Taylor rule, using a Structural Vector Autoregression approach on US data. Directed acyclical graph techniques and Bayesian search models are used to identify the contemporaneous causal structure in the construction of impulse-response functions. Further analysis is performed by evaluating the implications of performing standard innovation-accounting procedures, derived from a Structural Vector Autoregression on interest rates, inflation, and unemployment. This is examined whenever a causal structure is imposed vs. when it is observed. We find that the interest rate causes inflation and unemployment. This suggests that the Fed has not followed a Taylor rule in any of the two periods under study. This result differs significantly to the case when the causal structure is imposed. The second essay presents an incentive-compatible approach based on proper scoring rules to evaluate density forecasts in order to reduce the central banks' accountability problem. Our results indicate that the surveyed forecasters have done a "better" job than the Monetary Policy Committee (MPC). The third essay analyzes the causal structure of the factors that are presumed to influence the effectiveness of monetary policy, represented by the sacrifice ratio. Directed acyclical graph methods are used to identify the causal flow between such determinants and the sacrifice ratio. We find evidence that, while wage rigidities and central bank independence are the two major determinants of the sacrifice ratio, the degree of openness has no direct effect on the sacrifice ratio.
944

Essays on empirical time series modeling with causality and structural change

Kim, Jin Woong 30 October 2006 (has links)
In this dissertation, three related issues of building empirical time series models for financial markets are investigated with respect to contemporaneous causality, dynamics, and structural change. In the first essay, nation-wide industry information transmission among stock returns of ten sectors in the U.S. economy is examined through the Directed Acyclical Graph (DAG) for contemporaneous causality and Bernanke decomposition for dynamics. The evidence shows that the information technology sector is the most root cause sector. Test results show that DAG from ex ante forecast innovations is consistent with the DAG fro m ex post fit innovations. This supports innovation accounting based on DAGs using ex post innovations. In the second essay, the contemporaneous/dynamic behaviors of real estate and stock returns are investigated. Selected macroeconomic variables are included in the model to explain recent movements of both returns. During 1971-2004, there was a single structural break in October 1980. A distinct difference in contemporaneous causal structure before and after the break is found. DAG results show that REITs take the role of a causal parent after the break. Innovation accounting shows significantly positive responses of real estate returns due to an initial shock in default risk but insignificant responses of stock returns. Also, a shock in short run interest rates affects real estate returns negatively with significance but does not affect stock returns. In the third essay, a structural change in the volatility of five Asian and U.S. stock markets is examined during the post-liberalization period (1990-2005) in the Asian financial markets, using the Sup LM test. Four Asian financial markets (Hong Kong, Japan, Korea, and Singapore) experienced structural changes. However, test results do not support the existence of structural change in volatility for Thailand and U.S. Also, results show that the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) persistent coefficient increases, but the Autoregressive Conditional heteroskedasticity (ARCH) impact coefficient, implying short run adjustment, decreases in Asian markets. In conclusion, when the econometric model is set up, it is necessary to consider contemporaneous causality and possible structural breaks (changes). The dissertation emphasizes causal inference and structural consistency in econometric modeling. It highlights their importance in discovering contemporaneous/dynamic causal relationships among variables. These characteristics will likely be helpful in generating accurate forecasts.
945

Forecasting project progress and early warning of project overruns with probabilistic methods

Kim, Byung Cheol 10 October 2008 (has links)
Forecasting is a critical component of project management. Project managers must be able to make reliable predictions about the final duration and cost of projects starting from project inception. Such predictions need to be revised and compared with the project's objectives to obtain early warnings against potential problems. Therefore, the effectiveness of project controls relies on the capability of project managers to make reliable forecasts in a timely manner. This dissertation focuses on forecasting project schedule progress with probabilistic methods. Currently available methods, for example, the critical path method (CPM) and earned value management (EVM) are deterministic and fail to account for the inherent uncertainty in forecasting and project performance. The objective of this dissertation is to improve the predictive capabilities of project managers by developing probabilistic forecasting methods that integrate all relevant information and uncertainties into consistent forecasts in a mathematically sound procedure usable in practice. In this dissertation, two probabilistic methods, the Kalman filter forecasting method (KFFM) and the Bayesian adaptive forecasting method (BAFM), were developed. The KFFM and the BAFM have the following advantages over the conventional methods: (1) They are probabilistic methods that provide prediction bounds on predictions; (2) They are integrative methods that make better use of the prior performance information available from standard construction management practices and theories; and (3) They provide a systematic way of incorporating measurement errors into forecasting. The accuracy and early warning capacity of the KFFM and the BAFM were also evaluated and compared against the CPM and a state-of-the-art EVM schedule forecasting method. Major conclusions from this research are: (1) The state-of-the-art EVM schedule forecasting method can be used to obtain reliable warnings only after the project performance has stabilized; (2) The CPM is not capable of providing early warnings due to its retrospective nature; (3) The KFFM and the BAFM can and should be used to forecast progress and to obtain reliable early warnings of all projects; and (4) The early warning capacity of forecasting methods should be evaluated and compared in terms of the timeliness and reliability of warning in the context of formal early warning systems.
946

Comparative analyses of the January 2004 cold air outbreak

Hornberger, Kelli Lynne 21 May 2010 (has links)
Cold air outbreaks (CAOs) occur when large scale atmospheric circulations allow for the incursion of polar air masses into middle and lower latitudes, influencing wintertime temperatures regionally. The January 2004 CAO is identified as a major CAO in the Deep South of the United States in terms of wind chill equivalent temperature or a temperature-only criterion. Surface air temperature, horizontal winds, specific humidity, and Ertel potential vorticity are analyzed for this event using several reanalysis products: National Aeronautic and Space Administration Modern Era Retrospective-Analysis for Research and Application (MERRA), the National Centers for Environmental Prediction National Center for Atmospheric Research (NCEP-NCAR), and the National Centers for Environmental Prediction North American Regional Reanalysis (NARR). We perform an intercomparison of the reanalysis products and parallel surface station observations during the synoptic evolution of the leading cold front associated with CAO onset. The key synoptic, mesoscale, and dynamical features associated with onset are studied to determine the relative accuracy of the respective reanalysis products in representing the key features. The comparative evaluation revealed pronounced temperature and moisture biases in the NCEP-NCAR reanalysis products that limit its utility in portraying the synoptic features characteristic of CAO onset. Conversely, both MERRA and NARR accurately represent the detailed thermodynamic and moisture structural evolution associated with CAO onset indicating their utility in future observationally-based studies of CAO events. Ertel potential vorticity analyses indicate that the onset of the 2004 CAO is strongly linked to an incipient tropopause fold feature that developed over the Great Lakes region.
947

Evaluation of electronic commerce forecasts and identification of problems affecting their evaluation

Knutsson, Mats January 1999 (has links)
<p>Businesses use forecasts in order to gather information concerning phenomena that are important to them. Since electronic commerce has grown in importance for businesses, forecasts concerning this area are becoming increasingly important. The work presented in this report aims at gathering information useful when improving forecast quality. In practice the report presents a collection of forecasts, concerning business-to-consumer electronic commerce, and a collection of factors affecting electronic commerce forecast outcomes. A categorisation and evaluation of the collected forecasts is performed, this evaluation is done by comparing the forecasts in the categories to the actual outcomes. Problems that occur during the evaluation process, such as problems with forecast wording and scope, are described, and suggestions of how to avoid these problems are provided. Structured methods to categorise and evaluate the forecasts are also presented. Finally, the outcome from the evaluation is analysed using the compiled factors and indications are given of how to use the results in order to improve future forecasting.</p>
948

Three essays on the prediction and identification of currency crises /

Kennedy, Pauline. January 2003 (has links)
Thesis (Ph. D.)--University of California, San Diego, 2003. / Vita. Includes bibliographical references (leaves 106-110).
949

Comprehensibility, overfitting and co-evolution in genetic programming for technical trading rules

Seshadri, Mukund. January 2003 (has links)
Thesis (M.S.)--Worcester Polytechnic Institute. / Keywords: comprehensiblity; technical analysis; genetic programming; overfitting; cooperative coevolution. Includes bibliographical references (p. 82-87).
950

Modeling and forecast of Brazilian reservoir inflows via dynamic linear models under climate change scenarios

Lima, Luana Medeiros Marangon 06 February 2012 (has links)
The hydrothermal scheduling problem aims to determine an operation strategy that produces generation targets for each power plant at each stage of the planning horizon. This strategy aims to minimize the expected value of the operation cost over the planning horizon, composed of fuel costs to operate thermal plants plus penalties for failure in load supply. The system state at each stage is highly dependent on the water inflow at each hydropower generator reservoir. This work focuses on developing a probabilistic model for the inflows that is suitable for a multistage stochastic algorithm that solves the hydrothermal scheduling problem. The probabilistic model that governs the inflows is based on a dynamic linear model. Due to the cyclical behavior of the inflows, the model incorporates seasonal and regression components. We also incorporate climate variables such as precipitation, El Ni\~no, and other ocean indexes, as predictive variables when relevant. The model is tested for the power generation system in Brazil with about 140 hydro plants, which are responsible for more than 80\% of the electricity generation in the country. At first, these plants are gathered by basin and classified into 15 groups. Each group has a different probabilistic model that describes its seasonality and specific characteristics. The inflow forecast derived with the probabilistic model at each stage of the planning horizon is a continuous distribution, instead of a single point forecast. We describe an algorithm to form a finite scenario tree by sampling from the inflow forecasting distribution with interstage dependency, that is, the inflow realization at a specific stage depends on the inflow realization of previous stages. / text

Page generated in 0.1745 seconds