• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 75
  • Tagged with
  • 75
  • 75
  • 75
  • 10
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Price Based Unit Commitment With Reserve Considerations

Okuslug, Ali 01 January 2013 (has links) (PDF)
In electricity markets of modern electric power systems, many generation companies, as major market participants, aim to maximize their profits by supplying the electrical load in a competitive manner. This thesis is devoted to investigate the price based unit commitment problem which is used to optimize generation schedules of these companies in deregulated electricity markets. The solution algorithm developed is based on Dynamic Programming and Lagrange Relaxation methods and solves the optimization problem for a generation company having many generating units with different cost characteristics. Moreover, unit constraints including ramp-rate limits, minimum ON/OFF times, generation capacities of individual units and system constraints such as total energy limits, reserve requirements are taken into account in the problem formulation. The verification of the algorithm has been carried out by comparing the results of some sample cases with those in the literature. The effectiveness of the algorithm has been tested on several test systems. Finally, the possible utilization of the method by a generation company in Turkish Electricity Market to develop bidding strategies is also examined based on some case studies.
62

Estimating The Neutral Real Interest Rate For Turkey By Using An Unobserved Components Model

Ogunc, Fethi 01 July 2006 (has links) (PDF)
In this study, neutral real interest rate gap and output gap are estimated jointly under two different multivariate unobserved components models with the motivation to provide empirical measures that can be used to analyze the amount of stimulus that monetary policy is passing on to the economy, and to understand historical macroeconomic developments. In the analyses, Kalman filter technique is applied to a small-scale macroeconomic model of the Turkish economy to estimate the unobserved variables for the period 1989-2005. In addition, two alternative specifications for neutral real interest rate are used in the analyses. The first model uses a random walk model for the neutral real interest rate, whereas the second one employs more structural specification, which specifically links the neutral real rate with the trend growth rate and the long-term course of the risk premium. Comparison of the models developed by using various performance criteria clearly indicates the use of more structural specification against random walk specification. Results suggest that though there is relatively high uncertainty surrounding the neutral real interest rate estimates to use them directly in the policy-making process, estimates appear to be very useful for ex-post monetary policy evaluations.
63

Inference Of Piecewise Linear Systems With An Improved Method Employing Jump Detection

Selcuk, Ahmet Melih 01 September 2007 (has links) (PDF)
Inference of regulatory relations in dynamical systems is a promising active research area. Recently, most of the investigations in this field have been stimulated by the researches in functional genomics. In this thesis, the inferential modeling problem for switching hybrid systems is studied. The hybrid systems refers to dynamical systems in which discrete and continuous variables regulate each other, in other words the jumps and flows are interrelated. In this study, piecewise linear approximations are used for modeling purposes and it is shown that piecewise linear models are capable of displaying the evolutionary characteristics of switching hybrid systems approxi- mately. For the mentioned systems, detection of switching instances and inference of locally linear parameters from empirical data provides a solid understanding about the system dynamics. Thus, the inference methodology is based on these issues. The primary difference of the inference algorithm is the idea of transforming the switch- ing detection problem into a jump detection problem by derivative estimation from discrete data. The jump detection problem has been studied extensively in signal processing literature. So, related techniques in the literature has been analyzed care- fully and suitable ones adopted in this thesis. The primary advantage of proposed method would be its robustness in switching detection and derivative estimation. The theoretical background of this robustness claim and the importance of robustness for real world applications are explained in detail.
64

A Test Oriented Service And Object Model For Software Product Lines

Parlakol, Nazif Bulent 01 May 2010 (has links) (PDF)
In this thesis, a new modeling technique is proposed for minimizing regression testing effort in software product lines. The &ldquo / Product Flow Model&rdquo / is used for the common representation of products in application engineering and the &ldquo / Domain Service and Object Model&rdquo / represents the variant based relations between products and core assets. This new approach provides a solution for avoiding unnecessary work load of regression testing using the principles of sub-service decomposition and variant based product/sub-service traceability matrices. The proposed model is adapted to a sample product line targeting the banking domain, called Loyalty and Campaign Management System, where loyalty campaigns for credit cards are the products derived from core assets. Reduced regression test scope after the realization of new requirements is demonstrated through a case study. Finally, efficiency improvement in terms of time and effort in the test process with the adaptation of the proposed model is discussed.
65

A Fuzzy Software Prototype For Spatial Phenomena: Case Study Precipitation Distribution

Yanar, Tahsin Alp 01 October 2010 (has links) (PDF)
As the complexity of a spatial phenomenon increases, traditional modeling becomes impractical. Alternatively, data-driven modeling, which is based on the analysis of data characterizing the phenomena, can be used. In this thesis, the generation of understandable and reliable spatial models using observational data is addressed. An interpretability oriented data-driven fuzzy modeling approach is proposed. The methodology is based on construction of fuzzy models from data, tuning and fuzzy model simplification. Mamdani type fuzzy models with triangular membership functions are considered. Fuzzy models are constructed using fuzzy clustering algorithms and simulated annealing metaheuristic is adapted for the tuning step. To obtain compact and interpretable fuzzy models a simplification methodology is proposed. Simplification methodology reduced the number of fuzzy sets for each variable and simplified the rule base. Prototype software is developed and mean annual precipitation data of Turkey is examined as case study to assess the results of the approach in terms of both precision and interpretability. In the first step of the approach, in which fuzzy models are constructed from data, &quot / Fuzzy Clustering and Data Analysis Toolbox&quot / , which is developed for use with MATLAB, is used. For the other steps, the optimization of obtained fuzzy models from data using adapted simulated annealing algorithm step and the generation of compact and interpretable fuzzy models by simplification algorithm step, developed prototype software is used. If the accuracy is the primary objective then the proposed approach can produce more accurate solutions for training data than geographically weighted regression method. The minimum training error value produced by the proposed approach is 74.82 mm while the error obtained by geographically weighted regression method is 106.78 mm. The minimum error value on test data is 202.93 mm. An understandable fuzzy model for annual precipitation is generated only with 12 membership functions and 8 fuzzy rules. Furthermore, more interpretable fuzzy models are obtained when Gath-Geva fuzzy clustering algorithms are used during fuzzy model construction.
66

Continuous Time Mean Variance Optimal Portfolios

Sezgin Alp, Ozge 01 September 2011 (has links) (PDF)
The most popular and fundamental portfolio optimization problem is Markowitz&#039 / s one period mean-variance portfolio selection problem. However, it is criticized because of its one period static nature. Further, the estimation of the stock price expected return is a particularly hard problem. For this purpose, there are a lot of studies solving the mean-variance portfolio optimization problem in continuous time. To solve the estimation problem of the stock price expected return, in 1992, Black and Litterman proposed the Bayesian asset allocation method in discrete time. Later on, Lindberg has introduced a new way of parameterizing the price dynamics in the standard Black-Scholes and solved the continuous time mean-variance portfolio optimization problem. In this thesis, firstly we take up the Lindberg&#039 / s approach, we generalize the results to a jump-diffusion market setting and we correct the proof of the main result. Further, we demonstrate the implications of the Lindberg parameterization for the stock price drift vector in different market settings, we analyze the dependence of the optimal portfolio from jump and diffusion risk, and we indicate how to use the method. Secondly, we present the Lagrangian function approach of Korn and Trautmann and we derive some new results for this approach, in particular explicit representations for the optimal portfolio process. In addition, we present the L2-projection approach of Schweizer for the continuous time mean-variance portfolio optimization problem and derive the optimal portfolio and the optimal wealth processes for this approach. While, deriving these results as the underlying model, the market parameterization of Lindberg is chosen. Lastly, we compare these three different optimization frameworks in detail and their attractive and not so attractive features are highlighted by numerical examples.
67

Field Oriented Control Of Permanent Magnet Synchronous Motors Using Three-level Neutral-point-clamped Inverter

Mese, Huseyin 01 June 2012 (has links) (PDF)
In this thesis, field oriented control of permanent magnet synchronous motors using three-level neutral-point-clamped inverter is studied. Permanent magnet synchronous motors are used in high performance drive applications. In this study, the permanent magnet synchronous motor is fed by three-level neutral-point-clamped inverter. For three-level neutral-point-clamped inverter different space vector modulation algorithms, which are reported in literature, are analyzed and compared via computer simulations. The voltage balance on dc-link capacitors is also analyzed and a software control method is implemented in conjunction with the space vector PWM modulation, utilized. Nonlinear effects such as dead-time, semiconductor voltage drop and delays in gate drive circuitries also present in neutral-point-clamped inverter. The effects of these nonlinearities are studied and a compensation method for these nonlinear effects is proposed. The theoretical results are supported with computer simulations and verified with experimental results.
68

The Use Of Wavelet Type Basis Functions In The Mom Analysis Of Microstrip Structures

Cakir, Emre 01 December 2004 (has links) (PDF)
The Method of Moments (MoM) has been used extensively to solve electromagnetic problems. Its popularity is largely attributed to its adaptability to structures with various shapes and success in predicting the equivalent induced currents accurately. However, due to its dense matrix, especially for large structures, the MoM suffers from long matrix solution time and large storage requirement. In this thesis it is shown that use of wavelet basis functions result in a MoM matrix which is sparser than the one obtained by using traditional basis functions. A new wavelet system, different from the ones found in literature, is proposed. Stabilized Bi-Conjugate Gradient Method which is an iterative matrix solution method is utilized to solve the resulting sparse matrix equation. Both a one-dimensional problem with a microstrip line example and a two-dimensional problem with a rectangular patch antenna example are studied and the results are compared.
69

Game Theoretic Approach To Newsboy Problem: Nash, Stackelberg, Cooperative Games

Ozsoy, Aysu Sultan 01 September 2005 (has links) (PDF)
In this thesis, competitive and cooperative newsboy problems for two substitutable products are analyzed by using game theoretic concepts. The demands of the products are assumed to be dependent and normally distributed. Competition is handled for Nash and Stackelberg games. Nash and Stackelberg games are compared in terms of the order quantities and the expected profits. Cooperative newsboy problem is analyzed for the products having equal costs and revenues. The effect of demand correlation on the order quantities and the expected profits in all of the games is investigated through numerical experiments. Optimal solutions of the Nash, Stackelberg and the cooperative games are examined analytically when the demand correlation is 1.
70

Efficient Index Structures For Video Databases

Acar, Esra 01 February 2008 (has links) (PDF)
Content-based retrieval of multimedia data has been still an active research area. The efficient retrieval of video data is proven a difficult task for content-based video retrieval systems. In this thesis study, a Content-Based Video Retrieval (CBVR) system that adapts two different index structures, namely Slim-Tree and BitMatrix, for efficiently retrieving videos based on low-level features such as color, texture, shape and motion is presented. The system represents low-level features of video data with MPEG-7 Descriptors extracted from video shots by using MPEG-7 reference software and stored in a native XML database. The low-level descriptors used in the study are Color Layout (CL), Dominant Color (DC), Edge Histogram (EH), Region Shape (RS) and Motion Activity (MA). Ordered Weighted Averaging (OWA) operator in Slim-Tree and BitMatrix aggregates these features to find final similarity between any two objects. The system supports three different types of queries: exact match queries, k-NN queries and range queries. The experiments included in this study are in terms of index construction, index update, query response time and retrieval efficiency using ANMRR performance metric and precision/recall scores. The experimental results show that using BitMatrix along with Ordered Weighted Averaging method is superior in content-based video retrieval systems.

Page generated in 0.0359 seconds