• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 8
  • Tagged with
  • 36
  • 36
  • 36
  • 11
  • 11
  • 10
  • 10
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A two-level Probabilistic Risk Assessment of cascading failures leading to blackout in transmission power systems

Henneaux, Pierre 19 September 2013 (has links)
In our society, private and industrial activities increasingly rest on the implicit assumption that electricity is available at any time and at an affordable price. Even if operational data and feedback from the electrical sector is very positive, a residual risk of blackout or undesired load shedding in critical zones remains. The occurrence of such a situation is likely to entail major direct and indirect economical consequences, as observed in recent blackouts. Assessing this residual risk and identifying scenarios likely to lead to these feared situations is crucial to control and optimally reduce this risk of blackout or major system disturbance. The objective of this PhD thesis is to develop a methodology able to reveal scenarios leading to a blackout or a major system disturbance and to estimate their frequencies and their consequences with a satisfactory accuracy.<p><p>A blackout is a collapse of the electrical grid on a large area, leading to a power cutoff, and is due to a cascading failure. Such a cascade is composed of two phases: a slow cascade, starting with the occurrence of an initiating event and displaying characteristic times between successive events from minutes to hours, and a fast cascade, displaying characteristic times between successive events from milliseconds to tens of seconds. In cascading failures, there is a strong coupling between events: the loss of an element increases the stress on other elements and, hence, the probability to have another failure. It appears that probabilistic methods proposed previously do not consider correctly these dependencies between failures, mainly because the two very different phases are analyzed with the same model. Thus, there is a need to develop a conceptually satisfying probabilistic approach, able to take into account all kinds of dependencies, by using different models for the slow and the fast cascades. This is the aim of this PhD thesis.<p><p>This work first focuses on the level-I which is the analysis of the slow cascade progression up to the transition to the fast cascade. We propose to adapt dynamic reliability, an integrated approach of Probabilistic Risk Analysis (PRA) developed initially for the nuclear sector, to the case of transmission power systems. This methodology will account for the double interaction between power system dynamics and state transitions of the grid elements. This PhD thesis also introduces the development of the level-II to analyze the fast cascade, up to the transition towards an operational state with load shedding or a blackout. The proposed method is applied to two test systems. Results show that thermal effects can play an important role in cascading failures, during the first phase. They also show that the level-II analysis after the level-I is necessary to have an estimation of the loss of supplied power that a scenario can lead to: two types of level-I scenarios with a similar frequency can induce very different risks (in terms of loss of supplied power) and blackout frequencies. The level-III, i.e. the restoration process analysis, is however needed to have an estimation of the risk in terms of loss of supplied energy. This PhD thesis also presents several perspectives to improve the approach in order to scale up applications to real grids.<p> / Doctorat en Sciences de l'ingénieur / info:eu-repo/semantics/nonPublished
32

Reliability in performance-based regulation

Solver, Torbjörn January 2005 (has links)
In reregulated and restructured electricity markets the production and retail of electricity is conducted on competitive markets, the transmission and distribution on the other hand can be considered as natural monopolies. The financial regulation of Distribution System Operators (DSOs) has in many countries, partly as a consequence of the restructuring in ownership, gone through a major switch in regulatory policy. From applying regulatory regimes were the DSOs were allowed to charge their customers according to their actual cost plus some profit, i.e. cost-based regulation, to regulatory models in which the DSOs performance are valued in order to set the allowable revenue, i.e. Performance-Based Regulation (PBR). In regulatory regimes that value performance, the direct link between cost and income is weakened or sometimes removed. This give the regulated DSOs strong cost cutting incentives and there is consequently a risk of system reliability deterioration due to postponed maintenance and investments in order to save costs. To balance this risk the PBR-framework is normally complemented with some kind of quality regulation (QR). How both the PBR and QR frameworks are constructed determines the incentive that the DSO will act on and will therefore influence the system reliability development. This thesis links the areas of distribution system reliability and performancebased regulation. First, the key incentive features within PBR, that includes the quality of supply, are identified using qualitative measures that involve analyses of applied regulatory regimes, and general regulatory policies. This results in a qualitative comparison of applied PBR models. Further, the qualitative results are quantified and analysed further using time sequential Monte Carlo simulations (MCS). The MCS enables detailed analysis of regulatory features, parameter settings and financial risk assessments. In addition, the applied PBRframeworks can be quantitatively compared. Finally, some focus have been put on the Swedish regulation and the tool developed for DSO regulation, the Network Performance Assessment Model (NPAM), what obstacles there might be and what consequences it might bring when in affect. / QC 20101221
33

Reliability assessment of electrical power systems using genetic algorithms / Reliability assessment of electric power systems using genetic algorithms

Samaan, Nader Amin Aziz 15 November 2004 (has links)
The first part of this dissertation presents an innovative method for the assessment of generation system reliability. In this method, genetic algorithm (GA) is used as a search tool to truncate the probability state space and to track the most probable failure states. GA stores system states, in which there is generation deficiency to supply system maximum load, in a state array. The given load pattern is then convoluted with the state array to obtain adequacy indices. In the second part of the dissertation, a GA based method for state sampling of composite generation-transmission power systems is introduced. Binary encoded GA is used as a state sampling tool for the composite power system network states. A linearized optimization load flow model is used for evaluation of sampled states. The developed approach has been extended to evaluate adequacy indices of composite power systems while considering chronological load at buses. Hourly load is represented by cluster load vectors using the k-means clustering technique. Two different approaches have been developed which are GA parallel sampling and GA sampling for maximum cluster load vector with series state revaluation. The developed GA based method is used for the assessment of annual frequency and duration indices of composite system. The conditional probability based method is used to calculate the contribution of sampled failure states to system failure frequency using different component transition rates. The developed GA based method is also used for evaluating reliability worth indices of composite power systems. The developed GA approach has been generalized to recognize multi-state components such as generation units with derated states. It also considers common mode failure for transmission lines. Finally, a new method for composite system state evaluation using real numbers encoded GA is developed. The objective of GA is to minimize load curtailment for each sampled state. Minimization is based on the dc load flow model. System constraints are represented by fuzzy membership functions. The GA fitness function is a combination of these membership values. The proposed method has the advantage of allowing sophisticated load curtailment strategies, which lead to more realistic load point indices.
34

Optimization of Section Points Locations in Electric Power Distribution Systems : Development of a Method for Improving the Reliability / Optimal placering av sektioneringspunkter : Utveckling av metod för att förbättra tillförlitligheten

Johansson, Joakim January 2015 (has links)
The power distribution system is the final link to transfer the electrical energy to the individual customers. It is distributed in a complex technical grid but is associated with the majority of all outages occurring. Improving its reliability is an efficient way to reduce the effects of outages. A common way of improving the reliability is by designing loop structures containing two connected feeders separated by a section point. The location of the section point will decide how the system structure is connected and its level of reliability. By finding the optimal location, an improved reliability may be accomplished. This Master’s thesis has developed a method of finding optimized section points locations in a primary distribution system in order to improve its reliability. A case study has been conducted in a part of Mälarenergi Elnät’s distribution system with the objective of developing an algorithm in MATLAB able to generate the optimal section points in the area. An analytical technique together with a method called Failure Modes and Effect Analysis (FMEA) as preparatory step, was used to simulate the impact of outages in various components based on historical data and literature reviews. Quantifying the impact was made by calculating the System Average Interruption Duration Index (SAIDI) and the Expected Cost (ECOST) which represented the reliability from a customer- and a socio-economic perspective. Using an optimization routine based on a Greedy algorithm an improvement of the reliability was made possible. The result of the case study showed a possible improvement of 28% on SAIDI and 41% on ECOST if optimizing the location of section points. It also indicated that loop structures containing mostly industry-, trade- and service-sectors may improve ECOST considerably by having a relocated section point. The analysis concluded that based on the considerable improvement the case study showed, a distribution system could be highly benefitted by optimizing the location of section points. The created algorithm may provide a helpful tool well representative for such a process in a cost-effective way. Applying it into a full size system was considered being possible but it would first require some additional improvements of reliability inputs and to resolve some fundamental issues like rated current in lines and geographical distances to substations.
35

Novel Computational Methods for the Reliability Evaluation of Composite Power Systems using Computational Intelligence and High Performance Computing Techniques

Green, Robert C., II 24 September 2012 (has links)
No description available.
36

Real-time Scheduling for Data Stream Management Systems

Lehner, Wolfgang, Schmidt, Sven, Legler, Thomas, Schaller, Daniel 02 June 2022 (has links)
Quality-aware management of data streams is gaining more and more importance with the amount of data produced by streams growing continuously. The resources required for data stream processing depend on different factors and are limited by the environment of the data stream management system (DSMS). Thus, with a potentially unbounded amount of stream data and limited processing resources, some of the data stream processing tasks (originating from different users) may not be satisfyingly answered, and therefore, users should be enabled to negotiate a certain quality for the execution of their stream processing tasks. After the negotiation process, it is the responsibility of the Data Stream Management System to meet the quality constraints by using adequate resource reservation and scheduling techniques. Within this paper, we consider different aspects of real-time scheduling for operations within a DSMS. We propose a scheduling concept which enables us to meet certain time-dependent quality of service requirements for user-given processing tasks. Furthermore, we describe the implementation of our scheduling concept within a real-time capable data stream management system, and we give experimental results on that.

Page generated in 0.1224 seconds