• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 7
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 41
  • 41
  • 18
  • 9
  • 9
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Next Generation Design of a Frequency Data Recorder Using Field Programmable Gate Arrays

Billian, Bruce 25 September 2006 (has links)
The Frequency Disturbance Recorder (FDR) is a specialized data acquisition device designed to monitor fluctuations in the overall power system. The device is designed such that it can be attached by way of a standard wall power outlet to the power system. These devices then transmit their calculated frequency data through the public internet to a centralized data management and storage server. By distributing a number of these identical systems throughout the three major North American power systems, Virginia Tech has created a Frequency Monitoring Network (FNET). The FNET is composed of these distributed FDRs as well as an Information Management Server (IMS). Since frequency information can be used in many areas of power system analysis, operation and control, there are a great number of end uses for the information provided by the FNET system. The data provides researchers and other users with the information to make frequency analyses and comparisons for the overall power system. Prior to the end of 2004, the FNET system was made a reality, and a number of FDRs were placed strategically throughout the United States. The purpose of this thesis is to present the elements of a new generation of FDR hardware design. These elements will enable the design to be more flexible and to lower reliance on some vendor specific components. Additionally, these enhancements will offload most of the computational processing required of the system to a commodity PC rather than an embedded system solution that is costly in both development time and financial cost. These goals will be accomplished by using a Field Programmable Gate Array (FPGA), a commodity off-the-shelf personal computer, and a new overall system design. / Master of Science
32

Využití a výběr monitorovacího systému ve speciální tělesné přípravě AČR / Use and selection of monitoring system in special physical training of Army of the Czech Republic

Raděj, Karel January 2012 (has links)
AABBSSTTRRAACCTT TTiitteell:: Use and selection of monitoring system in special physical training of Army of the Czech Republic GGooaallss:: The goal of this thesis is to give a complex, integrated and comprehensive overview about functional options and effective utilization of Monitoring System in selected areas of special physical training of Czech Army. MMeetthhooddss:: Study, analysis, synthesis and comparison of published findings of professionals from the fields of Geography, Geodesy and Special Physical Training were used in this thesis. Secondary research consisted of interviews with experts on mentioned areas and disciplines. RReessuullttss:: The main subject of this work is the utilization of Monitoring System in special physical training process. Usage of MS in selected areas of special physical training will result in effective support in terms of planning, safety and overall evaluation of its individual parts. KKeeyy wwoorrddss:: GNSS (Global Navigation Satellite System) GPS (Global Positioning System) GIS (Geographic Information System) Special physical training Load Monitoring system Date
33

Prise en compte des risques de cyber-attaques dans le domaine de la sécurité des systèmes cyber-physiques : proposition de mécanismes de détection à base de modèles comportementaux / Addressing cyber-attack risks for the security of cyber-physical systems : proposition of detection mechanisms based on behavioural models

Sicard, Franck 11 October 2018 (has links)
Les systèmes de contrôle-commande industriels (Industrial Control System, ICS) sont des infrastructures constituées par un ensemble de calculateurs industriels reliés en réseau et permettant de contrôler un système physique. Ils assurent le pilotage de réseaux électriques (Smart Grid), de systèmes de production, de transports, de santé ou encore de systèmes d’armes. Pensés avant tout pour assurer productivité et respect de la mission dans un environnement non malveillant, les ICS sont, depuis le 21ème siècle, de plus en plus vulnérables aux attaques (Stuxnet, Industroyer, Triton, …) notamment avec l’arrivée de l’industrie 4.0. De nombreuses études ont contribué à sécuriser les ICS avec des approches issues du domaine de la sécurité (cryptographie, IDS, etc…) mais qui ne tiennent pas compte du comportement du système physique et donc des conséquences de l’acte de malveillance en lui-même. Ainsi, une sécurisation se limitant exclusivement à l’analyse des informations qui transitent sur un réseau industriel n’est pas suffisante. Notre approche amène un changement de paradigme dans les mécanismes de détection en y intégrant la modélisation du comportement du système cyber-physique.Cette thèse propose des mécanismes de détection d’attaques en se positionnant au plus proche de la physique. Ils analysent les données échangées entre le système de contrôle-commande et le système physique, et filtrent les échanges au travers de modèles déterministes qui représentent le comportement du système physique soumis à des lois de commande. A cet effet, une méthodologie de conception a été proposée dans laquelle l’ensemble des ordres est identifié afin de détecter les attaques brutales. Pour faire face aux autres attaques, en particulier celles plus sournoises, comme les attaques par séquences, nous proposons une stratégie de détection complémentaire permettant d’estimer l’occurrence d’une attaque avant que ses conséquences ne soient destructives. A cet effet, nous avons développé des concepts de distance d’un état caractérisé comme critique auquel nous avons adjoint un second mécanisme dit de trajectoire dans le temps permettant de caractériser une intention de nuire.L’approche proposée hybride ainsi deux techniques orientées sécurité (sonde IDS) et sûreté (approche filtre) pour proposer une stratégie de détection basée sur quatre mécanismes lié :• A la détection de contexte : basé sur l’état courant de l’ICS, un ordre émis par l’API peut être bloqué s’il conduit vers un état critique (attaque brutale).• Aux contraintes combinatoires (attaque par séquences) : vérifiées par les concepts de distance et de trajectoire (évolution de la distance).• Aux contraintes temporelles (attaque temporelle) : vérifiées par des fenêtres temporelles sur l’apparition d’évènements et d’indicateurs surveillant la durée moyenne d’exécution.• Aux sur-sollicitations basées sur un indicateur surveillant les commandes envoyées afin de prévenir un vieillissement prématuré (attaque sur les équipements).L’approche proposée a été appliquée sur différents exemples de simulation et sur une plateforme industrielle réelle où la stratégie de détection a montré son efficacité face à différents profils d’attaquant. / Industrial Control Systems (ICSs) are infrastructures composed by several industrial devices connected to a network and used to control a physical system. They control electrical power grid (Smart Grid), production systems (e.g. chemical and manufacturing industries), transport (e.g. trains, aircrafts and autonomous vehicles), health and weapon systems. Designed to ensure productivity and respect safety in a non-malicious environment, the ICSs are, since the 21st century, increasingly vulnerable to attacks (e.g. Stuxnet, Industroyer, Triton) especially with the emergence of the industry 4.0. Several studies contributed to secure the ICS with approaches from the security field (e.g. cryptography, IDS) which do not take into account the behavior of the physical system and therefore the consequences of the malicious act. Thus, a security approach limited exclusively to the analysis of information exchanged by industrial network is not sufficient. Our approach creates a paradigm shift in detection mechanisms by integrating the behavioral modeling of the cyber-physical system.This thesis proposes detection mechanisms of attacks by locating detection closer to physical system. They analyze the data exchanged between the control system and the physical system, and filter the exchanges through deterministic models that represent the behavior of the physical system controlled by control laws. For this purpose, a design methodology has been proposed in which all actions are identified in order to instantly detect brutal attacks. To deal with other attacks, especially the more sneaky, such as sequential attacks, we propose a complementary detection strategy to estimate the occurrence of an attack before its consequences are destructive. To this end, we have developed the concepts of distance of a state identified as critical to which we have added a second mechanism called trajectory which leads to a temporal notion that characterize an intention to harm.As part of this thesis, the proposed approach combines two techniques oriented security (IDS probe) and safety (filter approach) to propose a detection strategy based on four mechanisms related to:• Context detection: based on the current state of the system, an order sent by the PLC can be blocked by the control filter if it leads to a critical state (brutal attack).• Combinatorial constraints (sequential attack): verified by the concepts of distance (risk indicator for the current state) and trajectory (indicator of the intention to harm by studying the evolution of the distance on a sequence).• Temporal constraints (temporal attack): verified by time windows on the appearance of events and an indicator monitoring the average duration of execution.• Over-solicitation monitoring mechanism: based on an indicator monitoring orders sent to the actuators to prevent premature ageing of the production equipment (attack on the equipment).The proposed approach has been applied to various simulation examples and an industrial platform where the detection strategy has shown its effectiveness against different scenarios corresponding to attacker profiles.
34

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
35

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
36

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
37

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
38

Vizualizace a on-line kontrola důležitých parametrů blokových a odbočkových transformátorů jaderné elektrárny Dukovany / Visualization and online control of important parameters of block and tap-changing transformers at the Dukovany nuclear power plant

Holeš, David January 2020 (has links)
The thesis focuses on a design of visualization and setting limits of important parametres of power and own-consumption transformers at the nuclear power plant Dukovany. In the first part there is a description of a present technical state of oil power transformers at this power plant, including a description of a currently installed transformers monitoring system and electro monitoring system. The second part deals with a design of a visialization of parametrs and a diagram design of active-access displays of monitored parametrs of these transformers. In the thesis there is also a description of web interface with a new visualization. The last part of the thesis contains a design of setting the limits and criteria of important monitoring parametres of these transformers.
39

POLYNOMIAL CURVE FITTING INDICES FOR DYNAMIC EVENT DETECTION IN WIDE-AREA MEASUREMENT SYSTEMS

Longbottom, Daniel W. 14 August 2013 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In a wide-area power system, detecting dynamic events is critical to maintaining system stability. Large events, such as the loss of a generator or fault on a transmission line, can compromise the stability of the system by causing the generator rotor angles to diverge and lose synchronism with the rest of the system. If these events can be detected as they happen, controls can be applied to the system to prevent it from losing synchronous stability. In order to detect these events, pattern recognition tools can be applied to system measurements. In this thesis, the pattern recognition tool decision trees (DTs) were used for event detection. A single DT produced rules distinguishing between and the event and no event cases by learning on a training set of simulations of a power system model. The rules were then applied to test cases to determine the accuracy of the event detection. To use a DT to detect events, the variables used to produce the rules must be chosen. These variables can be direct system measurements, such as the phase angle of bus voltages, or indices created by a combination of system measurements. One index used in this thesis was the integral square bus angle (ISBA) index, which provided a measure of the overall activity of the bus angles in the system. Other indices used were the variance and rate of change of the ISBA. Fitting a polynomial curve to a sliding window of these indices and then taking the difference between the polynomial and the actual index was found to produce a new index that was non-zero during the event and zero all other times for most simulations. After the index to detect events was chosen to be the error between the curve and the ISBA indices, a set of power system cases were created to be used as the training data set for the DT. All of these cases contained one event, either a small or large power injection at a load bus in the system model. The DT was then trained to detect the large power injection but not the small one. This was done so that the rules produced would detect large events on the system that could potentially cause the system to lose synchronous stability but ignore small events that have no effect on the overall system. This DT was then combined with a second DT that predicted instability such that the second DT made the decision whether or not to apply controls only for a short time after the end of every event, when controls would be most effective in stabilizing the system.
40

Intelligent Techniques for Monitoring of Integrated Power Systems

Agrawal, Rimjhim January 2013 (has links) (PDF)
Continued increase in system load leading to a reduction in operating margins, as well as the tendency to move towards a deregulated grid with renewable energy sources has increased the vulnerability of the grid to blackouts. Advanced intelligent techniques are therefore required to design new monitoring schemes that enable smart grid operation in a secure and robust manner. As the grid is highly interconnected, monitoring of transmission and distribution systems is increasingly relying on digital communication. Conventional security assessment techniques are slow, hampering real-time decision making. Hence, there is a need to develop fast and accurate security monitoring techniques. Intelligent techniques that are capable of processing large amounts of captured data are finding increasing scope as essential enablers for the smart grid. The research work presented in this thesis has evolved from the need for enhanced monitoring in transmission and distribution grids. The potential of intelligent techniques for enhanced system monitoring has been demonstrated for disturbed scenarios in an integrated power system. In transmission grids, one of the challenging problems is network partitioning, also known as network area-decomposition. In this thesis, an approach based on relative electrical distance (RED) has been devised to construct zonal dynamic equivalents such that the dynamic characteristics of the original system are retained in the equivalent system within the desired accuracy. Identification of coherent generators is another key aspect in power system dynamics. In this thesis, a support vector clustering-based coherency identification technique is proposed for large interconnected multi-machine power systems. The clustering technique is based on coherency measure which is formulated using the generator rotor measurements. These rotor measurements can be obtained with the help of Phasor Measurement Units (PMUs). In distribution grids, accurate and fast fault identification of faults is a key challenge. Hence, an automated fault diagnosis technique based on multi class support vector machines (SVMs) has been developed in this thesis. The proposed fault location scheme is capable of accurately identify the fault type, location of faulted line section and the fault impedance in the distributed generation (DG) systems. The proposed approach is based on the three phase voltage and current measurements available at all the sources i.e. substation and at the connection points of DGs. An approach for voltage instability monitoring in 3-phase distribution systems has also been proposed in this thesis. The conventional single phase L-index measure has been extended to a 3-phase system to incorporate information pertaining to unbalance in the distribution system. All the approaches proposed in this thesis have been validated using standard IEEE test systems and also on practical Indian systems.

Page generated in 0.1431 seconds