61 |
A load flow and economic dispatch computer code for the Tucson Gas and Electric power systemBaldwin, Richard Francis January 1979 (has links)
No description available.
|
62 |
Analysis and continuous simulation of secure-economic operation of power systemsFahmideh-Vojdani, A. (Alireza) January 1982 (has links)
The present thesis, for the most part, is concerned with the application of continuous optimization to the secure-economic dispatching of thermal power plants. / The general concept of the continuous simulation of the optimum operation of power systems is introduced. This is the characterization of the solution trajectory of the underlying dispatching model as the loads vary along a forecasted trajectory, or as the system parameters continuously change. An efficient continuation algorithm is developed which characterizes the solution trajectory of a secure-economic dispatch model, given a piecewise linear trajectory of the bus loads, parameterized in terms of the system load. The algorithm considers piecewise quadratic generation cost functions, a DC load flow model, and the limits on generations and power flows in the normal and post transmission line outage states. The solution trajectory is provided in an analytic form over the entire loadability range of the system. Applications of the algorithms to systems with up to 118 buses show that it is fast, reliable, and well-suited for many applications in power system planning and operation. / The continuation algorithm, as the thesis describes, in essence applies the Incremental Loading procedure to the secure-economic dispatching. In this light, it can be viewed as a natural extension of the highly successful classical dispatching techniques such as Lambda Dispatching. A reexamination of the classical economic dispatching is presented early in the thesis. Highlights of this phase of the study include: generation scheduling with general (i.e., possibly non-convex) generation cost functions, an analytic study of the Valve Point Loading based on the general characterization of the valve loop based system incremental cost curve, an in-depth study of the system incremental cost in the context of the modified coordination equations, and a stochastic formulation and analysis of economic dispatching of regulating plants.
|
63 |
Maximizing the Availability of Distributed Software ServicesClutterbuck, Peter January 2005 (has links)
In a commercial Internet environment, the quality of service experienced by a user is critical to competitive advantage and business survivability. The availability and response time of a distributed software service are central components of the overall quality of service provided to users. Traditionally availability is a measure of service down time. Traditionally availability measures the probability that the service will be live and is expressed in terms of failure occurrence and repair or recovery time. Response time is a measure of the time taken from when the service request is made, to when service provision occurs for the user. Deteriorating response time is also a valuable indicator to denial of service attacks which continue to pose a significant threat to service availability. The concept of the service cluster is increasingly being deployed to improve service availability and response time. Cluster processor replication increases service availability. Cluster dispatching of service requests across the replicated cluster processors increases service scalability and therefore response time. This thesis commences with a review of the research and current technology in the area of distributed software service availability. The review aims to identify any deficiencies within that area and propose critical features that mitigate those deficiencies. The three critical features proposed are in relation to user wait time, cluster dispatching, and the trust-based filtering of service requests. The user wait time proposal is that the availability of a distributed service should reflect both liveness probability level and probabalistic user access time of the service. The cluster dispatching proposal is that dispatching processing overhead is a function of the number of Internet Protocol (IP) datagrams/Transport Control Protocol (TCP) segments that are received by the dispatcher in respect of each service request. Consequently the number of IP datagrams/TCP segments should be minimised ideally so that for each incoming service request there is one IP datagram/TCP segment. The trust-based filtering proposal is that the level of trust in respect of each service request should be identified by the service as this is critical in mitigating distributed denial of service attacks - and therefore maximising the availability of the service A conceptual availability model which supports the three critical features within an Internet clustered service environment is then described. The conceptual model proposes an expanded availability definition and then describes the realization of this definition via additional capabilities positioned within the Transport layer of the Internet communication environment. The additional capabilities of this model also facilitate the minimization of cluster dispatcher processing load and the identification by the cluster dispatcher of request trust level. The model is then implemented within the Linux kernel. The implementation involves the addition of several options to the existing TCP specification and also the addition of several functions to the existing Socket API. The implementation is subsequently evaluated in a dispatcher-based clustered service environment.
|
64 |
Reducing the impact of decision complexity in ambulance command and controlHayes, Jared, n/a January 2008 (has links)
The overriding goal of this work was to present information to ambulance command and control (AC2) operators in a manner that complemented their dispatchers decision making processes whilst minimising the effects of a number of identified complexities. It was theorised that presenting information in this manner would improve the decision making performance of the dispatchers. The initial stages of this work involved identifying the strategies that AC2 operators use when making decisions regarding the allocation of ambulances to emergency incidents and the complexities associated with these decisions. These strategies were identified after the analysis of interviews with AC2 operators using an interview approach called the Critical Decision Method. The subsequent analysis of the interview transcripts using an Emergent Themes Analysis provided a significant number of insights regarding the decision making processes of the operators and the information required to support these decisions. Of particular significance was the importance of situation awareness in the decision making process. For example, when dispatchers have a sound understanding of incidents and additional factors such as the ambulances under their control, the dispatch decision becomes less complicated.
To extend the understanding of the dispatcher�s work in the communication centres, a number of factors that could contribute to the complexity of the dispatch task were identified from an additional analysis of the interview transcripts. However it was not possible to establish from this the contribution of these factors to the perceived complexity encountered by the operators. To address this, a questionnaire was circulated requiring dispatchers to rate the contribution of a number of factors to the complexity of the dispatch task and the frequency that these factors occurred. The results showed that the most prevalent factors related to a number of the cognitive processes that the dispatchers performed to manage the dispatch task. Such processes included determining the resource most likely to arrive at the scene of an emergency incident the quickest. There were also differences in regard to which areas of the dispatch process the dispatchers in the two centres considered to be the most complex.
The final stage of this research was the design of a prototype interface that complemented the decision making strategies used by the dispatchers and addressed the identified complexities. At this stage the scope of the research was narrowed to focus primarily on the resource assessment and allocation phases of the dispatch process and several of the complexities associated with these. The prototype interface made use of a novel display technology that allowed the presentation of information across two overlapping LCD displays (referred to as a Multi Layered Display (MLD)). To test the effectiveness of this display a laboratory experiment was conducted comparing the perfomance of participants using the MLD with participants using a Single Layered Display (SLD) that presented the same information. The results indicated that in almost all cases the participants using the multi layer display performed better. However these differences did not prove to be significant.
|
65 |
Some cost implications of electric power factor correction and load managementVisser, Hercules 13 August 2012 (has links)
M. Phil. / Presently, ESKOM is rated as the fifth largest utility in the world that generates and distributes electricity power to their consumers at the lowest price per kilowatt-hour (kW.h). As a utility, ESKOM is the largest supplier of electrical energy in South Africa and is currently generating and distributing on demand to approximately 3000 consumers. This represents 92% of the South African market. ESKOM was selected as the utility supplying electrical energy for the purpose of this study. ESKOM's objective is to provide the means and systems by which the consumer can be satisfied with electricity at the most cost-effective manner. In order to integrate the consumers into these objectives, ESKOM took a decision in 1994 to change the supply tariff from active power (kW) to apparent power (kVA) for a number of reasons: To establish a structure whereby the utility and the consumer can control the utilisation of electrical power supply to the consumer. To utilise demand and control through power factor correction and implementation of load management systems. To identify some cost implications of electrical power factor correction and load management. Consumers with kW maximum demand tariff options had little or no financial incentives to improve their low power factor (PF) by reducing their reactive current supply. Switching to (kVA) maximum demand will involve steps to be taken to ensure that the reactive component is kept to a minimum with maximum power factor. ESKOM has structured various tariff rates and charges with unique features that would accommodate the consumers in their demand side management and load cost requirements, which, when applied, will result in an efficient and cost effective load profile. These tariffs are designed to guide consumers automatically into an efficient way of using electrical power, as it is designed to recover both the capital investment and the operating cost within two to three years after installation of power factor correction equipment. ESKOM's concept of Time-of-use (TOU) periods for peak, standard and off-peak times during week, Saturday and Sunday periods is discussed as load management. Interruptible loads can be scheduled or shed to suit lower tariff rates and to avoid maximum demand charge. The concept of load management will change the operation pattern of the consumer's electricity demand whereby the consumer will have immediate technical and financial benefits. In the last chapter of this dissertation, a hypothetical case study addresses and concludes on some of the technical and cost implications of electrical power factor correction and load management as a successful and profitable solution to optimize electrical power supply to the consumer. By implementing the above, ESKOM ensures that the consumer utilizes the electrical power supply to its optimum level at the lowest cost per kilowatthour (kW.h) generated.
|
66 |
Enhanced voltage regulation in lightly-loaded, meshed distribution networks using a phase shifting transformerSithole, Frederick Silence 03 June 2013 (has links)
M.Ing. (Electrical and Electronic Engineering) / Long transmission lines in power system require high line loading in order to lower voltage limits due to line losses. For relatively long lines, line charging is high and thus higher voltage limits reached at low loading. It follows then that it is a challenge to maintaining the voltages between the acceptable limits for relatively long lines. This dissertation highlights the problems experienced when load varying from very low to very high is supplied by very long parallel lines of different impedance characteristic. When the load is extremely high, there are low voltages experienced which are solved by use of shunt capacitors and/or adding more lines. When the load is extremely low, there are high voltages experienced which are solved by use of shunt reactors and/or switching some of the lines off. The type of solutions to this two loading extremes as indicated above, can be problematic, in that; new lines requires servitudes which can take too long, shunt capacitors and reactors in this type of the network is not desirable since the introduction of too many of these devices have maintenance implications and they would require continuous switching to maintain acceptable voltages, resulting in complicated operation of the network. This research proposes the use of a phase shifting transformer located on one of two parallel corridors supplying power to a load located remotely from the rest of the system. The transformer is able to rearrange the active power flows to vary loadings of the corridors and the improvements in voltage regulation can be realised during both low and high load conditions.
|
67 |
Modelling of different long-term electrical forecasts and its practical applications for transmission network flow studiesPayne, Daniel Frederik 26 February 2009 (has links)
D.Phil / The prediction of the expected transmission network loads as required for transmission network power flow studies, has become very important and much more complex than ten to twenty years ago. Therefore a single forecast is no longer the answer to the problem. The modelling of different long-term electrical forecasts makes it possible to compare a number of different forecasts. The modelling provides the further option that each expected load can be entered as a range and then the developed balancing algorithm checks for consensus (feasibility). If feasibility exists, then the different forecasts are reconciled (a feasible solution is determined). Factors such as international and national market trends, economical cycles, different weather patterns, climate cycles and demographic changes are studied. The factors that have significant impact on the transmission electrical loads are integrated in ten different forecasts. It thus gives more insight into the electrical industry and makes the forecast results more informative and therefore reduces the uncertainty in the future expected loads.
|
68 |
Evidenční počítač stavědla K-2002 / Telematic Shell for K-2002 SignalboxKandrik, Ján January 2011 (has links)
Optimization and plannig have become an immense part of railway traffic control. This work comprises creation of a telematic shell for a signalbox, that is dedicated to collecting, visualizing and evalutaing data from the whole interlocking system for purpose of traffic flow management support.
|
69 |
Tvorba logistické koncepce ve vybrané firmě / Creating Logistics Concepts in the Selected CompanyKoc, Martin January 2015 (has links)
The diploma thesis focuses on creating logistic concept of medical transport service of Hospital Znojmo. It contains theoretical part, which describes logistics, it analyses current business situation and there are presented changes to optimalize medical transport service.
|
70 |
Large-Scale DER Aggregations of Electric Water Heaters and Battery Inverter SystemsMarnell, Kevin 10 July 2019 (has links)
Distributed energy resources like residential electric water heaters and residential battery inverter systems offer a small amount of change to the grid individually. When aggregated however, these assets can cause major effects to the electric grid. Aggregating these resources allows them to take on generator-like functions with the ability to increment power and decrement power.
The Western Energy Imbalance Market is an energy market offering 15 minute and 5 minute markets for energy transactions between balancing areas. Generation assets make increment and decrement bids. Traditionally the only entrants to this market have been large scale generators and large scale assets legally designated as generators. Aggregated distributed resources could offer the same increments and decrements from managing residential assets like electric water heaters and batteries.
DERAS, a Distributed Energy Resource Aggregation System developed by the Portland State Power Lab group, is an aggregator of residential resources that could offer increment and decrement bids to an energy market, like an Energy Imbalance Market. This research models and simulates aggregations of distributed energy resources. This work analyzes the effects of 10,000 electric water heaters and 10,000 battery inverter systems. A simulation program was built to simulate regular use of these assets, and then add the additional effects of a decrement bid into the Western Energy Imbalance Market. The effects of the bids on energy levels inside the water heaters and batteries are examined. The power imported from the grid is also analyzed as an effect of the aggregator attempting to cover a generation decrement bid.
|
Page generated in 0.0369 seconds