• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 3
  • 2
  • 1
  • Tagged with
  • 27
  • 27
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Estimation of frequency control performance using probability distribution of load change

Wickramasinghe, Thusitha 09 July 2010 (has links)
In North American utilities, control area performance of interconnected power systems is assessed by the reliability standards imposed by the North American Electric Reliability Corporation (NERC). NERC standards on control area performance define two indices known as Control Performance Standards 1 and 2 (CPS1 and CPS2) to evaluate control area performance in normal interconnected power system operation. Out of the two indices, CPS1 evaluates the performance of a control area with respect to control of interconnection frequency and tie-line power flows. This thesis proposes a novel method to approximately estimate CPS1 for a two area power system using the probability distribution of load change. The proposed method of estimating CPS1 is validated against the time domain simulation method using a simple two-area test system. In the validation process, it is shown that the proposed method could approximately forecast CPS1 within 5% accuracy. The forecasted CPS1 value could then be used by a control area to design its future control strategies to be in compliance with NERC criteria at the minimum cost. These control actions include, but not limited to tuning governors, reducing non-confirming loads, ensuring adequate operating and spinning reserves etc.
2

Estimation of frequency control performance using probability distribution of load change

Wickramasinghe, Thusitha 09 July 2010 (has links)
In North American utilities, control area performance of interconnected power systems is assessed by the reliability standards imposed by the North American Electric Reliability Corporation (NERC). NERC standards on control area performance define two indices known as Control Performance Standards 1 and 2 (CPS1 and CPS2) to evaluate control area performance in normal interconnected power system operation. Out of the two indices, CPS1 evaluates the performance of a control area with respect to control of interconnection frequency and tie-line power flows. This thesis proposes a novel method to approximately estimate CPS1 for a two area power system using the probability distribution of load change. The proposed method of estimating CPS1 is validated against the time domain simulation method using a simple two-area test system. In the validation process, it is shown that the proposed method could approximately forecast CPS1 within 5% accuracy. The forecasted CPS1 value could then be used by a control area to design its future control strategies to be in compliance with NERC criteria at the minimum cost. These control actions include, but not limited to tuning governors, reducing non-confirming loads, ensuring adequate operating and spinning reserves etc.
3

Instrumentação biomecânica aplicada à análise do desempenho do chute em jogadores de futebol de campo

Silva, Marcelo Guimarães [UNESP] 06 February 2012 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:28:33Z (GMT). No. of bitstreams: 0 Previous issue date: 2012-02-06Bitstream added on 2014-06-13T19:58:07Z : No. of bitstreams: 1 silva_mg_me_guara.pdf: 891887 bytes, checksum: f0821f5c19c026a3c479d3e2370051df (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / O chute no futebol é um gesto técnico bastante estudado, devido a sua importância dentro de uma partida, porém existem algumas lacunas importantes a serem preenchidas, principalmente quanto a sua análise no aspecto quantitativo. Este estudo teve como objetivo encontrar um sistema eficaz de análise dos fatores técnicos intervenientes na performance motora de jogadores de futebol de campo. Neste sentido tornou-se de fundamental importância avaliar parâmetros biomecânicos aplicados durante o chute com o dorso do pé em atletas de categorias de base, a fim de melhorar a performance motora e corrigir falhas não detectadas pela análise qualitativa. Desta forma foi elaborado um sistema de medidas específico, que teve como ponto central da pesquisa, a análise do movimento de chute ao gol. Utilizou-se a cinemetria bidimensional (2D), visando analisar o ângulo formado pelo joelho da perna de apoio; a força de reação do solo ou força vertical (Fv); o COP (centro de pressão) medido pelo equilíbrio do pé de apoio e deslocamento em fases determinantes do chute; e através de acelerômetro, encontrou-se o comportamento da aceleração ou desaceleração produzidas nas fases do movimento de chute. A amostra foi constituída de quatro indivíduos com idades compreendidas entre 16 e 19 anos, atletas da equipe do Resende Futebol Clube, sendo dois da categoria juniores e dois da categoria juvenil. O teste consistiu na execução de chutes com o dorso do pé cujo objetivo foi acertar o alvo posicionado a nove metros do voluntário; o chute caracterizou-se por ser funcional, mas ao mesmo tempo empregando potência. Os resultados encontrados foram coerentes para Fv e aceleração na análise inter-voluntários. Em relação à velocidade alcançada pela bola para inter-voluntários, os resultados apresentaram padrão esperado... / The kick in soccer is a technical gesture extensively studied because of its importance within a game, but there are some important gaps to be filled, specially regarding its analysis on the quantitative aspect. This study aimed to find an effective system of analysis of technical factors involved in motor control performance of soccer players in the field. In this sense has become of paramount importance to assess biomechanical parameters applied during the kick with the instep of the basic categories of athletes, to improve motor performance and to correct errors not detected by qualitative analysis. Thus was developed a system of specific measures, which had as its central point of the research, analysis of the motion of kicking the goal. Kinemetry used the two-dimensional (2D) to evaluate the angle formed by the knee of the supporting leg, the ground reaction force, especifically vertical force (Fv), the COP (center of pressure) measured by the balance of the foot support and decisive shift in phases of the kick, and by accelerometer, found the behavior of acceleration or deceleration produced in the phases of the motion of kick. The sample consisted of four individuals aged between 16 and 19 years, athletes team Resende Football Club, two juniors and two categories of the youth category. The test consisted in the execution of kicks with the instep whose goal was to hit the target positioned at nine meters of the volunteer, the kick was characterized by being functional, but at the same time employing power. The results were consistent for Fv and acceleration parameters between inter-volunteers and the ball speeds were registered according with literature. The results showed the expected pattern, however, COP parameter presented the individual pattern between inter-volunteers and showed same trend for intra- volunteers
4

Instrumentação biomecânica aplicada à análise do desempenho do chute em jogadores de futebol de campo /

Silva, Marcelo Guimarães. January 2012 (has links)
Resumo: O chute no futebol é um gesto técnico bastante estudado, devido a sua importância dentro de uma partida, porém existem algumas lacunas importantes a serem preenchidas, principalmente quanto a sua análise no aspecto quantitativo. Este estudo teve como objetivo encontrar um sistema eficaz de análise dos fatores técnicos intervenientes na performance motora de jogadores de futebol de campo. Neste sentido tornou-se de fundamental importância avaliar parâmetros biomecânicos aplicados durante o chute com o dorso do pé em atletas de categorias de base, a fim de melhorar a performance motora e corrigir falhas não detectadas pela análise qualitativa. Desta forma foi elaborado um sistema de medidas específico, que teve como ponto central da pesquisa, a análise do movimento de chute ao gol. Utilizou-se a cinemetria bidimensional (2D), visando analisar o ângulo formado pelo joelho da perna de apoio; a força de reação do solo ou força vertical (Fv); o COP (centro de pressão) medido pelo equilíbrio do pé de apoio e deslocamento em fases determinantes do chute; e através de acelerômetro, encontrou-se o comportamento da aceleração ou desaceleração produzidas nas fases do movimento de chute. A amostra foi constituída de quatro indivíduos com idades compreendidas entre 16 e 19 anos, atletas da equipe do Resende Futebol Clube, sendo dois da categoria juniores e dois da categoria juvenil. O teste consistiu na execução de chutes com o dorso do pé cujo objetivo foi acertar o alvo posicionado a nove metros do voluntário; o chute caracterizou-se por ser funcional, mas ao mesmo tempo empregando potência. Os resultados encontrados foram coerentes para Fv e aceleração na análise inter-voluntários. Em relação à velocidade alcançada pela bola para inter-voluntários, os resultados apresentaram padrão esperado... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: The kick in soccer is a technical gesture extensively studied because of its importance within a game, but there are some important gaps to be filled, specially regarding its analysis on the quantitative aspect. This study aimed to find an effective system of analysis of technical factors involved in motor control performance of soccer players in the field. In this sense has become of paramount importance to assess biomechanical parameters applied during the kick with the instep of the basic categories of athletes, to improve motor performance and to correct errors not detected by qualitative analysis. Thus was developed a system of specific measures, which had as its central point of the research, analysis of the motion of kicking the goal. Kinemetry used the two-dimensional (2D) to evaluate the angle formed by the knee of the supporting leg, the ground reaction force, especifically vertical force (Fv), the COP (center of pressure) measured by the balance of the foot support and decisive shift in phases of the kick, and by accelerometer, found the behavior of acceleration or deceleration produced in the phases of the motion of kick. The sample consisted of four individuals aged between 16 and 19 years, athletes team Resende Football Club, two juniors and two categories of the youth category. The test consisted in the execution of kicks with the instep whose goal was to hit the target positioned at nine meters of the volunteer, the kick was characterized by being functional, but at the same time employing power. The results were consistent for Fv and acceleration parameters between inter-volunteers and the ball speeds were registered according with literature. The results showed the expected pattern, however, COP parameter presented the individual pattern between inter-volunteers and showed same trend for intra- volunteers / Orientador: Tamotsu Hirata / Coorientador: Henrique Martins Rocha / Banca: Mauro Pedro Peres / Banca: Luiz Heleno Moreira Duque / Mestre
5

Control performance assessment of run-to-run control system used in high-mix semiconductor manufacturing

Jiang, Xiaojing 04 October 2012 (has links)
Control performance assessment (CPA) is an important tool to realize high performance control systems in manufacturing plants. CPA of both continuous and batch processes have attracted much attention from researchers, but only a few results about semiconductor processes have been proposed previously. This work provides methods for performance assessment and diagnosis of the run-to-run control system used in high-mix semiconductor manufacturing processes. First, the output error source of the processes with a run-to-run EWMA controller is analyzed and a CPA method (namely CPA I) is proposed based on closed-loop parameter estimation. In CPA I, ARMAX regression is directly applied to the process output error, and the performance index is defined based on the variance of the regression results. The influence of plant model mismatch in the process gain and disturbance model parameter to the control performance in the cases with or without set point change is studied. CPA I method is applied to diagnose the plant model mismatch in the case with set point change. Second, an advanced CPA method (namely CPA II) is developed to assess the control performance degradation in the case without set point change. An estimated disturbance is generated by a filter, and ARMAX regression method is applied to the estimated disturbance to assess the control performance. The influence of plant model mismatch, improper controller tuning, metrology delay, and high-mix process parameters is studied and the results showed that CPA II method can quickly identify, diagnose and correct the control performance degradation. The CPA II method is applied to industrial data from a high-mix photolithography process in Texas Instruments and the influence of metrology delay and plant model mismatch is discussed. A control performance optimization (CPO) method based on analysis of estimated disturbance is proposed, and optimal EWMA controller tuning factor is suggested. Finally, the CPA II method is applied to non-threaded run-to-run controller which is developed based on state estimation and Kalman filter. Overall process control performance and state estimation behavior are assessed. The influence of plant model mismatch and improper selection of different controller variables is studied. / text
6

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad January 2009 (has links)
Doctor of Philosophy(PhD) / Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.
7

Quality of control and real-time scheduling : allowing for time-variations in computer control systems

Sanfridson, Martin January 2004 (has links)
The majority of computers around us are embedded in productsand dedicated to perform certain tasks. A specific task is thecontrol of a dynamic system. The computers are ofteninterconnected by communication networks forming a distributedsystem. Vehicles and manufacturing equipment are two types ofmechatronic machines which often host dedicated computercontrol systems. A research problem is how the real-timebehaviour of the computer system affects the application,especially the control of the dynamic system. If the internal or external conditions varies over time, itbecomes difficult to assign a fixed resource reservation thatwill work well in all situations. In general, the more time anapplication gets of a resource, the better its gauged orperceived quality will be. A strategy is to alter the resourcereservation when the condition changes. This can be constructedas a negotiation between competing applications, a method forwhich the termquality of control, QoC, has been coined. Scalability isthe ability to change the structure and configuration of asystem. It promotes evolving systems and a can help manage acomplex product family. An architecture for a QoC middleware ontop of a scalable computer system, has been proposed. As aquality measureof a control application, the well-knownweighted quadratic loss function used in optimal control, hasbeen revised to encompass a subset of the so called timingproperties. The timing properties are the periods and thedelays in the control loop, including time-varying period anddelay. They are the interface between control and computerengineering, from a control engineering viewpoint. The qualitymeasure can be used both offline and on-line given a model ofthe sampled-data system and an appropriate description of thetiming properties. In order to use a computer system efficiently and toguarantee its responsiveness, real-time scheduling is a must.In fixed priority scheduling each task arrives periodically andhas a fixed priority. A task with a high priority can preempt alow priority task and gain access to the resource. Thebest-case response time characterizes the delays in the system,which is useful from a control viewpoint. A new algorithm tocalculate thebest-caseresponsetime has been derived. It is based on ascheduling scenario which yields a recurrence equation. Themodel is dual to the well-known worst-case response timeanalysis. Besides the dynamic fixed priority scheduling algorithm,optimal control usingstatic schedulinghas been studied, assuming a limitedcommunication. In the static schedule, which is constructedpre-runtime, each task is assigned a time window within aschedule repeated in eternity. The optimal scheduling sequenceis sought by optimizing the overall control performance. Aninteresting aspect is that the non-specified control periodfalls out as a result of theoptimal schedule. The time-varying delay is accountedfor in the control design. Keywords:Real-time scheduling, sampled-data control,performance measure, quality of control, limited communication,time-varying delay, jitter.
8

A Study of Target Frequency Bond for Frequency Control Performance Score Calculations in an Isolated System

Lee, Hung-hsi 06 September 2010 (has links)
Power system frequency is one of the key performance indices of system operation. Abnormal frequency deviations would incur negative impacts to power equipments and service quality. Thus, it is important to operate and regulate the system frequency within an acceptable range. North American Electric Reliability Corporation (NERC) has been using a Control Performance Standard (CPS) for frequency control performance assessment since 1997 which uses system frequency and inter--area power flow to evaluate the power system control performance. This thesis presents a design of CPS for isolated system and the results of Taiwan Power Company frequency control performance based on the proposed CPS1 formulation.
9

Quality of control and real-time scheduling : allowing for time-variations in computer control systems

Sanfridson, Martin January 2004 (has links)
<p>The majority of computers around us are embedded in productsand dedicated to perform certain tasks. A specific task is thecontrol of a dynamic system. The computers are ofteninterconnected by communication networks forming a distributedsystem. Vehicles and manufacturing equipment are two types ofmechatronic machines which often host dedicated computercontrol systems. A research problem is how the real-timebehaviour of the computer system affects the application,especially the control of the dynamic system.</p><p>If the internal or external conditions varies over time, itbecomes difficult to assign a fixed resource reservation thatwill work well in all situations. In general, the more time anapplication gets of a resource, the better its gauged orperceived quality will be. A strategy is to alter the resourcereservation when the condition changes. This can be constructedas a negotiation between competing applications, a method forwhich the term<i>quality of control</i>, QoC, has been coined. Scalability isthe ability to change the structure and configuration of asystem. It promotes evolving systems and a can help manage acomplex product family. An architecture for a QoC middleware ontop of a scalable computer system, has been proposed.</p><p>As a<i>quality measure</i>of a control application, the well-knownweighted quadratic loss function used in optimal control, hasbeen revised to encompass a subset of the so called timingproperties. The timing properties are the periods and thedelays in the control loop, including time-varying period anddelay. They are the interface between control and computerengineering, from a control engineering viewpoint. The qualitymeasure can be used both offline and on-line given a model ofthe sampled-data system and an appropriate description of thetiming properties.</p><p>In order to use a computer system efficiently and toguarantee its responsiveness, real-time scheduling is a must.In fixed priority scheduling each task arrives periodically andhas a fixed priority. A task with a high priority can preempt alow priority task and gain access to the resource. Thebest-case response time characterizes the delays in the system,which is useful from a control viewpoint. A new algorithm tocalculate the<i>best-caseresponse</i>time has been derived. It is based on ascheduling scenario which yields a recurrence equation. Themodel is dual to the well-known worst-case response timeanalysis.</p><p>Besides the dynamic fixed priority scheduling algorithm,optimal control using<i>static scheduling</i>has been studied, assuming a limitedcommunication. In the static schedule, which is constructedpre-runtime, each task is assigned a time window within aschedule repeated in eternity. The optimal scheduling sequenceis sought by optimizing the overall control performance. Aninteresting aspect is that the non-specified control periodfalls out as a result of the<i>optimal schedule</i>. The time-varying delay is accountedfor in the control design.</p><p><b>Keywords:</b>Real-time scheduling, sampled-data control,performance measure, quality of control, limited communication,time-varying delay, jitter.</p>
10

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad January 2009 (has links)
Doctor of Philosophy(PhD) / Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.

Page generated in 0.1046 seconds