• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 227
  • 71
  • 28
  • 25
  • 21
  • 14
  • 11
  • 6
  • 6
  • 6
  • 5
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 511
  • 126
  • 95
  • 88
  • 73
  • 72
  • 70
  • 48
  • 48
  • 43
  • 39
  • 38
  • 36
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Tvorba písma OpenType volně dostupnými softwarovými prostředky / Making OpenType fonts with free software

Bednár, Peter January 2011 (has links)
In thesis themes of typography and computer font of OpenType format is described in details. At the beginning attention is paid to historical development of typeface, where stress is laid mainly on development of Roman and white letter with their characteristics. Having presented basis of typography work is concentrated on topic of digital font with emphasis on possibilities of OpenType format. Further its characteristics and advantages were listed compared to another formats and it was evaluated as format appropriate also for creating font in education process. Letterspacing and kerning were mentioned between basic graphical modifications in creating fonts. In theoretical part of the thesis they were examined in available programs designed for creating font in OpenType format. Except free available means into summary were included also commercial types due to absence of more advanced instruments and functions with free available applications. In evaluation was found that the most convenient for education is Fontlab Fontographer commercial program, free Type lite and Fontforge indicated for Open-source platform. Practical part of the thesis is focused on two chosen programs for creating main font characteristics. The goal was to detect if it is possible to reach identical results when using both programs. Fontographer program enabled to use wide tool palette dedicated to vector graphic processing by means of Adobe Illustrator similar instrument. In the case of Type lite program there were rather less instruments, what is sufficient for elementary work and familiarization with creating of digital typeface. Freeware shortage is basic absence of kerning, spacing or hinting functions. Comparing program possibilities, it falls that freeware programs based on OS Windows with their functionality are sufficient only for entry level users. The best option within free available programs is Fontforge for OS Linux which supports mentioned typographic functions. Fontographer was recommended for teaching of basic characteristics of OpenType font format. Another goal of the thesis was creating of recommended work procedure for creating basic characteristics of OpenType font for students, that is enclosed at the end of the thesis.
462

Initialiser et calibrer un modèle de microsimulation dynamique stochastique : application au modèle SimVillages / Initialize and Calibrate a Dynamic Stochastic Microsimulation Model : application to the SimVillages Model

Lenormand, Maxime 12 December 2012 (has links)
Le but de cette thèse est de développer des outils statistiques permettant d'initialiser et de calibrer les modèles de microsimulation dynamique stochastique, en partant de l’exemple du modèle SimVillages (développé dans le cadre du projet Européen PRIMA). Ce modèle couple des dynamiques démographiques et économiques appliquées à une population de municipalités rurales. Chaque individu de la population, représenté explicitement dans un ménage au sein d’une commune, travaille éventuellement dans une autre, et possède sa propre trajectoire de vie. Ainsi, le modèle inclut-il des dynamiques de choix de vie, d’étude, de carrière, d’union, de naissance, de divorce, de migration et de décès. Nous avons développé, implémenté et testé les modèles et méthodes suivants : 1 / un modèle permettant de générer une population synthétique à partir de données agrégées, où chaque individu est membre d’un ménage, vit dans une commune et possède un statut au regard de l’emploi. Cette population synthétique est l’état initial du modèle. 2 / un modèle permettant de simuler une table d’origine-destination des déplacements domicile-travail à partir de données agrégées. 3 / un modèle permettant d’estimer le nombre d’emplois dans les services de proximité dans une commune donnée en fonction de son nombre d’habitants et de son voisinage en termes de service. 4 / une méthode de calibration des paramètres inconnus du modèle SimVillages de manière à satisfaire un ensemble de critères d'erreurs définis sur des sources de données hétérogènes. Cette méthode est fondée sur un nouvel algorithme d’échantillonnage séquentiel de type Approximate Bayesian Computation. / The purpose of this thesis is to develop statistical tools to initialize and to calibrate dynamic stochastic microsimulation models, starting from their application to the SimVillages model (developed within the European PRIMA project). This model includes demographic and economic dynamics applied to the population of a set of rural municipalities. Each individual, represented explicitly in a household living in a municipality, possibly working in another, has its own life trajectory. Thus, model includes rules for the choice of study, career, marriage, birth children, divorce, migration, and death. We developed, implemented and tested the following models : • a model to generate a synthetic population from aggregate data, where each individual lives in a household in a municipality and has a status with regard to employment. The synthetic population is the initial state of the model. • a model to simulate a table of origin-destination commuting from aggregate data in order to assign a place of work for each individual working outside his municipality of residence. • a sub-model to estimate the number of jobs in local services in a given municipality in terms of its number of inhabitants and its neighbors in terms of service. • a method to calibrate the unknown SimVillages model parameters in order to satisfy a set of criteria. This method is based on a new Approximate Bayesian Computation algorithm using importance sampling. When applied to a toy example and to the SimVillages model, our algorithm is 2 to 8 times faster than the three main sequential ABC algorithms currently available.
463

台灣土地稅 累進稅VS 比例稅 比較研究

林修正 Unknown Date (has links)
一般在討論台灣土地稅的文章,很多都從孫中山的土地稅論點發展,對地價變動的「不勞而獲」利益,力主予以課徵,課徵高稅率的土地增值稅,以求社會公平,也從關心社會公平的角度,對地價稅也主張以累進稅方式,用提高稅率方式促進土地利用效率。但本文認為這樣的觀點並不正確。 我們從生產理論推論土地在生產活動,由於生產要素數量比例的調整會產生不同的生產力,進而推論一般認定土地土地價格、數量變動與土地收益變動的關係。地價變化的主因來自土地收益(地租)變化,土地收益變化是「因」,土地價格變化是「果」,處理地價變化問題,重點在對土地收益課稅,也就是地價稅,不是對地價變動課稅的土地增值稅。目前強調對地價稅輕稅,卻對土地增值稅重稅,不僅無法處理孫中山所提出取締「不勞而獲」的問題,甚至讓問題更嚴重。 本文在公平上強調不僅土地資源的應用要公平,貫穿孫中山土地稅的精神,應該所有的生產資源都應一視同仁,由此反推出孫中山過份強調土地的特殊性,導致土地在經濟活動中受到不平等的對待,這樣的不平等,在經濟資源在經濟部門間的移動,導致土地價格更加不合理,也使資源配置更不合理。另一方面,我們也討論孫中山所謂的公平與目前累進稅的公平意義差別,認為現行土地稅採行累進稅,已經偏離孫中山的想法,與原意不合。而累進稅的設計在許多方面並無法促進公平,甚至更不公平。 在效率方面,我們認為價格機能會讓資源配置到最佳狀況,因此地租曲線所呈現的相對土地收益、相對價格是很重要。由於百分之百的土地增值稅會出現閉鎖效果,使得土地增值稅改為累進稅制,這樣就紊亂值本的價格體系,扭曲資源配置,也讓價格資訊原本就不明的狀況更加不透明。不僅於此,連涉及土地收益的地價稅也採行累進稅制。更攪亂土地資源。我們認為土地稅採行累進稅,不僅扭曲孫中山的原意,而且對財富分配助益不大。 再者,我們引進預期及均衡概念去解釋地價變動,實際上,土地交易價格因為含有預期因素,所以交易價格所反映土地購買者的報酬率經常會低於市場狀況,所以土地交易不一定有利。也從均衡的概念推論,一般認定購買土地的報酬率高,其實不正確。若購買土地的報酬率高,則賣土地的人就是傻瓜。 最後,我們引用嘉義市的公告地價、公告現值,印證我們的推論。
464

以比例危險模型估計房貸借款人提前清償及違約風險

鍾岳昌, Chung, Yueh-chang Unknown Date (has links)
房屋貸款借款人對於其所負貸款債務的處分有兩種潛在風險行為,分別是提前清償及違約。這兩種借款人風險行為不管是對金融機構的資產管理,或是對近年在財務金融領域的不動產證券化而言,都是相當重要的探討議題,原因在於提前清償及違約帶來了利息收益與現金流量的不確定性,進而影響不動產抵押債權的價值。也就是為貸款承作機構、證券化保證機構及證券投資人帶來風險。 借款人決定提前清償及違約與否,除了與借款人自身特性及貸款條件有關外,尚受到隨時間經過而不斷變動的變數所影響,亦即許多影響因子並非維持在貸款起始點的狀態,而是會在貸款存續期間動態調整。進一步影響借款人行為,而這類變數即為時間相依變數(time –dependent variables,或time-varying variables)。因此,本研究利用便於處理時間相依變數的比例危險模型(Proportional Hazard Model)來分析借款人提前清償及違約風險行為,觀察借款人特徵、房屋型態、貸款條件及總體經濟等變數與借款人風險行為的關係。 實證結果顯示,借款人特徵部分的教育程度對提前清償及違約風險影響最為明顯,教育程度越高,越會提前清償,越低則較會違約。房屋型態則透天厝較非透天厝容易提前清償及違約。貸款條件中的貸款金額及貸款成數皆與違約為正相關,亦即利息負擔越重,借款人違約風險升高。總體經濟方面,借款人對利率變動最為敏感,反映利率代表借款人的資金成本,是驅動借款人提前清償及違約的財務動機與誘因。
465

Initialiser et calibrer un modèle de microsimulation dynamique stochastique : application au modèle SimVillages

Lenormand, Maxime 12 December 2012 (has links) (PDF)
Le but de cette thèse est de développer des outils statistiques permettant d'initialiser et de calibrer les modèles de microsimulation dynamique stochastique, en partant de l'exemple du modèle SimVillages (développé dans le cadre du projet Européen PRIMA). Ce modèle couple des dynamiques démographiques et économiques appliquées à une population de municipalités rurales. Chaque individu de la population, représenté explicitement dans un ménage au sein d'une commune, travaille éventuellement dans une autre, et possède sa propre trajectoire de vie. Ainsi, le modèle inclut-il des dynamiques de choix de vie, d'étude, de carrière, d'union, de naissance, de divorce, de migration et de décès. Nous avons développé, implémenté et testé les modèles et méthodes suivants : 1 / un modèle permettant de générer une population synthétique à partir de données agrégées, où chaque individu est membre d'un ménage, vit dans une commune et possède un statut au regard de l'emploi. Cette population synthétique est l'état initial du modèle. 2 / un modèle permettant de simuler une table d'origine-destination des déplacements domicile-travail à partir de données agrégées. 3 / un modèle permettant d'estimer le nombre d'emplois dans les services de proximité dans une commune donnée en fonction de son nombre d'habitants et de son voisinage en termes de service. 4 / une méthode de calibration des paramètres inconnus du modèle SimVillages de manière à satisfaire un ensemble de critères d'erreurs définis sur des sources de données hétérogènes. Cette méthode est fondée sur un nouvel algorithme d'échantillonnage séquentiel de type Approximate Bayesian Computation.
466

Handcuffs or Stethoscopes: A Cross-National Examination of the Influence that Political Institutions and Bureaucracy have on Public Policies Concerning Illegal Drugs

Nilson, Chad 16 May 2008 (has links)
This dissertation attempts to explain why cross-national variation exists in government approaches to dealing with illegal drugs. As other scholars have shown, several domestic and international political factors do account for some of this variance. However less is known of the effect that bureaucratic dominance and political institutions may have on drug policy. This research argues that bureaucrats define problems in ways that make their services the best possible solution to policymakers. Mediating the ability of bureaucrats to influence drug policy outcomes are political institutions. Certain institutional structures foster a competitive policymaking environment while others foster a more cooperative policymaking environment. In the former of these, law enforcement approaches to the drug problem are often retained as the status quo because competition between policy actors prevents consideration of alternatives. In the latter environment however, prevention, treatment, and harm reduction approaches to the drug problem are developed because cooperation between policymakers allows other actors. namely public health bureaucrats.to influence drug policy decision making. To test this argument, I constructed an original dataset that includes over 4,000 observations of drug policy in 101 democracies. Institutional data on intergovernmental relations, regime type, political bargaining, electoral design, and cameralism were regressed on 6 different drug policy indices: law enforcement, deterrence-based prevention, abstinence-based treatment, educationbased prevention, substitution-based treatment, and harm reduction. While controlling for government resource capacity, severity of the drug problem, international pressure, and political ideology, I found that institutions explain a portion of the variance in drug policy outcomes. Providing in-depth information about these phenomena is a large amount of field data I collected while interviewing 155 politicians, bureaucrats, interest group leaders, and service providers. Respondents from all four of the case countries examined in this research.including United States, Canada, Austria, and Netherlands.report that bureaucrats play a major role in the formation of drug policy. Which bureaucrats have the most influence on policymakers is largely a function of domestic political conditions, international political factors, and political institutions.
467

Tutoriál Blenderu / Blender Tutorial

Tomšík, Filip Unknown Date (has links)
This thesis focuses on the increasingly popular shareware Blender which has growing application possibilities in various branches. The thesis aims at a general description of basic modelling techniques working with mesh objects as well as curves used for creating models and computer graphics. Part of the thesis is dedicated to a description of animation abilities of the programme, an outline of dierent methods and a more detailed description of correspondent modules. The central point of the thesis is to introduce the reader to the basic and advanced modelling techniques and their use in Blender, in the form of a tutorial.
468

Duração da hospitalização e faturamento das despesas hospitalares em portadores de cardiopatia congênita e de cardiopatia isquêmica submetidos à intervenção cirúrgica cardiovascular assistidos no protocolo da via rápida / Duration of the hospitalization and hospital expenditures in teh congenital heart diseases and ischemic heart disease patients submited to cardiac surgical operations in fast track recovery

Fernandes, Alfredo Manoel da Silva 30 April 2003 (has links)
Com o objetivo de avaliar o atendimento dos pacientes submetidos à intervenção cirúrgica cardiovascular no protocolo de atendimento na via rápida (fast track recovery) em relação ao protocolo convencional, foi comparada a movimentação dos pacientes atendidos em ambos os protocolos nas diferentes unidades hospitalares. O estudo foi realizado em hospital público universitário especializado em cardiologia de 400 leitos, de referência terciária para o Sistema Único de Saúde. Foram estudados 175 pacientes, 107 (61%) homens e 68 (39%) mulheres, de idades entre 2 meses a 81 anos, dos quais 107 operados no protocolo da via rápida e 68 no protocolo convencional. Foram avaliadas variáveis demográficas, clínicas e, para avaliar a movimentação dos pacientes nas diferentes unidades hospitalares, as taxas de alta por unidade de tempo em cada unidade. A análise estatística foi feita por meio de análise exploratória, método de Kaplan Meier e modelo de riscos proporcionais de Cox. A análise de variância foi empregada para comparar o faturamento das despesas. A taxa de alta das diferentes unidades hospitalares por unidade de tempo dos portadores de cardiopatia congênita atendidos no protocolo da via rápida em relação ao protocolo da via convencional foi: a) 11,3 vezes a taxa de alta quando assistidos no protocolo na via convencional quanto ao tempo de permanência no centro cirúrgico; b) 6,3 vezes quanto à duração da intervenção cirúrgica; c) 6,8 vezes quanto à duração da anestesia; d) 1,5 vezes quanto à duração da perfusão; e) 2,8 vezes quanto à permanência na unidade de recuperação pós-operatória I; f) 6,7 vezes quanto à duração da hospitalização; g) 2,8 vezes quanto à permanência na unidade de internação pré-operatória; h) 2,1 vezes quanto à permanência na unidade de internação após a alta da unidade de terapia intensiva de recuperação pós-operatória. Para os portadores de cardiopatia isquêmica, as taxas de alta das unidades hospitalares para os protocolos de atendimento no protocolo da via rápida e no protocolo convencional não demonstraram diferença estatisticamente significante. Os valores de faturamento das despesas de internação dos portadores de cardiopatia congênita decorrentes de exames e procedimentos realizados nas fases pré- e pós-operatória e dos exames da fase trans-operatória foram menores quando os pacientes foram assistidos no protocolo da via rápida. Portanto, os portadores de cardiopatias congênita apresentaram menor permanência hospitalar nos recursos médicos hospitalares instalados, quando assistidos no protocolo de atendimento na via rápida, bem como menores despesas nas fases pré- e pós- operatória da internação. / Objective - To evaluate patient assistance in pre, per and postoperative phases of cardiac surgical intervention under fast track recovering protocol compared to the conventional way. Patients - 175 patients were studied, 107 (61%) men and 68 (39%) women. Ages 2 months to 81 years old. Patients included: first surgical intervention, congenital and ischemic cardiopathy without complexity, normal ventricular function and with at least 2 preoperative ambulatory consultations. Patients submitted to emergency surgeries were excluded. Interventions - assistance submitted by fast track and conventional protocol. Statistical analysis (measures) - exploratory, uni-varied (Kaplan Meier) and multi-varied (Cox) of the time in each admission unit. Hospital installations were classified in ambulatory, preoperative admission unit, surgical center, postoperative recovery unit and postoperative admission unit; the expression of this use was the discharge rate by unit of time from the significant interaction observed between assistance protocol and the kind of cardiopathy for the stay in the surgical center, surgical intervention time, stay in postoperative recovery unit, anesthesia time and time between admission and surgery dates. Results - the patients of congenital cardiopathy who underwent the protocol of conventional way recovery in relation to the fast track protocol, in the reliability range of 95% allows one to state that discharge rate by unit of time of the congenital cardiopathy patients assisted by the fast track protocol was: 11.3 times the discharge rate when assisted by the conventional way protocol as to the time of staying in the surgical center; 6.3 times as to the duration of the surgical intervention; 6.8 times as to duration of the anesthesia; 1.5 times as to the duration of the perfusion; 2.8 times as to the stay in the postoperative recovery unit; 6.7 times as to the stay in the hospital (period of time between the admission and the discharge date); 2.8 times as to the stay in the preoperative admission unit ( period of time between the admission date and the surgery date); 2.1 times as to the stay in the postoperative unit (period of time between the date of leaving the postoperative recovery unit and the date of discharge from the hospital). For the ischemia cardiopathy patients the risks concerning the protocols of recovery by the traditional way and the fast track were the same. CONCLUSIONS - The data concerning this study allows one to suggest that the assistance can be more efficient if one takes into consideration some variables studied in the protocol of fast track recovery. The congenital and ischemic cardiopathy patients presented shorter interval of time (concerning hospital stay in doctor-hospital installed facilities) when assisted in the fast track recovery protocol as well as fewer expenses with medical and hospital assistance.
469

Contribution de la Théorie des Valeurs Extrêmes à la gestion et à la santé des systèmes / Contribution of extreme value theory to systems management and health

Diamoutene, Abdoulaye 26 November 2018 (has links)
Le fonctionnement d'un système, de façon générale, peut être affecté par un incident imprévu. Lorsque cet incident a de lourdes conséquences tant sur l'intégrité du système que sur la qualité de ses produits, on dit alors qu'il se situe dans le cadre des événements dits extrêmes. Ainsi, de plus en plus les chercheurs portent un intérêt particulier à la modélisation des événements extrêmes pour diverses études telles que la fiabilité des systèmes et la prédiction des différents risques pouvant entraver le bon fonctionnement d'un système en général. C'est dans cette optique que s'inscrit la présente thèse. Nous utilisons la Théorie des Valeurs Extrêmes (TVE) et les statistiques d'ordre extrême comme outil d'aide à la décision dans la modélisation et la gestion des risques dans l'usinage et l'aviation. Plus précisément, nous modélisons la surface de rugosité de pièces usinées et la fiabilité de l'outil de coupe associé par les statistiques d'ordre extrême. Nous avons aussi fait une modélisation à l'aide de l'approche dite du "Peaks-Over Threshold, POT" permettant de faire des prédictions sur les éventuelles victimes dans l'Aviation Générale Américaine (AGA) à la suite d'accidents extrêmes. Par ailleurs, la modélisation des systèmes soumis à des facteurs d'environnement ou covariables passent le plus souvent par les modèles à risque proportionnel basés sur la fonction de risque. Dans les modèles à risque proportionnel, la fonction de risque de base est généralement de type Weibull, qui est une fonction monotone; l'analyse du fonctionnement de certains systèmes comme l'outil de coupe dans l'industrie a montré qu'un système peut avoir un mauvais fonctionnement sur une phase et s'améliorer sur la phase suivante. De ce fait, des modifications ont été apportées à la distribution de Weibull afin d'avoir des fonctions de risque de base non monotones, plus particulièrement les fonctions de risque croissantes puis décroissantes. En dépit de ces modifications, la prise en compte des conditions d'opérations extrêmes et la surestimation des risques s'avèrent problématiques. Nous avons donc, à partir de la loi standard de Gumbel, proposé une fonction de risque de base croissante puis décroissante permettant de prendre en compte les conditions extrêmes d'opérations, puis établi les preuves mathématiques y afférant. En outre, un exemple d'application dans le domaine de l'industrie a été proposé. Cette thèse est divisée en quatre chapitres auxquels s'ajoutent une introduction et une conclusion générales. Dans le premier chapitre, nous rappelons quelques notions de base sur la théorie des valeurs extrêmes. Le deuxième chapitre s'intéresse aux concepts de base de l'analyse de survie, particulièrement à ceux relatifs à l'analyse de fiabilité, en proposant une fonction de risque croissante-décroissante dans le modèle à risques proportionnels. En ce qui concerne le troisième chapitre, il porte sur l'utilisation des statistiques d'ordre extrême dans l'usinage, notamment dans la détection de pièces défectueuses par lots, la fiabilité de l'outil de coupe et la modélisation des meilleures surfaces de rugosité. Le dernier chapitre porte sur la prédiction d'éventuelles victimes dans l'Aviation Générale Américaine à partir des données historiques en utilisant l'approche "Peaks-Over Threshold" / The operation of a system in general may at any time be affected by an unforeseen incident. When this incident has major consequences on the system integrity and the quality of system products, then it is said to be in the context of extreme events. Thus, increasingly researchers have a particular interest in modeling such events with studies on the reliability of systems and the prediction of the different risks that can hinder the proper functioning of a system. This thesis takes place in this very perspective. We use Extreme Value Theory (EVT) and extreme order statistics as a decision support tool in modeling and risk management in industry and aviation. Specifically, we model the surface roughness of machined parts and the reliability of the associated cutting tool with the extreme order statistics. We also did a modeling using the "Peaks-Over Threshold, POT" approach to make predictions about the potential victims in the American General Aviation (AGA) following extreme accidents. In addition, the modeling of systems subjected to environmental factors or covariates is most often carried out by proportional hazard models based on the hazard function. In proportional hazard models, the baseline risk function is typically Weibull distribution, which is a monotonic function. The analysis of the operation of some systems like the cutting tool in the industry has shown that a system can deteriorated on one phase and improving on the next phase. Hence, some modifications have been made in the Weibull distribution in order to have non-monotonic basic risk functions, more specifically, the increasing-decreasing risk function. Despite these changes, taking into account extreme operating conditions and overestimating risks are problematics. We have therefore proposed from Gumbel's standard distribution, an increasingdecreasing risk function to take into account extreme conditions, and established mathematical proofs. Furthermore, an example of the application in the field of industry was proposed. This thesis is organized in four chapters and to this must be added a general introduction and a general conclusion. In the first chapter, we recall some basic notions about the Extreme Values Theory. The second chapter focuses on the basic concepts of survival analysis, particularly those relating to reliability analysis by proposing a function of increasing-decreasing hazard function in the proportional hazard model. Regarding the third chapter, it deals with the use of extreme order statistics in industry, particularly in the detection of defective parts, the reliability of the cutting tool and the modeling of the best roughness surfaces. The last chapter focuses on the prediction of potential victims in AGA from historical data using the Peaks-Over Threshold approach.
470

Aplicação do algoritmo genético adaptativo com hipermutação no ajuste dos parâmetros dos controladores suplementares e dispositivo FACTS IPFC /

Cordero Bautista, Luis Gustavo January 2019 (has links)
Orientador: Percival Bueno de Araujo / Resumo: As perturbações ou variações de carga produzem oscilações eletromecânicas que devem ser amortecidas o mais rápido possível para garantir confiabilidade e estabilidade da rede. Neste trabalho apresenta-se uma análise do dispositivo FACTS Interline Power Flow Controller (IPFC) e o controlador Proporcional Integral (PI) no gerenciamento dos fluxos de potência e a influência dos Estabilizadores do Sistema de Potência (ESP) e do IPFC Power Oscillation Damping (POD) sobre a estabilidade do sistema elétrico de potência. Neste trabalho enfoca-se nos estudos de estabilidade a pequenas perturbações usando um Algoritmo Genético Adaptativo com Hiper-mutação (AGAH) para ajustar os parâmetros dos controladores suplementares de amortecimento, o Estabilizador de sistema de potência (ESPs) e o Power Oscillation Damping (POD) em forma coordenada. O AGAH tem como objetivo encontrar os parâmetros ótimos do controlador para melhorar o amortecimento fraco das oscilações de baixa frequência locais e inter-área. Neste trabalho representa-se o sistema de elétrico de potência com a inclusão do dispositivo Interline Power Flow Controller com o modelo de sensibilidade de corrente (MSC). Considera-se como sistema teste o sistema Simétrico de Duas Áreas e o sistema New England como o intuito de avaliar o algoritmo proposto. As simulações são feitas no ambiente do MatLab. Por fim, apresenta-se a comparação do algoritmo genético com o desempenho do algoritmo proposto. / Abstract: Small-magnitude disturbances happen to produce electro-mechanical oscillations which should be damped as quickly as possible to ensure reliability and stability of the network. This work presents an analysis of Interline Power Flow Controller (IPFC) FACTS device and PI controller to control and manage power flow and also how Power System Stabilizers and IPFC Power Oscillations Damping (POD) controller influence over an electric power system stability. This work focuses on small-signal stability studies using an Adaptive Genetic Algorithm with Hyper-mutation (AGAH) in order to tune controller parameters in a coordinated way ensuring proper damping. AGAH aims to find optimal controller parameters to enhance the poor damping of local and inter-area low frequency oscillations. This works represents the electric power system and Interline Power Flow Controller device by a current sensitivity model (CSM). This paper considers two areas 14 bus symmetrical power system and New England power system in order to assess proposed algorithm. Coding and Simulations take place in MatLab platform. AGAH and GA get compared by time convergence and performance. This paper shows AGAH is an interesting optimization technique which outweighs GA. / Mestre

Page generated in 0.0926 seconds