• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 128
  • 52
  • 51
  • 9
  • 9
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 296
  • 296
  • 90
  • 75
  • 67
  • 65
  • 64
  • 59
  • 49
  • 41
  • 39
  • 36
  • 36
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
251

Evoluční návrh obvodů na úrovni tranzistorů / Evolutionary Circuit Design at the Transistor Level

Žaloudek, Luděk Unknown Date (has links)
This project deals with evolutionary design of electronic circuits with an emphasis on digital circuits. It describes the theoretical basics for the evolutionary design of circuits on computer systems, including the explanation of Genetic Programming and Evolutionary Strategies, possible design levels of electronic circuits, CMOS technology overview, also the overview of the most important evolutionary circuits design methods like development and Cartesian Genetic Programming. Next introduced is a new method of digital circuits design on the transistor level, which is based on CGP. Also a design system using this new method is introduced. Finally, the experiments performed with this system are described and evaluated.
252

Evolving user-specific emotion recognition model via incremental genetic programming / 漸進型遺伝的プログラミングによるユーザ特定型の感情認識モデルの進化に関する研究 / ゼンシンガタ イデンテキ プログラミング ニヨル ユーザ トクテイガタ ノ カンジョウ ニンシキ モデル ノ シンカ ニカンスル ケンキュウ

ユスフ ラハディアン, Rahadian Yusuf 22 March 2017 (has links)
本論文では,漸進型遺伝的プログラミングを用いて特定ユーザを対象にした感情認識モデルを進化的に実現する方法論について提案した.特徴量の木構造で解を表現する遺伝的プログラミングを用い,時間情報も含め顔表情データを取得できる汎用センサの情報を基にユーザ適応型の感情認識モデルを進化させた.同時に遺伝的プログラミングの非決定性,汎化性の欠如,過適応に対処するため,進化を漸進的に展開する機構を組み込んだ漸進型遺伝的プログラミング法を開発した. / This research proposes a model to tackle challenges common in Emotion Recognition based on facial expression. First, we use pervasive sensor and environment, enabling natural expressions of user, as opposed to unnatural expressions on a large dataset. Second, the model analyzes relevant temporal information, unlike many other researches. Third, we employ user-specific approach and adaptation to user. We also show that our evolved model by genetic programming can be analyzed on how it really works and not a black-box model. / 博士(工学) / Doctor of Philosophy in Engineering / 同志社大学 / Doshisha University
253

A Comparative Analysis Between Context-based Reasoning (cxbr) And Contextual Graphs (cxgs).

Lorins, Peterson Marthen 01 January 2005 (has links)
Context-based Reasoning (CxBR) and Contextual Graphs (CxGs) involve the modeling of human behavior in autonomous and decision-support situations in which optimal human decision-making is of utmost importance. Both formalisms use the notion of contexts to allow the implementation of intelligent agents equipped with a context sensitive knowledge base. However, CxBR uses a set of discrete contexts, implying that models created using CxBR operate within one context at a given time interval. CxGs use a continuous context-based representation for a given problem-solving scenario for decision-support processes. Both formalisms use contexts dynamically by continuously changing between necessary contexts as needed in appropriate instances. This thesis identifies a synergy between these two formalisms by looking into their similarities and differences. It became clear during the research that each paradigm was designed with a very specific family of problems in mind. Thus, CXBR best implements models of autonomous agents in environment, while CxGs is best implemented in a decision support setting that requires the development of decision-making procedures. Cross applications were implemented on each and the results are discussed.
254

Integrated Data Fusion And Mining (idfm) Technique For Monitoring Water Quality In Large And Small Lakes

Vannah, Benjamin 01 January 2013 (has links)
Monitoring water quality on a near-real-time basis to address water resources management and public health concerns in coupled natural systems and the built environment is by no means an easy task. Furthermore, this emerging societal challenge will continue to grow, due to the ever-increasing anthropogenic impacts upon surface waters. For example, urban growth and agricultural operations have led to an influx of nutrients into surface waters stimulating harmful algal bloom formation, and stormwater runoff from urban areas contributes to the accumulation of total organic carbon (TOC) in surface waters. TOC in surface waters is a known precursor of disinfection byproducts in drinking water treatment, and microcystin is a potent hepatotoxin produced by the bacteria Microcystis, which can form expansive algal blooms in eutrophied lakes. Due to the ecological impacts and human health hazards posed by TOC and microcystin, it is imperative that municipal decision makers and water treatment plant operators are equipped with a rapid and economical means to track and measure these substances. Remote sensing is an emergent solution for monitoring and measuring changes to the earth’s environment. This technology allows for large regions anywhere on the globe to be observed on a frequent basis. This study demonstrates the prototype of a near-real-time early warning system using Integrated Data Fusion and Mining (IDFM) techniques with the aid of both multispectral (Landsat and MODIS) and hyperspectral (MERIS) satellite sensors to determine spatiotemporal distributions of TOC and microcystin. Landsat satellite imageries have high spatial resolution, but such application suffers from a long overpass interval of 16 days. On the other hand, free coarse resolution sensors with daily revisit times, such as MODIS, are incapable of providing detailed water quality information because of low spatial resolution. This iv issue can be resolved by using data or sensor fusion techniques, an instrumental part of IDFM, in which the high spatial resolution of Landsat and the high temporal resolution of MODIS imageries are fused and analyzed by a suite of regression models to optimally produce synthetic images with both high spatial and temporal resolutions. The same techniques are applied to the hyperspectral sensor MERIS with the aid of the MODIS ocean color bands to generate fused images with enhanced spatial, temporal, and spectral properties. The performance of the data mining models derived using fused hyperspectral and fused multispectral data are quantified using four statistical indices. The second task compared traditional two-band models against more powerful data mining models for TOC and microcystin prediction. The use of IDFM is illustrated for monitoring microcystin concentrations in Lake Erie (large lake), and it is applied for TOC monitoring in Harsha Lake (small lake). Analysis confirmed that data mining methods excelled beyond two-band models at accurately estimating TOC and microcystin concentrations in lakes, and the more detailed spectral reflectance data offered by hyperspectral sensors produced a noticeable increase in accuracy for the retrieval of water quality parameters.
255

Falconet: Force-feedback Approach For Learning From Coaching And Observation Using Natural And Experiential Training

Stein, Gary 01 January 2009 (has links)
Building an intelligent agent model from scratch is a difficult task. Thus, it would be preferable to have an automated process perform this task. There have been many manual and automatic techniques, however, each of these has various issues with obtaining, organizing, or making use of the data. Additionally, it can be difficult to get perfect data or, once the data is obtained, impractical to get a human subject to explain why some action was performed. Because of these problems, machine learning from observation emerged to produce agent models based on observational data. Learning from observation uses unobtrusive and purely observable information to construct an agent that behaves similarly to the observed human. Typically, an observational system builds an agent only based on prerecorded observations. This type of system works well with respect to agent creation, but lacks the ability to be trained and updated on-line. To overcome these deficiencies, the proposed system works by adding an augmented force-feedback system of training that senses the agents intentions haptically. Furthermore, because not all possible situations can be observed or directly trained, a third stage of learning from practice is added for the agent to gain additional knowledge for a particular mission. These stages of learning mimic the natural way a human might learn a task by first watching the task being performed, then being coached to improve, and finally practicing to self improve. The hypothesis is that a system that is initially trained using human recorded data (Observational), then tuned and adjusted using force-feedback (Instructional), and then allowed to perform the task in different situations (Experiential) will be better than any individual step or combination of steps.
256

Contextualizing Observational Data For Modeling Human Performance

Trinh, Viet 01 January 2009 (has links)
This research focuses on the ability to contextualize observed human behaviors in efforts to automate the process of tactical human performance modeling through learning from observations. This effort to contextualize human behavior is aimed at minimizing the role and involvement of the knowledge engineers required in building intelligent Context-based Reasoning (CxBR) agents. More specifically, the goal is to automatically discover the context in which a human actor is situated when performing a mission to facilitate the learning of such CxBR models. This research is derived from the contextualization problem left behind in Fernlund's research on using the Genetic Context Learner (GenCL) to model CxBR agents from observed human performance [Fernlund, 2004]. To accomplish the process of context discovery, this research proposes two contextualization algorithms: Contextualized Fuzzy ART (CFA) and Context Partitioning and Clustering (COPAC). The former is a more naive approach utilizing the well known Fuzzy ART strategy while the latter is a robust algorithm developed on the principles of CxBR. Using Fernlund's original five drivers, the CFA and COPAC algorithms were tested and evaluated on their ability to effectively contextualize each driver's individualized set of behaviors into well-formed and meaningful context bases as well as generating high-fidelity agents through the integration with Fernlund's GenCL algorithm. The resultant set of agents was able to capture and generalized each driver's individualized behaviors.
257

[en] PETROLEUM SCHEDULING MULTIOBJECTIVE OPTIMIZATION FOR REFINERY BY GENETIC PROGRAMMING USING DOMAIN SPECIFIC LANGUAGE / [pt] OTIMIZAÇÃO MULTIOBJETIVO DA PROGRAMAÇÃO DE PETRÓLEO EM REFINARIA POR PROGRAMAÇÃO GENÉTICA EM LINGUAGEM ESPECÍFICA DE DOMÍNIO

CRISTIANE SALGADO PEREIRA 26 November 2018 (has links)
[pt] A programação de produção em refinaria (scheduling) pode ser compreendida como uma sequência de decisões que buscam otimizar a alocação de recursos, o sequenciamento de atividades e a realização temporal dessas atividades, respeitando um conjunto de restrições de diferentes naturezas e visando o atendimento de múltiplos objetivos onde fatores como atendimento à demanda de produção e minimização de variações operacionais nos equipamentos coexistem na mesma função. Este trabalho propõe o uso da técnica de Programação Genética para automatizar a criação de programas que representem uma solução completa de programação de petróleo em uma refinaria dentro de um horizonte de tempo. Para a evolução destes programas foi desenvolvida uma linguagem específica para o domínio de problemas de scheduling de petróleo e aplicada de forma a representar as principais atividades do estudo de caso. Para tal, a primeira etapa consistiu da avaliação de alguns cenários de programação de produção de forma a selecionar as atividades que devessem ser representadas e como fazê-lo. No modelo proposto, o cromossomo quântico guarda a superposição de estados de todas as soluções possíveis e, através do processo evolutivo e observação dos genes quânticos, o cromossomo clássico é criado como uma sequencia linear de instruções a serem executadas. As instruções executadas representam o scheduling. A orientação deste processo é feita através de uma função de aptidão multiobjetivo que hierarquiza as avaliações sobre o tempo de operação das unidades de destilação, o prazo para descarregamento de navios, a utilização do duto que movimenta óleo entre terminal e refinaria, além de fatores como número de trocas de tanques e uso de tanques de injeção nas unidades de destilação. No desenvolvimento deste trabalho foi contemplado um estudo sobre o conjunto de parâmetros para o modelo desenvolvido com base em um dos cenários de programação selecionados. A partir desta definição, para avaliação do modelo proposto, foram executadas diversas rodadas para cinco cenários de programação de petróleo. Os resultados obtidos foram comparados com estudo desenvolvido usando algoritmos genéticos cujas atividades, no cromossomo, possuem representação por ordem. A programação genética apresentou percentual de soluções aceitas variando entre 25 por cento e 90 por cento dependendo da complexidade do cenário, sendo estes valores superiores ao obtido usando Algoritmos Genéticos em todos os cenários, com esforço computacional menor. / [en] Refinery scheduling can be understood as a sequence of decisions that targets the optimization of available resources, sequencing and execution of activities on proper timing; always respecting restrictions of different natures. The final result must achieve multiple objectives guaranteeing co-existence of different factors in the same function, such as production demand fullfillment and minimize operational variation. In this work it is proposed the use of the genetic programming technique to automate the building process of programs that represent a complete oil scheduling solution within a defined time horizon. For the evolution of those programs, it was developed a domain specific language to translate oil scheduling instructions that was applied to represent the most relevant activities for the proposed case studies. For that, purpose first step was to evaluate a few real scheduling scenarios to select which activities needed to be represented and how to do that. On the proposed model, each quantum chromosome represents the overlapping of all solutions and by the evolutionary process (and quantum gene measurement) the classic chromosome is created as a linear sequence of scheduling instructions to be executed. The orientation for this process is performed through a multi-object fitness function that prioritizes the evaluations according to: the operating time of the atmospheric distillation unities, the oil unloading time from the ships, the oil pipeline operation to transport oil to the refinery and other parameters like the number of charge tanks switchover and injection tank used for the distillation unities. The scope of this work also includes a study about tuning for the developed model based in one of the considered scenarios. From this set, an evaluation of other different scheduling scenarios was performed to test the model. The obtained results were then compared with a developed model that uses genetic algorithms with order representation for the activities. The proposed model showed between 25 percent - 90 percent of good solutions depending on the scenario complexity. Those results exhibit higher percentage of good solutions requiring less computational effort than the ones obtained with the genetic algorithms.
258

Gene expression programming for logic circuit design

Masimula, Steven Mandla 02 1900 (has links)
Finding an optimal solution for the logic circuit design problem is challenging and time-consuming especially for complex logic circuits. As the number of logic gates increases the task of designing optimal logic circuits extends beyond human capability. A number of evolutionary algorithms have been invented to tackle a range of optimisation problems, including logic circuit design. This dissertation explores two of these evolutionary algorithms i.e. Gene Expression Programming (GEP) and Multi Expression Programming (MEP) with the aim of integrating their strengths into a new Genetic Programming (GP) algorithm. GEP was invented by Candida Ferreira in 1999 and published in 2001 [8]. The GEP algorithm inherits the advantages of the Genetic Algorithm (GA) and GP, and it uses a simple encoding method to solve complex problems [6, 32]. While GEP emerged as powerful due to its simplicity in implementation and exibility in genetic operations, it is not without weaknesses. Some of these inherent weaknesses are discussed in [1, 6, 21]. Like GEP, MEP is a GP-variant that uses linear chromosomes of xed length [23]. A unique feature of MEP is its ability to store multiple solutions of a problem in a single chromosome. MEP also has an ability to implement code-reuse which is achieved through its representation which allow multiple references to a single sub-structure. This dissertation proposes a new GP algorithm, Improved Gene Expression Programming (IGEP) which im- proves the performance of the traditional GEP by combining the code-reuse capability and simplicity of gene encoding method from MEP and GEP, respectively. The results obtained using the IGEP and the traditional GEP show that the two algorithms are comparable in terms of the success rate when applied on simple problems such as basic logic functions. However, for complex problems such as one-bit Full Adder (FA) and AND-OR Arithmetic Logic Unit (ALU) the IGEP performs better than the traditional GEP due to the code-reuse in IGEP / Mathematical Sciences / M. Sc. (Applied Mathematics)
259

AZIP, audio compression system: Research on audio compression, comparison of psychoacoustic principles and genetic algorithms

Chen, Howard 01 January 2005 (has links)
The purpose of this project is to investigate the differences between psychoacoustic principles and genetic algorithms (GA0). These will be discussed separately. The review will also compare the compression ratio and the quality of the decompressed files decoded by these two methods.
260

Gramatická evoluce v optimalizaci software / Grammatical Evolution in Software Optimization

Pečínka, Zdeněk January 2017 (has links)
This master's thesis offers a brief introduction to evolutionary computation. It describes and compares the genetic programming and grammar based genetic programming and their potential use in automatic software repair. It studies possible applications of grammar based genetic programming on automatic software repair. Grammar based genetic programming is then used in design and implementation of a new method for automatic software repair. Experimental evaluation of the implemented automatic repair was performed on set of test programs.

Page generated in 0.274 seconds