• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 19
  • 19
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Quantum information and entropy

Ibinson, Ben January 2008 (has links)
No description available.
2

Stroboscopic point concentration in hyper-chaotic system

Jan, Heng-tai 01 July 2010 (has links)
The detection for phase locking in a forced oscillator with dual attractors and ill-defined phase structure is hard until a quantitative approach was constructed for detecting phase locking via stroboscopic method. We study the route to weak phase locking in a chaotic system ¡§Chua oscillator¡¨ with complex attractor structure by analyzing the stroboscopic points. The onset of weak phase locking detected by using this statistical approach and the critical coupling strength calculated by conditional Lyapunov exponent are matched well. Detailed structure of phase locking intensity is described by the Arnold tongue diagram. Moreover, we apply this approach on three hyper-chaotic systems with multi-scroll attractor, including hyper-chaotic Rössler system, hyper-chaotic Lorenz system, and modified MCK oscillator. The weak phase locking between hyper-chaotic system and a periodic or a chaotic driving force is observable following the condition of stroboscopic point concentration.
3

The information analysis and the research on entropy for measurement data / Matavimo duomenų informacinė analizė ir entropijos tyrimas

Rybokas, Mindaugas 28 September 2006 (has links)
Information entropy parameter has been applied for an expression of the result of data assessment and it is supplemented by an index of sample of data that was evaluated out of set of information. A modelling system and software have been developed that can be used and are used for practical processing of measurement data for circular raster scales. / Duomenų įverčiui išreikšti pritaikytas informacinės entropijos parametras pateiktoje rezultato išraiškoje yra papildytas rodikliu apie duomenų imtį, kuri buvo įvertinta iš visos šį objektą charakterizuojančių duomenų aibės. Sukurta modeliavimo sistema ir programinė įranga gali būti naudojama didelio skaičiaus nežinomųjų lygtims spręsti, o praktikoje naudojama rastrinių skalių matavimo duomenims apdoroti.
4

Interaction Based Measure of Manufacturing Systems Complexity and Supply Chain Systems Vulnerability Using Information Entropy

Alamoudi, Rami Hussain 20 April 2008 (has links)
The first primary objective of this dissertation is to develop a framework that can quantitatively measure complexity of manufacturing systems in various configurations, including conjoined and disjoined systems. In this dissertation, an analytical model for manufacturing systems complexity that employs information entropy theory is proposed and verified. The model uses probability distribution of information regarding resource allocations that are described in terms of interactions among resources for part processing and part processing requirements. In the proposed framework, both direct and indirect interactions among resources are modeled using a matrix, called interaction matrix, which accounts for part processing and waiting times. The proposed complexity model identifies a manufacturing system that has evenly distributed interactions among resources as being more complex, because under disruption situation more information is required to identify source of the disruption. In addition, implicit relationships between the system complexity and performance in terms of resource utilizations, waiting time, cycle time and throughput of the system are studied in this dissertation by developing a computer program for simulating general job shop environment. The second primary objective of this dissertation is to develop a mathematical model for measuring the vulnerability of the supply chain systems. Global supply chains are exposed to different kinds of disruptions. This has promoted the issue of supply chain resilience higher than ever before in business as well as supporting agendas. In this dissertation, an extension of the proposed measure for manufacturing system complexity is used to measure the vulnerability of the supply chain systems using information entropy theory and influence matrix. We define the vulnerability of supply chain systems based on required information that describes the system in terms of topology and interrelationship among components. The proposed framework for vulnerability modeling in this dissertation focus on disruptive events such as natural disasters, terrorist attacks, or industrial disputes, rather than deviations such as variations in demand, procurement and transportation.
5

Matavimo duomenų informacinė analizė ir entropijos tyrimas / The Information Analysis and the Research on Entropy for Measurement Data

Rybokas, Mindaugas 05 October 2006 (has links)
Duomenų įverčiui išreikšti pritaikytas informacinės entropijos parametras pateiktoje rezultato išraiškoje yra papildytas rodikliu apie duomenų imtį, kuri buvo įvertinta iš visos šį objektą charakterizuojančių duomenų aibės. Sukurta modeliavimo sistema ir programinė įranga gali būti naudojama didelio skaičiaus nežinomųjų lygtims spręsti, o praktikoje naudojama rastrinių skalių matavimo duomenims apdoroti. / Information entropy parameter has been applied for an expression of the result of data assessment and it is supplemented by an index of sample of data that was evaluated out of set of information. A modelling system and software have been developed that can be used and are used for practical processing of measurement data for circular raster scales.
6

Information and Self-Organization in Complex Networks

Culbreth, Garland 12 1900 (has links)
Networks that self-organize in response to information are one of the most central studies in complex systems theory. A new time series analysis tool for studying self-organizing systems is developed and demonstrated. This method is applied to interacting complex swarms to explore the connection between information transport and group size, providing evidence for Dunbar's numbers having a foundation in network dynamics. A complex network model of information spread is developed. This network infodemic model uses reinforcement learning to simulate connection and opinion adaptation resulting from interaction between units. The model is applied to study polarized populations and echo chamber formation, exploring strategies for network resilience and weakening. The model is straightforward to extend to multilayer networks and networks generated from real world data. By unifying explanation and prediction, the network infodemic model offers a timely step toward understanding global collective behavior.
7

INTELLIGENT CONDITION BASED MAINTENANCE - A SOFT COMPUTING APPROACH TO SYSTEM DIAGNOSIS AND PROGNOSIS

KOTHAMASU, RANGANATH 03 April 2006 (has links)
No description available.
8

Automatic Construction Algorithms for Supervised Neural Networks and Applications

Tsai, Hsien-Leing 28 July 2004 (has links)
The reseach on neural networks has been done for six decades. In this period, many neural models and learning rules have been proposed. Futhermore, they were popularly and successfully applied to many applications. They successfully solved many problems that traditional algorithms could not solve efficiently . However, applying multilayer neural networks to applications, users are confronted with the problem of determining the number of hidden layers and the number of hidden neurons in each hidden layer. It is too difficult for users to determine proper neural network architectures. However, it is very significant, because neural network architectures always influence critically their performance. We may solve problems efficiently, only when we has proper neural network architectures. To overcome this difficulty, several approaches have been proposed to generate the architecture of neural networks recently. However, they still have some drawbacks. The goal of our research is to discover better approachs to automatically determine proper neural network architectures. We propose a series of approaches in this thesis. First, we propose an approach based on decision trees. It successfully determines neural network architectures and greatly decreases learning time. However, it can deal only with two-class problems and it generates bigger neural network architectures. Next, we propose an information entropy based approach to overcome the above drawbacks. It can generate easily multi-class neural networks for standard domain problems. Finally, we expand the above method for sequential domain and structured domain problems. Therefore, our approaches can be applied to many applications. Currently, we are trying to work on quantum neural networks. We are also interested in ART neural networks. They are also incremental neural models. We apply them to digital signal processing. We propose a character recognition application, a spoken word recognition application, and an image compression application. All of them have good performances.
9

Entropy: algoritmo de substituição de linhas de cache inspirado na entropia da informação. / Entropy: cache line replacement algorithm inspired in information entropy.

Kobayashi, Jorge Mamoru 07 June 2010 (has links)
Este trabalho apresenta um estudo sobre o problema de substituição de linhas de cache em microprocessadores. Inspirado no conceito de Entropia da Informação proposto em 1948 por Claude E. Shannon, este trabalho propõe uma nova heurística de substituição de linhas de cache. Seu objetivo é capturar e explorar melhor a localidade de referência dos programas e diminuir a taxa de miss rate durante a execução dos programas. O algoritmo proposto, Entropy, utiliza a heurística de entropia da informação para estimar as chances de uma linha ou bloco de cache ser referenciado após ter sido carregado na cache. Uma nova função de decaimento de entropia foi introduzida no algoritmo, otimizando seu funcionamento. Dentre os resultados obtidos, o Entropy conseguiu reduzir em até 50,41% o miss rate em relação ao algoritmo LRU. O trabalho propõe, ainda, uma implementação em hardware com complexidade e custo computacional comparáveis aos do algoritmo LRU. Para uma memória cache de segundo nível com 2-Mbytes e 8-way associative, a área adicional requerida é da ordem de 0,61% de bits adicionais. O algoritmo proposto foi simulado no SimpleScalar e comparado com o algoritmo LRU utilizando-se os benchmarks SPEC CPU2000. / This work presents a study about cache line replacement problem for microprocessors. Inspired in the Information Entropy concept stated by Claude E. Shannon in 1948, this work proposes a novel heuristic to replace cache lines in microprocessors. The major goal is to capture the referential locality of programs and to reduce the miss rate for cache access during programs execution. The proposed algorithm, Entropy, employs that new entropy heuristic to estimate the chances of a cache line to be referenced after it has been loaded into cache. A novel decay function has been introduced to optimize its operation. Results show that Entropy could reduce miss rate up to 50.41% in comparison to LRU. This work also proposes a hardware implementation which keeps computation and complexity costs comparable to the most employed algorithm, LRU. To a 2-Mbytes and 8-way associative cache memory, the required storage area is 0.61% of the cache size. The Entropy algorithm was simulated using SimpleScalar ISA simulator and compared to LRU using SPEC CPU2000 benchmark programs.
10

Entropy: algoritmo de substituição de linhas de cache inspirado na entropia da informação. / Entropy: cache line replacement algorithm inspired in information entropy.

Jorge Mamoru Kobayashi 07 June 2010 (has links)
Este trabalho apresenta um estudo sobre o problema de substituição de linhas de cache em microprocessadores. Inspirado no conceito de Entropia da Informação proposto em 1948 por Claude E. Shannon, este trabalho propõe uma nova heurística de substituição de linhas de cache. Seu objetivo é capturar e explorar melhor a localidade de referência dos programas e diminuir a taxa de miss rate durante a execução dos programas. O algoritmo proposto, Entropy, utiliza a heurística de entropia da informação para estimar as chances de uma linha ou bloco de cache ser referenciado após ter sido carregado na cache. Uma nova função de decaimento de entropia foi introduzida no algoritmo, otimizando seu funcionamento. Dentre os resultados obtidos, o Entropy conseguiu reduzir em até 50,41% o miss rate em relação ao algoritmo LRU. O trabalho propõe, ainda, uma implementação em hardware com complexidade e custo computacional comparáveis aos do algoritmo LRU. Para uma memória cache de segundo nível com 2-Mbytes e 8-way associative, a área adicional requerida é da ordem de 0,61% de bits adicionais. O algoritmo proposto foi simulado no SimpleScalar e comparado com o algoritmo LRU utilizando-se os benchmarks SPEC CPU2000. / This work presents a study about cache line replacement problem for microprocessors. Inspired in the Information Entropy concept stated by Claude E. Shannon in 1948, this work proposes a novel heuristic to replace cache lines in microprocessors. The major goal is to capture the referential locality of programs and to reduce the miss rate for cache access during programs execution. The proposed algorithm, Entropy, employs that new entropy heuristic to estimate the chances of a cache line to be referenced after it has been loaded into cache. A novel decay function has been introduced to optimize its operation. Results show that Entropy could reduce miss rate up to 50.41% in comparison to LRU. This work also proposes a hardware implementation which keeps computation and complexity costs comparable to the most employed algorithm, LRU. To a 2-Mbytes and 8-way associative cache memory, the required storage area is 0.61% of the cache size. The Entropy algorithm was simulated using SimpleScalar ISA simulator and compared to LRU using SPEC CPU2000 benchmark programs.

Page generated in 0.1567 seconds