• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 1
  • 1
  • Tagged with
  • 12
  • 12
  • 8
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Evaluation of Hierarchical Temporal Memory in algorithmic trading

Åslin, Fredrik January 2010 (has links)
<p>This thesis looks into how one could use Hierarchal Temporal Memory (HTM) networks to generate models that could be used as trading algorithms. The thesis begins with a brief introduction to algorithmic trading and commonly used concepts when developing trading algorithms. The thesis then proceeds to explain what an HTM is and how it works. To explore whether an HTM could be used to generate models that could be used as trading algorithms, the thesis conducts a series of experiments. The goal of the experiments is to iteratively optimize the settings for an HTM and try to generate a model that when used as a trading algorithm would have more profitable trades than losing trades. The setup of the experiments is to train an HTM to predict if it is a good time to buy some shares in a security and hold them for a fixed time before selling them again. A fair amount of the models generated during the experiments was profitable on data the model have never seen before, therefore the author concludes that it is possible to train an HTM so it can be used as a profitable trading algorithm.</p>
2

Evaluation of Hierarchical Temporal Memory in algorithmic trading

Åslin, Fredrik January 2010 (has links)
This thesis looks into how one could use Hierarchal Temporal Memory (HTM) networks to generate models that could be used as trading algorithms. The thesis begins with a brief introduction to algorithmic trading and commonly used concepts when developing trading algorithms. The thesis then proceeds to explain what an HTM is and how it works. To explore whether an HTM could be used to generate models that could be used as trading algorithms, the thesis conducts a series of experiments. The goal of the experiments is to iteratively optimize the settings for an HTM and try to generate a model that when used as a trading algorithm would have more profitable trades than losing trades. The setup of the experiments is to train an HTM to predict if it is a good time to buy some shares in a security and hold them for a fixed time before selling them again. A fair amount of the models generated during the experiments was profitable on data the model have never seen before, therefore the author concludes that it is possible to train an HTM so it can be used as a profitable trading algorithm.
3

Toward machines with brain inspired intelligence: A study on Hierarchical Temporal Memory Technology

Heravi Khajavi, Roxanne January 2008 (has links)
<p>This Master Thesis has been performed at the Department of Electrical Engineering, Division of Electronic Devices in Linköping University. A study about HTM technology and a technical evaluation of advanced HTM picture recognition has been attained. HTM, which stands for Hierarchical Temporal Memory, is a technology developed by Numenta Inc. based on Jeff Hawkins theory on the brain function. The report includes also some essential facts about the brain for guidelines of engineers to reach a better understanding of the connection between the brain and the technology of HTM. Even if the technique of HTM is still young but the ambition of its developer is to design truly intelligent machines.</p>
4

Toward machines with brain inspired intelligence: A study on Hierarchical Temporal Memory Technology

Heravi Khajavi, Roxanne January 2008 (has links)
This Master Thesis has been performed at the Department of Electrical Engineering, Division of Electronic Devices in Linköping University. A study about HTM technology and a technical evaluation of advanced HTM picture recognition has been attained. HTM, which stands for Hierarchical Temporal Memory, is a technology developed by Numenta Inc. based on Jeff Hawkins theory on the brain function. The report includes also some essential facts about the brain for guidelines of engineers to reach a better understanding of the connection between the brain and the technology of HTM. Even if the technique of HTM is still young but the ambition of its developer is to design truly intelligent machines.
5

Cognitive and neural processes underlying memory for time and context

Persson, Bjorn Martin January 2017 (has links)
The aim of this thesis is to examine the underlying cognitive and neural processes at play during retrieval of temporal and contextual source information. This was assessed across three experimental chapters. In the first experimental chapter, Chapter 2, the neural loci of context associations were assessed. Rats trained on an odour-context association task were given lesions to either the Lateral Entorhinal Cortex (LEC) or sham lesions. After surgery, performance on the odour-context task was assessed. It was hypothesised that memory for previously learned odour-context associations would be impaired following LEC lesions but not sham lesions. The results supported this hypothesis, demonstrating impaired memory for the previously learned odour-context associations in the LEC lesion group compared to the Sham lesion. In Chapter 3, the underlying retrieval processes used to retrieve time and context in human memory was assessed across three experiments. It was hypothesised that time would be remembered accurately using both recollection and familiarity, while correct context memory should rely on recollection alone. Two out of the three experiments supported this hypothesis, demonstrating that temporal information can be retrieved using familiarity in certain instances. The final experimental Chapter 4 used fMRI to extend Chapter 3 and examine whether neural activity would be greater in regions associated with recollection during memory for context, while activity in familiarity-related regions would be higher during memory for time. Results revealed no support for these predictions with no regions linked to recollection showing greater context-related activity, and no regions previously linked to familiarity exhibiting increased activation as temporal information was retrieved. The results are discussed in relation to established recollection and familiarity frameworks and previous work examining the neural substrates supporting memory for time and context.
6

A computational approach to achieve situational awareness from limited observations of a complex system

Sherwin, Jason 06 April 2010 (has links)
At the start of the 21st century, the topic of complexity remains a formidable challenge in engineering, science and other aspects of our world. It seems that when disaster strikes it is because some complex and unforeseen interaction causes the unfortunate outcome. Why did the financial system of the world meltdown in 2008-2009? Why are global temperatures on the rise? These questions and other ones like them are difficult to answer because they pertain to contexts that require lengthy descriptions. In other words, these contexts are complex. But we as human beings are able to observe and recognize this thing we call 'complexity'. Furthermore, we recognize that there are certain elements of a context that form a system of complex interactions - i.e., a complex system. Many researchers have even noted similarities between seemingly disparate complex systems. Do sub-atomic systems bear resemblance to weather patterns? Or do human-based economic systems bear resemblance to macroscopic flows? Where do we draw the line in their resemblance? These are the kinds of questions that are asked in complex systems research. And the ability to recognize complexity is not only limited to analytic research. Rather, there are many known examples of humans who, not only observe and recognize but also, operate complex systems. How do they do it? Is there something superhuman about these people or is there something common to human anatomy that makes it possible to fly a plane? - Or to drive a bus? Or to operate a nuclear power plant? Or to play Chopin's etudes on the piano? In each of these examples, a human being operates a complex system of machinery, whether it is a plane, a bus, a nuclear power plant or a piano. What is the common thread running through these abilities? The study of situational awareness (SA) examines how people do these types of remarkable feats. It is not a bottom-up science though because it relies on finding general principles running through a host of varied human activities. Nevertheless, since it is not constrained by computational details, the study of situational awareness provides a unique opportunity to approach complex tasks of operation from an analytical perspective. In other words, with SA, we get to see how humans observe, recognize and react to complex systems on which they exert some control. Reconciling this perspective on complexity with complex systems research, it might be possible to further our understanding of complex phenomena if we can probe the anatomical mechanisms by which we, as humans, do it naturally. At this unique intersection of two disciplines, a hybrid approach is needed. So in this work, we propose just such an approach. In particular, this research proposes a computational approach to the situational awareness (SA) of complex systems. Here we propose to implement certain aspects of situational awareness via a biologically-inspired machine-learning technique called Hierarchical Temporal Memory (HTM). In doing so, we will use either simulated or actual data to create and to test computational implementations of situational awareness. This will be tested in two example contexts, one being more complex than the other. The ultimate goal of this research is to demonstrate a possible approach to analyzing and understanding complex systems. By using HTM and carefully developing techniques to analyze the SA formed from data, it is believed that this goal can be obtained.
7

Temporal information processing and memory guided behaviors with recurrent neural networks

Dasgupta, Sakyasingha 28 January 2015 (has links)
No description available.
8

Se souvenir et revenir : approche théorique et méthodologique des stratégies de déplacement récursif et de leurs conséquences populationnelles / Remembering and coming back : a theoretical and methodological approach to recursive movement strategies and their population-level consequences

Riotte-Lambert, Louise 18 October 2016 (has links)
Les patrons récursifs de déplacement, où l’individu revient à des sites déjà visités, sont très répandus. L’utilisation de la mémoire, supposée être avantageuse lorsque l’environnement est prévisible, pourrait être sous-jacente à l’émergence de ces patrons. Cependant, notre compréhension de l’interface mémoire-déplacement a jusqu'à présent été limitée par un manque de méthodes adaptées et d’investigation théorique des avantages de l’utilisation de la mémoire et des patrons qui en émergent. Au cours de cette thèse j’ai cherché à combler en partie ces manques. Je propose ici trois nouveaux cadres d'analyse des patrons récursifs de déplacement. Le premier délimite les zones les plus fréquemment revisitées par un individu, le deuxième détecte la périodicité dans les revisites de sites connus, et le troisième définit formellement et quantifie la routine de déplacement en termes de répétitivité de la séquence de déplacement, et propose un algorithme pour détecter les sous-séquences répétées. A l'aide d'un modèle individu-centré, nous montrons que l'utilisation de la mémoire dans un environnement prévisible est très avantageuse énergétiquement comparée à une stratégie de recherche sans mémoire, y compris en situation de compétition, et qu'elle mène à l'émergence de domaines vitaux stables et à la ségrégation spatiale entre individus. L'utilisation de la mémoire invalide plusieurs hypothèses très courantes faites par les études populationnelles, en menant à une forte déplétion de l’environnement, à une augmentation de la taille de la population à l’équilibre, et à une relation non linéaire entre la taille de population totale et l’intensité de compétition localement ressentie par les individus. Ainsi, ma thèse contribue à une meilleure compréhension des conséquences de la mémoire sur la valeur sélective des individus, sur les patrons de déplacement, et sur la démographie des populations. Elle propose des méthodes innovantes pour quantifier et caractériser les patrons récursifs de déplacement pouvant émerger de son utilisation. Ces méthodes devraient ouvrir de nouvelles opportunités de comparaisons entre individus de différentes populations ou espèces qui permettront le test d'hypothèses sur les pressions de sélection favorisant l'utilisation de la mémoire. / Recursive movement patterns, by which an individual returns to already visited sites, are very common. Memory use, hypothesized to be advantageous when the environment is predictable, could underlie the emergence of these patterns. However, our understanding of the memory-movement interface has been limited by two knowledge gaps. We still lack appropriate methodologies and theoretical knowledge of the advantages of memory use and of the patterns that emerge from it. During this PhD project, I aimed at filling in some of these gaps. I present here three new frameworks for the analysis of recursive movement patterns. The first one delimits the areas most frequently revisited by an individual, the second one detects periodic revisit patterns, and the third one formally defines and quantifies routine movement behaviour in terms of movement sequence repetitiveness, and presents an algorithm that detects the sub-sequences that are repeated. Using an individual-based model, we show that memory use, when the environment is predictable, is very energetically advantageous compared to foraging strategies that do not use memory, including in a situation of competition, and that it leads to the emergence of stable Home Ranges and spatial segregation between individuals. Memory use invalidates several hypotheses very commonly made in population studies, by leading to a stronger environmental depletion, to a higher equilibrium population size, and to a nonlinear relationship between the total population size and the individually-experienced intensity of competition. Therefore, my PhD thesis contributes to a better understanding of the consequences of memory use for the fitness of individuals, for movement patterns, and for population dynamics. It offers innovative methodologies that quantify and characterize recursive movement patterns that can emerge from its use. These methods should open new opportunities for the comparison of the movements of individuals from different populations and species, and thus the testing of hypotheses about the pressures that select for memory use.
9

Hierarchical Temporal Memory Software Agent : In the light of general artificial intelligence criteria

Heyder, Jakob January 2018 (has links)
Artificial general intelligence is not well defined, but attempts such as the recent listof “Ingredients for building machines that think and learn like humans” are a startingpoint for building a system considered as such [1]. Numenta is attempting to lead thenew era of machine intelligence with their research to re-engineer principles of theneocortex. It is to be explored how the ingredients are in line with the design princi-ples of their algorithms. Inspired by Deep Minds commentary about an autonomy-ingredient, this project created a combination of Numentas Hierarchical TemporalMemory theory and Temporal Difference learning to solve simple tasks defined in abrowser environment. An open source software, based on Numentas intelligent com-puting platform NUPIC and Open AIs framework Universe, was developed to allowfurther research of HTM based agents on customized browser tasks. The analysisand evaluation of the results show that the agent is capable of learning simple tasksand there is potential for generalization inherent to sparse representations. However,they also reveal the infancy of the algorithms, not capable of learning dynamic com-plex problems, and that much future research is needed to explore if they can createscalable solutions towards a more general intelligent system.
10

Hierarchical Temporal Memory Cortical Learning Algorithm for Pattern Recognition on Multi-core Architectures

Price, Ryan William 01 January 2011 (has links)
Strongly inspired by an understanding of mammalian cortical structure and function, the Hierarchical Temporal Memory Cortical Learning Algorithm (HTM CLA) is a promising new approach to problems of recognition and inference in space and time. Only a subset of the theoretical framework of this algorithm has been studied, but it is already clear that there is a need for more information about the performance of HTM CLA with real data and the associated computational costs. For the work presented here, a complete implementation of Numenta's current algorithm was done in C++. In validating the implementation, first and higher order sequence learning was briefly examined, as was algorithm behavior with noisy data doing simple pattern recognition. A pattern recognition task was created using sequences of handwritten digits and performance analysis of the sequential implementation was performed. The analysis indicates that the resulting rapid increase in computing load may impact algorithm scalability, which may, in turn, be an obstacle to widespread adoption of the algorithm. Two critical hotspots in the sequential code were identified and a parallelized version was developed using OpenMP multi-threading. Scalability analysis of the parallel implementation was performed on a state of the art multi-core computing platform. Modest speedup was readily achieved with straightforward parallelization. Parallelization on multi-core systems is an attractive choice for moderate sized applications, but significantly larger ones are likely to remain infeasible without more specialized hardware acceleration accompanied by optimizations to the algorithm.

Page generated in 0.0594 seconds