• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 162
  • 33
  • 26
  • 22
  • 14
  • 7
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 331
  • 66
  • 60
  • 50
  • 39
  • 39
  • 38
  • 36
  • 34
  • 28
  • 28
  • 27
  • 27
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

[en] FAULT TOLERANCE IN DISTRIBUTED SYSTEMS / [pt] RECUPERAÇÃO DE ERROS EM SISTEMAS DE PROCESSOS DISTRIBUÍDOS

ALEXANDRE DE REZENDE ABIBE 27 December 2006 (has links)
[pt] Esta dissertação aborda o problema da recuperação de erros em sistemas distribuídos. Inicialmente, é feita uma breve análise sobre a origem deste problema e as soluções encontradas. Alguns métodos de resolução são então apresentados. Para a simulação do sistema distribuído foi desenvolvido um núcleo multi-tarefa numa máquina compatível com o PC-IBM-XT, utilizando o MS-DOS (versão 3.0 ou acima) como servidor. Finalmente, são apresentadas duas propostas. A primeira visa fornecer a um processo recursos que possibilitem a recuperação por retorno. A segunda utiliza redundância em um conjunto de processos em diferentes estações para garantir que o sistema como um todo continue operativo, mesmo com uma estação de falha / [en] This dissertation deals with the problem of fault tolerance in distributed systems. Initially, a brief analysis on the origins of this problem and its solutions is made. Some of the resolutions methods are then presented. In order to simulate a distributed system, a multi tasking operating system kernel was developed in an IBM PC-XT compatible machine, making use of the MS-DOS (version 3.0 or above) as a server. Finally, two proposals are presented. The first, is intended to supply a process with resources that allow recovery in case of algorithmic faults, making use of the backward error recovery method. The second, uses redundancy in a set of processes over different stations in order to warrant that the system as a whole keeps operative, even with a faulty s
42

Why Machine Learning Works

Montanez, George D. 01 December 2017 (has links)
To better understand why machine learning works, we cast learning problems as searches and characterize what makes searches successful. We prove that any search algorithm can only perform well on a narrow subset of problems, and show the effects of dependence on raising the probability of success for searches. We examine two popular ways of understanding what makes machine learning work, empirical risk minimization and compression, and show how they fit within our search frame-work. Leveraging the “dependence-first” view of learning, we apply this knowledge to areas of unsupervised time-series segmentation and automated hyperparameter optimization, developing new algorithms with strong empirical performance on real-world problem classes.
43

Evaluation of Hierarchical Temporal Memory in algorithmic trading

Åslin, Fredrik January 2010 (has links)
This thesis looks into how one could use Hierarchal Temporal Memory (HTM) networks to generate models that could be used as trading algorithms. The thesis begins with a brief introduction to algorithmic trading and commonly used concepts when developing trading algorithms. The thesis then proceeds to explain what an HTM is and how it works. To explore whether an HTM could be used to generate models that could be used as trading algorithms, the thesis conducts a series of experiments. The goal of the experiments is to iteratively optimize the settings for an HTM and try to generate a model that when used as a trading algorithm would have more profitable trades than losing trades. The setup of the experiments is to train an HTM to predict if it is a good time to buy some shares in a security and hold them for a fixed time before selling them again. A fair amount of the models generated during the experiments was profitable on data the model have never seen before, therefore the author concludes that it is possible to train an HTM so it can be used as a profitable trading algorithm.
44

The applications of fractal geometry and self - similarity to art music

Steynberg, Ilse January 2014 (has links)
The aim of this research study is to investigate different practical ways in which fractal geometry and self-similarity can be applied to art music, with reference to music composition and analysis. This specific topic was chosen because there are many misconceptions in the field of fractal and self-similar music. Analyses of previous research as well as the music analysis of several compositions from different composers in different genres were the main methods for conducting the research. Although the dissertation restates much of the existing research on the topic, it is (to the researcher‟s knowledge) one of the first academic works that summarises the many different facets of fractal geometry and music. Fractal and self-similar shapes are evident in nature and art dating back to the 16th century, despite the fact that the mathematics behind fractals was only defined in 1975 by the French mathematician, Benoit B. Mandelbrot. Mathematics has been a source of inspiration to composers and musicologists for many centuries and fractal geometry has also infiltrated the works of composers in the past 30 years. The search for fractal and self-similar structures in music composed prior to 1975 may lead to a different perspective on the way in which music is analysed. Basic concepts and prerequisites of fractals were deliberately simplified in this research in order to collect useful information that musicians can use in composition and analysis. These include subjects such as self-similarity, fractal dimensionality and scaling. Fractal shapes with their defining properties were also illustrated because their structures have been likened to those in some music compositions. This research may enable musicians to incorporate mathematical properties of fractal geometry and self-similarity into original compositions. It may also provide new ways to view the use of motifs and themes in the structural analysis of music. / Dissertation (MMus)--University of Pretoria, 2014. / lk2014 / Music / MMus / Unrestricted
45

Modelling approaches for optimal liquidation under a limit-order book structure

Blair, James January 2016 (has links)
This thesis introduces a selection of models for optimal execution of financial assets at the tactical level. As opposed to optimal scheduling, which defines a trading schedule for the trader, this thesis investigates how the trader should interact with the order book. If a trader is aggressive he will execute his order using market orders, which will negatively feedback on his execution price through market impact. Alternatively, the models we focus on consider a passive trader who places limit orders into the limit-order book and waits for these orders to be filled by market orders from other traders. We assume these models do not exhibit market impact. However, given we await market orders from other participants to fill our limit orders a new risk is borne: execution risk. We begin with an extension of Guéant et al. (2012b) who through the use of an exponential utility, standard Brownian motion, and an absolute decay parameter were able to cleverly build symmetry into their model which significantly reduced the complexity. Our model consists of geometric Brownian motion (and mean-reverting processes) for the asset price, a proportional control parameter (the additional amount we ask for the asset), and a proportional decay parameter, implying that the symmetry found in Guéant et al. (2012b) no longer exists. This novel combination results in asset-dependent trading strategies, which to our knowledge is a unique concept in this framework of literature. Detailed asymptotic analyses, coupled with advanced numerical techniques (informing the asymptotics) are exploited to extract the relevant dynamics, before looking at further extensions using similar methods. We examine our above mentioned framework, as well as that of Guéant et al. (2012), for a trader who has a basket of correlated assets to liquidate. This leads to a higher-dimensional model which increases the complexity of both numerically solving the problem and asymptotically examining it. The solutions we present are of interest, and comparable with Markowitz portfolio theory. We return to our framework of a single underlying and consider four extensions: a stochastic volatility model which results in an added dimension to the problem, a constrained optimisation problem in which the control has an explicit lower bound, changing the exponential intensity to a power intensity which results in a reformulation as a singular stochastic control problem, and allowing the trader to trade using both market orders and limit orders resulting in a free-boundary problem. We complete the study with an empirical analysis using limit-order book data which contains multiple levels of the book. This involves a novel calibration of the intensity functions which represent the limit-order book, before backtesting and analysing the performance of the strategies.
46

De l'esthétique numérique : changements nés de la révolution numérique, vus à travers les études des médias, l'art sonore et la philosophie / On digital aesthetics : changes arising from the digital revolution, a media, sound art and philosophy perspective

Gampe, Johanna 18 December 2014 (has links)
Cette thèse explore les nombreux changements nés de la Révolution Numérique à travers l'esthétique. Elle aborde les questions essentielles en trois étapes: le discours, la création et la réception. La première partie rassemble les courants et les mots-clefs qui reflètent le discours principal dans son actualité et son historique. Une étude comparative analyse les approches esthétiques typiques. Que l'on étudie les nouveaux médias, l'art visuel, l'art numérique, le « computer art », l'art de l'information ou l'art du code: on cherche tous à définir les nouvelles qualités spécifiques du médium numérique. La deuxième partie se concentre sur les arts sonores et les drames audio en particulier, mais aborde également la musique assistée par ordinateur. L'attention portée au genre du Hörspiel permet de démontrer comment espace et temps évoluent dans le détail. La mise en place d'une hypothèse de travail portant sur les Sonarios - installations sonores spatiales à interactivité scénique - permet de comparer directement 1'« ancien» genre du Hörspiel au « nouveau» genre des Sonarios, en se fondant sur trois études de cas et un état des lieux de l'art. En outre, elle propose un modèle d'interactivité quantitatif et qualitatif. La troisième partie présente les courants prédominants en philosophie. Elle construit un modèle herméneutique à l'aide de l'ontologie, en commençant par la virtualisation et l'actualisation. Abordant philosophiquement des catégories esthétiques, elle affine ce modèle en y incorporant les découvertes de la deuxième partie. Un modèle graphique de signes algorithmiques, développé en sémiotique et en phénoménologie, est alors intégré au domaine de l'esthétique. / This dissertation explores the numerous changes arising from the Digital Revolution through aesthetics. It takes on the essential issues in three steps, tackling discourse, creation and reception. The first part assembles trends and keywords that reflect the main discourse, covering both the present and the past. A comparative study analyzes typical aesthetic approaches: whether considering new media art, virtual art, digital art, computer art, information art or code art, it becomes apparent that almost ail approaches seek the specific new qualities of the digital medium. The second part concentrates on sound arts and on audio dramas in particular, but also tackles computer music. The focus on the genre Hörspiel allows demonstration of how space and time change in detail. The setting up of a working hypothesis of Sonarios, defining interactive scenic spatial sound installations, allows the direct comparison of the 'old' genre Hörspiel and the 'new' genre Sonario on the basis of three case studies and a consideration of the state of the art. It, moreover, suggests a qualitative and quantitative model of interactivity. The third part introduces the prominent research sectors of philosophy. Step by step, it constructs a hermeneutic model with the aid of ontology, starting with the processes of virtualization and actualization. Selected philosophical approaches on important aesthetic categories are presented in order to refine and discuss the model, together with an analysis of findings from the second part. A graphical model of algorithmic signs, developed in the context of semiotics and phenomenology, allows the analysis to be embedded in the context of aesthetics.
47

Towards cache optimization in finite automata implementations

Ketcha Ngassam, Ernest 21 July 2007 (has links)
To the best of our knowledge, the only available implementations of FA-based string recognizers are the so-called conventional table-driven algorithm and, of course, its hardcoded counterpart suggested by Thompson, Penello, and DeRemer in 1967, 1986, and 2004 respectively. However, our early experiments have shown that the performance of both implementations is hampered by the random access nature of the automaton’s transition table in the case of table-driven, and also the random access nature of the directly executable instructions that make up each hardcoded state. Moreover, the problem of memory load and instruction load are also performance bottlenecks of these algorithms, since, as the automaton size grows, more space in memory is required to hold data/instructions relevant to the states. This thesis exploits the notion of cache optimization (that requires good data or instructions organization) in investigating various enhancements of both table-driven and hardcoding. Functions have been used to formally define the denotational semantics of string recognizers. These functions rely on various so-called strategy variables that are integrated into the formal definition of each recognizer. By appropriately selecting these variables, the conventional algorithms may be described, without loss of generality. By specializing these strategy variables, the new and enhanced recognizers can be denotationally described, and resulting algorithms can then be implemented. We first introduce the so-called Dynamic State Allocation (DSA) strategy regarded as a sort of Just-In-time (JIT) implementation of FA-based string recognizers whereby a predefined portion of the memory is reserved for acceptance testing. Then follows the State pre-Ordering (SpO) strategy that assumes some prior knowledge on the order in which states would be visited. In this case, acceptance testing takes place once each state have been allocated to its new position in memory. The last strategy referred to as the Allocated Virtual Caching (AVC) strategy is based on the premise that a portion of the memory originally occupied by the automaton’s states is virtually used as a sort of cache memory in which acceptance testing takes place, enabling therefore, the exploitation of the various performance enhancement notions on which hardware cache memory relies. It is shown that the algorithms can be classified in a taxonomy tree which is further mapped into a class-diagram that represents the design of a toolkit for FA-based string recognition. Also given in the thesis are empirical results that indicate that the algorithms suggested can, in general, outperform their conventional counterparts when recognizing large and appropriately chosen input strings. / Thesis (PhD (Computer Science))--University of Pretoria, 2007. / Computer Science / PhD / unrestricted
48

Models and algorithms for metabolic networks : elementary modes and precursor sets / .

Acuna, Vicente 04 June 2010 (has links)
. / In this PhD, we present some algorithms and complexity results for two general problems that arise in the analysis of a metabolic network: the search for elementary modes of a network and the search for minimal precursors sets. Elementary modes is a common tool in the study of the cellular characteristic of a metabolic network. An elementary mode can be seen as a minimal set of reactions that can work in steady state independently of the rest of the network. It has therefore served as a mathematical model for the possible metabolic pathways of a cell. Their computation is not trivial and poses computational challenges. We show that some problems, like checking consistency of a network, finding one elementary mode or checking that a set of reactions constitutes a cut are easy problems, giving polynomial algorithms based on LP formulations. We also prove the hardness of central problems like finding a minimum size elementary mode, finding an elementary mode containing two given reactions, counting the number of elementary modes or finding a minimum reaction cut. On the enumeration problem, we show that enumerating all reactions containing one given reaction cannot be done in polynomial total time unless P=NP. This result provides some idea about the complexity of enumerating all the elementary modes. The search for precursor sets is motivated by discovering which external metabolites are sufficient to allow the production of a given set of target metabolites. In contrast with previous proposals, we present a new approach which is the first to formally consider the use of cycles in the way to produce the target. We present a polynomial algorithm to decide whether a set is a precursor set of a given target. We also show that, given a target set, finding a minimal precursor set is easy but finding a precursor set of minimum size is NP-hard. We further show that finding a solution with minimum size internal supply is NP-hard. We give a simple characterisation of precursors sets by the existence of hyperpaths between the solutions and the target. If we consider the enumeration of all the minimal precursor sets of a given target, we find that this problem cannot be solved in polynomial total time unless P=NP. Despite this result, we present two algorithms that have good performance for medium-size networks
49

Two-Dimensional Computer-Generated Ornamentation Using a User-Driven Global Planning Strategy

Anderson, Dustin Robert 01 June 2008 (has links)
Hand drawn ornamentation, such as floral or geometric patterns, is a tedious and time consuming task that requires much skill and training in ornamental design principles and aesthetics. Ornamental drawings both historically and presently play critical roles in all things from art to architecture; however, little work has been done in exploring their algorithmic and interactive generation. The field of computer graphics offers many algorithmic possibilities for assisting an artist in creating two-dimensional ornamental art. When computers handle the repetition and overall structure of ornament, considerable savings in time and money can result. Today, the few existing computer algorithms used to generate 2D ornament have over-generalized and over-simplified the process of ornamentation, resulting in the substitution of limited amounts of generic and static "clip art" for once personalized artistic innovations. Two possible approaches to computational ornamentation exist: interactive tools give artists instant feedback on their work while non-interactive programs can carry out complex and sometimes lengthy computations to produce mathematically precise ornamental compositions. Due to the importance of keeping an artist in the loop for the production of ornamentation, we present an application designed and implemented utilizing a user-driven global planning strategy, to help guide the generation of two-dimensional ornament. The system allows for the creation of beautiful organic ornamental 2D art which follows a user-defined curve. We present the application, the algorithmic approaches used, and the potential uses of this application.
50

A Framework for Automated Discovery and Analysis of Suspicious Trade Records

Datta, Debanjan 27 May 2022 (has links)
Illegal logging and timber trade presents a persistent threat to global biodiversity and national security due to its ties with illicit financial flows, and causes revenue loss. The scale of global commerce in timber and associated products, combined with the complexity and geographical spread of the supply chain entities present a non-trivial challenge in detecting such transactions. International shipment records, specifically those containing bill of lading is a key source of data which can be used to detect, investigate and act upon such transactions. The comprehensive problem can be described as building a framework that can perform automated discovery and facilitate actionability on detected transactions. A data driven machine learning based approach is necessitated due to the volume, velocity and complexity of international shipping data. Such an automated framework can immensely benefit our targeted end-users---specifically the enforcement agencies. This overall problem comprises of multiple connected sub-problems with associated research questions. We incorporate crucial domain knowledge---in terms of data as well as modeling---through employing expertise of collaborating domain specialists from ecological conservationist agencies. The collaborators provide formal and informal inputs spanning across the stages---from requirement specification to the design. Following the paradigm of similar problems such as fraud detection explored in prior literature, we formulate the core problem of discovering suspicious transactions as an anomaly detection task. The first sub-problem is to build a system that can be used find suspicious transactions in shipment data pertaining to imports and exports of multiple countries with different country specific schema. We present a novel anomaly detection approach---for multivariate categorical data, following constraints of data characteristics, combined with a data pipeline that incorporates domain knowledge. The focus of the second problem is U.S. specific imports, where data characteristics differ from the prior sub-problem---with heterogeneous attributes present. This problem is important since U.S. is a top consumer and there is scope of actionable enforcement. For this we present a contrastive learning based anomaly detection model for heterogeneous tabular data, with performance and scalability characteristics applicable to real world trade data. While the first two problems address the task of detecting suspicious trades through anomaly detection, a practical challenge with anomaly detection based systems is that of relevancy or scenario specific precision. The third sub-problem addresses this through a human-in-the-loop approach augmented by visual analytics, to re-rank anomalies in terms of relevance---providing explanations for cause of anomalies and soliciting feedback. The last sub-problem pertains to explainability and actionability towards suspicious records, through algorithmic recourse. Algorithmic recourse aims to provides meaningful alternatives towards flagged anomalous records, such that those counterfactual examples are not judged anomalous by the underlying anomaly detection system. This can help enforcement agencies advise verified trading entities in modifying their trading patterns to avoid false detection, thus streamlining the process. We present a novel formulation and metrics for this unexplored problem of algorithmic recourse in anomaly detection. and a deep learning based approach towards explaining anomalies and generating counterfactuals. Thus the overall research contributions presented in this dissertation addresses the requirements of the framework, and has general applicability in similar scenarios beyond the scope of this framework. / Doctor of Philosophy / Illegal timber trade presents multiple global challenges to ecological biodiversity, vulnerable ecosystems, national security and revenue collection. Enforcement agencies---the target end-users of this framework---face a myriad of challenges in discovering and acting upon shipments with illegal timber that violate national and transnational laws due to volume and complexity of shipment data, coupled with logistical hurdles. This necessitates an automated framework based upon shipment data that can address this task---through solving problems of discovery, analysis and actionability. The overall problem is decomposed into self contained sub-problems that address the associated specific research questions. These comprise of anomaly detection in multiple types of high dimensional tabular data, improving precision of anomaly detection through expert feedback and algorithmic recourse for anomaly detection. We present data mining and machine learning solutions to each of the sub-problems that overcome limitations and inapplicability of prior approaches. Further, we address two broader research questions. First is incorporation domain knowledge into the framework, which we accomplish through collaboration with domain experts from environmental conservation organizations. Secondly, we address the issue of explainability in anomaly detection for tabular data in multiple contexts. Such real world data presents with challenges of complexity and scalability, especially given the tabular format of the data that presents it's own set of challenges in terms of machine learning. The solutions presented to these machine learning problems associated with each of components of the framework provide an end-to-end solution to it's requirements. More importantly, the models and approaches presented in this dissertation have applicability beyond the application scenario with similar data and application specific challenges.

Page generated in 0.0322 seconds