• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 6
  • 2
  • 1
  • 1
  • Tagged with
  • 20
  • 20
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Análise de complexidade aplicada à antecipação de crises no mercado de bens minerais. / Analysis applied to complex crises in anticipation of goods market minerals.

Dompieri, Mauricio 20 October 2014 (has links)
O objetivo deste estudo foi o de investigar as oportunidades de aplicação da análise de complexidade como método de análise da Economia Mineral de um bem mineral, utilizando o níquel como estudo de caso. Para tanto foram estudadas as particularidades do mercado de commodities, com maior profundidade no caso do níquel, os fatores que nele influem e alguns modelos desenvolvidos para simulação, compreensão e predição do comportamento do sistema composto por este mercado. Foram verificadas as condições para que o mercado de bens minerais tenha sido considerado um sistema complexo. No caso do níquel foi também analisado o estado atual da tecnologia de extração, incluindo os desenvolvimentos mais recentes. Passou-se então à descrição do método utilizado na análise de complexidade que define a complexidade de um sistema como uma grandeza quantificável em função de sua topologia, representada pela estrutura das correlações entre suas variáveis, e da entropia total do sistema. A entropia total do sistema é a integração das entropias de Shannon das variáveis que participam de sua estrutura e é uma medida da sua incerteza. Neste método, o cálculo das correlações entre as variáveis não é feito estatisticamente, mas sim por meio do cálculo da entropia mútua. A vantagem deste método é que revela correlações entre pares de variáveis que apresentam relações não lineares ou até mesmo bifurcações, clustering e outras patologias de difícil tratamento estatístico. Desta forma, evita-se o termo correlação, que remete ao tratamento estatístico, preferindo-se acoplamento em seu lugar, para identificar a dependência entre duas variáveis. A seguir, foram abordadas as duas modalidades de análise de complexidade utilizadas: estática e dinâmica. A análise estática revela, por meio de um mapa cognitivo, a estrutura do sistema e as forças de acoplamento entre seus componentes, bem como os índices de complexidade, compostos das complexidades crítica, operacional e mínima, da entropia e da robustez. O índice de maior destaque é o da robustez, que mede a resiliência do sistema por meio da diferença entre as complexidades crítica e operacional, e é um indicador de sua sustentabilidade. A análise dinâmica revela, para sistemas que variam com o tempo, a evolução dos indicadores de complexidade ao longo do tempo. O interesse nesse tipo de análise é que o criador do método identificou experimentalmente que o colapso de um sistema é quase sempre precedido de um aumento brusco em sua complexidade. Esta característica é então explorada na análise do mercado do níquel para procurar antecipar crises. Na parte experimental pode-se então revelar a estrutura de acoplamentos de uma cesta de metais e do mercado específico do níquel, usando-se a análise estática. A seguir, passou-se a investigar a evolução dos indicadores de complexidade ao longo do tempo, tendo sido possível identificar as situações de crise no mercado pelo aumento de complexidade e entropia e, no caso específico da crise de 2008-2009 foi possível perceber o aumento significativo da complexidade e entropia antes mesmo da instalação da crise, fornecendo assim um aviso prévio do evento. / This study aimed at investigating the opportunities for application of complexity analysis as a method of analysis of mineral commodities economics, using nickel as case study. With that intention, the particularities of commodities were studied, in a deeper fashion in the case of nickel, its influencing factors and respective models which have been developed for simulating, understanding and predicting the behavior of the commodity market system. The conditions which allow the mineral commodities market to be considered a complex system have been verified. In the case of nickel the current state of the extraction technology including the latest developments has also been analyzed. Then focus goes to the description of the method used in complexity analysis, where complexity of a system is defined as a measurable quantity based on its topology, represented by the structure of the correlation between its variables, and the total entropy of the system. The total entropy of the system is the integration of the Shannon entropy of the variables that participate in its structure and is a measure of the systems uncertainty, i.e., its departure from a deterministic operating fashion. Calculation of correlations between variables in this method is not done statistically, but by calculating the mutual entropy between each pair of variables. The advantage of this method is that it reveals correlations between pairs of variables that exhibit nonlinear relationships or even bifurcations, clustering and other pathologies of difficult statistical treatment. Thus, the term correlation is avoided, which refers to the statistical treatment, being coupling the preferred expression to identify the dependence between two variables. The two types of complexity analysis were then performed: static and dynamic. Static analysis reveals the system structure and strength of couplings between the components by means of a cognitive map, as well as the complexity indices consisting of critical complexity, operational and minimum entropy and robustness. Robustness is the most interesting index in this case, as it measures the resilience of the system using the difference between the critical and operative complexities, and is an indicator of its sustainability. The dynamic analysis reveals, for time variant systems, the evolution of complexity indicators over time. Interest in this type of analysis is that the methods developer has experimentally identified that the collapse of a system is almost always preceded by a sharp increase in their complexity. This feature is then exploited in the analysis of the nickel market in trying to anticipate crises. Then, in the experimental section, structures of couplings were identified for a basket of metals and for the specific nickel market, using static analysis. Finally the evolution of indicators of complexity over time has been investigated, which revealed to be possible to identify a crisis in the market by the increasing complexity and entropy and, in the particular case of the 2008-2009 crisis its been also was possible to observe a significant increase in complexity and entropy just before installation of the crisis itself, providing a pre-alarm of the event.
2

Análise de complexidade aplicada à antecipação de crises no mercado de bens minerais. / Analysis applied to complex crises in anticipation of goods market minerals.

Mauricio Dompieri 20 October 2014 (has links)
O objetivo deste estudo foi o de investigar as oportunidades de aplicação da análise de complexidade como método de análise da Economia Mineral de um bem mineral, utilizando o níquel como estudo de caso. Para tanto foram estudadas as particularidades do mercado de commodities, com maior profundidade no caso do níquel, os fatores que nele influem e alguns modelos desenvolvidos para simulação, compreensão e predição do comportamento do sistema composto por este mercado. Foram verificadas as condições para que o mercado de bens minerais tenha sido considerado um sistema complexo. No caso do níquel foi também analisado o estado atual da tecnologia de extração, incluindo os desenvolvimentos mais recentes. Passou-se então à descrição do método utilizado na análise de complexidade que define a complexidade de um sistema como uma grandeza quantificável em função de sua topologia, representada pela estrutura das correlações entre suas variáveis, e da entropia total do sistema. A entropia total do sistema é a integração das entropias de Shannon das variáveis que participam de sua estrutura e é uma medida da sua incerteza. Neste método, o cálculo das correlações entre as variáveis não é feito estatisticamente, mas sim por meio do cálculo da entropia mútua. A vantagem deste método é que revela correlações entre pares de variáveis que apresentam relações não lineares ou até mesmo bifurcações, clustering e outras patologias de difícil tratamento estatístico. Desta forma, evita-se o termo correlação, que remete ao tratamento estatístico, preferindo-se acoplamento em seu lugar, para identificar a dependência entre duas variáveis. A seguir, foram abordadas as duas modalidades de análise de complexidade utilizadas: estática e dinâmica. A análise estática revela, por meio de um mapa cognitivo, a estrutura do sistema e as forças de acoplamento entre seus componentes, bem como os índices de complexidade, compostos das complexidades crítica, operacional e mínima, da entropia e da robustez. O índice de maior destaque é o da robustez, que mede a resiliência do sistema por meio da diferença entre as complexidades crítica e operacional, e é um indicador de sua sustentabilidade. A análise dinâmica revela, para sistemas que variam com o tempo, a evolução dos indicadores de complexidade ao longo do tempo. O interesse nesse tipo de análise é que o criador do método identificou experimentalmente que o colapso de um sistema é quase sempre precedido de um aumento brusco em sua complexidade. Esta característica é então explorada na análise do mercado do níquel para procurar antecipar crises. Na parte experimental pode-se então revelar a estrutura de acoplamentos de uma cesta de metais e do mercado específico do níquel, usando-se a análise estática. A seguir, passou-se a investigar a evolução dos indicadores de complexidade ao longo do tempo, tendo sido possível identificar as situações de crise no mercado pelo aumento de complexidade e entropia e, no caso específico da crise de 2008-2009 foi possível perceber o aumento significativo da complexidade e entropia antes mesmo da instalação da crise, fornecendo assim um aviso prévio do evento. / This study aimed at investigating the opportunities for application of complexity analysis as a method of analysis of mineral commodities economics, using nickel as case study. With that intention, the particularities of commodities were studied, in a deeper fashion in the case of nickel, its influencing factors and respective models which have been developed for simulating, understanding and predicting the behavior of the commodity market system. The conditions which allow the mineral commodities market to be considered a complex system have been verified. In the case of nickel the current state of the extraction technology including the latest developments has also been analyzed. Then focus goes to the description of the method used in complexity analysis, where complexity of a system is defined as a measurable quantity based on its topology, represented by the structure of the correlation between its variables, and the total entropy of the system. The total entropy of the system is the integration of the Shannon entropy of the variables that participate in its structure and is a measure of the systems uncertainty, i.e., its departure from a deterministic operating fashion. Calculation of correlations between variables in this method is not done statistically, but by calculating the mutual entropy between each pair of variables. The advantage of this method is that it reveals correlations between pairs of variables that exhibit nonlinear relationships or even bifurcations, clustering and other pathologies of difficult statistical treatment. Thus, the term correlation is avoided, which refers to the statistical treatment, being coupling the preferred expression to identify the dependence between two variables. The two types of complexity analysis were then performed: static and dynamic. Static analysis reveals the system structure and strength of couplings between the components by means of a cognitive map, as well as the complexity indices consisting of critical complexity, operational and minimum entropy and robustness. Robustness is the most interesting index in this case, as it measures the resilience of the system using the difference between the critical and operative complexities, and is an indicator of its sustainability. The dynamic analysis reveals, for time variant systems, the evolution of complexity indicators over time. Interest in this type of analysis is that the methods developer has experimentally identified that the collapse of a system is almost always preceded by a sharp increase in their complexity. This feature is then exploited in the analysis of the nickel market in trying to anticipate crises. Then, in the experimental section, structures of couplings were identified for a basket of metals and for the specific nickel market, using static analysis. Finally the evolution of indicators of complexity over time has been investigated, which revealed to be possible to identify a crisis in the market by the increasing complexity and entropy and, in the particular case of the 2008-2009 crisis its been also was possible to observe a significant increase in complexity and entropy just before installation of the crisis itself, providing a pre-alarm of the event.
3

Distributed calculations using mobile agents / Calculs Distribués par des Agents Mobiles

Abbas, Shehla 15 December 2008 (has links)
Cette thèse traite l’utilisation des agents mobiles dans le domaine des algo- rithmes distribués en les déplaçant de manière aléatoire dans le réseau. Initialement k agents mobiles ayant les identités uniques sont placés dans le réseau. On décrit un algorithme distribué pour calculer un arbre couvrant dans les réseaux dynamiques en utilisant les agents mobiles. Les agents marquent les noeuds sur les quelles ils arrivent. Ils utilisent deux techniques di?érentes : le clonage dans lequel un agent crée son propre clone pour faire quelques tâches et le marquage sur la tableau de bord (un espace mémoire sur les noeuds). Ces techniques sont utilisés dans les applications comme l’arbre couvrant, le rassemblement et la collecte d’information. Chacun des agents détient une information partielle. Quand deux ou plusieurs agents se rencontrent sur un noeud, ils fusionnent en un seul agent. On s’intéresse alors au temps nécessaire ou tous les k agents fusionnent en un seul et unique agent. On présent une chaîne de Markov pour le comportement des agents, et on montre comment on peut utiliser cette technique pour calculer la bourne supérieur. On étudie le même problème quand les agents mobile commencent la marche aléatoire sous un régime stationnaire. On a aussi étudié le problème de Handshake et on l’a analysé en utilisant les agents mobiles. / This thesis deals with the use of mobile agents in distributed algorithms by performing random walks in the network. k mobile agents having unique identities are placed initially in a network. We describe a distributed algorithm for computing spanning trees in dynamic networks by using mobile agents. The agents mark the nodes on which they arrive. They use two di?erent techniques. In one problem they use the cloning in which an agent creates its own clone to do some task assigned. In the second, the mobile agents mark on the whiteboard (a memory location on the nodes). These techniques are used in applications such as spanning tree, gathering and collecting information. The mobile agents have limited knowledge and hence, they are not intelligent and do not have computational capabilities. When two or more agents meet at a node of the underlying graph, they merge into a single agent. The parameter of interest is the expected time for all the agents to merge into a single agent. We present a Markov chain, modelling the agents behavior, and show how this can be used to upper bound the expected time for all the k agents to merge into a single agent. We study the same problem when the mobile agents start their walk directly under stationary regime. Handshake problem is also studied and analyzed using mobile agents.
4

Rūšiavimo algoritmų vizualizavimas ir sudėtingumo analizė / Visualization and Complexity Analysis of Sorting Algorithms

Saročka, Gediminas 02 July 2012 (has links)
Rūšiavimo algoritmų sudėtingumo analizių galima atrasti be problemų, todėl pagrindinė šio darbo idėja buvo sukurti rūšiavimo algoritmų vizualizavimą. Šiame darbe buvo sukurtas trijų paprastųjų rūšiavimo algoritmų (įterpimo, burbulo ir išrinkimo), bei dviejų greitųjų rūšiavimo algoritmų (Šelo ir sąlajos) vizualizavimas. Darbe taip pat galima skaičiuoti rūšiavimo algoritmų rūšiuojamą laiką. / There is a lot of complexity analysis of sorting algorithms can be found without problems, so the main idea of this work was to create a visualization of sorting algorithms. This work was created three simple sorting algorithms (insertion sort, bubble sort and selection sort), and two high-speed sorting algorithms (Shell sort and merge sort) visualization. This program is capable of calculating sorting time of sorting algorithm for further sorting algorithm complexity analysis.
5

Analyzing the Computational Complexity of Abstract Dialectical Frameworks via Approximation Fixpoint Theory

Straß, Hannes, Wallner, Johannes Peter 22 January 2014 (has links) (PDF)
Abstract dialectical frameworks (ADFs) have recently been proposed as a versatile generalization of Dung's abstract argumentation frameworks (AFs). In this paper, we present a comprehensive analysis of the computational complexity of ADFs. Our results show that while ADFs are one level up in the polynomial hierarchy compared to AFs, there is a useful subclass of ADFs which is as complex as AFs while arguably offering more modeling capacities. As a technical vehicle, we employ the approximation fixpoint theory of Denecker, Marek and Truszczyński, thus showing that it is also a useful tool for complexity analysis of operator-based semantics.
6

Low complexity differential geometric computations with applications to human activity analysis

January 2012 (has links)
abstract: In this thesis, we consider the problem of fast and efficient indexing techniques for time sequences which evolve on manifold-valued spaces. Using manifolds is a convenient way to work with complex features that often do not live in Euclidean spaces. However, computing standard notions of geodesic distance, mean etc. can get very involved due to the underlying non-linearity associated with the space. As a result a complex task such as manifold sequence matching would require very large number of computations making it hard to use in practice. We believe that one can device smart approximation algorithms for several classes of such problems which take into account the geometry of the manifold and maintain the favorable properties of the exact approach. This problem has several applications in areas of human activity discovery and recognition, where several features and representations are naturally studied in a non-Euclidean setting. We propose a novel solution to the problem of indexing manifold-valued sequences by proposing an intrinsic approach to map sequences to a symbolic representation. This is shown to enable the deployment of fast and accurate algorithms for activity recognition, motif discovery, and anomaly detection. Toward this end, we present generalizations of key concepts of piece-wise aggregation and symbolic approximation for the case of non-Euclidean manifolds. Experiments show that one can replace expensive geodesic computations with much faster symbolic computations with little loss of accuracy in activity recognition and discovery applications. The proposed methods are ideally suited for real-time systems and resource constrained scenarios. / Dissertation/Thesis / M.S. Electrical Engineering 2012
7

MARRT Pipeline: Pipeline for Markerless Augmented Reality Systems Based on Real-Time Structure from Motion

Paulo Gomes Neto, Severino 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:53:49Z (GMT). No. of bitstreams: 2 arquivo1931_1.pdf: 3171518 bytes, checksum: 18e05da39f750dea38eaa754f1aa4735 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009 / Atualmente, com o aumento do poder computacional e os estudos em usabilidade, sistemas de tempo real e foto-realismo, os requisitos de qualquer sistema de computador são mais complexos e sofisticados. Sistemas de Realidade Aumentada não são exceção em sua tentativa de resolver problemas da vida real do usuário com um nível reduzido de risco, tempo gasto ou complexidade de aprendizado. Tais sistemas podem ser classificados como baseados em marcadores ou livres de marcadores. O papel essencial da realidade aumentada sem marcadores é evitar o uso desnecessário e indesejável de marcadores nas aplicações. Para atender à demanda por tecnologias de realidade aumentada robustas e não-intrusivas, esta dissertação propõe uma cadeia de execução para o desenvolvimento de aplicações de realidade aumentada sem marcadores, especialmente baseadas na técnica de recuperação da estrutura a partir do movimento em tempo real
8

The Lifted Heston Stochastic Volatility Model

Broodryk, Ryan 04 January 2021 (has links)
Can we capture the explosive nature of volatility skew observed in the market, without resorting to non-Markovian models? We show that, in terms of skew, the Heston model cannot match the market at both long and short maturities simultaneously. We introduce Abi Jaber (2019)'s Lifted Heston model and explain how to price options with it using both the cosine method and standard Monte-Carlo techniques. This allows us to back out implied volatilities and compute skew for both models, confirming that the Lifted Heston nests the standard Heston model. We then produce and analyze the skew for Lifted Heston models with a varying number N of mean reverting terms, and give an empirical study into the time complexity of increasing N. We observe a weak increase in convergence speed in the cosine method for increased N, and comment on the number of factors to implement for practical use.
9

Analyzing the Computational Complexity of Abstract Dialectical Frameworks via Approximation Fixpoint Theory

Straß, Hannes, Wallner, Johannes Peter 22 January 2014 (has links)
Abstract dialectical frameworks (ADFs) have recently been proposed as a versatile generalization of Dung''s abstract argumentation frameworks (AFs). In this paper, we present a comprehensive analysis of the computational complexity of ADFs. Our results show that while ADFs are one level up in the polynomial hierarchy compared to AFs, there is a useful subclass of ADFs which is as complex as AFs while arguably offering more modeling capacities. As a technical vehicle, we employ the approximation fixpoint theory of Denecker, Marek and Truszczyński, thus showing that it is also a useful tool for complexity analysis of operator-based semantics.
10

Uma proposta para medição de complexidade e estimação de custos de segurança em procedimentos de tecnologia da informação / An approach to measure the complexity and estimate the cost associated to Information Technology Security Procedures

Moura, Giovane Cesar Moreira January 2008 (has links)
Segurança de TI tornou-se nos últimos anos uma grande preocupação para empresas em geral. Entretanto, não é possível atingir níveis satisfatórios de segurança sem que estes venham acompanhados tanto de grandes investimentos para adquirir ferramentas que satisfaçam os requisitos de segurança quanto de procedimentos, em geral, complexos para instalar e manter a infra-estrutura protegida. A comunidade científica propôs, no passado recente, modelos e técnicas para medir a complexidade de procedimentos de configuração de TI, cientes de que eles são responsáveis por uma parcela significativa do custo operacional, freqüentemente dominando o total cost of ownership. No entanto, apesar do papel central de segurança neste contexto, ela não foi objeto de investigação até então. Para abordar este problema, neste trabalho aplica-se um modelo de complexidade proposto na literatura para mensurar o impacto de segurança na complexidade de procedimentos de TI. A proposta deste trabalho foi materializada através da implementação de um protótipo para análise de complexidade chamado Security Complexity Analyzer (SCA). Como prova de conceito e viabilidade de nossa proposta, o SCA foi utilizado para avaliar a complexidade de cenários reais de segurança. Além disso, foi conduzido um estudo para investigar a relação entre as métricas propostas no modelo de complexidade e o tempo gasto pelo administrador durante a execução dos procedimentos de segurança, através de um modelo quantitativo baseado em regressão linear, com o objetivo de prever custos associados à segurança. / IT security has become over the recent years a major concern for organizations. However, it doest not come without large investments on both the acquisition of tools to satisfy particular security requirements and complex procedures to deploy and maintain a protected infrastructure. The scientific community has proposed in the recent past models and techniques to estimate the complexity of configuration procedures, aware that they represent a significant operational cost, often dominating total cost of ownership. However, despite the central role played by security within this context, it has not been subject to any investigation to date. To address this issue, we apply a model of configuration complexity proposed in the literature in order to be able to estimate security impact on the complexity of IT procedures. Our proposal has been materialized through a prototypical implementation of a complexity scorer system called Security Complexity Analyzer (SCA). To prove concept and technical feasibility of our proposal, we have used SCA to evaluate real-life security scenarios. In addition, we have conducted a study in order to investigate the relation between the metrics proposed in the model and the time spent by the administrator while executing security procedures, with a quantitative model built using multiple regression analysis, in order to predict the costs associated to security.

Page generated in 0.0699 seconds