Spelling suggestions: "subject:"[een] ALGORITHMIC"" "subject:"[enn] ALGORITHMIC""
81 |
Fraturas metodológicas nas arquiteturas digitais / Methodology break in the digital architecturesNatividade, Verônica Gomes 20 May 2010 (has links)
O objetivo da presente dissertação é investigar as mudanças ocorridas na metodologia de projeto arquitetônico em função da incorporação do computador como ferramenta auxiliar na concepção de projetos. Parte do princípio de que existe relação íntima entre a ferramenta empregada, a metodologia adotada e a forma arquitetônica, tomando como recorte temporal a segunda metade do século XX. Mais especificamente, investiga as chamadas arquiteturas digitais, isto é, aquelas arquiteturas cuja elaboração e manipulação da forma tiveram como plataforma as ferramentas digitais ou softwares específicos para o desenho e modelagem. Neste caso, o computador não é empregado como ferramenta de representação, mas associado ao processo criativo, causando interferências diretas na forma dos edifícios. Norteada pela pergunta como as novas arquiteturas têm sido concebidas?, o foco primordial da dissertação são as técnicas básicas exclusivamente digitais. Para isso, o estudo analisa três momentos evolutivos da metodologia de projeto assistido pelo computador. O período de formulação, onde é estudada a evolução das ferramentas digitais de projeto e sua gradativa incorporação ao domínio arquitetônico, bem como as novas técnicas e conceitos surgidos nesse momento; o ponto de inflexão, com a construção da primeira das arquiteturas digitais, o Museu Guggenheim de Bilbao, a partir do qual houve a consolidação do movimento digital na disciplina; e, finalmente, o terceiro momento, onde é identificada a fratura metodológica, isto é, quando as tecnologias paramétricas e algorítmicas, identificadas como duas técnicas básicas essencialmente digitais, emergiram como fontes catalisadoras do processo de evolução das arquiteturas digitais para as arquiteturas geradas digitalmente. Com este trabalho, pretende-se fornecer contribuições iniciais para a atualização e evolução da prática de projeto na cena arquitetônica brasileira. / This research aims to investigate the changes in architectural design methodology due to the introduction of the computer as a tool to assist the design process. It assumes that tools, adopted methodology and architectural form are close related, taking as cut-off time the second half of the twentieth century. More specifically, it investigates the so-called \'digital architecture\', that is, the digitally-based design architectures. In this case, the computer is not used as a tool for visualization, but as a generative tool to manipulates and transform architectural form. Guided by the question \'how new architectures have been conceived?\', this research focused on exclusively digital techniques. The research examines three moments of the evolutionary digital design methodology. The formulation period, where embracing the evolution of digital design tools and their gradual incorporation into the architectural practice, as well as new techniques and concepts arising in this moment. The turning point, with theconstruction of the first digital architecture, the Guggenheim Museum in Bilbao, which can be defined as a consolidating moment of the digital real in architecture realm, and finally, the third moment, identified as a \'methodology break\', when the parametric and algorithmic technologies, recognized as essentially digital techniques that emerged as catalyst in the process of evolution \'from digital architectures for digitally generated architectures\'. This research aims to provide initial contributions for the upgrade and evolution of design practice in Brazilian architectural scenario.
|
82 |
A journey across football modelling with application to algorithmic tradingKharrat, Tarak January 2016 (has links)
In this thesis we study the problem of forecasting the final score of a football match before the game kicks off (pre-match) and show how the derived models can be used to make profit in an algorithmic trading (betting) strategy. The thesis consists of two main parts. The first part discusses the database and a new class of counting processes. The second part describes the football forecasting models. The data part discusses the details of the design, specification and data collection of a comprehensive database containing extensive information on match results and events, players' skills and attributes and betting market prices. The database was created using state of the art web-scraping, text-processing and data-mining techniques. At the time of writing, we have collected data on all games played in the five major European leagues since the 2009-2010 season and on more than 7000 players. The statistical modelling part discusses forecasting models based on a new generation of counting process with flexible inter-arrival time distributions. Several different methods for fast computation of the associated probabilities are derived and compared. The proposed algorithms are implemented in a contributed R package Countr available from the Comprehensive R Archive Network. One of these flexible count distributions, the Weibull count distribution, was used to derive our first forecasting model. Its predictive ability is compared to the models previously suggested in the literature and tested in an algorithmic trading (betting) strategy. The model developed has been shown to perform rather well compared to its competitors. Our second forecasting model uses the same statistical distribution but models the attack and defence strengths of each team at the players level rather than at a team level, as is systematically done in the literature. For this model we make heavy use of the data on the players' attributes discussed in the data part of the thesis. Not only does this model turn out to have a higher predictive power but it also allows us to answer important questions about the 'nature of the game' such as the contribution of the full-backs to the attacking efforts or where would a new team finish in the Premier League.
|
83 |
Distributed and privacy preserving algorithms for mobility information processingKatsikouli, Panagiota January 2018 (has links)
Smart-phones, wearables and mobile devices in general are the sensors of our modern world. Their sensing capabilities offer the means to analyze and interpret our behaviour and surroundings. When it comes to human behaviour, perhaps the most informative feature is our location and mobility habits. Insights from human mobility are useful in a number of everyday practical applications, such as the improvement of transportation and road network infrastructure, ride-sharing services, activity recognition, mobile data pre-fetching, analysis of the social behaviour of humans, etc. In this dissertation, we develop algorithms for processing mobility data. The analysis of mobility data is a non trivial task as it involves managing large quantities of location information, usually spread out spatially and temporally across many tracking sensors. An additional challenge in processing mobility information is to publish the data and the results of its analysis without jeopardizing the privacy of the involved individuals or the quality of the data. We look into a series of problems on processing mobility data from individuals and from a population. Our mission is to design algorithms with provable properties that allow for the fast and reliable extraction of insights. We present efficient solutions - in terms of storage and computation requirements - , with a focus on distributed computation, online processing and privacy preservation.
|
84 |
Desenvolvimento algorítmico e arquitetural para a estimação de movimento na compressão de vídeo de alta definição / Algorithmic and architectural development for motion estimation on high definition video compressionPorto, Marcelo Schiavon January 2012 (has links)
A compressão de vídeo é um tema extremamente relevante no cenário atual, principalmente devido ao crescimento significativo da utilização de vídeos digitais. Sem a compressão, é praticamente impossível enviar ou armazenar vídeos digitais devido à sua grande quantidade de informações, inviabilizando aplicações como televisão digital de alta definição, vídeo conferência, vídeo chamada para celulares etc. O problema vem se tornando maior com o crescimento de aplicações de vídeos de alta definição, onde a quantidade de informação é consideravelmente maior. Diversos padrões de compressão de vídeo foram desenvolvidos nos últimos anos, todos eles podem gerar grandes taxas de compressão. Os padrões de compressão de vídeo atuais obtêm a maior parte dos seus ganhos de compressão explorando a redundância temporal, através da estimação de movimento. No entanto, os algoritmos de estimação de movimento utilizados atualmente não consideram as variações nas características dos vídeos de alta definição. Neste trabalho uma avaliação da estimação de movimento em vídeos de alta definição é apresentada, demonstrando que algoritmos rápidos conhecidos, e largamente utilizados pela comunidade científica, não apresentam os mesmos resultados de qualidade com o aumento da resolução dos vídeos. Isto demonstra a importância do desenvolvimento de novos algoritmos focados em vídeos de altíssima definição, superiores à HD 1080p. Esta tese apresenta o desenvolvimento de novos algoritmos rápidos de estimação de movimento, focados na codificação de vídeos de alta definição. Os algoritmos desenvolvidos nesta tese apresentam características que os tornam menos suscetíveis à escolha de mínimos locais, resultando em ganhos significativos de qualidade em relação aos algoritmos rápidos convencionais, quando aplicados a vídeos de alta definição. Além disso, este trabalho também visa o desenvolvimento de arquiteturas de hardware dedicadas para estes novos algoritmos, igualmente dedicadas a vídeos de alta definição. O desenvolvimento arquitetural é extremamente relevante, principalmente para aplicações de tempo real a 30 quadros por segundo, e também para a utilização em dispositivos móveis, onde requisitos de desempenho e potência são críticos. Todos os algoritmos desenvolvidos foram avaliados para um conjunto de 10 sequências de teste HD 1080p, e seus resultados de qualidade e custo computacional foram avaliados e comparados com algoritmos conhecidos da literatura. As arquiteturas de hardware dedicadas, desenvolvidas para os novos algoritmos, foram descritas em VHDL e sintetizadas para FPGAs e ASIC, em standard cells nas tecnologias 0,18μm e 90nm. Os algoritmos desenvolvidos apresentam ganhos de qualidade para vídeos de alta definição em relação a algoritmos rápidos convencionais, e as arquiteturas desenvolvidas possuem altas taxas de processamento com baixo consumo de recursos de hardware e de potência. / Video compression is an extremely relevant theme in today’s scenario, mainly due to the significant growth in digital video applications. Without compression, it is almost impossible to send or store digital videos, due to the large amount of data that they require, making applications such as high definition digital television, video conferences, mobiles video calls, and others unviable. This demand is increasing since there is a strong growth in high definition video applications, where the amount of information is considerably larger. Many video coding standards were developed in the last few years, all of them can achieve excellent compression rates. A significant part of the compression gains in the current video coding standards are obtained through the exploration of the temporal redundancies by means of the motion estimation process. However, the current motion estimation algorithms do not consider the inherent variations that appear in high and ultra-high definition videos. In this work an evaluation of the motion estimation in high definition videos is presented. This evaluation shows that some well know fast algorithms, that are widely used by the scientific community, do not keep the same quality results when applied to high resolution videos. It demonstrates the relevance of new fast algorithms that are focused on high definition videos. This thesis presents the development of new fast motion estimation algorithms focused in high definition video encoding. The algorithms developed in this thesis show some characteristics that make them more resilient to avoid local minima, when applied to high definition videos. Moreover, this work also aims at the development of dedicated hardware architectures for these new algorithms, focused on high definition videos. The architectural development is extremely relevant, mainly for real time applications at 30 frames per second, and also for mobile applications, where performance and power are critical issues. All developed algorithms were assessed using 10 HD 1080p test video sequences, and the results for quality and computational cost were evaluated and compared against known algorithms from the literature. The dedicated hardware architectures, developed for the new algorithms, were described in VHDL and synthesized for FPGA and ASIC. The ASIC implementation used 0.18μm and 90nm CMOS standard cells technology. The developed algorithms present quality gains in comparison to regular fast algorithms for high definition videos, and the developed architectures presents high processing rate with low hardware resources cost and power consumption.
|
85 |
Réseaux idéaux et fonction multilinéaire GGH13 / On ideal lattices and the GGH13 multilinear mapPellet--Mary, Alice 16 October 2019 (has links)
La cryptographie à base de réseaux euclidiens est un domaine prometteur pour la construction de primitives cryptographiques post-quantiques. Un problème fondamental, lié aux réseaux, est le problème du plus court vecteur (ou SVP, pour Shortest Vector Problem). Ce problème est supposé être difficile à résoudre même avec un ordinateur quantique. Afin d’améliorer l’efficacité des protocoles cryptographiques, on peut utiliser des réseaux structurés, comme par exemple des réseaux idéaux ou des réseaux modules (qui sont une généralisation des réseaux idéaux). La sécurité de la plupart des schémas utilisant des réseaux structurés dépend de la difficulté du problème SVP dans des réseaux modules, mais un petit nombre de schémas peuvent également être impactés par SVP dans des réseaux idéaux. La principale construction pouvant être impactée par SVP dans des réseaux idéaux est la fonction multilinéaire GGH13. Cette fonction multilinéaire est principalement utilisée aujourd’hui pour construire des obfuscateurs de programmes, c’est-à-dire des fonctions qui prennent en entrée le code d’un programme et renvoie le code d’un programme équivalent (calculant la même fonction), mais qui doit cacher la façon dont le programme fonctionne.Dans cette thèse, nous nous intéressons dans un premier temps au problème SVP dans les réseaux idéaux et modules. Nous présentons un premier algorithme qui, après un pre-calcul exponentiel, permet de trouver des vecteurs courts dans des réseaux idéaux plus rapidement que le meilleur algorithme connu pour des réseaux arbitraires. Nous présentons ensuite un algorithme pour les réseaux modules de rang 2, également plus efficace que le meilleur algorithme connu pour des réseaux arbitraires, à condition d’avoir accès à un oracle résolvant le problème du plus proche vecteur dans un réseau fixé. Le pré-calcul exponentiel et l’oracle pour le problème du plus proche vecteurs rendent ces deux algorithmes inutilisables en pratique.Dans un second temps, nous nous intéressons à la fonction GGH13 ainsi qu’aux obfuscateurs qui l’utilisent. Nous étudions d’abord l’impact des attaques statistiques sur la fonction GGH13 et ses variantes. Nous nous intéressons ensuite à la sécurité des obfuscateurs utilisant la fonction GGH13 et proposons une attaque quantique contre plusieurs de ces obfuscateurs. Cette attaque quantique utilise entre autres un algorithme calculant un vecteur court dans un réseau idéal dépendant d’un paramètre secret de la fonction GGH13. / Lattice-based cryptography is a promising area for constructing cryptographic primitives that are plausibly secure even in the presence of quantum computers. A fundamental problem related to lattices is the shortest vector problem (or SVP), which asks to find a shortest non-zero vector in a lattice. This problem is believed to be intractable, even quantumly. Structured lattices, for example ideal lattices or module lattices (the latter being a generalization of the former), are often used to improve the efficiency of lattice-based primitives. The security of most of the schemes based on structured lattices is related to SVP in module lattices, and a very small number of schemes can also be impacted by SVP in ideal lattices.In this thesis, we first focus on the problem of finding short vectors in ideal and module lattices.We propose an algorithm which, after some exponential pre-computation, performs better on ideal lattices than the best known algorithm for arbitrary lattices. We also present an algorithm to find short vectors in rank 2 modules, provided that we have access to some oracle solving the closest vector problem in a fixed lattice. The exponential pre-processing time and the oracle call make these two algorithms unusable in practice.The main scheme whose security might be impacted by SVP in ideal lattices is the GGH13multilinear map. This protocol is mainly used today to construct program obfuscators, which should render the code of a program unintelligible, while preserving its functionality. In a second part of this thesis, we focus on the GGH13 map and its application to obfuscation. We first study the impact of statistical attacks on the GGH13 map and on its variants. We then study the security of obfuscators based on the GGH13 map and propose a quantum attack against multiple such obfuscators. This quantum attack uses as a subroutine an algorithm to find a short vector in an ideal lattice related to a secret element of the GGH13 map.
|
86 |
L'apport d'un logiciel de simulation d'algorithmes dans le processus enseignement-apprentissage de l'algorithmique chez des apprenants débutants : cas de l'ENSET de Libreville / The contribution of an algorithm simulation software in the algorithmic teaching-learning process for novice learners : case of the Libreville ENSETOvono, Médard-Sylvain 04 October 2018 (has links)
Cette thèse a pour objet l’étude de l'apport de l’introduction des technologies de l'information et de la communication pour l'enseignement (TICE) dans le processus enseignement-apprentissage de l'algorithmique chez les apprenants débutants, en particulier une application de simulation d'algorithmes (AlgoBox). L'utilisation de cet artefact computationnel a pour objectif de proposer une contribution afin de juguler le problème du fort taux d'échec constaté par les responsables administratifs et pédagogiques depuis plusieurs années dans l'enseignement de l'algorithmique et de la programmation à l'Ecole Normale Supérieure de l’Enseignement Technique (ENSET) de Libreville. Ainsi, la mise en place d'un dispositif expérimental conformément aux hypothèses émises et faisant suite à une première expérimentation d'identification et d'analyse de certains indicateurs, a permis de faire l'étude de l'apport du logiciel de simulation d'algorithme. Au terme de cette recherche, il est établi une légère augmentation des performances des apprenants vraisemblablement due non seulement à l’apport de l’artefact, mais aussi à la réactivation des connaissances liées aux savoirs implicites portées par la tâche prescrite. Il convient donc de préconiser d’accompagner l’usage de cette application par la mise en œuvre d’une nouvelle approche afin de permettre la construction d’une pédagogie adaptée (Ginestié, 2008). / This thesis has for object to study of the contribution of the introduction of the TICE's in the teaching-learning process of the algorithmics to the novice learners, in particular an application of simulation of algorithmics (AlgoBox). The use of this computational artifact aims at proposing a contribution in order to solvat the problem of the high rate of failure noticed by the administratives and pedagogicals responsibles for several years in the teaching of the algorithmics and programmations at Ecole Normale Supérieure de l'Enseignement Technique (ENSET) of Libreville.Thereby the implementation of an experimental device according to the emitted hypotheses and following upon a first experiment of identification and analysis of certain indicators, has allowed making the study of the contribution of the simulation software of algorithm. At the end of this research, it is therefore note a slight increase in the performances of the learners most probably due not only to the contribution of the artifact, but also to the reactivation of implicit knowledge carried out by the prescribed task. It is thus advisable to foresee to accompany the use of this application by the implementation a new approach so as to allow the construction of an adapted pedagogy (Ginestie, 2008).
|
87 |
3-manifolds algorithmically bound 4-manifoldsChurchill, Samuel 27 August 2019 (has links)
This thesis presents an algorithm for producing 4–manifold triangulations with boundary an arbitrary orientable, closed, triangulated 3–manifold. The research is an extension of Costantino and Thurston’s work on determining upper bounds on the number of 4–dimensional simplices necessary to construct such a triangulation. Our first step in this bordism construction is the geometric partitioning of an initial 3–manifold M using smooth singularity theory. This partition provides handle attachment sites on the 4–manifold Mx[0,1] and the ensuing handle attachments eliminate one of the boundary components of Mx[0,1], yielding a 4-manifold with boundary exactly M. We first present the construction in the smooth case before extending the smooth singularity theory to triangulated 3–manifolds. / Graduate
|
88 |
Is Algorithmic Trading the villain? - Evidence from stock markets in TaiwanLi, Kun-ta 18 October 2011 (has links)
As science advances, computer technologies are developing rapidly in the past decades. The previous way of traders¡¦ yelling for orders in the house of exchange has been replaced by the Internet and computers. The trading modes of institutional investors are transforming gradually, particularly the radical changes in the US stock market for the past 5 years. The transaction volume from high frequency trading and algorithmic trading is growing dramatically per year, accounting for at least 70% in the U.S. market. And many researchers find these trading methods based on the computer programs good in increasing liquidity, reducing volatility and facilitating price discovery.
By using intraday data of Taiwan stock market in 2008 to conduct empirical research, this study intends to analyze the effect of this trend on the TW stock market. Empirical results found that the greater the market capitalization, liquidity, stock volatility are, the higher the proportion of algorithmic trading will be, but which only exists in foreign institutional investors. On the other hand, the increase of the proportion of algorithmic trading can improve liquidity, meanwhile raise the volatility. The conclusion remains unchanged when applied to control the effect of financial tsunami. That means algorithmic trader¡¦s behaviors are not always positive. This result could be related to the special transaction mechanism or lower competition of algorithmic trading in Taiwan. As to trading strategy, the result found that foreign institutional investors focus on momentum strategies, whereas particular dealers act for the sake of index arbitrage or hedge.
In summary, the algorithmic trader¡¦s transaction bears positive (liquidity) and negative (volatility) impact on the market at the same time. For individual investors, algorithmic trading¡¦s momentum strategy could appeal to them, but they may not make a profit from these trades, because this strategy could merely want to pull price higher and sell stock or the opposite. About regulators, algorithmic traders¡¦ behavior should be regulated partly; regulatory authorities might also consider adding the circuit mechanism similar to South Koreas¡¦, especially on the program trading.
Keywords: algorithmic trading, high frequency trading, intraday, strategy, liquidity, volatility, market quality
|
89 |
A Reconsideration Of The Concept Of Architectural Space In The Virtual RealmKinayoglu, Gokhan 01 September 2007 (has links) (PDF)
The discovery of new geometries in the 19th century and the departure from an absolute to a relative understanding of space-time, together with the invention of higher dimensions have caused a shift towards the idealization of space. This new type of ideal space was called hyperspace. The counter-intuitive quality of hyperspace has opened up new formal possibilities and representation techniques in art and architecture. In a similar manner, with the introduction of computers, the virtual and immaterial quality of cyberspace has offered new design techniques and forms to architecture. Algorithmic design tools and the use of surface as the primary architectural element in cyberspace have caused a shift in the conception of space together with the way it is perceived.
Taking its departure point from physical space, this thesis investigates the upper and lower dimensions of space in order to understand and analyze the current conception of architectural space in the virtual realm. Three types of spatial qualities are investigated in detail: the ideal characteristic of hyperspace, the visual medium of cyberspace and the algorithmic formation of hypospace.
|
90 |
Högfrekvenshandel : En kvalitativ studiePalmborg, Adam, Malm, Max January 2015 (has links)
Syfte: Högfrekvenshandel har på senare år varit ett omdiskuterat och kontroversiellt ämne. Fenomenet har genomgått omfattande granskning och åsikterna kring dess påverkan på marknaden och dess aktörer går isär. Då tidigare forskning främst genomförts på den amerikanska marknaden är syftet med den här studien att bistå med en djupare insikt kring denna typ av handel och dess avtryck på den svenska finansmarknaden. Metod: För att behandla syftet har en kvalitativ studie av högfrekvenshandel med en deduktiv ansats genomförts. Teori: Studien utgår från Rational Choice Theory, Effektiva marknadshypotesen och tidigare forskning inom ämnet. Med hjälp av det teoretiska ramverket har studien analyserat det empiriska underlaget. Relevanta aspekter har identifierats som kan förklara varför studiens respondenter har ett specifikt förhållningssätt gentemot högfrekvenshandel. Empiri: Studien består av en dokumentstudie och fyra semistrukturerade intervjuer med intressenter på den svenska finansmarknaden. Intervjuerna ämnar identifiera de olika intressenternas förhållningssätt gentemot högfrekvenshandel och dess bakomliggande orsaker. Slutsats: Studien har kommit fram till att förhållningssättet gentemot högfrekvenshandel står i relation till vilken typ av verksamhet som intressenten bedriver. Vidare kan det konstateras att tidigare forskning till stor del går att applicera på den svenska marknaden. / Purpose: In recent years, High Frequency Trading has been a widely debated and controversial topic. The phenomenon has been subject to extensive examination and the opinions regarding its effect on the financial markets are inconsistent. Previous research has foremost been conducted on the American financial market. Thus the purpose of this thesis is to contribute with deeper insight regarding this kind of trading and its impact on the Swedish financial market. Method: To address the purpose of this thesis, a qualitative study with a deductive approach has been conducted. Theory: The thesis emanates from Rational Choice Theory, The Efficient Market Hypothesis and previous research within the field. Using the theoretical framework, the thesis has analyzed the empirical data. Relevant aspects has been identified which can explain why the thesis’ respondents has a specific approach towards High Frequency Trading. Empirics: The thesis consists of a document study and four semi structured interviews with stakeholders on the Swedish financial market. Through these interviews, the thesis aims to identify the stakeholders’ different approaches towards High Frequency Trading and what might cause this particular point of view. Conclusion: The thesis can conclude that the approach towards High Frequency Trading is correlated to the type of operation conducted by the respondent. Furthermore, it can be concluded that previous research in general is applicable on the Swedish financial market.
|
Page generated in 0.0424 seconds