• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 21
  • 21
  • 7
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Dynamic and Thermodynamic Approach to Complexity.

Yang, Jin 08 1900 (has links)
The problem of establishing the correct approach to complexity is a very hot and crucial issue to which this dissertation gives some contributions. This dissertation considers two main possibilities, one, advocated by Tsallis and co-workers, setting the foundation of complexity on a generalized, non-extensive , form of thermodynamics, and another, proposed by the UNT Center for Nonlinear Science, on complexity as a new condition that, for physical systems, would be equivalent to a state of matter intermediate between dynamics and thermodynamics. In the first part of this dissertation, the concept of Kolmogorov-Sinai entropy is introduced. The Pesin theorem is generalized in the formalism of Tsallis non-extensive thermodynamics. This generalized form of Pesin theorem is used in the study of two major classes of problems, whose prototypes are given by the Manneville and the logistic map respectively. The results of these studies convince us that the approach to complexity must be made along lines different from those of the non-extensive thermodynamics. We have been convinced that the Lévy walk can be used as a prototype model of complexity, as a condition of balance between order and randomness that yields new phenomena such as aging, and multifractality. We reach the conclusions that these properties must be studied within a dynamic rather than thermodynamic perspective. The second part focuses on the study of the heart beating problem using a dynamic model, the so-called memory beyond memory, based on the Lévy walker model. It is proved that the memory beyond memory effect is more obvious in the healthy heart beating sequence. The concepts of fractal, multifractal, wavelet transformation and wavelet transform maximum modulus (WTMM) method are introduced. Artificial time sequences are generated by the memory beyond memory model to mimic the heart beating sequence. Using WTMM method, the multifratal singular spectrums of the sequences are calculated. It is clear that the sequence with strong memory beyond memory effect has broader singular spectrum.2003-08
12

Analise e comparação qualitativa de sistemas de detecção de plagio em tarefas de programação / Qualitative analysis and comparison of plagiarism detection systems on programming coursework

Kleiman, Alan Bustos 22 June 2007 (has links)
Orientador: Tomasz Kowaltowski / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-09T11:46:19Z (GMT). No. of bitstreams: 1 Kleiman_AlanBustos_M.pdf: 724679 bytes, checksum: 98bfcce8c89306724c31252f82cf7cc0 (MD5) Previous issue date: 2007 / Resumo: Plágio em submissões de alunos e um problema que vem aumentando ao longo do tempo e instituições de ensino têm trabalho considerável para eliminá-lo. Examinamos o problema do ponto de vista de submissões de alunos em disciplinas introdutórias de programação, fazendo um resumo de alguns sistemas e algoritmos existentes. Implementamos vários algoritmos descritos com a finalidade de efetuar uma comparação direta e qualitativa, com foco no pré-processamento de programas. Em particular, desenvolvemos um mecanismo para a normalização de programas através de uma análise sintática cuidadosa e reordenação da árvore de sintaxe abstrata de maneira a minimizar a quantidade de ruído criada por plagiadores ao tentar copiar e modificar programas de outros. Conseguimos resultados positivos utilizando esse método de pré-processamento, especialmente quando combinado com o algoritmo conhecido como Running Karp Rabin Greedy String Tiling. Esses resultados positivos reforçam nossa conclusão de que o pré-processamento pode ser até mais importante que o algoritmo em si, e apontam novas direções para pesquisas futuras / Abstract: Encountering plagiarism in student coursework has become increasingly common, and signifcant effort has been undertaken to counter this problem. We focus on the plagiarism in student submissions in programming courses, in particular in introductory computer science courses, describing some of the existing systems and algorithms already dedicated to this problem. We implement many of the algorithms so that we could undertake a direct and qualitative comparison, with a special focus on pre-processing student programs. In particular, we develop a mechanism for normalizing programs through careful parsing and ordering of their abstract syntax trees so as to minimize the noise created by plagiarists attempting to copy and modify someone else's program. We achieve positive results utilizing this new pre-processing method, particularly with the Running Karp Rabin Greedy String Tiling algorithm. The positive results reinforce our conclusion that pre-processing may be more important than the algorithm itself and point to new directions for further research / Mestrado / Sistemas de Computação / Mestre em Ciência da Computação
13

Universal Induction and Optimisation: No Free Lunch

Everitt, Tom January 2013 (has links)
No description available.
14

Segmenting Observed Time Series Using Comovement and Complexity Measures / Segmentering av Observerade Tidsserier med hjälp av Comovement- och Komplexitetsmått

Norgren, Lee January 2019 (has links)
Society depends on unbiased, efficient and replicable measurement tools to tell us more truthfully what is happening when our senses would otherwise fool us. A new approach is made to consistently detect the start and end of historic recessions as defined by the US Federal Reserve. To do this, three measures, correlation (Spearman and Pearson), Baur comovement and Kolmogorov complexity, are used to quantify market behaviour to detect recessions. To compare the effectiveness of each measure the normalized correct Area Under Curve (AUC) fraction is introduced. It is found that for all three measures, the performance is mostly dependent on the type of data and that financial market data does not perform as good as fundamental economical data to detect recessions. Furthermore, comovement is found to be the most efficient individual measure and also most efficient of all measures when compared against several measures merged together. / Samhället är beronde förväntningsriktiga, effektiva och replikerbara mätverktyg för att mer sanningsenligt informera vad som händer när våra sinnen lurar oss. Ett nytt tillvägagångssätt utvecklas för att konsekvent uppmäta början och slut av historiska lågkonjunkturer så som definierats av US Federal Reserve. För att göra detta används tre mätmetoder, korrelation (Spearman och Pearson), Baur comovement och Kolmogorovkomplexitet, för att kvantifiera marknadsbeteendet i avsikt att upptäcka lågkonjunkturer. För att jämföra effektiviteten hos varje metod introduceras normalized correct Area Under Curve (AUC) fraktionen. Det konstateras att effektiviteten hos alla tre metoder är främst beroende av vilken typ av data som används och att finansiell data inte fungerar lika bra som real ekonomiska data för att upptäcka lågkonjunkturer. Vidare visas att comovement är den mest effektiva individualla mätmetoden och även den mest effektiva metoden jämfört med sammanslagna metoder
15

Nelinearna dinamička analiza fizičkih procesa u žiivotnoj sredini / Nonlinear dynamical analysis of the physical processes in the environment

Mimić Gordan 29 September 2016 (has links)
<p>Ispitivan&nbsp; je&nbsp; spregnut&nbsp; sistem&nbsp; jednačina&nbsp; za&nbsp; prognozu&nbsp; temperature&nbsp; na povr&scaron;ini&nbsp; i&nbsp; u&nbsp; dubljem sloju zemlji&scaron;ta.&nbsp; Računati&nbsp; su&nbsp; Ljapunovljevi eksponenti,&nbsp; bifurkacioni dijagram, atraktor i analiziran je domen re&scaron;enja. Uvedene su nove informacione mere&nbsp; bazirane na<br />Kolmogorovljevoj kompleksnosti,&nbsp; za kvantifikaciju&nbsp; stepena nasumičnosti u vremenskim serijama,.&nbsp; Nove mere su primenjene na razne serije dobijene merenjem fizičkih faktora životne sredine i pomoću klimatskih modela.</p> / <p>Coupled system of prognostic equations for&nbsp; the&nbsp; ground surface temperature and&nbsp; the deeper layer temperature was examind. Lyapunov exponents, bifurcation diagrams, attractor and the domain of solutions were analyzed.&nbsp; Novel information measures based on Kolmogorov complexity&nbsp; and used&nbsp; for the quantification of randomness in time series, were presented.Novel measures were tested on various time series obtained by measuring physical factors of the environment or as the climate model outputs.</p>
16

Normalized social distance / Normalized social distance

Šlerka, Josef January 2019 (has links)
This dissertation thesis deals with the application of the concept of information distance to the social network data analysis. We consider this data as recorded acts of social action. As such, they express certain attitudes, values and intentions. We introduce a formula for calculating the Normalized Social Distance and, based on the series of case studies, we prove the usefulness and validity of this approach. The application of formal mathematical and computer science techniques to massive data records of human action in social network environments is enabled by the change brought by new media and the associated technological advancement. This change is accompanied by a gradual transition of research methods in the humanities, referred to as the onset of digital humanities. This approach is characterized by the application of quantitative methods in the field of humanities and the discovery of new data areas useful for analyses. In case of social media data, the differentiation between quantitative and qualitative methods is no longer valid. A good example is also this thesis, in which information theory specifically combines the methods of a traditional social network analysis and the Goffman's frame analysis of human action. Keywords Information distance, Normalized Social Distance, Kolmogorov...
17

Complexité de Kolmogorov et corrélations quantiques; étude du carré magique

Berthelette, Sophie 08 1900 (has links)
L'informatique quantique, ce surprenant mariage entre informatique et physique, est un domaine riche en nouvelles idées, autant pour la technologie future qu'une meilleure compréhension de notre univers. C'est le phénomène de l'intrication qui est au coeur de cette nouvelle façon de voir l'information. Ce mémoire porte sur l'étude des corrélations quantiques observées dans la nature, mises de l'avant, entre autres, par John Bell. Plus particulièrement, deux jeux non signalants, dans lesquels ces corrélations se manifestent, sont étudiés: le jeu CHSH, probablement l'exemple le plus connu à ce jour, et le jeu de pseudotélépathie du carré magique. Pour ce faire, deux points de vue seront adoptés, soit probabiliste et algorithmique. Le premier est motivé par la prédiction (ce qui aurait pu se passer), tandis que le second s'intéresse à l'information intrinsèque contenue dans un objet (ce qui s'est passé). Les concepts «aléatoire» et «information» seront donc abordés premièrement à la Shannon (approche probabiliste) puis à la Kolmogorov (approche algorithmique). C'est la complexité de Kolmogorov qui sera utilisée pour quantifier l'information de façon factuelle. De plus, le cas particulier où plusieurs répétitions d'un jeu sont jouées en parallèle dans un monde classique sera examiné. Le théorème des répétitions parallèles, résultat important sur le sujet démontré par Ran Raz, sera présenté et utilisé par la suite dans l'étude algorithmique des jeux CHSH et du carré magique. / Quantum information, this intriguing marriage between computer science and physics, is a promising field of research for future technologies as well as a better understanding of our universe. Entanglement is at the very heart of this new way of understanding information. This thesis focuses on quantum correlations that are observed in nature. They have been studied in great detail by, among others, John Bell. More specifically, two non-signaling games, in which these correlations arise, are studied: the CHSH game, which is probably the best-known example of such games, and the magic square pseudotelepathy game. To do so, two points of view will be adopted: probabilistic and algorithmic. The first is motivated by prediction (what could have happened) and the second focuses on the intrinsic information about an object (what happened). Therefore, the concepts of randomness and information are first addressed from Shannon’s point of view (probabilistic approach) and second from Kolmogorov’s point of view (algorithmic approach). Kolmogorov complexity is used to quantify information in a factual way. Furthermore, the particular case in which multiple repetitions of a game are played in parallel in a classical world is considered. The parallel repetition theorem, an important result on the subject proven by Ran Raz, is presented and used in the algorithmic study of the CHSH game and the magic square game.
18

Complexity-aware Decision-making with Applications to Large-scale and Human-in-the-loop Systems

Stefansson, Elis January 2023 (has links)
This thesis considers control systems governed by autonomous decision-makers and humans. We formalise and compute low-complex control policies with applications to large-scale systems, and propose human interaction models for controllers to compute interaction-aware decisions. In the first part of the thesis, we consider complexity-aware decision-making, formalising the complexity of control policies and constructing algorithms that compute low-complexity control policies. More precisely, first, we consider large-scale control systems given by hierarchical finite state machines (HFSMs) and present a planning algorithm for such systems that exploits the hierarchy to compute optimal policies efficiently. The algorithm can also handle changes in the system with ease. We prove these properties and conduct simulations on HFSMs with up to 2 million states, including a robot application, where our algorithm outperforms both Dijkstra's algorithm and Contraction Hierarchies.  Second, we present a planning objective for control systems modelled as finite state machines yielding an explicit trade-off between a policy's performance and complexity. We consider Kolmogorov complexity since it captures the ultimate compression of an object on a universal Turing machine. We prove that this trade-off is hard to optimise in the sense that dynamic programming is infeasible. Nonetheless, we present two heuristic algorithms obtaining low-complexity policies and evaluate the algorithms on a simple navigation task for a mobile robot, where we obtain low-complexity policies that concur with intuition.  In the second part of the thesis, we consider human-in-the-loop systems and predict human decision-making in such systems. First, we look at how the interaction between a robot and a human in a control system can be predicted using game theory, focusing on an autonomous truck platoon interacting with a human-driven car. The interaction is modelled as a hierarchical dynamic game, where the hierarchical decomposition is temporal with a high-fidelity tactical horizon predicting immediate interactions and a low-fidelity strategic horizon estimating long-term behaviour. The game enables feasible computations validated through simulations yielding situation-aware behaviour with natural and safe interactions.  Second, we seek models to explain human decision-making, focusing on driver overtaking scenarios. The overtaking problem is formalised as a decision problem with perceptual uncertainty. We propose and numerically analyse risk-agnostic and risk-aware decision models, judging if an overtaking is desirable. We show how a driver's decision time and confidence level can be characterised through two model parameters, which collectively represent human risk-taking behaviour. We detail an experimental testbed for evaluating the decision-making process in the overtaking scenario and present some preliminary experimental results from two human drivers. / Denna avhandling studerar styrsystem med autonoma beslutsfattare och människor. Vi formaliserar och beräknar styrlagar av låg komplexitet med tillämpningar på storskaliga system samt föreslår modeller för mänsklig interaktion som kan användas av regulatorer för att beräkna interaktionsmedvetna beslut. I den första delen av denna avhandling studerar vi komplexitet-medveten beslutsfattning, där vi formaliserar styrlagars komplexitet samt konstruerar algoritmer som beräknar styrlagar med låg komplexitet. Mer precist, först studerar vi storskaliga system givna av hierarkiska finita tillståndsmaskiner (HFSMs) och presenterar en planeringsalgoritm för sådana system som utnyttjar hierarkin för att beräkna optimala styrlagar effektivt. Algoritmen kan också lätt hantera förändringar i systemet. Vi bevisar dessa egenskaper och utför simuleringar på HFSMs med upp till 2 miljoner tillstånd, inklusive en robot-applikation, där vår algorithm överträffar både Dijkstra's algoritm och så kallade Contraction Hierarchies. För det andra så presenterar vi ett planeringsobjektiv för finita tillståndsmaskiner som ger en explicit avvägning mellan ett styrlags prestanda och komplexitet. Vi använder Kolmogorovkomplexitet då den fångar den ultimata komprimeringen av ett objekt i en universell Turing-maskin. Vi bevisar att detta objektiv är icke-trivial att optimera över i avseendet att dynamisk programming är omöjligt att utföra. Vi presenterar två algoritmer som beräknar styrlagar med låg komplexitet och evaluerar våra algoritmer på ett enkelt navigationsproblem där vi erhåller styrlagar av låg komplexitet som instämmer med intuition. I den andra delen av denna avhandling behandlar vi reglersystem där en människa interagerar med systemet och studerar hur mänskligt beslutsfattande i sådana system kan förutspås. Först studerar vi hur interaktionen mellan en maskin och en människa i ett reglersystem can förutspås med hjälp av spelteori, med fokus på en självkörande lastbilskonvoj som interagerar med en mänskligt styrd bil. Interaktionen är modellerad som ett hierarkiskt dynamiskt spel, där den hierarkiska indelningen är tidsmässig med en högupplöst taktil horisont som förutspår omedelbara interaktioner samt en lågupplöst strategisk horisont som estimerar långtgående interaktioner. Indelning möjliggör beräkningar som vi validerar via simuleringar där vi får situations-medvetet beteende med naturliga och säkra interaktioner. För det andra söker vi en model med få parametrar som förklarar mänskligt beteende där vi fokuserar på omkörningar. Vi formaliserar omkörningsproblemet som ett beslutfattningsproblem med perceptuell osäkerhet. Vi presenterar och analyserar numeriskt risk-agnostiska och risk-medvetna beslutsmodeller som avväger om en omkörning är önskvärd. Vi visar hur en förares beslutstid och konfidensnivå kan karakteriserar via två modellparametrar som tillsammans representerar mänskligt risk-beteende. Vi beskriver en experimentell testbädd och presentar preliminära resultat från två mänskliga förare. / <p>QC 20230523</p>
19

Fast Low Memory T-Transform: string complexity in linear time and space with applications to Android app store security.

Rebenich, Niko 27 April 2012 (has links)
This thesis presents flott, the Fast Low Memory T-Transform, the currently fastest and most memory efficient linear time and space algorithm available to compute the string complexity measure T-complexity. The flott algorithm uses 64.3% less memory and in our experiments runs asymptotically 20% faster than its predecessor. A full C-implementation is provided and published under the Apache Licence 2.0. From the flott algorithm two deterministic information measures are derived and applied to Android app store security. The derived measures are the normalized T-complexity distance and the instantaneous T-complexity rate which are used to detect, locate, and visualize unusual information changes in Android applications. The information measures introduced present a novel, scalable approach to assist with the detection of malware in app stores. / Graduate
20

Quantum Algorithmic Engineering with Photonic Integrated Circuits

Kallol, Roy January 2013 (has links) (PDF)
Integrated quantum photonics show monolithic waveguide chips to be a promising platform for realizing the next generation of quantum optical circuits. This work proposes the implementation of quantum page Rank algorithm on a photonic waveguide lattice. Our contributions are as follows: Continuous-time quantum stochastic walk(QSW)-an alternate paradigm of quantum computing, is a hybrid quantum walk that incorporates both unitary and non-unitary effects. We propose the use of QSW which necessitates the hopping of the quantum crawler on a directed graph, for the quantum page Rank problem. We propose the implementation of quantum page Rank on a photonic waveguide lattice, where we allow the density matrix to evolve according to the Lindblad-Kossakowski master equation, the diagonal of which gives the quantum page Rank. We have also shown the use of the metric of positional Kolmogorov Complexity as an efficient tool for determining whether or not the quantum channel has been compromised. We appositionally encode multi-photon decoy pulses within the stream of single photon pulses. This positional encoding is chosen in such a way as to have low Kolmogorov complexity. The PNS attack on the multi-photon decoy pulses causes a dip in the ratio of the transmittance of the decoy pulses to the signal pulses in the conventional analysis.

Page generated in 0.0528 seconds