• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 266
  • 49
  • 36
  • 30
  • 28
  • 16
  • 13
  • 13
  • 12
  • 10
  • 7
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

Existence et calcul distribué d'équilibres dans des jeux de congestion généralisés / Existence and distributed computation of equilibria in generalized congestion games

Rodier, Lise 12 July 2016 (has links)
Cette thèse se focalise sur les jeux de potentiel et une généralisation d'un jeu d'ordonnancement dans un graphe que nous avons appelé jeu de placement.Dans ce jeu, le coût d'un joueur est impacté par son voisinage.Nous pouvons illustrer cela avec un exemple : le placement de joueurs dans un train, pour lesquels la présence de voisins directs influe sur le bien-être.Les résultats de cette thèse se divisent en deux parties.Tout d'abord, nous étudions ces jeux en considérant l'existence et les propriétés de structure des équilibres.Nous nous posons la question fondamentale de savoir s'il existe des équilibres de Nash dans le jeu de placement.Si tel est le cas, nous tachons de déterminer si ces équilibres sont facilement calculables.Dans le cas où il n'existe pas d'équilibre nous prouvons la NP-complétude du problème.Dans un second temps nous nous intéressons à la notion de calcul distribué d'équilibre de Nash dans des jeux de placement.En particulier nous considérons un jeu basé sur le problème de Max-Cut, qui a été plus étudié en théorie des graphes.Cela nous a permis d'étendre nos travaux à une application aux réseaux mobiles pour la gestion d'interférences dans les réseaux sans fils.Nous avons pu, pour les différents jeux, mettre en place des algorithmes distribués de calcul d'équilibres et étudier leur convergence.Parallèlement, nous avons étendu les travaux de Max-Cut à un problème de sélection d'offre de qualité de service parmi divers fournisseurs d'accès.Nous comparons les performances d'algorithmes de calcul distribué d'équilibres et de minimisation de regret. / This thesis focuses on potential games and a generalized load balancing game in a graph we called placement game.In this game, the cost of a player is affected by its neighbors.We can illustrate this with an example: the placement of players on a train, where the presence of direct neighbors affects their well-being.The results of this thesis are divided into two parts.First, we study these games considering the existence and structural properties of equilibria.We ask ourselves the fundamental question of whether there are Nash equilibria in the placement game.If this is the case we aim to determine if they are easily calculable, if there is no such equilibria we prove the NP-completeness of the problem.Secondly we focus on the concept of distributed algorithms to compute Nash equilibria in placement games.In particular we consider a game based on the Max-Cut problem, which has been more frequently studied.This allowed us to expand our work to a mobile network application for managing interference in wireless networks.We were able, for those different games, to implement distributed algorithms to compute equilibria and study their convergence.Meanwhile, we have expanded the Max-Cut works with a selection of QoS offers problem from various network providers.We compare the performance of distributed algorithms and regret minimization.
262

Embedding an object calculus in the unifying theories of programming

Smith, Michael Anthony January 2010 (has links)
Hoare and He's Unifying Theories of Programming (UTP) provides a rich model of programs as relational predicates. This theory is intended to provide a single framework in which any programming paradigms, languages, and features, can be modelled, compared and contrasted. The UTP already has models for several programming formalisms, such as imperative programming, higher-order programming (e.g. programing with procedures), several styles of concurrent programming (or reactive systems), class-based object-orientation, and transaction processing. We believe that the UTP ought to be able to represent all significant computer programming language formalisms, in order for it to be considered a unifying theory. One gap in the UTP work is that of object-based object-orientation, such as that presented in Abadi and Cardelli's untyped object calculi (sigma-calculi). These sigma-calculi provide a prominent formalism of object-based object-oriented (OO) programs, which models programs as objects. We address this gap within this dissertation by presenting an embedding of an Abadi--Cardelli-style object calculus in the UTP. More formally, the thesis that his dissertation argues is that it is possible to provide an object-based object rientation to the UTP, with value- and reference-based objects, and a fully abstract model of references. We have made three contributions to our area of study: first, to extend the UTP with a notion of object-based object orientation, in contrast with the existing class-based models; second, to provide an alternative model of pointers (references) for the UTP that supports both value-based compound values (e.g. objects) and references (pointers), in contrast to existing UTP models with pointers that have reference-based compound values; and third, to model an Abadi-Cardelli notion of an object in the UTP, and thus demonstrate that it can unify this style of object formalism.
263

Probabilistic inference for phrase-based machine translation : a sampling approach

Arun, Abhishek January 2011 (has links)
Recent advances in statistical machine translation (SMT) have used dynamic programming (DP) based beam search methods for approximate inference within probabilistic translation models. Despite their success, these methods compromise the probabilistic interpretation of the underlying model thus limiting the application of probabilistically defined decision rules during training and decoding. As an alternative, in this thesis, we propose a novel Monte Carlo sampling approach for theoretically sound approximate probabilistic inference within these models. The distribution we are interested in is the conditional distribution of a log-linear translation model; however, often, there is no tractable way of computing the normalisation term of the model. Instead, a Gibbs sampling approach for phrase-based machine translation models is developed which obviates the need of computing this term yet produces samples from the required distribution. We establish that the sampler effectively explores the distribution defined by a phrase-based models by showing that it converges in a reasonable amount of time to the desired distribution, irrespective of initialisation. Empirical evidence is provided to confirm that the sampler can provide accurate estimates of expectations of functions of interest. The mix of high probability and low probability derivations obtained through sampling is shown to provide a more accurate estimate of expectations than merely using the n-most highly probable derivations. Subsequently, we show that the sampler provides a tractable solution for finding the maximum probability translation in the model. We also present a unified approach to approximating two additional intractable problems: minimum risk training and minimum Bayes risk decoding. Key to our approach is the use of the sampler which allows us to explore the entire probability distribution and maintain a strict probabilistic formulation through the translation pipeline. For these tasks, sampling allies the simplicity of n-best list approaches with the extended view of the distribution that lattice-based approaches benefit from, while avoiding the biases associated with beam search. Our approach is theoretically well-motivated and can give better and more stable results than current state of the art methods.
264

A transdisciplinary study of embodiment in HCI, AI and New Media

Al-Shihi, Hamda Darwish Ali January 2012 (has links)
The aim of this thesis is to report on a transdisciplinary approach, regarding the complexity of thinking about human embodiment in relation to machine embodiment. A practical dimension of this thesis is to elicit some principles for the design and evaluation of virtual embodiment. The transdisciplinary approach suggests, firstly, that a single discipline or reality is, on its own, not sufficient to explain the complexity and dynamism of the embodied interaction between the human and machine. Secondly, the thesis argues for thinking of transdisciplinary research as a process of individuation, becoming or transduction, that is, as a process of mediation between heterogeneous approaches rather than perceiving research as a stabilized cognitive schema designed to accumulate new outcomes to the already-there reality. Arguing for going beyond the individualized approaches to embodiment, this thesis analyzes three cases where the problems that appear in one case are resolved through the analysis of the following one. Consisting of three phases, this research moves from objective scientific 'reality' to more phenomenological, subjective and complex realities. The first study employs a critical review of embodied conversational agents in human-computer interaction (HCI) in a learning context using a comparative meta-analysis. Meta-analysis was applied because most of the studies for evaluating embodiment are experimental. A learning context was selected because the number of studies is suitable for meta-analysis and the findings could be generalized to other contexts. The analysis reveals that there is no 'persona effect', that is, the expected positive effect of virtual embodiment on the participant's affective, perceptive and cognitive measures. On the contrary, it shows the reduction of virtual embodiment to image and a lack of consideration for the participant's embodiment and interaction, in addition to theoretical and methodological shortcomings. The second phase solves these problems by focusing on Mark Hansen's phenomenological account of embodiment in new media. The investigation shows that Hansen improves on the HCI account by focusing on the participant's dynamic interaction with new media. Nevertheless, his views of embodied perception and affection are underpinned by a subjective patriarchal account leading to object/subject and body/work polarizations. The final phase resolves this polarization by analyzing the controversial work of Alan Turing on intelligent machinery. The research provides a different reading of the Turing Machine based on Simondon's concept of individuation, repositioning its materiality from the abstract non-existent to the actual-virtual realm and investigating the reasons for its abstraction. It relates the emergence of multiple human-machine encounters in Turing's work to the complex counter-becoming of what it describes as 'the Turing Machine compound'.
265

Child-centred technologies as learning tools within the primary classroom : exploring the role of tablets and the potential of digital pens in schools

Mann, Anne-Marie January 2017 (has links)
This thesis provides insights into how technology can be and is used as child-centric learning tools within primary school classrooms. The conducted studies look closely at how tablet technology is integrated into the modern classroom, and considers how existing digital writing technologies could support handwriting-based learning exercises in future. This is achieved by conducting three in-the-wild studies, using different approaches, with a total of seventy-four children in school classrooms. In the first study, focus is placed on how tablets integrate into and with existing classroom practices, documenting when and how children use tablets in class. Relevant and complementary to this, the use of traditional writing tools is questioned and two further studies explore the potential and suitability of digital pens to support children's handwriting-based learning. One looks in detail at how children's handwriting is effected by different existing digital pen technologies. The other study, conducted through a creative, participatory design session, asks children to provide their opinions regarding desirable features for digital writing technology. The findings from this research classify and exemplify the role of tablets in the classroom, and explore potential design directions of digital writing tools which could be used by children in the future. This work may be useful and of interest to others who conduct research with children within the fields of Human Computer Interaction, Child Computer Interaction or education.
266

Calcul haute performance pour la détection de rayon Gamma / High Performance Computing for Detection of Gamma ray

Aubert, Pierre 04 October 2018 (has links)
La nouvelle génération d'expériences de physique produira une quantité de données sans précédent. Cette augmentation du flux de données cause des bouleversements techniques à tous les niveaux, comme le stockage des données, leur analyse, leur dissémination et leur préservation.Le projet CTA sera le plus grand observatoire d'astronomie gamma au sol à partir de 2021. Il produira plusieurs centaines de Péta-octets de données jusqu'en 2030 qui devront être analysées, stockée, compressées, et réanalysées tous les ans.Ce travail montre comment optimiser de telles analyses de physique avec les techniques de l'informatique hautes performances par le biais d'un générateur de format de données efficace, d'optimisation bas niveau de l'utilisation du pipeline CPU et de la vectorisation des algorithmes existants, un algorithme de compression rapide d'entiers et finalement une nouvelle analyse de données basée sur une méthode de comparaison d'image optimisée. / The new generation research experiments will introduce huge data surge to a continuously increasing data production by current experiments. This increasing data rate causes upheavals at many levels, such as data storage, analysis, diffusion and conservation.The CTA project will become the utmost observatory of gamma astronomy on the ground from 2021. It will generate hundreds Peta-Bytes of data by 2030 and will have to be stored, compressed and analyzed each year.This work address the problems of data analysis optimization using high performance computing techniques via an efficient data format generator, very low level programming to optimize the CPU pipeline and vectorization of existing algorithms, introduces a fast compression algorithm for integers and finally exposes a new analysis algorithm based on efficient pictures comparison.

Page generated in 0.0233 seconds