• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 274
  • 249
  • 38
  • 25
  • 24
  • 11
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 726
  • 198
  • 183
  • 147
  • 128
  • 115
  • 101
  • 97
  • 80
  • 73
  • 72
  • 70
  • 61
  • 56
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

An expansion of the 3D rendering discussion

Widell, Alex January 2018 (has links)
My work consists of a written part including images of practical investigations and references. The thesis is best read in chronological order, including images.
112

Tasks and visual techniques for the exploration of temporal graph data

Kerracher, Natalie January 2017 (has links)
This thesis considers the tasks involved in exploratory analysis of temporal graph data, and the visual techniques which are able to support these tasks. There has been an enormous increase in the amount and availability of graph (network) data, and in particular, graph data that is changing over time. Understanding the mechanisms involved in temporal change in a graph is of interest to a wide range of disciplines. While the application domain may differ, many of the underlying questions regarding the properties of the graph and mechanism of change are the same. The research area of temporal graph visualisation seeks to address the challenges involved in visually representing change in a graph over time. While most graph visualisation tools focus on static networks, recent research has been directed toward the development of temporal visualisation systems. By representing data using computer-generated graphical forms, Information Visualisation techniques harness human perceptual capabilities to recognise patterns, spot anomalies and outliers, and find relationships within the data. Interacting with these graphical representations allow individuals to explore large datasets and gain further insightinto the relationships between different aspects of the data. Visual approaches are particularly relevant for Exploratory Data Analysis (EDA), where the person performing the analysis may be unfamiliar with the data set, and their goal is to make new discoveries and gain insight through its exploration. However, designing visual systems for EDA can be difficult, as the tasks which a person may wish to carry out during their analysis are not always known at outset. Identifying and understanding the tasks involved in such a process has given rise to a number of task taxonomies which seek to elucidate the tasks and structure them in a useful way. While task taxonomies for static graph analysis exist, no suitable temporal graph taxonomy has yet been developed. The first part of this thesis focusses on the development of such a taxonomy. Through the extension and instantiation of an existing formal task framework for general EDA, a task taxonomy and a task design space are developed specifically for exploration of temporal graph data. The resultant task framework is evaluated with respect to extant classifications and is shown to address a number of deficiencies in task coverage in existing works. Its usefulness in both the design and evaluation processes is also demonstrated. Much research currently surrounds the development of systems and techniques for visual exploration of temporal graphs, but little is known about how the different types of techniques relate to one another and which tasks they are able to support. The second part of this thesis focusses on the possibilities in this area: a design spaceof the possible visual encodings for temporal graph data is developed, and extant techniques are classified into this space, revealing potential combinations of encodings which have not yet been employed. These may prove interesting opportunities for further research and the development of novel techniques. The third part of this work addresses the need to understand the types of analysis the different visual techniques support, and indeed whether new techniques are required. The techniques which are able to support the different task dimensions are considered. This task-technique mapping reveals that visual exploration of temporalgraph data requires techniques not only from temporal graph visualisation, but also from static graph visualisation and comparison, and temporal visualisation. A number of tasks which are unsupported or less-well supported, which could prove interesting opportunities for future research, are identified. The taxonomies, design spaces, and mappings in this work bring order to the range of potential tasks of interest when exploring temporal graph data and the assortmentof techniques developed to visualise this type of data, and are designed to be of use in both the design and evaluation of temporal graph visualisation systems.
113

Apport des SIG et de la réalité virtuelle à la modélisation et la simulation du trafic urbain / GIS contribution and virtual reality to the modeling and to the simulation of urban traffic

Richard, Julien 21 March 2018 (has links)
La cartographie ainsi que les sciences qui s'y rattachent, sont en constante évolution et s'adaptent aux nouvelles technologies et les domaines d'application sont de plus en plus variés. Nous souhaitons illustrer, dans ce manuscrit, l'application des systèmes d'informations géographiques à la simulation du trafic urbain en 4D grâce aux nouvelles technologies telles que les casques de réalité virtuelle. Le flux routier se traduit en équations via des simulations discrètes (voiture par voiture) et continues (assimilées à l'écoulement d'un fluide).Dans une première partie, nous allons étudier l'historique de la cartographie, plus particulièrement la représentation de la ville au cours du temps. La gestion de trafic urbain est un élément crucial pour les urbanistes et sa représentation a évolué tant par l'utilisation d'outils de plus en plus précis, que par les enjeux actuels. L'urbanisation croissante nous conduit à être de plus en plus prévoyants sur les flux urbains. Il n'est pas seulement question de réseaux routiers, ce problème d'urbanisation influence d'autres réseaux tels que les systèmes d'assainissements qui sont sous dimensionnés face à l'augmentation de foyers et de surfaces imperméables. Il est en est de même pour les réseaux d'eau potable qui doivent être sans cesse renouvelés pour subvenir aux besoins des habitants, et plus globalement avec tous les réseaux enterrés. Nous étudierons aussi dans cette partie la modélisation du réseau routier via les graphes et les hypergraphes, cela nous permettra d'optimiser le code. En effet, la modélisation choisie, développée par M. Bouillé, la représentation HBDS, se rapproche de l'écriture de code en orienté objet et permet de bien structurer un réseau. Dans la partie suivante, nous décrirons les critères de développement en passant par le choix des données sources et des langages informatiques. Celui de la donnée source est très important pour une simulation qui s'approche au plus près de la réalité. Faire des simulations sur diverses villes du monde, sans se limiter à la France, est un des buts de ce travail. Nous ferons donc une analyse des données disponibles afin de trouver les meilleures informations pour alimenter les simulations. Ensuite, nous exposerons les méthodes et les réalisations que nous avons mises en place pour cette étude. Dans cette partie, on trouvera l'organisation du code ainsi que les différents outils de géomatique permettant la simulation de trafic en ville. Nous avons créé de nombreux algorithmes avant de développer, pour optimiser le temps de conception et renforcer le modèle créé. De plus, au cours de cette partie, nous nous intéresserons à l'apport d'un outil d'aide à la décision dans ce contexte via la mise en place d'outils :- De simulation informatique,- De Système Expert avec la création d'un module d'Intelligence Artificielle. Pour finir, les résultats visuels et les perspectives de ce travail seront abordés. Nous allons décrire l'interface homme-machine qui se devait d'être la plus intuitive possible. En effet, les interfaces des SIG propriétaires sont souvent très complexes, ils doivent être accessibles pour proposer des outils variés selon les domaines d'intérêts des utilisateurs. Dans le cadre de notre thématique, nous pouvons limiter les interactions avec l'utilisateur et nous concentrer sur des usages plus ciblés sur la simulation. Nous verrons aussi l'utilisation des principes d'immersion visuelle, comme la stéréoscopie, principe encore sous exploité dans les SIG actuels / Mapping and spatial data visualization are increasingly used to communicate to a wide audience, while providing specific expertise. We want to illustrate the application of geographical information systems to 4D urban traffic simulation thanks to new technologies such as virtual reality headsets. Road flow can be described in equations by discrete simulation (car by car) and continuous simulation (as a fluid flow).Firstly, we study the cartography history, more particularly the city representation over time. Urban traffic management is a critical piece for urban planners. Its representation has changed both with precise tools uses, and with current issues. An increase in urbanization leads us to be more and more farsighted of urban flows. It's not only a road networks question. These urbanization problems impact other networks as sanitation which are undersized dealing with population and surfaces damp-proofing increases. It's the same problem with the water supply which has to be replaced to cover the population needs, and more generally with all underground networks. We also study in this part the road network model via graphs and hypergraphs to optimize the code. Indeed, the chosen model, developed by Mr. Bouillé, the HBDS representation, is close to the object oriented code writing and helps to well structure a network. Afterwards we describe the development criterion through the raw data choice and the computer languages. Raw data choice is important to get the most realist simulation. The fact to make simulation all over the world, and not only in France, is one of the aims of this work. That's why we do a data analysis to find the best data to supply simulations. Then we expose methods and achievements that we implement for this study. In this part, we present the code organization and the geomatic tools helping to the city traffic simulation. We build many algorithms before coding, to optimize the conception time and to strengthen the created model. Moreover we talk about the benefits of a decision support tool in this context via the implementation tools :- Computer simulation,- Expert System with Artificial Intelligence creation. At least, visual results and perspectives are discussed. We describe the graphical user interface which had to be user-friendly. Indeed, owner user interfaces are often complicated. It has to be approachable to offer wide tools depending on users fields. As part of our thematic, we can limit interactions with the user and focus on targeted uses on simulation. We can also see the immersion view uses as the stereoscopy, technique underused in actual GIS
114

Un modèle pour la composition d'applications de visualisation et d'interaction continue avec des simulations scientifiques / A model for composing applications of visualization and continuous interaction with scientific simulations

Turki, Ahmed 08 March 2012 (has links)
La simulation informatique est un outil incontournable dans les sciences expérimentales. La puissance de calcul croissante des ordinateurs associée au parallélisme et aux avancées dans la modélisation mathématique des phénomènes physiques permet de réaliser virtuellement des expériences de plus en plus complexes. De plus, l'émergence de la programmation GPU a considérablement accru la qualité et la rapidité de l'affichage. Ceci a permis de démocratiser la visualisation sous forme graphique des résultats de simulation. La visualisation scientifique peut être passive : l'utilisateur peut suivre l'évolution de la simulation ou bien observer ses résultats après que le calcul soit terminé. Elle peut aussi être interactive lorsque le chercheur peut agir sur la simulation alors qu'elle se déroule. Créer de telles applications complexes n'est cependant pas à la portée de tout scientifique non informaticien. La programmation par composants est, depuis des années, mise en avant comme une solution à ce problème. Elle consiste à construire des applications en interconnectant des programmes exécutant des tâches élémentaires. Ce mémoire présente un modèle de composants et une méthode de composition d'applications de visualisation scientifique interactive. Elle s'intéresse, en particulier, à la conciliation de deux contraintes majeures dans la coordination de ces applications : la performance et la cohérence. / Computer simulation is an essential tool in experimental sciences. The increasing computing power, parallelism and the advances in the mathematical modeling of physical phenomena allow to virtually run always more complex experiments. In addition, the rise of GPU programming has greatly increased the quality and performance of display. This has allowed to spread the graphical visualization of simulation results. Scientific visualization can be passive: the user can only follow the simulation's progress or observe its results when it is done. It can also be interactive in which case the researcher can act on the simulation while it is running. Creating such complex applications can, however, be tedious for non-computer-scientists. Component-based development is, for years, highlighted as a solution to this problem. It consists in building applications by interconnecting small programs completing elementary tasks. This thesis presents a component model and a method for composing interactive scientific visualization applications. It particularly focuses on the balance between two major constraints of these applications: performance and coherence.
115

Partitionnement de grands graphes : mesures, algorithmes et visualisation / Graph Partitioning : measures, algorithms and visualization

Queyroi, François 10 October 2013 (has links)
L'analyse de réseaux (représentés par des graphes) est une composante importante dans la compréhension de systèmes complexes issus de nombreuses disciplines telles que la biologie, la géographie ou la sociologie. Nous nous intéressons dans cette thèse aux décompositions de ces réseaux. Ces décompositions sont utiles pour la compression des données, la détection de communautés ou la visualisation de graphes. Une décomposition possible est un partitionnement hiérarchique des sommets du graphe. Nous traitons de l'évaluation de la qualité de telles structures (leur capacité à bien capturer la topologie du graphe) par le biais de mesures de qualité. Nous discutons ensuite l'utilisation de ces mesures en tant que fonctions objectives à maximiser dans le cadre d'algorithmes de partitionnement. Enfin, nous nous intéressons à la définition de métaphores visuelles efficaces permettant de représenter différentes décompositions de graphes. / Network analysis is an important step in the understanding of complex systems studied in various areas such as biology, geography or sociology. This thesis focuses on the problems related to the decomposition of those networks when they are modeled by graphs. Graph decomposition methods are useful for data compression, community detection or network visualisation. One possible decomposition is a hierarchical partition of the set of vertices. We propose a method to evaluate the quality of such structures using quality measures and algorithms to maximise those measures. We also discuss the design of effective visual metaphors to represent various graph decompositions.
116

Online Anomaly Detection

Ståhl, Björn January 2006 (has links)
Where the role of software-intensive systems has shifted from the traditional one of fulfilling isolated computational tasks, larger collaborative societies with interaction as primary resource, is gradually taking its place. This can be observed in anything from logistics to rescue operations and resource management, numerous services with key-roles in the modern infrastructure. In the light of this new collaborative order, it is imperative that the tools (compilers, debuggers, profilers) and methods (requirements, design, implementation, testing) that supported traditional software engineering values also adjust and extend towards those nurtured by the online instrumentation of software intensive systems. That is, to adjust and to help to avoid situations where limitations in technology and methodology would prevent us from ascertaining the well-being and security of systems that assists our very lives. Coupled with most perspectives on software development and maintenance is one well established member of, and complement to, the development process. Debugging; or the art of discovering, localising, and correcting undesirable behaviours in software-intensive systems, the need for which tend to far outlive development in itself. Debugging is currently performed based on a premise of the developer operating from a god-like perspective. A perspective that implies access and knowledge regarding source code, along with minute control over execution properties. However, the quality as well as accessibility of such information steadily decline with time as requirements, implementation, hardware components and their associated developers, all alike fall behind their continuously evolving surroundings. In this thesis, it is argued that the current practice of software debugging is insufficient, and as precursory action, introduce a technical platform suitable for experimenting with future methods regarding online debugging, maintenance and analysis. An initial implementation of this platform will then be used for experimenting with a simple method that is targeting online observation of software behaviour.
117

Vliv vizuálních představ na formování poslechových dovedností žáků v HV na 1. stupni ZŠ / Influence of visual images on forming listening skills of elementary school music students

Ejemová, Barbora January 2016 (has links)
This diploma thesis deals with the issue of visualisaion during listening to the music in the music lessons at the elementary school. The thesis contains suggestions for both: visualisation-based listening work and listening work without visualisation. Theoretical part contains theoretical background of the topic. It covers the development of the listening activities, describes their contemporary conception and deals with their psychological aspects. Practical part of this study comprises the research surveys with didactic materials. Part of the work is a CD with listening tests, presentations and videostreams, which are adapted for presentation during classes.
118

Data visualisation in digital forensics

Fei, B.K.L. (Bennie Kar Leung) 07 March 2007 (has links)
As digital crimes have risen, so has the need for digital forensics. Numerous state-of-the-art tools have been developed to assist digital investigators conduct proper investigations into digital crimes. However, digital investigations are becoming increasingly complex and time consuming due to the amount of data involved, and digital investigators can find themselves unable to conduct them in an appropriately efficient and effective manner. This situation has prompted the need for new tools capable of handling such large, complex investigations. Data mining is one such potential tool. It is still relatively unexplored from a digital forensics perspective, but the purpose of data mining is to discover new knowledge from data where the dimensionality, complexity or volume of data is prohibitively large for manual analysis. This study assesses the self-organising map (SOM), a neural network model and data mining technique that could potentially offer tremendous benefits to digital forensics. The focus of this study is to demonstrate how the SOM can help digital investigators to make better decisions and conduct the forensic analysis process more efficiently and effectively during a digital investigation. The SOM’s visualisation capabilities can not only be used to reveal interesting patterns, but can also serve as a platform for further, interactive analysis. / Dissertation (MSc (Computer Science))--University of Pretoria, 2007. / Computer Science / unrestricted
119

User-centred security event visualisation / Visualisation d'événements de sécurité centrée autour de l'utilisateur

Humphries, Christopher 08 December 2015 (has links)
Il est aujourd'hui de plus en plus difficile de gérer les énormes quantités de données générées dans le cadre de la sécurité des systèmes. Les outils de visualisation sont une piste pour faire face à ce défi. Ils représentent de manière synthétique et souvent esthétique de grandes quantités de données et d'événements de sécurité pour en faciliter la compréhension et la manipulation. Dans ce document, nous présentons tout d'abord une classification des outils de visualisation pour la sécurité en fonction de leurs objectifs respectifs. Ceux-ci peuvent être de trois ordres : monitoring (c'est à dire suivi en temps réel des événements pour identifier au plus tôt les attaques alors qu'elles se déroulent), exploration (parcours et manipulation a posteriori d'une quantité importante de données pour découvrir les événements importants) ou reporting (représentation a posteriori d'informations déjà connues de manière claire et synthétique pour en faciliter la communication et la transmission). Ensuite, nous présentons ELVis, un outil capable de représenter de manière cohérente des évènements de sécurité issus de sources variées. ELVis propose automatiquement des représentations appropriées en fonction du type des données (temps, adresse IP, port, volume de données, etc.). De plus, ELVis peut être étendu pour accepter de nouvelles sources de données. Enfin, nous présentons CORGI, une extension d'ELVIs permettant de manipuler simultanément plusieurs sources de données pour les corréler. A l'aide de CORGI, il est possible de filtrer les évènements de sécurité provenant d'une source de données en fonction de critères résultant de l'analyse des évènements de sécurité d'une autre source de données, facilitant ainsi le suivi des évènements sur le système d'information en cours d'analyse. / Managing the vast quantities of data generated in the context of information system security becomes more difficult every day. Visualisation tools are a solution to help face this challenge. They represent large quantities of data in a synthetic and often aesthetic way to help understand and manipulate them. In this document, we first present a classification of security visualisation tools according to each of their objectives. These can be one of three: monitoring (following events in real time to identify attacks as early as possible), analysis (the exploration and manipulation a posteriori of a an important quantity of data to discover important events) or reporting (representation a posteriori of known information in a clear and synthetic fashion to help communication and transmission). We then present ELVis, a tool capable of representing security events from various sources coherently. ELVis automatically proposes appropriate representations in function of the type of information (time, IP address, port, data volume, etc.). In addition, ELVis can be extended to accept new sources of data. Lastly, we present CORGI, an successor to ELVIS which allows the simultaneous manipulation of multiple sources of data to correlate them. With the help of CORGI, it is possible to filter security events from a datasource by multiple criteria, which facilitates following events on the currently analysed information systems.
120

Enhanced visualisation techniques to support access to personal information across multiple devices

Beets, Simone Yvonne January 2014 (has links)
The increasing number of devices owned by a single user makes it increasingly difficult to access, organise and visualise personal information (PI), i.e. documents and media, across these devices. The primary method that is currently used to organise and visualise PI is the hierarchical folder structure, which is a familiar and widely used means to manage PI. However, this hierarchy does not effectively support personal information management (PIM) across multiple devices. Current solutions, such as the Personal Information Dashboard and Stuff I’ve Seen, do not support PIM across multiple devices. Alternative PIM tools, such as Dropbox and TeamViewer, attempt to provide a means of accessing PI across multiple devices, but these solutions also suffer from several limitations. The aim of this research was to investigate to what extent enhanced information visualisation (IV) techniques could be used to support accessing PI across multiple devices. An interview study was conducted to identify how PI is currently managed across multiple devices. This interview study further motivated the need for a tool to support visualising PI across multiple devices and identified requirements for such an IV tool. Several suitable IV techniques were selected and enhanced to support PIM across multiple devices. These techniques comprised an Overview using a nested circles layout, a Tag Cloud and a Partition Layout, which used a novel set-based technique. A prototype, called MyPSI, was designed and implemented incorporating these enhanced IV techniques. The requirements and design of the MyPSI prototype were validated using a conceptual walkthrough. The design of the MyPSI prototype was initially implemented for a desktop or laptop device with mouse-based interaction. A sample personal space of information (PSI) was used to evaluate the prototype in a controlled user study. The user study was used to identify any usability problems with the MyPSI prototype. The results were highly positive and the participants agreed that such a tool could be useful in future. No major problems were identified with the prototype. The MyPSI prototype was then implemented on a mobile device, specifically an Android tablet device, using a similar design, but supporting touch-based interaction. Users were allowed to upload their own PSI using Dropbox, which was visualised by the MyPSI prototype. A field study was conducted following the Multi-dimensional In-depth Long-term Case Studies approach specifically designed for IV evaluation. The field study was conducted over a two-week period, evaluating both the desktop and mobile versions of the MyPSI prototype. Both versions received positive results, but the desktop version was slightly preferred over the mobile version, mainly due to familiarity and problems experienced with the mobile implementation. Design recommendations were derived to inform future designs of IV tools to support accessing PI across multiple devices. This research has shown that IV techniques can be enhanced to effectively support accessing PI across multiple devices. Future work will involve customising the MyPSI prototype for mobile phones and supporting additional platforms.

Page generated in 0.0963 seconds