• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 274
  • 248
  • 38
  • 25
  • 24
  • 11
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 724
  • 197
  • 182
  • 147
  • 128
  • 114
  • 101
  • 96
  • 80
  • 73
  • 71
  • 70
  • 60
  • 56
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Automated Visualisation of Product Deployment / Automatisk Visualisering av Produkt Distribution

Chowdary, Milton January 2022 (has links)
The development of large products, whether it is software or hardware, faces many challenges. Two of these challenges are to keep everyone involved up-to-date on the latest developments, and to get a clear overview of the components of the product. A proposed solution is to have a graph presenting all the necessary information about the product. The issue with having a graph of a constantly changing product is that it requires a lot of maintenance to keep it up-to-date. This thesis presents the implementation of a software for Ericsson, that can gather automatically the required information about a given product and creates a graph to present it. The software traverses a file structure, containing information about a product, and stores it. This information is then used to create two different graphs: a tree graph and a box graph.The graphs were evaluated, both by the author and by the team at Ericsson, based on visualisation principles.  The results show that the automatically gathered information is effective and can communicate the information needed. The tree graph receives slightly favourable reviews in comparison to the currently available and manually created graph. However, limitations for graph layout on the visualisation tool made the graphs larger than necessary and, therefore, harder to understand. In order to achieve a better result, other visualisation tools could be considered. The software created tree graphs that are useable at Ericsson, and could prove helpful for development.
52

Visualisations pour la veille en épidémiologie animale / Visualizations for animal epidemiology surveillance

Fadloun, Samiha 15 November 2018 (has links)
De nombreux documents concernant l'émergence, la propagation ou le suivi de maladies humaines et animales sont quotidiennement publiés sur le Web. Afin de prévenir l'expansion des maladies, les épidémiologistes doivent constamment rechercher ces documents et les étudier afin de détecter les foyers de propagation le plus tôt possible. Dans cette thèse, nous nous intéressons aux deux activités liées à ce travail de veille afin de proposer des outils visuels permettant de faciliter/accélérer l'accès aux informations pertinentes. Nous nous focalisons sur les maladies animales, qui ont été moins étudiées et qui pourtant peuvent avoir de lourdes conséquences sur les activités humaines (maladies transmises d'animaux à humains, épidémies dans les élevages, ...).La première activité du veilleur consiste à collecter les documents issus du Web. Pour cela, nous proposons EpidVis, un outil visuel permettant aux épidémiologistes de regrouper et structurer les mots-clés nécessaires à leurs recherches, construire visuellement des requêtes complexes, les lancer sur différents moteurs de recherche et visualiser les résultats retournés. La seconde activité du veilleur consiste à explorer un grand nombre de documents concernant les maladies. Ces documents contiennent non seulement des informations telles que les noms des maladies, les symptômes associés, les espèces infectées, mais aussi des informations de type spatio-temporelles. Nous proposons EpidNews, un outil de visualisation analytique permettant d'explorer ces données en vue d'en extraire des informations. Les deux outils ont été réalisés dans le cadre d'une étroite collaboration avec des experts en épidémiologie. Ces derniers ont réalisé des études de cas pour montrer que les fonctionnalités des propositions étaient complètement adaptées et permettaient de pouvoir facilement extraire de la connaissance. / Many documents concerning emergence, spread or follow-up of human and animal diseases are published daily on the Web. In order to prevent the spread of disease, epidemiologists must frequently search for these documents and analyze them to detect outbreaks as early as possible. In this thesis, we are interested in the two activities related to this monitoring work in order to produce visual tools facilitating the access to relevant information. We focus on animal diseases, which have been less studied but can have serious consequences for human activities (diseases transmitted from animals to humans, epidemics in livestock ...).The first activity is to collect documents from the Web. For this, we propose EpidVis, a visual tool that allows epidemiologists to group and organize the keywords used for their research, visually build complex queries, launch them on different search engines and view the results returned. The second activity is to explore a large number of documents concerning diseases. These documents contain not only information such as disease names, associated symptoms, infected species, but also spatio-temporal information. We propose EpidNews, a visual analytics tool to explore this data for information extraction. Both tools were developed in close collaboration with experts in epidemiology. The latter carried out case studies to show that the functionalities of the proposals were completely adapted and made it possible to easily extract knowledge.
53

Planification visuelle et interactive d'interventions dans des environnements d'accélérateur de particules émettant des rayonnements ionisants / Interactive visual intervention planning in particle accelerator environments with ionizing radiation

Fabry, Thomas 30 May 2014 (has links)
Les radiations sont omniprésentes. Elles ont de nombreuses applications dans des domaines variés: en médecine, elles permettent de réaliser des diagnostiques et de guérir des patients; en communication, tous les systèmes modernes utilisent des formes de rayonnements électromagnétiques; et en science, les chercheurs les utilisent pour découvrir la composition et la structure des matériaux, pour n'en nommer que quelques-unes. Concrètement, la radiation est un processus au cours duquel des particules ou des ondes voyagent à travers différents types de matériaux. La radiation peut être très énergétique, et aller jusqu'à casser les atomes de la matière ordinaire. Dans ce cas, on parlera de radiation ionisante. Il est communément admis que la radiation ionisante peut être bien plus nocif pour les êtres vivants que la radiation non ionisante. Dans cette dissertation, nous traiterons de la radiation ionisante. La radioactivité est le processus d'émission des radiations ionisantes. Elle existe sous forme naturelle, et est présente dans les sols, dans l'air et notre planète entière est bombardée en permanence de rayonnements cosmiques énergétiques. Depuis le début du XXe siècle, les chercheurs sont capables de créer artificiellement de la matière radioactive. Cette découverte a offert de multiples avancées technologiques, mais a eu également de lourdes conséquences pour l'humanité comme l'ont démontrés les évènements de Tchernobyl et de Fukushima ou d'autres accidents dans le monde médical. Cette dangerosité a conduit à l'élaboration d'un système de radioprotection. Dans la pratique, la radioprotection est principalement mise en œuvre en utilisant la méthode ALARA. Cette méthodologie consiste à justifier, optimiser et limiter les doses reçues. Elle est utilisée conjointement avec les limites légales. Le facteur d'optimisation est contraint par le fait que l'exposition volontaire d'un travailleur aux radiations lors d'une opération doit être plus bénéfique que si aucune intervention humaine n'était conduite dans une situation donnée. Dans le monde industriel et scientifique, il existe des infrastructures qui émettent des rayonnements ionisants. La plupart d'entre elles nécessitent des opérations de maintenance. Dans l'esprit du principe ALARA, ces interventions doivent être optimisées pour réduire l'exposition des travailleurs aux rayonnements ionisants. Cette optimisation ne peut pas être réalisée de manière automatique car la faisabilité des interventions nécessite dans tous les cas une évaluation humaine. La planification des interventions peut cependant être facilitée par des moyens techniques et scientifiques comme par exemple un outil informatique. Dans le contexte décrit ci-dessus, cette thèse regroupe des considérations techniques et scientifiques, et présente la méthodologie utilisée pour développer des outils logiciels pour la mise en œuvre de la radioprotection. / Radiation is omnipresent. It has many interesting applications: in medicine, where it allows curing and diagnosing patients; in communication, where modern communication systems make use of electromagnetic radiation; and in science, where it is used to discover the structure of materials; to name a few. Physically, radiation is a process in which particles or waves travel through any kind of material, usually air. Radiation can be very energetic, in which case it can break the atoms of ordinary matter (ionization). If this is the case, radiation is called ionizing. It is known that ionizing radiation can be far more harmful to living beings than non-ionizing radiation. In this dissertation, we are concerned with ionizing radiation. Naturally occurring ionizing radiation in the form of radioactivity is a most natural phenomenon. Almost everything is radioactive: there is radiation emerging from the soil, it is in the air, and the whole planet is constantly undergoing streams of energetic cosmic radiation. Since the beginning of the twentieth century, we are also able to artificially create radio-active matter. This has opened a lot of interesting technological opportunities, but has also given a tremendous responsibility to humanity, as the nuclear accidents in Chernobyl and Fukushima, and various accidents in the medical world have made clear. This has led to the elaboration of a radiological protection system. In practice, the radiological protection system is mostly implemented using a methodology that is indicated with the acronym ALARA: As Low As Reasonably Achievable. This methodology consists of justifying, optimizing and limiting the radiation dose received. This methodology is applied in conjunction with the legal limits. The word "reasonably" means that the optimization of radiation exposure has to be seen in context. The optimization is constrained by the fact that the positive effects of an operation might surpass the negative effects caused by the radiation. Several industrial and scientific procedures give rise to facilities with ionizing radiation. Most technical and scientific facilities also need maintenance operations. In the spirit of ALARA, these interventions need to be optimized in terms of the exposure of the maintenace workers to ionizing radiation. This optimization cannot be automated since the feasibility of the intervention tasks requires human assessment. The intervention planning could however be facilitated by technical-scientific means, e.g. software tools. In the context sketched above, this thesis provides technical-scientific considerations and the development of technical-scientific methodologies and software tools for the implementation of radiation protection.In particular, this thesis addresses the need for an interactive visual intervention planning tool in the context of high energy particle accelerator facilities.
54

Where Social Networks, Graph Rewriting and Visualisation Meet : Application to Network Generation and Information Diffusion / Quand les réseaux sociaux, la réécriture de graphes et la visualisation se rencontrent : application à la génération de réseaux et à la diffusion d'information.

Vallet, Jason 07 December 2017 (has links)
Dans cette thèse, nous présentons à la fois une collection de modèles de générations de réseaux et de diffusion d'information exprimés à l'aide d'un formalisme particulier appelé la réécriture de graphes, ainsi qu'une nouvelle méthode de représentation permettant la visualisation de la diffusion d'information dans des grands réseaux sociaux. Les graphes sont des objets mathématiques particulièrement versatiles qui peuvent être utilisés pour représenter une large variété de systèmes abstraits. Ces derniers peuvent être transformés de multiples façons (création, fusion ou altération de leur éléments), mais de telles modifications doivent être contrôlées afin d'éviter toute opération non souhaitée. Pour cela, nous faisons appel au formalisme particulier de la réécriture de graphes afin d'encadrer et de contrôler toutes les transformations. Dans notre travail, un système de réécriture de graphes opère sur un graphe, qui peut être transformé suivant un ensemble de règles, le tout piloté par une stratégie. Nous commençons tout d'abord par utiliser la réécriture en adaptant deux algorithmes de génération de réseaux, ces derniers permettant la création de réseaux aux caractéristiques petit monde. Nous traduisons ensuite vers le formalisme de réécriture différents modèles de diffusion d'information dans les réseaux sociaux. En énonçant à l'aide d'un formalisme commun différents algorithmes, nous pouvons plus facilement les comparer, ou ajuster leurs paramètres. Finalement, nous concluons par la présentation d'un nouvel algorithme de dessin compact de grands réseaux sociaux pour illustrer nos méthodes de propagation d'information. / In this thesis, we present a collection of network generation and information diffusion models expressed using a specific formalism called strategic located graph rewriting, as well as a novel network layout algorithm to show the result of information diffusion in large social networks. Graphs are extremely versatile mathematical objects which can be used to represent a wide variety of high-level systems. They can be transformed in multiple ways (e.g., creating new elements, merging or altering existing ones), but such modifications must be controlled to avoid unwanted operations. To ensure this point, we use a specific formalism called strategic graph rewriting. In this work, a graph rewriting system operates on a single graph, which can then be transformed according to some transformation rules and a strategy to steer the transformation process. First, we adapt two social network generation algorithms in order to create new networks presenting small-world characteristics. Then, we translate different diffusion models to simulate information diffusion phenomena. By adapting the different models into a common formalism, we make their comparison much easier along with the adjustment of their parameters. Finally, we finish by presenting a novel compact layout method to display overviews of the results of our information diffusion method.
55

Widening stakeholder involvement : exploiting interactive 3D visualisation and protocol buffers in geo-computing

McCreadie, Christopher Andrew January 2014 (has links)
Land use change has an impact on regional sustainability which can be assessed using social, economic and environmental indicators. Stakeholder engagement tools provide a platform that can demonstrate the possible future impacts land use change may have to better inform stakeholder groups of the impact of policy changes or plausible climatic variations. To date some engagement tools are difficult to use or understand and lack user interaction whilst other tools demonstrate model environments with a tightly coupled user interface, resulting in poor performance. The research and development described herein relates to the development and testing of a visualisation engine for rendering the output of an Agent Based Model (ABM) as a 3D Virtual Environment via a loosely-coupled data driven communications protocol called Protocol Buffers. The tool, named Rural Sustainability Visualisation Tool (R.S.V.T) is primarily aimed to enhance nonexpert knowledge and understanding of the effects of land use change, driven by farmer decision making, on the sustainability of a region. Communication protocols are evaluated and Protocol Buffers, a binarybased communications protocol is selected, based on speed of object serialization and data transfer, to pass message from the ABM to the 3D Virtual Environment. Early comparative testing of R.S.V.T and its 2D counterpart RepastS shows R.S.V.T and its loosely-coupled approach offers an increase in performance when rendering land use scenes. The flexibility of Protocol Buffer’s and MongoDB are also shown to have positive performance implications for storing and running of loosely-coupled model simulations. A 3D graphics Application Programming Interface (API), commonly used in the development of computer games technology is selected to develop the Virtual Environment. Multiple visualisation methods, designed to enhance stakeholder engagement and understanding, are developed and tested to determine their suitability in both user preference and information retrieval. The application of a prototype is demonstrated using a case study based in the Lunan catchment in Scotland, which has water quality and biodiversity issues due to intense agriculture. The region is modelled using three scenario storylines that broadly describe plausible futures. Business as Might Be Usual (BAMBU), Growth Applied Strategy (GRAS) and the Sustainable European Development Goal (SEDG) are the applied scenarios. The performance of the tool is assessed and it is found that R.S.V.T can run faster than its 2D equivalent when loosely coupled with a 3D Virtual Environment. The 3D Virtual Environment and its associated visualisation methods are assessed using non-expert stakeholder groups and it is shown that 3D ABM output is generally preferred to 2D ABM output. Insights are also gained into the most appropriate visualisation techniques for agricultural landscapes. Finally, the benefit of taking a loosely-coupled approach to the visualisation of model data is demonstrated through the performance of Protocol Buffers during testing, showing it is capable of transferring large amounts of model data to a bespoke visual front-end.
56

Measuring comprehension of abstract data visualisations

Shovman, Mark January 2011 (has links)
Common visualisation techniques such as bar-charts and scatter-plots are not sufficient for visual analysis of large sets of complex multidimensional data. Technological advancements have led to a proliferation of novel visualisation tools and techniques that attempt to meet this need. A crucial requirement for efficient visualisation tool design is the development of objective criteria for visualisation quality, informed by research in human perception and cognition. This thesis presents a multidisciplinary approach to address this requirement, underpinning the design and implementation of visualisation software with the theory and methodology of cognitive science. An opening survey of visualisation practices in the research environment identifies three primary uses of visualisations: the detection of outliers, the detection of clusters and the detection of trends. This finding, in turn, leads to a formulation of a cognitive account of the visualisation comprehension processes, founded upon established theories of visual perception and reading comprehension. Finally, a psychophysical methodology for objectively assessing visualisation efficiency is developed and used to test the efficiency of a specific visualisation technique, namely an interactive three-dimensional scatterplot, in a series of four experiments. The outcomes of the empirical study are three-fold. On a concrete applicable level, three-dimensional scatterplots are found to be efficient in trend detection but not in outlier detection. On a methodological level, ‘pop-out’ methodology is shown to be suitable for assessing visualisation efficiency. On a theoretical level, the cognitive account of visualisation comprehension processes is enhanced by empirical findings, e.g. the significance of the learning curve parameters. All these provide a contribution to a ‘science of visualisation’ as a coherent scientific paradigm, both benefiting fundamental science and meeting an applied need.
57

Visualisation et traitements interactifs de grilles régulières 3D haute-résolution virtualisées sur GPU. Application aux données biomédicales pour la microscopie virtuelle en environnement HPC. / Interactive visualisation and processing of high-resolution regular 3D grids virtualised on GPU. Application to biomedical data for virtual microscopy in HPC environment.

Courilleau, Nicolas 29 August 2019 (has links)
La visualisation de données est un aspect important de la recherche scientifique dans de nombreux domaines.Elle permet d'aider à comprendre les phénomènes observés voire simulés et d'en extraire des informations à des fins notamment de validations expérimentales ou tout simplement pour de la revue de projet.Nous nous intéressons dans le cadre de cette étude doctorale à la visualisation de données volumiques en imagerie médicale et biomédicale, obtenues grâce à des appareils d'acquisition générant des champs scalaires ou vectoriels représentés sous forme de grilles régulières 3D.La taille croissante des données, due à la précision grandissante des appareils d'acquisition, impose d'adapter les algorithmes de visualisation afin de pouvoir gérer de telles volumétries.De plus, les GPUs utilisés en visualisation de données volumiques, se trouvant être particulièrement adaptés à ces problématiques, disposent d'une quantité de mémoire très limitée comparée aux données à visualiser.La question se pose alors de savoir comment dissocier les unités de calculs, permettant la visualisation, de celles de stockage.Les algorithmes se basant sur le principe dit "out-of-core" sont les solutions permettant de gérer de larges ensembles de données volumiques.Dans cette thèse, nous proposons un pipeline complet permettant de visualiser et de traiter, en temps réel sur GPU, des volumes de données dépassant très largement les capacités mémoires des CPU et GPU.L'intérêt de notre pipeline provient de son approche de gestion de données "out-of-core" permettant de virtualiser la mémoire qui se trouve être particulièrement adaptée aux données volumiques.De plus, cette approche repose sur une structure d'adressage virtuel entièrement gérée et maintenue sur GPU.Nous validons notre modèle grâce à plusieurs applications de visualisation et de traitement en temps réel.Tout d'abord, nous proposons un microscope virtuel interactif permettant la visualisation 3D auto-stéréoscopique de piles d'images haute résolution.Puis nous validons l'adaptabilité de notre structure à tous types de données grâce à un microscope virtuel multimodale.Enfin, nous démontrons les capacités multi-rôles de notre structure grâce à une application de visualisation et de traitement concourant en temps réel. / Data visualisation is an essential aspect of scientific research in many fields.It helps to understand observed or even simulated phenomena and to extract information from them for purposes such as experimental validations or solely for project review.The focus given in this thesis is on the visualisation of volume data in medical and biomedical imaging.The acquisition devices used to acquire the data generate scalar or vector fields represented in the form of regular 3D grids.The increasing accuracy of the acquisition devices implies an increasing size of the volume data.Therefore, it requires to adapt the visualisation algorithms in order to be able to manage such volumes.Moreover, visualisation mostly relies on the use of GPUs because they suit well to such problematics.However, they possess a very limited amount of memory compared to the generated volume data.The question then arises as to how to dissociate the calculation units, allowing visualisation, from those of storage.Algorithms based on the so-called "out-of-core" principle are the solutions for managing large volume data sets.In this thesis, we propose a complete GPU-based pipeline allowing real-time visualisation and processing of volume data that are significantly larger than the CPU and GPU memory capacities.The pipeline interest comes from its GPU-based approach of an out-of-core addressing structure, allowing the data virtualisation, which is adequate for volume data management.We validate our approach using different real-time applications of visualisation and processing.First, we propose an interactive virtual microscope allowing 3D auto-stereoscopic visualisation of stacks of high-resolution images.Then, we verify the adaptability of our structure to all data types with a multimodal virtual microscope.Finally, we demonstrate the multi-role capabilities of our structure through a concurrent real-time visualisation and processing application.
58

Feature selection through visualisation for the classification of online reviews

Koka, Keerthika 17 April 2017 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The purpose of this work is to prove that the visualization is at least as powerful as the best automatic feature selection algorithms. This is achieved by applying our visualization technique to the online review classification into fake and genuine reviews. Our technique uses radial chart and color overlaps to explore the best feature selection through visualization for classification. Every review is treated as a radial translucent red or blue membrane with its dimensions determining the shape of the membrane. This work also shows how the dimension ordering and combination is relevant in the feature selection process. In brief, the whole idea is about giving a structure to each text review based on certain attributes, comparing how different or how similar the structure of the different or same categories are and highlighting the key features that contribute to the classification the most. Colors and saturations aid in the feature selection process. Our visualization technique helps the user get insights into the high dimensional data by providing means to eliminate the worst features right away, pick some best features without statistical aids, understand the behavior of the dimensions in different combinations.
59

Flow Imaging of the Fluid Mechanics of Multilayer Slide Coating. Flow visualisation of layers formation in a 3-layers slide coating die, measurement of their thicknesses and interfacial and free surface flow instabilities

Alpin, Richard P. January 2016 (has links)
Coating onto a moving substrate several films simultaneously on top of each other is a challenging exercise. This is due to the fact, depending on operating conditions (thickness and velocity of individual layers and the physical properties of the coating fluids), flow instabilities may arise at the interfaces between the layers and on the top layer. These instabilities ruin the application of the final multi-layered coating and must be avoided. This research addresses this coating flow situation and seeks to develop guidelines to avoid these instabilities. Following a critical literature survey, this thesis presents a novel experimental method that visualises multi-layered coating flow down an inclined multi-slot die. The visualisation is obtained using a unique configuration including a high-speed camera, telecentric objective lens and illumination. The results show for a single layer, as the die angle and Reynolds number increases, the flow becomes more unstable and for a dual layer flow, as Re increases the peak to peak amplitude and the frequency decreases at the free surface and interface. The latter was unexpected and does not conform with existing literature. The triple layer results show either a monotonically increasing or increasing from first to second layer viscosity stratifications are the most stable flows along with flow heights in the first and second layers of <22% and >18% of the total thickness respectively, which concur with current literature. The visualisation additionally obtained other instabilities including single layer back-wetting and vortices, and multilayer slot invasion with the findings concurring with current literature. / EPRSC/Tata Steel Industrial CASE Studentship; EP/J501840/1
60

Danish Companies Dashboard: An Interactive, Geospatial Visualisation of Industries and Profit in Denmark

Hyrup, Tobias, Matthews, Pernille, Nguyen, David Nhan Thien, Kusnick, Jakob, Jänicke, Stefan 07 July 2022 (has links)
Profound knowledge of the business landscape is crucial for any company wanting to affect its position in the market. Whereas corresponding data is publicly available, visual interfaces that inform on the distribution of companies operating in different sectors are rare. To close the gap for the Danish market, we developed the Danish Company Dashboard (DCD), which uses the Danish Business Authority’s database on company data to visually explore how the different companies, grouped by industries, are geographically scattered across Denmark on a regional and municipality plane. Moreover, the study and the accompanying visualisations provide insights into how the profit of each industry and company differs throughout the regions and municipalities, thereby supporting strategic decision making tasks of industry stakeholders.

Page generated in 0.1107 seconds