51 |
Somatic-Variant-Discovery-from-WES-Data-Using-Control-FREECJumah, K., Kamieniecka, K., Maier, W., Poterlowicz, Krzysztof January 2024 (has links)
No
|
52 |
QualDash: Adaptable Generation of Visualisation Dashboards for Healthcare Quality ImprovementElshehaly, Mai, Randell, Rebecca, Brehmer, M., McVey, Lynn, Alvarado, Natasha, Gale, C.P., Ruddle, R.A. 07 September 2020 (has links)
Yes / Adapting dashboard design to different contexts of use is an open question in visualisation research. Dashboard designers often seek to strike a balance between dashboard adaptability and ease-of-use, and in hospitals challenges arise from the vast diversity of key metrics, data models and users involved at different organizational levels. In this design study, we present QualDash, a dashboard generation engine that allows for the dynamic configuration and deployment of visualisation dashboards for healthcare quality improvement (QI). We present a rigorous task analysis based on interviews with healthcare professionals, a co-design workshop and a series of one-on-one meetings with front line analysts. From these activities we define a metric card metaphor as a unit of visual analysis in healthcare QI, using this concept as a building block for generating highly adaptable dashboards, and leading to the design of a Metric Specification Structure (MSS). Each MSS is a JSON structure which enables dashboard authors to concisely configure unit-specific variants of a metric card, while offloading common patterns that are shared across cards to be preset by the engine. We reflect on deploying and iterating the design of QualDash in cardiology wards and pediatric intensive care units of five NHS hospitals. Finally, we report evaluation results that demonstrate the adaptability, ease-of-use and usefulness of QualDash in a real-world scenario.
|
53 |
Visualisations pour la veille en épidémiologie animale / Visualizations for animal epidemiology surveillanceFadloun, Samiha 15 November 2018 (has links)
De nombreux documents concernant l'émergence, la propagation ou le suivi de maladies humaines et animales sont quotidiennement publiés sur le Web. Afin de prévenir l'expansion des maladies, les épidémiologistes doivent constamment rechercher ces documents et les étudier afin de détecter les foyers de propagation le plus tôt possible. Dans cette thèse, nous nous intéressons aux deux activités liées à ce travail de veille afin de proposer des outils visuels permettant de faciliter/accélérer l'accès aux informations pertinentes. Nous nous focalisons sur les maladies animales, qui ont été moins étudiées et qui pourtant peuvent avoir de lourdes conséquences sur les activités humaines (maladies transmises d'animaux à humains, épidémies dans les élevages, ...).La première activité du veilleur consiste à collecter les documents issus du Web. Pour cela, nous proposons EpidVis, un outil visuel permettant aux épidémiologistes de regrouper et structurer les mots-clés nécessaires à leurs recherches, construire visuellement des requêtes complexes, les lancer sur différents moteurs de recherche et visualiser les résultats retournés. La seconde activité du veilleur consiste à explorer un grand nombre de documents concernant les maladies. Ces documents contiennent non seulement des informations telles que les noms des maladies, les symptômes associés, les espèces infectées, mais aussi des informations de type spatio-temporelles. Nous proposons EpidNews, un outil de visualisation analytique permettant d'explorer ces données en vue d'en extraire des informations. Les deux outils ont été réalisés dans le cadre d'une étroite collaboration avec des experts en épidémiologie. Ces derniers ont réalisé des études de cas pour montrer que les fonctionnalités des propositions étaient complètement adaptées et permettaient de pouvoir facilement extraire de la connaissance. / Many documents concerning emergence, spread or follow-up of human and animal diseases are published daily on the Web. In order to prevent the spread of disease, epidemiologists must frequently search for these documents and analyze them to detect outbreaks as early as possible. In this thesis, we are interested in the two activities related to this monitoring work in order to produce visual tools facilitating the access to relevant information. We focus on animal diseases, which have been less studied but can have serious consequences for human activities (diseases transmitted from animals to humans, epidemics in livestock ...).The first activity is to collect documents from the Web. For this, we propose EpidVis, a visual tool that allows epidemiologists to group and organize the keywords used for their research, visually build complex queries, launch them on different search engines and view the results returned. The second activity is to explore a large number of documents concerning diseases. These documents contain not only information such as disease names, associated symptoms, infected species, but also spatio-temporal information. We propose EpidNews, a visual analytics tool to explore this data for information extraction. Both tools were developed in close collaboration with experts in epidemiology. The latter carried out case studies to show that the functionalities of the proposals were completely adapted and made it possible to easily extract knowledge.
|
54 |
Planification visuelle et interactive d'interventions dans des environnements d'accélérateur de particules émettant des rayonnements ionisants / Interactive visual intervention planning in particle accelerator environments with ionizing radiationFabry, Thomas 30 May 2014 (has links)
Les radiations sont omniprésentes. Elles ont de nombreuses applications dans des domaines variés: en médecine, elles permettent de réaliser des diagnostiques et de guérir des patients; en communication, tous les systèmes modernes utilisent des formes de rayonnements électromagnétiques; et en science, les chercheurs les utilisent pour découvrir la composition et la structure des matériaux, pour n'en nommer que quelques-unes. Concrètement, la radiation est un processus au cours duquel des particules ou des ondes voyagent à travers différents types de matériaux. La radiation peut être très énergétique, et aller jusqu'à casser les atomes de la matière ordinaire. Dans ce cas, on parlera de radiation ionisante. Il est communément admis que la radiation ionisante peut être bien plus nocif pour les êtres vivants que la radiation non ionisante. Dans cette dissertation, nous traiterons de la radiation ionisante. La radioactivité est le processus d'émission des radiations ionisantes. Elle existe sous forme naturelle, et est présente dans les sols, dans l'air et notre planète entière est bombardée en permanence de rayonnements cosmiques énergétiques. Depuis le début du XXe siècle, les chercheurs sont capables de créer artificiellement de la matière radioactive. Cette découverte a offert de multiples avancées technologiques, mais a eu également de lourdes conséquences pour l'humanité comme l'ont démontrés les évènements de Tchernobyl et de Fukushima ou d'autres accidents dans le monde médical. Cette dangerosité a conduit à l'élaboration d'un système de radioprotection. Dans la pratique, la radioprotection est principalement mise en œuvre en utilisant la méthode ALARA. Cette méthodologie consiste à justifier, optimiser et limiter les doses reçues. Elle est utilisée conjointement avec les limites légales. Le facteur d'optimisation est contraint par le fait que l'exposition volontaire d'un travailleur aux radiations lors d'une opération doit être plus bénéfique que si aucune intervention humaine n'était conduite dans une situation donnée. Dans le monde industriel et scientifique, il existe des infrastructures qui émettent des rayonnements ionisants. La plupart d'entre elles nécessitent des opérations de maintenance. Dans l'esprit du principe ALARA, ces interventions doivent être optimisées pour réduire l'exposition des travailleurs aux rayonnements ionisants. Cette optimisation ne peut pas être réalisée de manière automatique car la faisabilité des interventions nécessite dans tous les cas une évaluation humaine. La planification des interventions peut cependant être facilitée par des moyens techniques et scientifiques comme par exemple un outil informatique. Dans le contexte décrit ci-dessus, cette thèse regroupe des considérations techniques et scientifiques, et présente la méthodologie utilisée pour développer des outils logiciels pour la mise en œuvre de la radioprotection. / Radiation is omnipresent. It has many interesting applications: in medicine, where it allows curing and diagnosing patients; in communication, where modern communication systems make use of electromagnetic radiation; and in science, where it is used to discover the structure of materials; to name a few. Physically, radiation is a process in which particles or waves travel through any kind of material, usually air. Radiation can be very energetic, in which case it can break the atoms of ordinary matter (ionization). If this is the case, radiation is called ionizing. It is known that ionizing radiation can be far more harmful to living beings than non-ionizing radiation. In this dissertation, we are concerned with ionizing radiation. Naturally occurring ionizing radiation in the form of radioactivity is a most natural phenomenon. Almost everything is radioactive: there is radiation emerging from the soil, it is in the air, and the whole planet is constantly undergoing streams of energetic cosmic radiation. Since the beginning of the twentieth century, we are also able to artificially create radio-active matter. This has opened a lot of interesting technological opportunities, but has also given a tremendous responsibility to humanity, as the nuclear accidents in Chernobyl and Fukushima, and various accidents in the medical world have made clear. This has led to the elaboration of a radiological protection system. In practice, the radiological protection system is mostly implemented using a methodology that is indicated with the acronym ALARA: As Low As Reasonably Achievable. This methodology consists of justifying, optimizing and limiting the radiation dose received. This methodology is applied in conjunction with the legal limits. The word "reasonably" means that the optimization of radiation exposure has to be seen in context. The optimization is constrained by the fact that the positive effects of an operation might surpass the negative effects caused by the radiation. Several industrial and scientific procedures give rise to facilities with ionizing radiation. Most technical and scientific facilities also need maintenance operations. In the spirit of ALARA, these interventions need to be optimized in terms of the exposure of the maintenace workers to ionizing radiation. This optimization cannot be automated since the feasibility of the intervention tasks requires human assessment. The intervention planning could however be facilitated by technical-scientific means, e.g. software tools. In the context sketched above, this thesis provides technical-scientific considerations and the development of technical-scientific methodologies and software tools for the implementation of radiation protection.In particular, this thesis addresses the need for an interactive visual intervention planning tool in the context of high energy particle accelerator facilities.
|
55 |
Where Social Networks, Graph Rewriting and Visualisation Meet : Application to Network Generation and Information Diffusion / Quand les réseaux sociaux, la réécriture de graphes et la visualisation se rencontrent : application à la génération de réseaux et à la diffusion d'information.Vallet, Jason 07 December 2017 (has links)
Dans cette thèse, nous présentons à la fois une collection de modèles de générations de réseaux et de diffusion d'information exprimés à l'aide d'un formalisme particulier appelé la réécriture de graphes, ainsi qu'une nouvelle méthode de représentation permettant la visualisation de la diffusion d'information dans des grands réseaux sociaux. Les graphes sont des objets mathématiques particulièrement versatiles qui peuvent être utilisés pour représenter une large variété de systèmes abstraits. Ces derniers peuvent être transformés de multiples façons (création, fusion ou altération de leur éléments), mais de telles modifications doivent être contrôlées afin d'éviter toute opération non souhaitée. Pour cela, nous faisons appel au formalisme particulier de la réécriture de graphes afin d'encadrer et de contrôler toutes les transformations. Dans notre travail, un système de réécriture de graphes opère sur un graphe, qui peut être transformé suivant un ensemble de règles, le tout piloté par une stratégie. Nous commençons tout d'abord par utiliser la réécriture en adaptant deux algorithmes de génération de réseaux, ces derniers permettant la création de réseaux aux caractéristiques petit monde. Nous traduisons ensuite vers le formalisme de réécriture différents modèles de diffusion d'information dans les réseaux sociaux. En énonçant à l'aide d'un formalisme commun différents algorithmes, nous pouvons plus facilement les comparer, ou ajuster leurs paramètres. Finalement, nous concluons par la présentation d'un nouvel algorithme de dessin compact de grands réseaux sociaux pour illustrer nos méthodes de propagation d'information. / In this thesis, we present a collection of network generation and information diffusion models expressed using a specific formalism called strategic located graph rewriting, as well as a novel network layout algorithm to show the result of information diffusion in large social networks. Graphs are extremely versatile mathematical objects which can be used to represent a wide variety of high-level systems. They can be transformed in multiple ways (e.g., creating new elements, merging or altering existing ones), but such modifications must be controlled to avoid unwanted operations. To ensure this point, we use a specific formalism called strategic graph rewriting. In this work, a graph rewriting system operates on a single graph, which can then be transformed according to some transformation rules and a strategy to steer the transformation process. First, we adapt two social network generation algorithms in order to create new networks presenting small-world characteristics. Then, we translate different diffusion models to simulate information diffusion phenomena. By adapting the different models into a common formalism, we make their comparison much easier along with the adjustment of their parameters. Finally, we finish by presenting a novel compact layout method to display overviews of the results of our information diffusion method.
|
56 |
Widening stakeholder involvement : exploiting interactive 3D visualisation and protocol buffers in geo-computingMcCreadie, Christopher Andrew January 2014 (has links)
Land use change has an impact on regional sustainability which can be assessed using social, economic and environmental indicators. Stakeholder engagement tools provide a platform that can demonstrate the possible future impacts land use change may have to better inform stakeholder groups of the impact of policy changes or plausible climatic variations. To date some engagement tools are difficult to use or understand and lack user interaction whilst other tools demonstrate model environments with a tightly coupled user interface, resulting in poor performance. The research and development described herein relates to the development and testing of a visualisation engine for rendering the output of an Agent Based Model (ABM) as a 3D Virtual Environment via a loosely-coupled data driven communications protocol called Protocol Buffers. The tool, named Rural Sustainability Visualisation Tool (R.S.V.T) is primarily aimed to enhance nonexpert knowledge and understanding of the effects of land use change, driven by farmer decision making, on the sustainability of a region. Communication protocols are evaluated and Protocol Buffers, a binarybased communications protocol is selected, based on speed of object serialization and data transfer, to pass message from the ABM to the 3D Virtual Environment. Early comparative testing of R.S.V.T and its 2D counterpart RepastS shows R.S.V.T and its loosely-coupled approach offers an increase in performance when rendering land use scenes. The flexibility of Protocol Buffer’s and MongoDB are also shown to have positive performance implications for storing and running of loosely-coupled model simulations. A 3D graphics Application Programming Interface (API), commonly used in the development of computer games technology is selected to develop the Virtual Environment. Multiple visualisation methods, designed to enhance stakeholder engagement and understanding, are developed and tested to determine their suitability in both user preference and information retrieval. The application of a prototype is demonstrated using a case study based in the Lunan catchment in Scotland, which has water quality and biodiversity issues due to intense agriculture. The region is modelled using three scenario storylines that broadly describe plausible futures. Business as Might Be Usual (BAMBU), Growth Applied Strategy (GRAS) and the Sustainable European Development Goal (SEDG) are the applied scenarios. The performance of the tool is assessed and it is found that R.S.V.T can run faster than its 2D equivalent when loosely coupled with a 3D Virtual Environment. The 3D Virtual Environment and its associated visualisation methods are assessed using non-expert stakeholder groups and it is shown that 3D ABM output is generally preferred to 2D ABM output. Insights are also gained into the most appropriate visualisation techniques for agricultural landscapes. Finally, the benefit of taking a loosely-coupled approach to the visualisation of model data is demonstrated through the performance of Protocol Buffers during testing, showing it is capable of transferring large amounts of model data to a bespoke visual front-end.
|
57 |
Measuring comprehension of abstract data visualisationsShovman, Mark January 2011 (has links)
Common visualisation techniques such as bar-charts and scatter-plots are not sufficient for visual analysis of large sets of complex multidimensional data. Technological advancements have led to a proliferation of novel visualisation tools and techniques that attempt to meet this need. A crucial requirement for efficient visualisation tool design is the development of objective criteria for visualisation quality, informed by research in human perception and cognition. This thesis presents a multidisciplinary approach to address this requirement, underpinning the design and implementation of visualisation software with the theory and methodology of cognitive science. An opening survey of visualisation practices in the research environment identifies three primary uses of visualisations: the detection of outliers, the detection of clusters and the detection of trends. This finding, in turn, leads to a formulation of a cognitive account of the visualisation comprehension processes, founded upon established theories of visual perception and reading comprehension. Finally, a psychophysical methodology for objectively assessing visualisation efficiency is developed and used to test the efficiency of a specific visualisation technique, namely an interactive three-dimensional scatterplot, in a series of four experiments. The outcomes of the empirical study are three-fold. On a concrete applicable level, three-dimensional scatterplots are found to be efficient in trend detection but not in outlier detection. On a methodological level, ‘pop-out’ methodology is shown to be suitable for assessing visualisation efficiency. On a theoretical level, the cognitive account of visualisation comprehension processes is enhanced by empirical findings, e.g. the significance of the learning curve parameters. All these provide a contribution to a ‘science of visualisation’ as a coherent scientific paradigm, both benefiting fundamental science and meeting an applied need.
|
58 |
Visualisation et traitements interactifs de grilles régulières 3D haute-résolution virtualisées sur GPU. Application aux données biomédicales pour la microscopie virtuelle en environnement HPC. / Interactive visualisation and processing of high-resolution regular 3D grids virtualised on GPU. Application to biomedical data for virtual microscopy in HPC environment.Courilleau, Nicolas 29 August 2019 (has links)
La visualisation de données est un aspect important de la recherche scientifique dans de nombreux domaines.Elle permet d'aider à comprendre les phénomènes observés voire simulés et d'en extraire des informations à des fins notamment de validations expérimentales ou tout simplement pour de la revue de projet.Nous nous intéressons dans le cadre de cette étude doctorale à la visualisation de données volumiques en imagerie médicale et biomédicale, obtenues grâce à des appareils d'acquisition générant des champs scalaires ou vectoriels représentés sous forme de grilles régulières 3D.La taille croissante des données, due à la précision grandissante des appareils d'acquisition, impose d'adapter les algorithmes de visualisation afin de pouvoir gérer de telles volumétries.De plus, les GPUs utilisés en visualisation de données volumiques, se trouvant être particulièrement adaptés à ces problématiques, disposent d'une quantité de mémoire très limitée comparée aux données à visualiser.La question se pose alors de savoir comment dissocier les unités de calculs, permettant la visualisation, de celles de stockage.Les algorithmes se basant sur le principe dit "out-of-core" sont les solutions permettant de gérer de larges ensembles de données volumiques.Dans cette thèse, nous proposons un pipeline complet permettant de visualiser et de traiter, en temps réel sur GPU, des volumes de données dépassant très largement les capacités mémoires des CPU et GPU.L'intérêt de notre pipeline provient de son approche de gestion de données "out-of-core" permettant de virtualiser la mémoire qui se trouve être particulièrement adaptée aux données volumiques.De plus, cette approche repose sur une structure d'adressage virtuel entièrement gérée et maintenue sur GPU.Nous validons notre modèle grâce à plusieurs applications de visualisation et de traitement en temps réel.Tout d'abord, nous proposons un microscope virtuel interactif permettant la visualisation 3D auto-stéréoscopique de piles d'images haute résolution.Puis nous validons l'adaptabilité de notre structure à tous types de données grâce à un microscope virtuel multimodale.Enfin, nous démontrons les capacités multi-rôles de notre structure grâce à une application de visualisation et de traitement concourant en temps réel. / Data visualisation is an essential aspect of scientific research in many fields.It helps to understand observed or even simulated phenomena and to extract information from them for purposes such as experimental validations or solely for project review.The focus given in this thesis is on the visualisation of volume data in medical and biomedical imaging.The acquisition devices used to acquire the data generate scalar or vector fields represented in the form of regular 3D grids.The increasing accuracy of the acquisition devices implies an increasing size of the volume data.Therefore, it requires to adapt the visualisation algorithms in order to be able to manage such volumes.Moreover, visualisation mostly relies on the use of GPUs because they suit well to such problematics.However, they possess a very limited amount of memory compared to the generated volume data.The question then arises as to how to dissociate the calculation units, allowing visualisation, from those of storage.Algorithms based on the so-called "out-of-core" principle are the solutions for managing large volume data sets.In this thesis, we propose a complete GPU-based pipeline allowing real-time visualisation and processing of volume data that are significantly larger than the CPU and GPU memory capacities.The pipeline interest comes from its GPU-based approach of an out-of-core addressing structure, allowing the data virtualisation, which is adequate for volume data management.We validate our approach using different real-time applications of visualisation and processing.First, we propose an interactive virtual microscope allowing 3D auto-stereoscopic visualisation of stacks of high-resolution images.Then, we verify the adaptability of our structure to all data types with a multimodal virtual microscope.Finally, we demonstrate the multi-role capabilities of our structure through a concurrent real-time visualisation and processing application.
|
59 |
Feature selection through visualisation for the classification of online reviewsKoka, Keerthika 17 April 2017 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The purpose of this work is to prove that the visualization is at least as powerful
as the best automatic feature selection algorithms. This is achieved by applying
our visualization technique to the online review classification into fake and genuine
reviews. Our technique uses radial chart and color overlaps to explore the best
feature selection through visualization for classification. Every review is treated as a
radial translucent red or blue membrane with its dimensions determining the shape
of the membrane. This work also shows how the dimension ordering and combination
is relevant in the feature selection process. In brief, the whole idea is about giving
a structure to each text review based on certain attributes, comparing how different
or how similar the structure of the different or same categories are and highlighting
the key features that contribute to the classification the most. Colors and saturations
aid in the feature selection process. Our visualization technique helps the user get
insights into the high dimensional data by providing means to eliminate the worst
features right away, pick some best features without statistical aids, understand the
behavior of the dimensions in different combinations.
|
60 |
Danish Companies Dashboard: An Interactive, Geospatial Visualisation of Industries and Profit in DenmarkHyrup, Tobias, Matthews, Pernille, Nguyen, David Nhan Thien, Kusnick, Jakob, Jänicke, Stefan 07 July 2022 (has links)
Profound knowledge of the business landscape is
crucial for any company wanting to affect its position in the
market. Whereas corresponding data is publicly available, visual
interfaces that inform on the distribution of companies operating
in different sectors are rare. To close the gap for the Danish
market, we developed the Danish Company Dashboard (DCD),
which uses the Danish Business Authority’s database on company
data to visually explore how the different companies, grouped
by industries, are geographically scattered across Denmark on
a regional and municipality plane. Moreover, the study and the
accompanying visualisations provide insights into how the profit
of each industry and company differs throughout the regions
and municipalities, thereby supporting strategic decision making
tasks of industry stakeholders.
|
Page generated in 0.0847 seconds