• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 28
  • 6
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 80
  • 80
  • 22
  • 21
  • 16
  • 15
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Visualization of Particle In Cell Simulations / Visualization of Particle In Cell Simulations

Ljung, Patric January 2000 (has links)
<p>A numerical simulation case involving space plasma and the evolution of instabilities that generates very fast electrons, i.e. approximately at half of the speed of light, is used as a test bed for scientific visualisation techniques. A visualisation system was developed to provide interactive real-time animation and visualisation of the simulation results. The work focuses on two themes and the integration of them. The first theme is the storage and management of the large data sets produced. The second theme deals with how the Visualisation System and Visual Objects are tailored to efficiently visualise the data at hand. </p><p>The integration of the themes has resulted in an interactive real-time animation and visualisation system which constitutes a very powerful tool for analysis and understanding of the plasma physics processes. The visualisations contained in this work have spawned many new possible research projects and provided insight into previously not fully understood plasma physics phenomena.</p>
52

Efficient Methods for Direct Volume Rendering of Large Data Sets

Ljung, Patric January 2006 (has links)
Direct Volume Rendering (DVR) is a technique for creating images directly from a representation of a function defined over a three-dimensional domain. The technique has many application fields, such as scientific visualization and medical imaging. A striking property of the data sets produced within these fields is their ever increasing size and complexity. Despite the advancements of computing resources these data sets seem to grow at even faster rates causing severe bottlenecks in terms of data transfer bandwidths, memory capacity and processing requirements in the rendering pipeline. This thesis focuses on efficient methods for DVR of large data sets. At the core of the work lies a level-of-detail scheme that reduces the amount of data to process and handle, while optimizing the level-of-detail selection so that high visual quality is maintained. A set of techniques for domain knowledge encoding which significantly improves assessment and prediction of visual significance for blocks in a volume are introduced. A complete pipeline for DVR is presented that uses the data reduction achieved by the level-of-detail selection to minimize the data requirements in all stages. This leads to reduction of disk I/O as well as host and graphics memory. The data reduction is also exploited to improve the rendering performance in graphics hardware, employing adaptive sampling both within the volume and within the rendered image. The developed techniques have been applied in particular to medical visualization of large data sets on commodity desktop computers using consumer graphics processors. The specific application of virtual autopsies has received much interest, and several developed data classification schemes and rendering techniques have been motivated by this application. The results are, however, general and applicable in many fields and significant performance and quality improvements over previous techniques are shown. / On the defence date the status of article IX was Accepted.
53

Concept and Workflow for 3D Visualization of Multifaceted Meteorological Data

Helbig, Carolin 10 August 2015 (has links) (PDF)
The analysis of heterogeneous, complex data sets has become important in many scientific domains. With the help of scientific visualization, researchers can be supported in exploring their research results. One domain, where researchers have to deal with spatio-temporal data from different sources including simulation, observation and time-independent data, is meteorology. In this thesis, a concept and workflow for the 3D visualization of meteorological data was developed in cooperation with domain experts. Three case studies have been conducted based on the developed concept. In addition, the concept has been enhanced based on the experiences gained from the case studies. In contrast to existing all-in-one software applications, the proposed workflow employs a combination of existing software applications and their extensions to make a variety of already implemented visualization algorithms available. The workflow provides methods for data integration and for abstraction of the data as well as for generating representations of the variables of interest. Solutions for visualizing sets of variables, comparing results of multiple simulation runs and results of simulations based on different models are presented. The concept includes the presentation of the visualization scenes in virtual reality environments for a more comprehensible display of multifaceted data. To enable the user to navigate within the scenes, some interaction functionality was provided to control time, camera, and display of objects. The proposed methods have been selected with respect to the requirements defined in cooperation with the domain experts and have been verified with user tests. The developed visualization methods are used to analyze and present recent research results as well as for educational purposes. As the proposed approach uses generally applicable concepts, it can also be applied for the analysis of scientific data from other disciplines. / In nahezu allen Wissenschaftsdisziplinen steigt der Umfang erhobener Daten. Diese sind oftmals heterogen und besitzen eine komplexe Struktur, was ihre Analyse zu einer Herausforderung macht. Die wissenschaftliche Visualisierung bietet hier Möglichkeiten, Wissenschaftler bei der Untersuchung ihrer Forschungsergebnisse zu unterstützen. Eine der Disziplinen, in denen räumlich-zeitliche Daten aus verschiedenen Quellen inklusive Simulations- und Observationsdaten eine Rolle spielen, ist die Meteorologie. In dieser Arbeit wurde in Zusammenarbeit mit Experten der Meteorologie ein Konzept und ein Workflow für die 3D-Visualisierung meteorologischer Daten entwickelt. Dabei wurden drei Fallstudien erarbeitet, die zum einen auf dem erstellten Konzept beruhen und zum anderen durch die während der Fallstudie gesammelten Erfahrungen das Konzept erweiterten. Der Workflow besteht aus einer Kombination existierender Software sowie Erweiterungen dieser. Damit wurden Funktionen zur Verfügung gestellt, die bei anderen Lösungsansätzen in diesem Bereich, die oft nur eine geringere Anzahl an Funktionalität bieten, nicht zur Verfügung stehen. Der Workflow beinhaltet Methoden zur Datenintegration sowie für die Abstraktion und Darstellung der Daten. Es wurden Lösungen für die Visualisierung einer Vielzahl an Variablen sowie zur vergleichenden Darstellung verschiedener Simulationsläufe und Simulationen verschiedener Modelle präsentiert. Die generierten Visualisierungsszenen wurden mit Hilfe von 3D-Geräten, beispielsweise eine Virtual-Reality-Umgebung, dargestellt. Die stereoskopische Projektion bietet dabei die Möglichkeit, diese komplexen Daten mit verbessertem räumlichem Eindruck darzustellen. Um dem Nutzer eine umfassende Analyse der Daten zu ermöglichen, wurden eine Reihe von Funktionen zur Interaktion zur Verfügung gestellt, um beispielsweise Zeit, Kamera und die Anzeige von 3D-Objekten zu steuern. Das Konzept und der Workflow wurden entsprechend der Anforderungen entwickelt, die zusammen mit Fachexperten definiert wurden. Des Weiteren wurden die Anwendungen in verschiedenen Entwicklungsstadien durch Nutzer getestet und deren Feedback in die Entwicklung einbezogen. Die Ergebnisse der Fallstudien wurden von den Wissenschaftlern benutzt, um ihre Daten zu analysieren, sowie diese zu präsentieren und in der Lehre einzusetzen. Da der vorgeschlagene Workflow allgemein anwendbare Konzepte beinhaltet, kann dieser auch für die Analyse wissenschaftlicher Daten anderer Disziplinen verwendet werden.
54

Multi-Volume Rendering in OpenSpace Using A-Buffers for Space Weather Visualizations

Strandstedt, Jonas January 2017 (has links)
The work described in this thesis is part of the initial development of the open-source visualization software OpenSpace, a collaborative project between Linköping University (LiU), the National Aeronautics and Space Administration (NASA) and the American Museum of Natural History (AMNH). The report covers the background and implementation of a rendering system that enables OpenSpace to interactively visualize multiple overlapping space weather events. The system works much like a Deferred Renderer by rendering all objects once and then resolves the final image in a second rendering step. To render a mix of opaque and translucent objects and volumes simultaneously, order-independent transparency solutions are implemented. Performance is compared against traditional methods and possible improvements are discussed. The implemented rendering system is currently powering the OpenSpace visualizations, this gives scientists an interactive tool for studying multiple space weather events, education and public outreach.
55

Visualization of Particle In Cell Simulations / Visualization of Particle In Cell Simulations

Ljung, Patric January 2000 (has links)
A numerical simulation case involving space plasma and the evolution of instabilities that generates very fast electrons, i.e. approximately at half of the speed of light, is used as a test bed for scientific visualisation techniques. A visualisation system was developed to provide interactive real-time animation and visualisation of the simulation results. The work focuses on two themes and the integration of them. The first theme is the storage and management of the large data sets produced. The second theme deals with how the Visualisation System and Visual Objects are tailored to efficiently visualise the data at hand. The integration of the themes has resulted in an interactive real-time animation and visualisation system which constitutes a very powerful tool for analysis and understanding of the plasma physics processes. The visualisations contained in this work have spawned many new possible research projects and provided insight into previously not fully understood plasma physics phenomena.
56

[en] VISUALIZING VECTOR FIELDS OVER SURFACES / [pt] VISUALIZANDO CAMPOS VETORIAIS EM SUPERFÍCIES

THIAGO MARQUES TOLEDO 18 January 2017 (has links)
[pt] Campos vetoriais são resultados comuns em simuladores físicos. Simulações em modelos de reservatórios de petróleo podem nos fornecer, por exemplo, dados relativos ao fluxo de óleo, água e gás. Para um melhor entendimento de tais dados, entretanto, é interessante o uso de uma técnica de visualização que permita a identificação de características locais e tendências globais no campo. Este trabalho propõe uma técnica para visualização de campos vetoriais 3D baseada em GPU que utiliza o algoritmo de convolução de integral de linha (LIC) em 2D para a visualização da componente tangencial à superfície projetada no espaço da tela. Dados relativos à magnitude e componente normal são apresentados através de uma escala de cores bidimensional. Para fixar a imagem resultante do LIC no modelo é proposto um esquema simples baseado em coordenadas de texturas aleatórias, eliminando a necessidade de textura sólida 3D para armazenar o ruído branco. Filtros para animação da imagem de LIC foram adaptados para permitir velocidade variável de acordo com a magnitude do campo. Para melhoria da imagem final, o algoritmo de LIC é aplicado em duas passadas e o resultado é submetido a um filtro de passa-alta. O framework desenvolvido como parte do trabalho foi explorado no contexto da visualização de fluxos em modelos de reservatório de petróleo e de gradientes de altura em terrenos. No caso específico de reservatórios, é proposta uma variação da técnica que permite visualização simultânea de fluxos de óleo, gás e água. / [en] Vector fields are common results of physics simulators. Simulations over black-oil reservoirs, for instance, can generate oil, water and gas flow data. For a better understanding of such data, however, it s interesting to use a visualization technique that allows a better identification of local characteristics and global tendencies of the field. This work proposes a technique for visualization of 3D vector fields that is GPU-based and uses the 2D line integral convolution (LIC) algorithm to visualize the component tangential to the surface projected on screen space. Data related to magnitude and normal component are presented through a 2-dimensional color scale. A simple scheme based on randomly generated texture coordinates is proposed to fixate the resulting LIC image to the model, avoiding flickering during model manipulation and eliminating the need for a solid 3D texture noise. For animation, we adjust the use of filters to ensure that the animation speed varies in accordance to the field magnitude. To enhance the final image, the LIC algorithm is applied in two passes and the result is put through a high-pass filter. The framework developed as part of this work has been applied in the context of visualizing flow in black-oil reservoir models and height gradients in terrains. In the specific case of reservoirs, a variation from the main technique is proposed to allow simultaneous visualization of oil, gas and water flows.
57

[en] A SIMPLE COMPRESSION FOR IRREGULAR MESHES WITH HANDLES / [pt] UMA COMPRESSÃO SIMPLES PARA MALHAS IRREGULARES COM ALÇAS

RUBEN GOMEZ DIAZ 06 October 2004 (has links)
[pt] Muitas são as aplicações onde se faz necessário transmitir modelos 3D via Internet. Entre eles merece destaque o compartilhamento de dados entre ambientes colaborativos situados em diferentes localidades. Este compartilhamento permite a sua análise e visualização, porém restrições de largura de banda da rede (Internet/Intranet) assim como o custo de armazenamento limitam a complexidade do modelo a ser transmitido/armazenado. As malhas geométricas são utilizadas em diferentes áreas da computação gráfica e visualização científica, como exemplos podem se citar elementos finitos os quais são utilizados em modelos CAD, jogos, modelagem de terrenos, geometria computacional entre outros. Devido à grande complexidade das malhas, estas são processadas por meios computacionais usando alguma estrutura de dados que represente da melhor forma o modelo em questão. A principal motivãção deste trabalho é verificar a viabilidade do uso de uma nova estrutura de dados para representar e comprimir malhas irregulares (triângulos e quadrângulos). Nesta nova abordagem será apresentada a estrutura de dados CHalfEdge. Ela usa os conceitos e idéias da representação HalfEdge e esta por sua vez possui um baixo custo de armazenamento e mantém um alto poder de expressão. Neste trabalho é desenvolvido tambem um algoritmo de compressão de malhas triangulares e/ou quadrangulares com suporte a alças. Este novo algoritmo proposto é uma extensão da compressão de malhas triangulares EdgeBreaker. / [en] Many applications need to transmit 3D models over the Internet, among those data sharing between collaborative environments situated in different locations. Those data sharing aim to analyze and visualize them but bandwidth constraints and storage costs limit the complexity of models than can be transmitted/stored. Polygonal meshes are used in different areas of Computer Graphics and Scientific Visualization. For instance, finite elements and boundary representations are used in CAD models, games, terrain modelling, etc. Due the great complexity of those meshes, they must be represented by a specific data structure that suits them. The main motivation of this work is to verify the feasibility of the use of a new data structure to represent and to compress irregular meshes (triangles and quads). It is introduced the CHalfEdge data structure based on the ideas of the HalfEdge data structure, which are used to represent models by boundary representation. In this work it is also proposed a new algorithm to compress and decompress irregulars meshes with genus, this new algorithm is an extension of the EdgeBreaker compression for regular meshes.
58

Visualização exploratória de dados volumétricos multivalorados variantes no tempo / Exploratory visualization of volumetric data multivalued time varying

Thiago Silva Reis Santos 08 October 2012 (has links)
Simulações por computador permitem reduzir custo e, muitas vezes, realizar experimentos que na vida real seriam impraticáveis, ou por questões ambientais (explosões nucleares), ou por fatores que estão fora do controle do ser humano (colisões entre estrelas). Entretanto, e muito difícil manipular e analisar as centenas de gigabytes, ou mesmo terabytes, que tais simulações produzem como resultado. Os trabalhos que lidam com tais conjuntos de dados, tipicamente, empregam tanto técnicas de visualização científica como técnicas de visualização da informação, em geral refletindo o comportamento dos dados em um único instante de tempo. Entretanto, a análise da evolução temporal e a disponibilização de representações visuais integradas ainda é um grande desafio. Esse trabalho introduz diversas estratégias buscando tratar estes problemas, as quais tem em comum a utilização de projeções multidimensionais para apoiar a análise exploratória dos de dados, tanto em um instante de tempo específico, como ao longo da evolução temporal. O objetivo é favorecer a localização de grupos de elementos com comportamento similar e acompanhar sua evolução ao longo da simulação. Uma das estratégias introduzidas resume o comportamento temporal dos dados multidimensionais em uma única visualização, o que permite rastrear as entidades com comportamento similar e analisá-las ao longo da simulação / Computer simulations of physical phenomena allow reducing costs and studying behavior that would be unfeasible to observe in real life situations, either due to environmental limitations, e.g., a nuclear explosion, or due to factors that are beyond human control (e.g., collisions between stars). Millions of primitives (voxels, vertices or particle) may be required to accurately capture system behavior, thus generating very large data sets that are typically time-varying and multidimensional, as multiple simulation variables describe each primitive. Therefore, analyzing the hundreds of gigabytes or even terabytes resulting from these simulations remains a challenge. Current solutions that handle this type of data usually rely on Scientific or Information Visualization techniques, but typically revealing data behavior at a particular time instant. It remains a major challenge to provide visualizations capable of assisting analysts trying to inspect and understand behavior along the temporal domain. This work is an attempt in this direction, introducing several strategies to handle these problems. They have in common the use of multidimensional projection techniques to support exploratory analysis of simulation data, both at specic time instants and along the simulation as a whole. The goal is to favor the perception of groups of elements showing similar behavior and track their temporal evolution. One of the strategies introduced summarizes, in a single visual representation, the temporal behavior of the multidimensional data space, thus allowing analysts to identify and analyze the entities with similar behavior along the simulation
59

LEVIA’18: Leipzig Symposium on Visualization in Applications 2018

Jänicke, Stefan, Hotz, Ingrid, Liu, Shixia 25 January 2019 (has links)
No description available.
60

A la recherche de la haute performance pour les codes de calcul et la visualisation scientifique / Searching for the highest performance for simulation codes and scientific visualization

Colin de Verdière, Guillaume 16 October 2019 (has links)
Cette thèse vise à démontrer que l'algorithmique et la programmation, dans un contexte de calcul haute performance (HPC), ne peuvent être envisagées sans tenir compte de l'architecture matérielle des supercalculateurs car cette dernière est régulièrement remise en cause.Après avoir rappelé quelques définitions relatives aux codes et au parallélisme, nous montrons que l'analyse des différentes générations de supercalculateurs, présents au CEA lors de ces 30 dernières années, permet de dégager des points de vigilances et des recommandations de bonnes pratiques en direction des développeurs de code.En se reposant sur plusieurs expériences, nous montrons comment viser une performance adaptée aux supercalculateurs et comment essayer d'atteindre la performance portable voire la performance extrême dans le monde du massivement parallèle, incluant ou non l'usage de GPU.Nous expliquons que les logiciels et matériels dédiés au dépouillement graphique des résultats de calcul suivent les mêmes principes de parallélisme que pour les grands codes scientifiques, impliquant de devoir maîtriser une vue globale de la chaîne de simulation. Enfin, nous montrons quelles sont les tendances et contraintes qui vont s'imposer à la conception des futurs supercalculateurs de classe exaflopique, impactant de fait le développement des prochaines générations de codes de calcul. / This thesis aims to demonstrate that algorithms and coding, in a high performance computing (HPC) context, cannot be envisioned without taking into account the hardware at the core of supercomputers since those machines evolve dramatically over time. After setting a few definitions relating to scientific codes and parallelism, we show that the analysis of the different generations of supercomputer used at CEA over the past 30 years allows to exhibit a number of attention points and best practices toward code developers.Based on some experiments, we show how to aim at code performance suited to the usage of supercomputers, how to try to get portable performance and possibly extreme performance in the world of massive parallelism, potentially using GPUs.We explain that graphical post-processing software and hardware follow the same parallelism principles as large scientific codes, requiring to master a global view of the simulation chain.Last, we describe tendencies and constraints that will be forced on the new generations of exaflopic class supercomputers. These evolutions will, yet again, impact the development of the next generations of scientific codes.

Page generated in 0.0263 seconds