• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 29
  • 6
  • 4
  • 3
  • 2
  • 1
  • Tagged with
  • 93
  • 93
  • 27
  • 22
  • 18
  • 17
  • 16
  • 15
  • 14
  • 13
  • 13
  • 13
  • 12
  • 12
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An Algorithm for Clipping Polygons of Large Geographical Data

Alghamdi, Areej 27 September 2017 (has links)
We present an algorithm for overlaying polygonal data with regular grids and calculating the percentage overlap for each cell in the regular grid.  Our algorithm is able to support self-intersecting polygons, meaning that some spatial regions may be covered by two or more polygons.  Our algorithm is able to identify these cases and eliminate redundant polygons, preventing erroneous results.  We also present an optimized version of our algorithm that uses spatial sorting through interval trees, and provide a performance comparison between the optimized and unoptimized versions. Finally, we apply our algorithm to geography data, specifically of bark beetle infestation
2

Rzsweep: A New Volume-Rendering Technique for Uniform Rectilinear Datasets

Chaudhary, Gautam 10 May 2003 (has links)
A great challenge in the volume-rendering field is to achieve high-quality images in an acceptable amount of time. In the area of volume rendering, there is always a trade-off between speed and quality. Applications where only high-quality images are acceptable often use the ray-casting algorithm, but this method is computationally expensive and typically achieves low frame rates. The work presented here is RZSweep, a new volume-rendering algorithm for uniform rectilinear datasets, that gives high-quality images in a reasonable amount of time. In this algorithm a plane sweeps the vertices of the implicit grid of regular datasets in depth order, projecting all the implicit faces incident on each vertex. This algorithm uses the inherent properties of a rectilinear datasets. RZSweep is an object-order, back-toront, direct volume rendering, face projection algorithm for rectilinear datasets using the cell approach. It is a single processor serial algorithm. The simplicity of the algorithm allows the use of the graphics pipeline for hardware-assisted projection, and also, with minimum modification, a version of the algorithm that is graphics-hardware independent. Lighting, color and various opacity transfer functions are implemented for giving realism to the final resulting images. Finally, an image comparison is done between RZSweep and a 3D texture-based method for volume rendering using standard image metrics like Euclidian and geometric differences.
3

Visualization of Computer-Modeled Forests for Forest Management

Mohammadi-Aragh, Mahnas Jean 11 December 2004 (has links)
Forest management is a costly and time-consuming activity. Remote sensing has the potential to improve the process by making it cheaper and more efficient, but only if appropriate characteristics can be determined from computer-models. This thesis describes the implementation of a forest visualization system and a corresponding user study that tests the accuracy of parameter estimation and forest characterization. The study uses data obtained from field-surveys to generate a computer-modeled forest. Five different stands were tested. Based on the quantitative results obtained, generally, there is no statistically significant difference in parameter estimation when comparing field-recorded movies and computer-generated movies.
4

AN ADAPTIVE SAMPLING APPROACH TO INCOMPRESSIBLE PARTICLE-BASED FLUID

Hong, Woo-Suck 16 January 2010 (has links)
I propose a particle-based technique for simulating incompressible uid that includes adaptive re nement of particle sampling. Each particle represents a mass of uid in its local region. Particles are split into several particles for ner sampling in regions of complex ow. In regions of smooth ow, neghboring particles can be merged. Depth below the surface and Reynolds number are exploited as our criteria for determining whether splitting or merging should take place. For the uid dynamics calculations, I use the hybrid FLIP method, which is computationally simple and e cient. Since the uid is incompressible, each particle has a volume proportional to its mass. A kernel function, whose e ective range is based on this volume, is used for transferring and updating the particle's physical properties such as mass and velocity. In addition, the particle sampling technique is extended to a fully adaptive approach, supporting adaptive splitting and merging of uid particles and adaptive spatial sampling for the reconstruction of the velocity and pressure elds. Particle splitting allows a detailed sampling of uid momentum in regions of complex ow. Particle merging, in regions of smooth ow, reduces memory and computational overhead. An octree structure is used to compute inter-particle interactions and to compute the pressure eld. The octree supporting eld-based calculations is adapted to provide a ne spatial reconstruction where particles are small and a coarse reconstruction where particles are large. This scheme places computational resources where they are most needed, to handle both ow and surface complexity. Thus, incompressibility can be enforced even in very small, but highly turbulent areas. Simultaneously, the level of detail is very high in these areas, allowing the direct support of tiny splashes and small-scale surface tension e ects. This produces a nely detailed and realistic representation of surface motion.
5

Cinematic Scientific Visualizations

Litaker, Kendall R 16 December 2013 (has links)
The Hubble Space Telescope has provided the world with incredible imagery of the surrounding universe. The aesthetic quality of this imagery is limited by production resources; by creating a method to harness the highly refined detail and quality of CG elements in live-action films, we can inspire and educate at a much greater level. In this thesis, I create a rendering approach that allows camera movement around and through elements such as nebulae and galaxies, creating a more cinematic experience. The solution will also allow for reasonable scientific accuracy, visual appeal, efficiency, and extendability to other astronomical visualizations. 3D meshes are constructed and textured using telescopic images as reference. Splats are volumetrically generated using a voxelized bounding box around the mesh. Valid splats within a user specified maximum distance receive initial color and alpha values from the texture map. Probability density functions are used to create a density falloff along the edges of the object, and modifications to the RGBA values are made to achieve the desired cloud-like appearance. The data sets are rendered using a C program developed at the Space Telescope Science Institute by Dr. Frank Summers. The methodology is applied to the test cases of a nebula, star-forming region Sharpless 2-106, and a galaxy, Messier 51, or the Whirlpool Galaxy. The results of this thesis demonstrate the visual, scientific, and technical success of this solution. The code developed during this project generates the desired imagery with reasonable efficiency. A short animation moving from outside the galaxy to a close up of the nebula exhibits the flexibility in scale and camera movement. A careful balance between scientific accuracy and visual appeal were maintained through consultation with astronomers at the Space Telescope Science Institute. The favorable efficient, flexible, visual, and scientific results presented by this work make this process extendable to most other cases of nebula and galaxy visualizations.
6

Using Virtual Environments to Visualize Atmospheric Data: Can It Improve a Meteorologist'S Potential to Analyze the Information?

Ziegeler, Sean Bernard 11 May 2002 (has links)
Conventional analysis of atmospheric data includes three-dimensional desktop-computer displays. One disadvantage is that it can reduce the ability to zoom in and see small-scale features while concurrently viewing other faraway features. This research intends to determine if using virtual environments to examine atmospheric data can improve a meteorologist's ability to analyze the given information. In addition to possibly enhancing small-scale analysis, virtual environments technology offers an array of possible improvements. Presented is the theory on developing an experiment to establish the extent to which virtual environments assist meteorologists in analysis. Following is the details of an implementation of such an experiment. Based on the quantitative results obtained, the conclusion is that immersion can significantly increase the accuracy of a meteorologist's analysis of an atmospheric data set.
7

Realizing a feature-based framework for scientific data mining

Mehta, Sameep 13 September 2006 (has links)
No description available.
8

"Implementação de uma estrutura de dados para visualização científica" / Implementation of a data structure for scientific visualization

Souza, Carlos André Sanches de 01 April 2003 (has links)
Estruturas de dados volumétricas são de extrema utilidade em várias aplicações, e em particular na área de Visualização Científica. Essas estruturas são úteis em duas etapas do processo de visualização. A primeira é na representação de dados, isto é, de informações associadas aos valores, medidos ou simulados, os quais se deseja visualizar. A outra fase que necessita de uma estrutura de dados é a fase de exploração, ou seja, o modelo criado é usado tanto para exploração interativa quanto para a realização de simulações sobre o mesmo, por exemplo, numa cirurgia virtual. Está em desenvolvimento no ICMC, uma estrutura de dados volumétrica, chamada Singular Half-Face, que tem como característica a modelagem explícita das singularidades presentes no modelo, além de outros elementos topológicos. Este trabalho de mestrado visa testar a viabilidade da estrutura num contexto de visualização em fluxo de dados, incorporando-a à biblioteca gráfica de visualização Visualization ToolKit (VTK), que possui definição extremamente pobre da topologia dos objetos que representa. Adicionando essa nova classe e realizando sobre ela técnicas convencionais de visualização e exploração de dados, é estudada sua capacidade de apoiar todas as fases do processo de visualização. / Volumetric data structures are of extreme utility in various applications, and particularly in the area of Scientific Visualization. These structures are useful in two stages of the visualization process. The first one is in data representation, that is, in the organization of information associated with the values, measured or simulated, to be visualized. The other phase of the visualization data flow that needs a data structure is the exploration phase. It would be useful to have a model designed to be used both for interactive exploration and for simulation in a number of application, for example, in virtual surgery. A volumetric data structure, named Singular Half-Face (SHF) is being developed at the ICMC, that has as its main characteristic the explicit modeling of singularities present in the model, besides other essential topological elements. This dissertation aims at testing the viability of this structure in a context of visualization in data flow, by incorporating it in the Visualization ToolKit (VTK) class library, whose data structures are extremely poor in definition of the topology of the objects that they represent. By adding SHF to this library and carrying out conventional visualization and data exploration on it, we wish to study its support to all the phases of the visualization process.
9

Visualização científica computacional aplicada a modelos aerodinâmicos simulados em método dos painéis / Open source visualization of scientific computational development to aircrafts models simulated in method of the panels

Albuquerque, Luciana Abdo Lins de 17 November 2003 (has links)
O aumento do poder computacional e conseqüente desenvolvimento das técnicas de simulação numérica, aliados ao avanço tecnológico dos periféricos de medição, fizeram com que muitas áreas de pesquisa, passassem a necessitar de ferramentas gráficas e de auxílio computacional para apoiar o processo de interpretação das informações geradas. A aplicação de técnicas gráficas para ampliar a capacidade de interpretação de dados científicos tem sido denominada visualização em computação científica (ViSC - Visualization in Scientific Computing). Modelos aerodinâmicos construídos no laboratório, foram simulados em software numérico de método dos painéis e submetidos a rotinas desenvolvidas em C++ as quais serviram de superfície para uma ferramenta do tipo biblioteca, de baixo custo, muito utilizada em universidades do mundo todo, chama VTK (Visualization Tool Kit), que possui elementos gráficos para a geração de visualizações de qualquer tipo de dados. Esses códigos em C++ são responsáveis pelos tipos de visualização gerados e principalmente por permitir o uso da ferramenta. As visualizações de distribuição de pressão e isolinhas nas superfícies dos modelos são de suma importância na identificação de problemas aerodinâmicos possibilitando correções e modificações antes mesmo de o modelo ser construído. / The increase of computational power and the technical development of numerical are responsible for the creation of many new areas that use graphics tools and computational aid for the interpretation of the information generated. The application of graphics techniques to increase the capability of scientific interpretation is called ViSC or Visualization in Scientific Computation. Laboratory built freeflight and wind tunnel models were calculated using numerical software and submitted to routines developed in C++ language which produced various types of visualization while using VTK. Visualization of pressure distributions and streamlines on the model surfaces are important for identification of aerodynamic problems and making corrections and modifications possible before the construction of the physical model.
10

Wavelet Compression for Visualization and Analysis on High Performance Computers

Li, Shaomeng 31 October 2018 (has links)
As HPC systems move towards exascale, the discrepancy between computational power and I/O transfer rate is only growing larger. Lossy in situ compression is a promising solution to address this gap, since it alleviates I/O constraints while still enabling traditional post hoc analysis. This dissertation explores the viability of such a solution with respect to a specific kind of compressor — wavelets. We especially examine three aspects of concern regarding the viability of wavelets: 1) information loss after compression, 2) its capability to fit within in situ constraints, and 3) the compressor’s capability to adapt to HPC architectural changes. Findings from this dissertation inform in situ use of wavelet compressors on HPC systems, demonstrate its viabilities, and argue that its viability will only increase as exascale computing becomes a reality.

Page generated in 0.1375 seconds