• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 5
  • 5
  • 5
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A web-based approach to image-based lighting using high dynamic range images and QuickTime object virtual reality

Cuellar, Tamara Melissa 10 October 2008 (has links)
This thesis presents a web-based approach to lighting three-dimensional geometry in a virtual scene. The use of High Dynamic Range (HDR) images for the lighting model makes it possible to convey a greater sense of photorealism than can be provided with a conventional computer generated three-point lighting setup. The use of QuickTime ™ Object Virtual Reality to display the three-dimensional geometry offers a sophisticated user experience and a convenient method for viewing virtual objects over the web. With this work, I generate original High Dynamic Range images for the purpose of image-based lighting and use the QuickTime ™ Object Virtual Reality framework to creatively alter the paradigm of object VR for use in object lighting. The result is two scenarios: one that allows for the virtual manipulation of an object within a lit scene, and another with the virtual manipulation of light around a static object. Future work might include the animation of High Dynamic Range image-based lighting, with emphasis on such features as depth of field and glare generation.
2

Image Based Visualization Methods for Meteorological Data

Olsson, Björn January 2004 (has links)
<p>Visualization is the process of constructing methods, which are able to synthesize interesting and informative images from data sets, to simplify the process of interpreting the data. In this thesis a new approach to construct meteorological visualization methods using neural network technology is described. The methods are trained with examples instead of explicitely designing the appearance of the visualization.</p><p>This approach is exemplified using two applications. In the fist the problem to compute an image of the sky for dynamic weather, that is taking account of the current weather state, is addressed. It is a complicated problem to tie the appearance of the sky to a weather state. The method is trained with weather data sets and images of the sky to be able to synthesize a sky image for arbitrary weather conditions. The method has been trained with various kinds of weather and images data. The results show that this is a possible method to construct weather visaualizations, but more work remains in characterizing the weather state and further refinement is required before the full potential of the method can be explored. This approach would make it possible to synthesize sky images of dynamic weather using a fast and efficient empirical method.</p><p>In the second application the problem of computing synthetic satellite images form numerical forecast data sets is addressed. In this case a mode is trained with preclassified satellite images and forecast data sets to be able to synthesize a satellite image representing arbitrary conditions. The resulting method makes it possible to visualize data sets from numerical weather simulations using synthetic satellite images, but could also be the basis for algorithms based on a preliminary cloud classification.</p> / Report code: LiU-Tek-Lic-2004:66.
3

REAL-TIME EMBEDDED ALGORITHMS FOR LOCAL TONE MAPPING OF HIGH DYNAMIC RANGE IMAGES

Hassan, Firas January 2007 (has links)
No description available.
4

Image-based Material Editing

Khan, Erum 01 January 2006 (has links)
Photo editing software allows digital images to be blurred, warped or re-colored at the touch of a button. However, it is not currently possible to change the material appearance of an object except by painstakingly painting over the appropriate pixels. Here we present a set of methods for automatically replacing one material with another, completely different material, starting with only a single high dynamic range image, and an alpha matte specifying the object. Our approach exploits the fact that human vision is surprisingly tolerant of certain (sometimes enormous) physical inaccuracies. Thus, it may be possible to produce a visually compelling illusion of material transformations, without fully reconstructing the lighting or geometry. We employ a range of algorithms depending on the target material. First, an approximate depth map is derived from the image intensities using bilateral filters. The resulting surface normals are then used to map data onto the surface of the object to specify its material appearance. To create transparent or translucent materials, the mapped data are derived from the object's background. To create textured materials, the mapped data are a texture map. The surface normals can also be used to apply arbitrary bidirectional reflectance distribution functions to the surface, allowing us to simulate a wide range of materials. To facilitate the process of material editing, we generate the HDR image with a novel algorithm, that is robust against noise in individual exposures. This ensures that any noise, which would possibly have affected the shape recovery of the objects adversely, will be removed. We also present an algorithm to automatically generate alpha mattes. This algorithm requires as input two images--one where the object is in focus, and one where the background is in focus--and then automatically produces an approximate matte, indicating which pixels belong to the object. The result is then improved by a second algorithm to generate an accurate alpha matte, which can be given as input to our material editing techniques.
5

Image Based Visualization Methods for Meteorological Data

Olsson, Björn January 2004 (has links)
Visualization is the process of constructing methods, which are able to synthesize interesting and informative images from data sets, to simplify the process of interpreting the data. In this thesis a new approach to construct meteorological visualization methods using neural network technology is described. The methods are trained with examples instead of explicitely designing the appearance of the visualization. This approach is exemplified using two applications. In the fist the problem to compute an image of the sky for dynamic weather, that is taking account of the current weather state, is addressed. It is a complicated problem to tie the appearance of the sky to a weather state. The method is trained with weather data sets and images of the sky to be able to synthesize a sky image for arbitrary weather conditions. The method has been trained with various kinds of weather and images data. The results show that this is a possible method to construct weather visaualizations, but more work remains in characterizing the weather state and further refinement is required before the full potential of the method can be explored. This approach would make it possible to synthesize sky images of dynamic weather using a fast and efficient empirical method. In the second application the problem of computing synthetic satellite images form numerical forecast data sets is addressed. In this case a mode is trained with preclassified satellite images and forecast data sets to be able to synthesize a satellite image representing arbitrary conditions. The resulting method makes it possible to visualize data sets from numerical weather simulations using synthetic satellite images, but could also be the basis for algorithms based on a preliminary cloud classification. / Report code: LiU-Tek-Lic-2004:66.

Page generated in 0.0527 seconds