• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 44
  • 44
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Advanced Real-time Post-Processing using GPGPU techniques

Lönroth, Per, Unger, Mattias January 2008 (has links)
<p> </p><p>Post-processing techniques are used to change a rendered image as a last step before presentation and include, but is not limited to, operations such as change of saturation or contrast, and also more advanced effects like depth-of-field and tone mapping.</p><p>Depth-of-field effects are created by changing the focus in an image; the parts close to the focus point are perfectly sharp while the rest of the image has a variable amount of blurriness. The effect is widely used in photography and movies as a depth cue but has in the latest years also been introduced into computer games.</p><p>Today’s graphics hardware gives new possibilities when it comes to computation capacity. Shaders and GPGPU languages can be used to do massive parallel operations on graphics hardware and are well suited for game developers.</p><p>This thesis presents the theoretical background of some of the recent and most valuable depth-of-field algorithms and describes the implementation of various solutions in the shader domain but also using GPGPU techniques. The main objective is to analyze various depth-of-field approaches and look at their visual quality and how the methods scale performance wise when using different techniques.</p><p> </p>
12

Imaging, characterization and processing with axicon derivatives.

Saikaley, Andrew Grey 06 August 2013 (has links)
Axicons have been proposed for imaging applications since they offer the advantage of extended depth of field (DOF). This enhanced DOF comes at the cost of degraded image quality. Image processing has been proposed to improve the image quality. Initial efforts were focused on the use of an axicon in a borescope thereby extending depth of focus and eliminating the need for a focusing mechanism. Though promising, it is clear that image processing would lead to improved image quality. This would also eliminate the need, in certain applications, for a fiber optic imaging bundle as many modern day video borescopes use an imaging sensor coupled directly to the front end optics. In the present work, three types of refractive axicons are examined: a linear axicon, a logarithmic axicon and a Fresnel axicon. The linear axicon offers the advantage of simplicity and a significant amount of scientific literature including the application of image restoration techniques. The Fresnel axicon has the advantage of compactness and potential low cost of production. As no physical prior examples of the Fresnel axicons were available for experimentation until recently, very little literature exists. The logarithmic axicon has the advantage of nearly constant longitudinal intensity distribution and an aspheric design producing superior pre-processed images over the aforementioned elements. Point Spread Functions (PSFs) for each of these axicons have been measured. These PSFs form the basis for the design of digital image restoration filters. The performance of these three optical elements and a number of restoration techniques are demonstrated and compared.
13

Narrativa tridimensional: uma investigação sobre a linguagem 3D estereoscópica / Three-dimensional narrative. An analysis of stereocopic 3D language

Grace Maria Martins da Silva Luzzi 25 April 2014 (has links)
A partir da investigação de elementos que remontam o passado de técnicas de ilusão e imersão na arte da representação da imagem, bem como teorias que condicionam a existência de uma linguagem audiovisual própria, este trabalho tem o objetivo de analisar e identificar a existência de uma linguagem própria pertinente à estereoscopia. A pesquisa toma como ponto de análise estereoscópica o filme Coraline [2009] de Henry Selick, uma animação stop motion, concebida e lançada nos cinemas 3D estereoscópicos / Through an investigation of the history of techniques of illusion and immersion in popular art and entertainment, as well as theories that determine the existence of a distinct visual language, this work aims to analyse and identify the existence of a unique narrative language exclusive to 3-D stereoscopic film-making. The research takes as its point of stereoscopic analysis the stop-motion animation movie Coraline (2009, Henry Selick) designed and launched in cinemas using the stereoscopic technique.
14

A Depth of Field Algorithm for Realtime 3D Graphics in OpenGL / Algoritm i OpenGL för att rendera realtids 3D grafik med fokus

Henriksson, Ola January 2002 (has links)
The company where this thesis was formulated constructs VR applications for the medical environment. The hardware used is ordinary dektops with consumer level graphics cards and haptic devices. In medicin some operations require microscopes or cameras. In order to simulate these in a virtual reality environment for educational purposes, the effect of depth of field or focus have to be considered. A working algorithm that generates this optical occurence in realtime, stereo rendered computer graphics is presented in this thesis. The algorithm is implemented in OpenGL and C++ to later be combined with a VR application simulating eye-surgery which is built with OpenGL Optimizer. Several different approaches are described in this report. The call for realtime stereo rendering (~60 fps) means taking advantage of the graphics hardware to a great extent. In OpenGL this means using the extensions to a specific graphic chip for better performance, in this case the algorithm is implemented for a GeForce3 card. To increase the speed of the algorithm much of the workload is moved from the CPU to the GPU (Graphics Processing Unit). By re-defining parts of the ordinary OpenGL pipeline via vertex programs, a distance-from-focus map can be stored in the alpha channel of the final image with little time loss. This can effectively be used to blend a previously blurred version of the scene with a normal render. Different techniques to quickly blur a renderedimage is discussed, to keep the speed up solutions that require moving data from the graphics card is not an option.
15

Advanced Real-time Post-Processing using GPGPU techniques

Lönroth, Per, Unger, Mattias January 2008 (has links)
Post-processing techniques are used to change a rendered image as a last step before presentation and include, but is not limited to, operations such as change of saturation or contrast, and also more advanced effects like depth-of-field and tone mapping. Depth-of-field effects are created by changing the focus in an image; the parts close to the focus point are perfectly sharp while the rest of the image has a variable amount of blurriness. The effect is widely used in photography and movies as a depth cue but has in the latest years also been introduced into computer games. Today’s graphics hardware gives new possibilities when it comes to computation capacity. Shaders and GPGPU languages can be used to do massive parallel operations on graphics hardware and are well suited for game developers. This thesis presents the theoretical background of some of the recent and most valuable depth-of-field algorithms and describes the implementation of various solutions in the shader domain but also using GPGPU techniques. The main objective is to analyze various depth-of-field approaches and look at their visual quality and how the methods scale performance wise when using different techniques.
16

Creating a Depth Map of Eye Iris in Visible Spectrum / Creating a Depth Map of Eye Iris in Visible Spectrum

Kubíček, Martin January 2019 (has links)
Diplomová práce si dává za cíl navrhnout a uvést v praxi metodiku snímání oční duhovky ve viditelném spektru. Klade přitom důraz na kvalitu snímků, věrohodné podání barev vůči reálnému podkladu a hlavně na kontinuální hloubku ostrosti, která odhaluje dosud nezkoumané aspekty a detaily duhovky. V poslední řadě se také soustředí na co nejmenší vystavení duhovky fyzickému stresu. Metodika obsahuje přesné postupy jak snímat duhovku a zajištuje tím konzistentnost snímků. Tím umožní vytvářet databáze duhovek s ohledem na jejich vývoj v čase či jiném aspektu jako je například psychologický stav snímané osoby. Na úvod je v práci představena anatomie lidského oka a zejména pak duhovky. Dále pak známé způsoby snímání duhovky. Následuje část, jež se zabývá správným osvětlením duhovky. To je nutné pro požadovanou úroveň kvality snímků zároveň ale vystavuje oko velkému fyzickému stresu. Je tedy nutné najít kompromis mezi těmito aspekty. Důležitý je popis samotné metodiky obsahující podrobný popis snímání. Dále se práce zabývá nutnými postprodukčními úpravami jako je například složení snímků s různou hloubkou ostrosti do jednoho kontinuálního snímku či aplikací filtrů pro odstranění vad na snímcích. Poslední část práce je rozdělena na zhodnocení výsledků a závěr, v němž se rozebírají možné rozšíření či úpravy metodiky tak, aby ji bylo možné použít i mimo laboratorní podmínky.
17

Creating a Depth Map of Eye Iris in Visible Spectrum / Creating a Depth Map of Eye Iris in Visible Spectrum

Kubíček, Martin January 2019 (has links)
Diplomová práce si dává za cíl navrhnout a uvést v praxi metodiku snímání oční duhovky ve viditelném spektru. Klade přitom důraz na kvalitu snímků, věrohodné podání barev vůči reálnému podkladu a hlavně na kontinuální hloubku ostrosti, která odhaluje dosud nezk- oumané aspekty a detaily duhovky. V poslední řadě se také soustředí na co nejmenší vys- tavení duhovky fyzickému stresu. Metodika obsahuje přesné postupy jak snímat duhovku a zajištuje tím konzistentnost snímků. Tím umožní vytvářet databáze duhovek s ohledem na jejich vývoj v čase či jiném aspektu jako je například psychologický stav snímané os- oby. Na úvod je v práci představena anatomie lidského oka a zejména pak duhovky. Dále pak známé způsoby snímání duhovky. Následuje část, jež se zabývá správným osvětlením duhovky. To je nutné pro požadovanou úroveň kvality snímků zároveň ale vystavuje oko velkému fyzickému stresu. Je tedy nutné najít kompromis mezi těmito aspekty. Důležitý je popis samotné metodiky obsahující podrobný popis snímání. Dále se práce zabývá nutnými postprodukčními úpravami jako je například složení snímků s různou hloubkou ostrosti do jednoho kontinuálního snímku či aplikací filtrů pro odstranění vad na snímcích. Poslední část práce je rozdělena na zhodnocení výsledků a závěr, v němž se rozebírají možné rozšíření či úpravy metodiky tak, aby ji bylo možné použít i mimo laboratorní podmínky.
18

Automatizace procesu 3D zobrazování / Automatization of 3D stacking process

Kamenec, Jan January 2012 (has links)
The task of my thesis was to automate the process of 3D stacking. The work includes design of complex control board, that will serve as a control unit and provide a comprehensive function of mechanical displacement in combination with digital image acquisition. In addition, the electronics for controlling a stepper motor, PCB and the design. The result of this is a facility that provides automatic acquisition of images with different depth of field.
19

Development of High-Speed Camera Techniques for Droplet Measurement in Annular Flows

Cohn, Ayden Seth 03 June 2024 (has links)
This research addresses the critical need for precise two-phase flow data in the development of computer simulation models, with a specific focus on the annular flow regime's droplet behavior. The study aims to contribute to the evaluation of safety and efficiency in nuclear reactors that handle fluids transitioning between liquid and gas states for thermal energy transport. Central to the investigation is the collection and analysis of droplet size and velocity distribution data, particularly to help with developing models for the water-cooled nuclear power plants. The experimental setup employs advanced tools, including a high-speed camera, lens, teleconverter, and a selected light source, to capture high-resolution images of droplets. Calibration procedures, incorporating depth of field testing, are implemented to ensure accurate droplet size measurements. A critical component of the research is the introduction of a droplet identification program, developed using Matlab, which facilitates efficient processing of experimental data. Preliminary results from the Virginia Tech test facility demonstrate the system's capability to eliminate out-of-focus droplets and obtain precise droplet data in a reasonable amount of time. Experimental results from the Rensselaer Polytechnic Institute test facility provide droplet size and velocity distributions for a variety of annular flow conditions. This facility has a concurrent two-flow system that pumps air and water at different rates through a 9.525 mm inner diameter tube. The conditions tested include gas superficial velocities ranging from 22 to 40 m/s and liquid superficial velocities ranging from 0.09 to 0.44 m/s. The measured flow has a temperature of 21°C and a pressure of 1 atm. / Master of Science / This research explores the behavior of small droplets as fluids transition between liquid and gas states, particularly within the context of the cooling water in nuclear power plants. The overarching goal is to collect data on these droplets to improve computer simulations that help design nuclear reactors and assess their safety. This is important because it is often infeasible due to safety, monetary, or time restrictions to physically test some nuclear reactor equipment. The study employs state-of-the-art technology, including high-speed cameras and specialized imaging tools, to capture and analyze droplet size distribution data. This investigation is pivotal in ensuring the fuel in nuclear reactors remain adequately cooled during part of the boiling process. The research methodology includes the development of a droplet identification program using Matlab, ensuring efficient processing of experimental data. Preliminary findings from experimental tests at Virginia Tech showcase the program's capability to filter out irrelevant data and provide accurate droplet information. Experimental results from the Rensselaer Polytechnic Institute annular flow test facility provide droplet size and velocity data for a range of conditions that cooling water may face. Beyond its contributions to nuclear engineering, this research holds promise for influencing advancements in various applications that involve liquid droplets, opening avenues for innovation in the broader scientific and engineering communities.
20

Extending the depth of focus using digital image filtering

Hu, Guang-hua 14 November 2012 (has links)
Two types of image processing methods capable of forming a composite image from a set of image slices which have in-focus as well as out-of-focus segments are discussed. The first type is based on space domain operations and has been discussed in the literature. The second type, to be introduced, is based on the intuitive concept that the spectral energy distribution of a focused object is biased towards lower frequencies after blurring. This approach requires digital image filtering in the spatial frequency domain. A comparison among methods of both types is made using a quantitative uÌ delity criterion. / Master of Science

Page generated in 0.083 seconds