• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 54
  • 23
  • 14
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 52
  • 31
  • 24
  • 23
  • 22
  • 19
  • 18
  • 17
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Image compression using the one-dimensional discrete pulse transform

Uys, Ernst Wilhelm 03 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2011. / ENGLISH ABSTRACT: The nonlinear LULU smoothers excel at removing impulsive noise from sequences and possess a variety of theoretical properties that make it possible to perform a so-called Discrete Pulse Transform, which is a novel multiresolution analysis technique that decomposes a sequence into resolution levels with a large amount of structure, analogous to a Discrete Wavelet Transform. We explore the use of a one-dimensional Discrete Pulse Transform as the central element in a digital image compressor. We depend crucially on the ability of space-filling scanning orders to map the two-dimensional image data to one dimension, sacrificing as little image structure as possible. Both lossless and lossy image compression are considered, leading to five new image compression schemes that give promising results when compared to state-of-the-art image compressors. / AFRIKAANSE OPSOMMING: Die nielineêre LULU gladstrykers verwyder impulsiewe geraas baie goed uit rye en besit verskeie teoretiese eienskappe wat dit moontlik maak om ’n sogenoemde Diskrete Puls Transform uit te voer; ’n nuwe multiresolusie analise tegniek wat ’n ry opbreek in ’n versameling resolusie vlakke wat ’n groot hoeveelheid struktuur bevat, soortgelyk tot ’n Diskrete Golfie Transform. Ons ondersoek of ’n eendimensionele Diskrete Puls Transform as die sentrale element in ’n digitale beeld kompressor gebruik kan word. Ons is afhanklik van ruimtevullende skandeer ordes om die tweedimensionele beelddata om te skakel na een dimensie, sonder om te veel beeld struktuur te verloor. Vyf nuwe beeld kompressie skemas word bespreek. Hierdie skemas lewer belowende resultate wanneer dit met die beste hedendaagse beeld kompressors vergelyk word.
12

3D multiresolution statistical approaches for accelerated medical image and volume segmentation

Al Zu'bi, Shadi Mahmoud January 2011 (has links)
Medical volume segmentation got the attraction of many researchers; therefore, many techniques have been implemented in terms of medical imaging including segmentations and other imaging processes. This research focuses on an implementation of segmentation system which uses several techniques together or on their own to segment medical volumes, the system takes a stack of 2D slices or a full 3D volumes acquired from medical scanners as a data input. Two main approaches have been implemented in this research for segmenting medical volume which are multi-resolution analysis and statistical modeling. Multi-resolution analysis has been mainly employed in this research for extracting the features. Higher dimensions of discontinuity (line or curve singularity) have been extracted in medical images using a modified multi-resolution analysis transforms such as ridgelet and curvelet transforms. The second implemented approach in this thesis is the use of statistical modeling in medical image segmentation; Hidden Markov models have been enhanced here to segment medical slices automatically, accurately, reliably and with lossless results. But the problem with using Markov models here is the computational time which is too long. This has been addressed by using feature reduction techniques which has also been implemented in this thesis. Some feature reduction and dimensionality reduction techniques have been used to accelerate the slowest block in the proposed system. This includes Principle Components Analysis, Gaussian Pyramids and other methods. The feature reduction techniques have been employed efficiently with the 3D volume segmentation techniques such as 3D wavelet and 3D Hidden Markov models. The system has been tested and validated using several procedures starting at a comparison with the predefined results, crossing the specialists’ validations, and ending by validating the system using a survey filled by the end users explaining the techniques and the results. This concludes that Markovian models segmentation results has overcome all other techniques in most patients’ cases. Curvelet transform has been also proved promising segmentation results; the end users rate it better than Markovian models due to the long time required with Hidden Markov models.
13

Measuring Data Abstraction Quality in Multiresolution Visualizations

Cui, Qingguang 11 April 2007 (has links)
Data abstraction techniques are widely used in multiresolution visualization systems to reduce visual clutter and facilitate analysis from overview to detail. However, analysts are usually unaware of how well the abstracted data represent the original dataset, which can impact the reliability of results gleaned from the abstractions. In this thesis, we define three types of data abstraction quality measures for computing the degree to which the abstraction conveys the original dataset: the Histogram Difference Measure, the Nearest Neighbor Measure and Statistical Measure. They have been integrated within XmdvTool, a public-domain multiresolution visualization system for multivariate data analysis that supports sampling as well as clustering to simplify data. Several interactive operations are provided, including adjusting the data abstraction level, changing selected regions, and setting the acceptable data abstraction quality level. Conducting these operations, analysts can select an optimal data abstraction level. We did an evaluation to check how well the data abstraction measures conform to the data abstraction quality perceived by users. We adjusted the data abstraction measures based on the results of the evaluation. We also experimented on the measures with different distance methods and different computing mechanisms, in order to find the optimal variation from many variations of each type of measure. Finally, we developed two case studies to demonstrate how analysts can compare different abstraction methods using the measures to see how well relative data density and outliers are maintained, and then select an abstraction method that meets the requirement of their analytic tasks.
14

A Wavelet Based Multiscale Run-by-Run Controller for Multiple Input Multiple Output (MIMO) Processes

Kothamasu, Santosh 11 May 2004 (has links)
Run-by-Run (RbR) control is an online supervisory control strategy designed for the batch manufacturing industry. The objective of RbR control is to minimize process drift, shift and variability between machine runs, thereby reducing costs. The most widely used RbR controllers use the Exponentially Weighted Moving Average (EWMA) filter. However, the linear nature of the EWMA filter makes these RbR controllers inefficient for processes with features at multiple frequencies (also known as multiscale processes). Recent developments in wavelet theory have enhanced the ability to analyze events in multiscale processes. New RbR control strategies have started to emerge that incorporate wavelet analysis. These controllers, developed at the University of South Florida, seem to be robust in dealing with multiscale processes. The objective of this research is to integrate the wavelet based, multiscale analysis approach with the existing double EWMA RbR control strategy for controlling a multiple input multiple output (MIMO) process. The new controller (WRbR controller) is applied on a Chemical Mechanical Planerization process having four inputs and two outputs. A continuous drift and mean shift are introduced in the process, which is then controlled using both the existing double EWMA and the new wavelet based RbR controllers. The results indicate that the wavelet based controller is better in terms of the average square deviation and the standard deviation in the process outputs. Moreover, the observed decrease in the magnitude of the average absolute input deviation indicates a smoother process operation.
15

LODStrips:Continuous Level of Detail using Triangle Strips

Ramos Romero, Jose Francisco 29 April 2008 (has links)
In recent years, multiresolution models have progressed substantially. At the beginning, discrete models were employed in graphics applications, due mainly to the low degree of complexity involved in implementing them, which is the reason why nowadays they are still used in applications without high graphics requirements. Nevertheless, the increase in realism in graphics applications makes it necessary to use multiresolution models which are more exact in their approximations, which do not call for high storage costs and which are faster in visualization. This has given way to continuous models, where two consecutive levels of detail only differ by a few polygons and where, additionally, the duplication of information is avoided to a considerable extent, thus improving on the spatial cost offered by most discrete models.Advances have been made in the use of new graphics primitives which minimize the data transfer between the CPU and the GPU, apart from trying to make use of the connectivity information given by a polygonal mesh. For this purpose, graphics primitives with implicit connectivity, such as triangle strips and triangle fans, have been developed. Many continuous models based on this type of primitives have been recently developed. In these last few years, graphics hardware performance has evolved outstandingly, giving rise to new techniques which allow the continuous models to accelerate even more. In this work, we have improved the interactive render of polygonal meshes. To tackle the problem, we firstly studied fundamental techniques to efficiently render polygonal meshes and we later made use of geometry simplification and level of detail techniques. Thus, we defined a multiresolution model that represents a polygonal mesh at any given resolution. This approach is able to manage continuous level-of-detail by smoothly adapting mesh resolution to the application requirements. Moreover, the model was modified to take the maximum advantage of the recent GPU features. We also created a modified version of the model for being used in deforming meshes. Finally, we developed an independent library to integrate our model in real-time applications.
16

A Study of Velocity-Dependent JND of Haptic Model Detail

Tang, John Ko-Han January 2010 (has links)
The study of haptics, or the sense of touch in virtual reality environments, is constantly looking for improvements in modeling with a high fidelity. Highly detailed models are desirable, but they often lead to slow processing times, which can mean a loss of fidelity in the force feedback sensations. Model compression techniques are critical to balancing model detail and processing time. One of the proposed compression techniques is to create multiple models of the same object but with different levels of detail (LOD) for each model. The technique hypothesizes that the human arm loses sensitivity to forces with the increase of its movement speed. This the compression technique determines which model to use based on the user's movement speed. This dissertation examines studies how the movement speed of the user affects the user's ability to sense changes in details of haptic models. Experiments are conducted using different haptic surfaces. Their levels of detail are changed while the subject interacts with them to mimic the effects of a multiresolution compression implementation. The tests focus on the subjects' ability to differentiate changes of the surfaces at each speed. The first experiment uses curved surfaces with multiple resolutions. This test observes the sensitivity of the user when the details on the surface are small. The results show that the subjects are more sensitive to changes of small details at a lower speed than higher speed. The second experiment measures sensitivity to larger features by using trapezoidal surfaces with different angles. The trapezoidal surfaces can be seen as a low-resolution haptic model with only two vertices, and changing the angles of the trapezoids is seen as changing the radii of curvature. With the same speed settings from the first experiment applied to the subjects, the sensitivity for changes in curvature is predicted to decrease with the increase of speed. However, the results of this experiment proved otherwise. The conclusions suggest that multiresolution designs are not a straightforward reduction of LOD, even though the movement speed does affect haptic sensitivity. The model's geometry should be taken into account when designing the parameters for haptic model compression. The results from the experiments provide insights to future haptic multiresolution compression designs.
17

A Hierarchical On-Line Path Planning Scheme using Wavelets

Bakolas, Efstathios 02 April 2007 (has links)
The main objective of this thesis is to present a new path planning scheme for solving the shortest (collision-free) path problem for an agent (vehicle) operating in a partially known environment. We present two novel algorithms to solve the planning problem. For both of these approaches we assume that the agent has detailed knowledge of the environment and the obstacles only in the vicinity of its current position. Far away obstacles or the final destination are only partially known and may even change dynamically at each instant of time. The path planning scheme is based on information gathered on-line by the available on-board sensor devices. The solution minimizes the total length of the path with respect to a metric that includes actual path length, along with a risk-induced metric. In order to obtain an approximation of the whole configuration space at different levels of fidelity we use a wavelet approximation scheme. In the first proposed algorithm, the path-planning problem is solved using a multi-resolution cell decomposition of the environment obtained from the wavelet transform. In the second algorithm, we extend the results of the the first one by using the multiresolution representation of the environment in conjunction with a conformal mapping to polar coordinates. By performing the cell decomposition in polar coordinates, we can naturally incorporate sector-like cells that are adapted to the data representation collected by the on-board sensor devices.
18

Recovery of the logical gravity field by spherical regularization wavelets approximation and its numerical implementation

Shuler, Harrey Jeong 29 April 2014 (has links)
As an alternative to spherical harmonics in modeling the gravity field of the Earth, we built a multiresolution gravity model by employing spherical regularization wavelets in solving the inverse problem, i.e. downward propagation of the gravity signal to the Earth.s surface. Scale discrete Tikhonov spherical regularization scaling function and wavelet packets were used to decompose and reconstruct the signal. We recovered the local gravity anomaly using only localized gravity measurements at the observing satellite.s altitude of 300 km. When the upward continued gravity anomaly to the satellite altitude with a resolution 0.5° was used as simulated measurement inputs, our model could recover the local surface gravity anomaly at a spatial resolution of 1° with an RMS error between 1 and 10 mGal, depending on the topography of the gravity field. Our study of the effect of varying the data volume and altering the maximum degree of Legendre polynomials on the accuracy of the recovered gravity solution suggests that the short wavelength signals and the regions with high magnitude gravity gradients respond more strongly to such changes. When tested with simulated SGG measurements, i.e. the second order radial derivative of the gravity anomaly, at an altitude of 300 km with a 0.7° spatial resolution as input data, our model could obtain the gravity anomaly with an RMS error of 1 ~ 7 mGal at a surface resolution of 0.7° (< 80 km). The study of the impact of measurement noise on the recovered gravity anomaly implies that the solutions from SGG measurements are less susceptible to measurement errors than those recovered from the upward continued gravity anomaly, indicating that the SGG type mission such as GOCE would be an ideal choice for implementing our model. Our simulation results demonstrate the model.s potential in determining the local gravity field at a finer scale than could be achieved through spherical harmonics, i.e. less than 100 km, with excellent performance in edge detection. / text
19

A Study of Velocity-Dependent JND of Haptic Model Detail

Tang, John Ko-Han January 2010 (has links)
The study of haptics, or the sense of touch in virtual reality environments, is constantly looking for improvements in modeling with a high fidelity. Highly detailed models are desirable, but they often lead to slow processing times, which can mean a loss of fidelity in the force feedback sensations. Model compression techniques are critical to balancing model detail and processing time. One of the proposed compression techniques is to create multiple models of the same object but with different levels of detail (LOD) for each model. The technique hypothesizes that the human arm loses sensitivity to forces with the increase of its movement speed. This the compression technique determines which model to use based on the user's movement speed. This dissertation examines studies how the movement speed of the user affects the user's ability to sense changes in details of haptic models. Experiments are conducted using different haptic surfaces. Their levels of detail are changed while the subject interacts with them to mimic the effects of a multiresolution compression implementation. The tests focus on the subjects' ability to differentiate changes of the surfaces at each speed. The first experiment uses curved surfaces with multiple resolutions. This test observes the sensitivity of the user when the details on the surface are small. The results show that the subjects are more sensitive to changes of small details at a lower speed than higher speed. The second experiment measures sensitivity to larger features by using trapezoidal surfaces with different angles. The trapezoidal surfaces can be seen as a low-resolution haptic model with only two vertices, and changing the angles of the trapezoids is seen as changing the radii of curvature. With the same speed settings from the first experiment applied to the subjects, the sensitivity for changes in curvature is predicted to decrease with the increase of speed. However, the results of this experiment proved otherwise. The conclusions suggest that multiresolution designs are not a straightforward reduction of LOD, even though the movement speed does affect haptic sensitivity. The model's geometry should be taken into account when designing the parameters for haptic model compression. The results from the experiments provide insights to future haptic multiresolution compression designs.
20

Implementation and Evaluation of Spatiotemporal Prediction Algorithms and Prediction of Spatially Distributed Greenhouse Gas Inventories

Rodway, James EA Unknown Date
No description available.

Page generated in 0.0994 seconds