• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 7
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 82
  • 82
  • 27
  • 19
  • 19
  • 18
  • 16
  • 16
  • 16
  • 11
  • 11
  • 11
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

An Edge-Preserving Super-Precision for Simultaneous Enhancement of Spacial and Grayscale Resolutions

SAKANIWA, Kohichi, YAMADA, Isao, OHTSUKA, Toshinori, HASEGAWA, Hiroshi 01 February 2008 (has links)
No description available.
12

SAR Remote Sensing of Canadian Coastal Waters using Total Variation Optimization Segmentation Approaches

Kwon, Tae-Jung 28 April 2011 (has links)
The synthetic aperture radar (SAR) onboard Earth observing satellites has been acknowledged as an integral tool for many applications in monitoring the marine environment. Some of these applications include regional sea-ice monitoring and detection of illegal or accidental oil discharges from ships. Nonetheless, a practicality of the usage of SAR images is greatly hindered by the presence of speckle noises. Such noise must be eliminated or reduced to be utilized in real-world applications to ensure the safety of the marine environment. Thus this thesis presents a novel two-phase total variation optimization segmentation approach to tackle such a challenging task. In the total variation optimization phase, the Rudin-Osher-Fatemi total variation model was modified and implemented iteratively to estimate the piecewise smooth state by minimizing the total variation constraints. In the finite mixture model classification phase, an expectation-maximization method was performed to estimate the final class likelihoods using a Gaussian mixture model. Then a maximum likelihood classification technique was utilized to obtain the final segmented result. For its evaluation, a synthetic image was used to test its effectiveness. Then it was further applied to two distinct real SAR images, X-band COSMO-SkyMed imagery containing verified oil-spills and C-band RADARSAT-2 imagery mainly containing two different sea-ice types to confirm its robustness. Furthermore, other well-established methods were compared with the proposed method to ensure its performance. With the advantage of a short processing time, the visual inspection and quantitative analysis including kappa coefficients and F1 scores of segmentation results confirm the superiority of the proposed method over other existing methods.
13

A study of the sensitivity of topological dynamical systems and the Fourier spectrum of chaotic interval maps

Roque Sol, Marco A. 02 June 2009 (has links)
We study some topological properties of dynamical systems. In particular the rela- tionship between spatio-temporal chaotic and Li-Yorke sensitive dynamical systems establishing that for minimal dynamical systems those properties are equivalent. In the same direction we show that being a Li-Yorke sensitive dynamical system implies that the system is also Li-Yorke chaotic. On the other hand we survey the possibility of lifting some topological properties from a given dynamical system (Y, S) to an- other (X, T). After studying some basic facts about topological dynamical systems, we move to the particular case of interval maps. We know that through the knowl- edge of interval maps, f : I → I, precious information about the chaotic behavior of general nonlinear dynamical systems can be obtained. It is also well known that the analysis of the spectrum of time series encloses important material related to the signal itself. In this work we look for possible connections between chaotic dynamical systems and the behavior of its Fourier coefficients. We have found that a natural bridge between these two concepts is given by the total variation of a function and its connection with the topological entropy associated to the n-th iteration, fn(x), of the map. Working in a natural way using the Sobolev spaces Wp,q(I) we show how the Fourier coefficients are related to the chaoticity of interval maps.
14

Spatially Regularized Reconstruction of Fibre Orientation Distributions in the Presence of Isotropic Diffusion

Zhou, Quan 14 April 2014 (has links)
The connectivity and structural integrity of the white matter of the brain is known to be implicated in a wide range of brain-related diseases and injuries. However, it is only since the advent of diffusion magnetic resonance imaging (dMRI) that researchers have been able to probe the miscrostructure of white matter in vivo. Presently, among a range of methods of dMRI, high angular resolution diffusion imaging (HARDI) is known to excel in its ability to provide reliable information about the local orientations of neural fasciculi (aka fibre tracts). It preserves the high angular resolution property of diffusion spectrum imaging (DSI) but requires less measurements. Meanwhile, as opposed to the more traditional diffusion tensor imaging (DTI), HARDI is capable of distinguishing the orientations of multiple fibres passing through a given spatial voxel. Unfortunately, the ability of HARDI to discriminate neural fibres that cross each other at acute angles is always limited. The limitation becomes the motivation to develop numerous post-processing tools, aiming at the improvement of the angular resolution of HARDI. Among such methods, spherical deconvolution (SD) is the one which attracts the most attentions. Due to its ill-posed nature, however, standard SD relies on a number of a priori assumptions needed to render its results unique and stable. In the present thesis, we introduce a novel approach to the problem of non-blind SD of HARDI signals, which does not only consider the existence of anisotropic diffusion component of HARDI signal but also explicitly take the isotropic diffusion component into account. As a result of that, in addition to reconstruction of fODFs, our algorithm can also yield a useful estimation of its related IDM, which quantifies a relative contribution of the isotropic diffusion component as well as its spatial pattern. Moreover, one of the principal contributions is to demonstrate the effectiveness of exploiting different prior models for regularization of the spatial-domain behaviours of the reconstructed fODFs and IDMs. Specifically, the fibre continuity model has been used to force the local maxima of the fODFs to vary consistently throughout the brain, whereas the bounded variation model has helped us to achieve piecewise smooth reconstruction of the IDMs. The proposed algorithm is formulated as a convex minimization problem, which admits a unique and stable minimizer. Moreover, using ADMM, we have been able to find the optimal solution via a sequence of simpler optimization problems, which are both computationally efficient and amenable to parallel computations. In a series of both in silico and in vivo experiments, we demonstrate how the proposed solution can be used to successfully overcome the effect of partial voluming, while preserving the spatial coherency of cerebral diffusion at moderate to severe noise levels. The performance of the proposed method is compared with that of several available alternatives, with the comparative results clearly supporting the viability and usefulness of our approach. Moreover, the results illustrate the power of applied spatial regularization terms.
15

SAR Remote Sensing of Canadian Coastal Waters using Total Variation Optimization Segmentation Approaches

Kwon, Tae-Jung 28 April 2011 (has links)
The synthetic aperture radar (SAR) onboard Earth observing satellites has been acknowledged as an integral tool for many applications in monitoring the marine environment. Some of these applications include regional sea-ice monitoring and detection of illegal or accidental oil discharges from ships. Nonetheless, a practicality of the usage of SAR images is greatly hindered by the presence of speckle noises. Such noise must be eliminated or reduced to be utilized in real-world applications to ensure the safety of the marine environment. Thus this thesis presents a novel two-phase total variation optimization segmentation approach to tackle such a challenging task. In the total variation optimization phase, the Rudin-Osher-Fatemi total variation model was modified and implemented iteratively to estimate the piecewise smooth state by minimizing the total variation constraints. In the finite mixture model classification phase, an expectation-maximization method was performed to estimate the final class likelihoods using a Gaussian mixture model. Then a maximum likelihood classification technique was utilized to obtain the final segmented result. For its evaluation, a synthetic image was used to test its effectiveness. Then it was further applied to two distinct real SAR images, X-band COSMO-SkyMed imagery containing verified oil-spills and C-band RADARSAT-2 imagery mainly containing two different sea-ice types to confirm its robustness. Furthermore, other well-established methods were compared with the proposed method to ensure its performance. With the advantage of a short processing time, the visual inspection and quantitative analysis including kappa coefficients and F1 scores of segmentation results confirm the superiority of the proposed method over other existing methods.
16

Reconstrução tomográfica de imagens SPECT a partir de poucos dados utilizando variação total / Tomographic reconstruction of SPECT images from few data using total variation

João Guilherme Vicente de Araujo 13 April 2017 (has links)
Para realizar a correção de atenuação em uma tomografia computadorizada por emissão de fóton único (SPECT, em inglês) é necessário medir e reconstruir o mapa dos coeficientes de atenuação utilizando uma leitura de um tomógrafo de transmissão, feita antes ou simultaneamente à leitura de emissão. Essa abordagem encarece a produção da imagem e, em alguns casos, aumenta consideravelmente a duração do exame, sendo a imobilidade do paciente um fator importante para o sucesso da reconstrução. Uma alternativa que dispensa a leitura de transmissão é reconstruir tanto a imagem de atividade quanto o mapa de atenuação somente através dos dados de uma leitura de emissão. Dentro dessa abordagem propusermos um método baseado no algoritmo criado por Censor, cujo objetivo é resolver um problema misto de viabilidade côncavo-convexo para reconstruir simultaneamente as imagens. O método proposto é formulado como um problema de minimização, onde a função objetivo é dada pela variação total das imagens sujeita à viabilidade mista de Censor. Os teste foram feitos em imagens simuladas e os resultados obtidos na ausência de ruídos, mesmo para uma pequena quantidade de dados, foram satisfatórios. Na presença de dados ruidosos com distribuição de Poisson o método foi instável e a escolha das tolerâncias, nesse caso, ainda é um problema aberto. / In order to perform attenuation correction in single photon emission computed tomography (SPECT), we need to measure and reconstruct the attenuation coefficients map using a transmission tomography scan, performed either sequentially or simultaneously with an emission scan. This approach increases the cost required to produce the image and, in some cases, increases considerably the scanning time, therefore the patient immobility is an important factor to the reconstruction success. An alternative that dispense the transmission scan is reconstruct both the activity image and the attenuation map only from emission scan data. In this approach we proposed a method based on the Censors algorithm, which objective is to solve a mixed convex-concave feasibility problem to reconstruct simultaneously all images. The method proposed is formulated as a minimization problem, where the objective function is given by the total variation of the images subject to Censors mixed feasibility. In the simulations, artificial images were used and the obtained results without noised data, even for small amount of data, were satisfactory. The method was unstable in the presence of Poisson distributed noise and the tolerance choice, in this case, is an open problem yet.
17

Robust Multiframe Super-Resolution with Adaptive Norm Choice Using Difference Curvature Based BTV Regularization

Liu, Xiaohong January 2016 (has links)
Multi-frame image super-resolution focuses on reconstructing a high-resolution image from a set of low-resolution images with high similarity. Since super-resolution is an ill-posted problem, regularization techniques are widely used to constrain the minimization function. Combining image prior knowledge with fidelity model, Bayesian-based methods can effectively solve this ill-posed problem, which makes this kind of methods more popular than other methods. Our proposed model is based on maximum a posteriori probability (MAP) estimation. In this thesis, we propose a novel initialization method based on median operator to initialize our estimated high-resolution image. For the fidelity term in our proposed algorithm, the half-quadratic estimation is used to choose error norm adaptively instead of using fixed L1 or L2 norm. Furthermore, for our regularization term, we propose a novel regularization method based on Difference Curvature (DC) and Bilateral Total Variation (BTV) to suppress mixed noises and preserve image edges simultaneously. In our experimental results, synthetic data and real data are both tested to demonstrate the superiority of our proposed method in terms of clearer texture and less noise over other state-of-the-art methods.
18

Consecutive Covering Arrays and a New Randomness Test

Godbole, A. P., Koutras, M. V., Milienos, F. S. 01 May 2010 (has links)
A k × n array with entries from an "alphabet" A = { 0, 1, ..., q - 1 } of size q is said to form a t-covering array (resp. orthogonal array) if each t × n submatrix of the array contains, among its columns, at least one (resp. exactly one) occurrence of each t-letter word from A (we must thus have n = qt for an orthogonal array to exist and n ≥ qt for a t -covering array). In this paper, we continue the agenda laid down in Godbole et al. (2009) in which the notion of consecutive covering arrays was defined and motivated; a detailed study of these arrays for the special case q = 2, has also carried out by the same authors. In the present article we use first a Markov chain embedding method to exhibit, for general values of q, the probability distribution function of the random variable W = Wk, n, t defined as the number of sets of t consecutive rows for which the submatrix in question is missing at least one word. We then use the Chen-Stein method (Arratia et al., 1989, 1990) to provide upper bounds on the total variation error incurred while approximating L (W) by a Poisson distribution Po (λ) with the same mean as W. Last but not least, the Poisson approximation is used as the basis of a new statistical test to detect run-based discrepancies in an array of q-ary data.
19

A Systematic Evaluation of Compressed Sensing Algorithms Applied to Magnetic Resonance Imaging

Fassett, Scott William 22 May 2012 (has links)
No description available.
20

Space-Frequency Regularization for Qualitative Inverse Scattering

Alqadah, Hatim F. January 2011 (has links)
No description available.

Page generated in 0.1072 seconds