• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 62
  • 12
  • 7
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 101
  • 45
  • 27
  • 18
  • 18
  • 17
  • 16
  • 16
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Sea spike modeling

Kuo, Chin-Chuan 12 1900 (has links)
Approved for public release; distribution is unlimited / In this thesis a clutter voltage model for scattering from the sea surface is developed. A model for the scattering from a whitecap and a wave breaking occurrence model re combined to simulate the back scattered signal from one radar resolution cell. The simulation performed obtained the probability density function of sea clutter under different assumptions of wind velocities and wave breaking conditions. This model incorporates some measured quantities such as the mean clutter voltage and the correlation time as parameters. The probability density function depends on the parameters of this model. The obtained probability density functions do not confirm to any familiar simple density function. / http://archive.org/details/seaspikemodeling00kuoc / Lieutenant, Taiwan Navy
12

Low Cost 3D Flow Estimation in Medical Ultrasound

January 2018 (has links)
abstract: Medical ultrasound imaging is widely used today because of it being non-invasive and cost-effective. Flow estimation helps in accurate diagnosis of vascular diseases and adds an important dimension to medical ultrasound imaging. Traditionally flow estimation is done using Doppler-based methods which only estimate velocity in the beam direction. Thus when blood vessels are close to being orthogonal to the beam direction, there are large errors in the estimation results. In this dissertation, a low cost blood flow estimation method that does not have the angle dependency of Doppler-based methods, is presented. First, a velocity estimator based on speckle tracking and synthetic lateral phase is proposed for clutter-free blood flow. Speckle tracking is based on kernel matching and does not have any angle dependency. While velocity estimation in axial dimension is accurate, lateral velocity estimation is challenging due to reduced resolution and lack of phase information. This work presents a two tiered method which estimates the pixel level movement using sum-of-absolute difference, and then estimates the sub-pixel level using synthetic phase information in the lateral dimension. Such a method achieves highly accurate velocity estimation with reduced complexity compared to a cross correlation based method. The average bias of the proposed estimation method is less than 2% for plug flow and less than 7% for parabolic flow. Blood is always accompanied by clutter which originates from vessel wall and surrounding tissues. As magnitude of the blood signal is usually 40-60 dB lower than magnitude of the clutter signal, clutter filtering is necessary before blood flow estimation. Clutter filters utilize the high magnitude and low frequency features of clutter signal to effectively remove them from the compound (blood + clutter) signal. Instead of low complexity FIR filter or high complexity SVD-based filters, here a power/subspace iteration based method is proposed for clutter filtering. Excellent clutter filtering performance is achieved for both slow and fast moving clutters with lower complexity compared to SVD-based filters. For instance, use of the proposed method results in the bias being less than 8% and standard deviation being less than 12% for fast moving clutter when the beam-to-flow-angle is $90^o$. Third, a flow rate estimation method based on kernel power weighting is proposed. As the velocity estimator is a kernel-based method, the estimation accuracy degrades near the vessel boundary. In order to account for kernels that are not fully inside the vessel, fractional weights are given to these kernels based on their signal power. The proposed method achieves excellent flow rate estimation results with less than 8% bias for both slow and fast moving clutters. The performance of the velocity estimator is also evaluated for challenging models. A 2D version of our two-tiered method is able to accurately estimate velocity vectors in a spinning disk as well as in a carotid bifurcation model, both of which are part of the synthetic aperture vector flow imaging (SA-VFI) challenge of 2018. In fact, the proposed method ranked 3rd in the challenge for testing dataset with carotid bifurcation. The flow estimation method is also evaluated for blood flow in vessels with stenosis. Simulation results show that the proposed method is able to estimate the flow rate with less than 9% bias. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2018
13

Studies on the salient properties of digital imagery that impact on human target acquisition and the implications for image measures.

Ewing, Gary John January 1999 (has links)
Electronically displayed images are becoming increasingly important as an interface between man and information systems. Lengthy periods of intense observation are no longer unusual. There is a growing awareness that specific demands should be made on displayed images in order to achieve an optimum match with the perceptual properties of the human visual system. These demands may vary greatly, depending on the task for which the displayed image is to be used and the ambient conditions. Optimal image specifications are clearly not the same for a home TV, a radar signal monitor or an infrared targeting image display. There is, therefore, a growing need for means of objective measurement of image quality, where "image quality" is used in a very broad sense and is defined in the thesis, but includes any impact of image properties on human performance in relation to specified visual tasks. The aim of this thesis is to consolidate and comment on the image measure literatures, and to find through experiment the salient properties of electronically displayed real world complex imagery that impacts on human performance. These experiments were carried out for well specified visual tasks (of real relevance), and the appropriate application of image measures to this imagery, to predict human performance, was considered. An introduction to certain aspects of image quality measures is given, and clutter metrics are integrated into this concept. A very brief and basic introduction to the human visual system (HVS) is given, with some basic models. The literature on image measures is analysed, with a resulting classification of image measures, according to which features they were attempting to quantify. A series of experiments were performed to evaluate the effects of image properties on human performance, using appropriate measures of performance. The concept of image similarity was explored, by objectively measuring the subjective perception of imagery of the same scene, as obtained through different sensors, and which underwent different luminance transformations. Controlled degradations were introduced, by using image compression. Both still and video compression were used to investigate both spatial and temporal aspects of HVS processing. The effects of various compression schemes on human target acquisition performance were quantified. A study was carried out to determine the "local" extent, to which the clutter around a target, affects its detectability. It was found in this case, that the excepted wisdom, of setting the local domain (support of the metric) to twice the expected target size, was incorrect. The local extent of clutter was found to be much greater, with this having implications for the application of clutter metrics. An image quality metric called the gradient energy measure (GEM), for quantifying the affect of filtering on Nuclear Medicine derived images, was developed and evaluated. This proved to be a reliable measure of image smoothing and noise level, which in preliminary studies agreed with human perception. The final study discussed in this thesis determined the performance of human image analysts, in terms of their receiver-operating characteristic, when using Synthetic Aperture Radar (SAR) derived images in the surveillance context. In particular, the effects of target contrast and background clutter on human analyst target detection performance were quantified. In the final chapter, suggestions to extend the work of this thesis are made, and in this context a system to predict human visual performance, based on input imagery, is proposed. This system intelligently uses image metrics based on the particular visual task and human expectations and human visual system performance parameters. / Thesis (Ph.D.)--Medical School; School of Computer Science, 1999.
14

Improved Endocardial Border Definition with Short-Lag Spatial Coherence (SLSC) Imaging

Lediju Bell, Muyinatu A. January 2012 (has links)
<p>Clutter is a problematic noise artifact in a variety of ultrasound applications. Clinical tasks complicated by the presence of clutter include detecting cancerous lesions in abdominal organs (e.g. livers, bladders) and visualizing endocardial borders to assess cardiovascular health. In this dissertation, an analytical expression for contrast loss due to clutter is derived, clutter is quantified in abdominal images, and sources of abdominal clutter are identified. Novel clutter reduction methods are also presented and tested in abdominal and cardiac images. </p><p>One of the novel clutter reduction methods is Short-Lag Spatial Coherence (SLSC) imaging. Instead of applying a conventional delay-and-sum beamformer to measure the amplitude of received echoes and form B-mode images, the spatial coherence of received echoes are measured to form SLSC images. The world's first SLSC images of simulated, phantom, and <italic>in vivo</italic> data are presented herein. They demonstrate reduced clutter and improved contrast, contrast-to-noise, and signal-to-noise ratios compared to conventional B-mode images. In addition, the resolution characteristics of SLSC images are quantified and compared to resolution in B-mode images. </p><p>A clinical study with 14 volunteers was conducted to demonstrate that SLSC imaging offers 19-33% improvement in the visualization of endocardial borders when the quality of B-mode images formed from the same echo data was poor. There were no statistically significant improvements in endocardial border visualization with SLSC imaging when the quality of matched B-mode images was medium to good.</p> / Dissertation
15

Model predictive control with haptic feedback for robot manipulation in cluttered scenarios

Killpack, Marc Daniel 13 January 2014 (has links)
Current robot manipulation and control paradigms have largely been developed for static or highly structured environments such as those common in factories. For most techniques in robot trajectory generation, such as heuristic-based geometric planning, this has led to putting a high cost on contact with the world. This approach and methodology can be prohibitive to robots operating in many unmodeled and dynamic environments. This dissertation presents work on using haptic based feedback (torque and tactile sensing) to formulate a controller for robot manipulation in clutter. We define “clutter” as any environment in which we expect the robot to make both incidental and purposeful contact while maneuvering and manipulating. The controllers developed in this dissertation take the form of single or multi-time step Model Predictive Control (a form of optimal control which incorporates feedback) which attempts to regulate contact forces at multiple locations on a robot arm while reaching to a goal. The results and conclusions in this dissertation are based on extensive testing in simulation (tens of thousands of trials) and testing in realistic scenarios with real robots incorporating tactile sensing. The approach is novel in the sense that it allows contact and explicitly incorporate the contact and predictive model of the robot arm in calculating control effort at every time step. The expected broader impact of this research is progress towards a new foundation of reactive feedback controllers that will include a higher likelihood of success in many constrained and dynamic scenarios such as reaching into containers without line of sight, maneuvering in cluttered search and rescue situations or working with unpredictable human co-workers.
16

The Cycling Property for the Clutter of Odd st-Walks

Abdi, Ahmad January 2014 (has links)
A binary clutter is cycling if its packing and covering linear program have integral optimal solutions for all Eulerian edge capacities. We prove that the clutter of odd st- walks of a signed graph is cycling if and only if it does not contain as a minor the clutter of odd circuits of K5 nor the clutter of lines of the Fano matroid. Corollaries of this result include, of many, the characterization for weakly bipartite signed graphs, packing two- commodity paths, packing T-joins with small |T|, a new result on covering odd circuits of a signed graph, as well as a new result on covering odd circuits and odd T-joins of a signed graft.
17

Procesamiento de señales de radar en presencia de clutter dinámico

Pascual, Juan Pablo January 2014 (has links)
El radar es un sistema de sensado remoto que utiliza técnicas de procesamiento estadístico de señales para obtener información de la señal recibida. Los radares convencionales son sistemas activos que operan transmitiendo energía en forma de ondas electromagnéticas y recibiendo las señales reflejadas por el entorno y el objeto iluminado. Una de las dificultades a tener en cuenta en los sistemas de radar es que la señal de interés suele encontrarse obscurecida por las reflexiones producidas por el ambiente, fenómeno al que se denomina {\it clutter}. Generalmente, y dependiendo de la aplicación, el clutter es considerado una fuente de interferencia y perturbaciones cuyos efectos se deben eliminar o reducir. Por lo tanto, en vista de su naturaleza aleatoria, es importante el desarrollo de métodos estadísticos de procesamiento de señales para poder detectar objetivos y estimar sus propiedades en situaciones de clutter intenso y dinámico. Para obtener algoritmos eficientes, es fundamental utilizar modelos realistas de las señales recibidas por el radar. Estos modelos deben enfatizar las diferencias entre el objeto de interés y el clutter. De esta forma, los métodos de procesamiento de señal son usados para separar el objetivo del clutter y reducir el efecto degradante de este último. En esta tesis se aborda el problema de detección en presencia de clutter dinámico para aplicaciones de radar. En especial se desarrollan modelos que contemplan las variaciones del escenario y utilizan la historia del clutter para mejorar su caracterización en el instante actual y las predicciones a tiempo futuro. La primera alternativa considera el clutter como una serie temporal que presenta heteroscedasticidad condicional autorregresiva generalizada, utilizando los denominados procesos GARCH. Este tipo de procesos poseen la característica de ser impulsivos, pero presentan la desventaja de que no cuentan con una expresión explícita para su función densidad de probabilidad. Por este motivo, se analizan alternativas para estimar sus parámetros y determinar la calidad de la estimación. Asimismo, se adaptan los test de hipótesis usuales para deducir un esquema de detección basado en el modelo GARCH. Con el fin de incorporar información de múltiples pulsos en los instantes de decisión, se extiende el modelo anterior combinando un proceso GARCH en dos dimensiones (GARCH-2D) con un proceso autorregresivo (AR) y se deriva el detector correspondiente para este modelo de clutter. La parte GARCH-2D del modelo preserva la propiedad impulsiva de los procesos GARCH y la AR en las innovaciones permite modelar la correlación pulso a pulso que existe en los datos. En ambos casos se deducen expresiones para las probabilidades de falsa alarma y, dada su complejidad matemática, la probabilidad de detección se evalúa por medio de simulaciones numéricas. Además, se analiza la sensibilidad del desempeño de los detectores ante errores en la estimación de sus parámetros. A pesar de que no resultan de tasa de falsa alarma constante, muestran un comportamiento robusto en situaciones prácticas. Por último, el desempeño de los detectores propuestos es comparado con algoritmos de detección existentes en la literatura utilizando mediciones reales de clutter marítimo. Los resultados muestran que presentan un mejor desempeño respecto de los demás detectores, es decir, una probabilidad de detección mayor para una tasa de falsa alarma menor, independientemente de la relación señal a clutter. Finalmente se estudia el problema de estimación secuencial de los parámetros de los procesos GARCH. Si bien de los análisis de sensibilidad se concluye que en los detectores porpuestos no es necesaria una actualización frecuente de los mismos, su estimación es la etapa de mayor costo computacional en los esquemas de detección propuestos. Siguiendo el enfoque de estimación Bayesiano se deduce un estimador lineal de mínimo error cuadrático medio para la varianza condicional de los procesos GARCH, que es el parámetro del cual depende el estadístico de los detectores desarrollados. La deducción del algoritmo es análoga a la del filtro de Kalman, pero en este caso las matrices del sistema son aleatorias.
18

Studies on the salient properties of digital imagery that impact on human target acquisition and the implications for image measures.

Ewing, Gary John January 1999 (has links)
Electronically displayed images are becoming increasingly important as an interface between man and information systems. Lengthy periods of intense observation are no longer unusual. There is a growing awareness that specific demands should be made on displayed images in order to achieve an optimum match with the perceptual properties of the human visual system. These demands may vary greatly, depending on the task for which the displayed image is to be used and the ambient conditions. Optimal image specifications are clearly not the same for a home TV, a radar signal monitor or an infrared targeting image display. There is, therefore, a growing need for means of objective measurement of image quality, where "image quality" is used in a very broad sense and is defined in the thesis, but includes any impact of image properties on human performance in relation to specified visual tasks. The aim of this thesis is to consolidate and comment on the image measure literatures, and to find through experiment the salient properties of electronically displayed real world complex imagery that impacts on human performance. These experiments were carried out for well specified visual tasks (of real relevance), and the appropriate application of image measures to this imagery, to predict human performance, was considered. An introduction to certain aspects of image quality measures is given, and clutter metrics are integrated into this concept. A very brief and basic introduction to the human visual system (HVS) is given, with some basic models. The literature on image measures is analysed, with a resulting classification of image measures, according to which features they were attempting to quantify. A series of experiments were performed to evaluate the effects of image properties on human performance, using appropriate measures of performance. The concept of image similarity was explored, by objectively measuring the subjective perception of imagery of the same scene, as obtained through different sensors, and which underwent different luminance transformations. Controlled degradations were introduced, by using image compression. Both still and video compression were used to investigate both spatial and temporal aspects of HVS processing. The effects of various compression schemes on human target acquisition performance were quantified. A study was carried out to determine the "local" extent, to which the clutter around a target, affects its detectability. It was found in this case, that the excepted wisdom, of setting the local domain (support of the metric) to twice the expected target size, was incorrect. The local extent of clutter was found to be much greater, with this having implications for the application of clutter metrics. An image quality metric called the gradient energy measure (GEM), for quantifying the affect of filtering on Nuclear Medicine derived images, was developed and evaluated. This proved to be a reliable measure of image smoothing and noise level, which in preliminary studies agreed with human perception. The final study discussed in this thesis determined the performance of human image analysts, in terms of their receiver-operating characteristic, when using Synthetic Aperture Radar (SAR) derived images in the surveillance context. In particular, the effects of target contrast and background clutter on human analyst target detection performance were quantified. In the final chapter, suggestions to extend the work of this thesis are made, and in this context a system to predict human visual performance, based on input imagery, is proposed. This system intelligently uses image metrics based on the particular visual task and human expectations and human visual system performance parameters. / Thesis (Ph.D.)--Medical School; School of Computer Science, 1999.
19

Facing clutter : on message competition in marketing communications /

Rosengren, Sara, January 2008 (has links)
Diss. Stockholm : Handelshögskolan, 2008.
20

Biology-Based Matched Signal Processing and Physics-Based Modeling For Improved Detection

January 2014 (has links)
abstract: Peptide microarrays have been used in molecular biology to profile immune responses and develop diagnostic tools. When the microarrays are printed with random peptide sequences, they can be used to identify antigen antibody binding patterns or immunosignatures. In this thesis, an advanced signal processing method is proposed to estimate epitope antigen subsequences as well as identify mimotope antigen subsequences that mimic the structure of epitopes from random-sequence peptide microarrays. The method first maps peptide sequences to linear expansions of highly-localized one-dimensional (1-D) time-varying signals and uses a time-frequency processing technique to detect recurring patterns in subsequences. This technique is matched to the aforementioned mapping scheme, and it allows for an inherent analysis on how substitutions in the subsequences can affect antibody binding strength. The performance of the proposed method is demonstrated by estimating epitopes and identifying potential mimotopes for eight monoclonal antibody samples. The proposed mapping is generalized to express information on a protein's sequence location, structure and function onto a highly localized three-dimensional (3-D) Gaussian waveform. In particular, as analysis of protein homology has shown that incorporating different kinds of information into an alignment process can yield more robust alignment results, a pairwise protein structure alignment method is proposed based on a joint similarity measure of multiple mapped protein attributes. The 3-D mapping allocates protein properties into distinct regions in the time-frequency plane in order to simplify the alignment process by including all relevant information into a single, highly customizable waveform. Simulations demonstrate the improved performance of the joint alignment approach to infer relationships between proteins, and they provide information on mutations that cause changes to both the sequence and structure of a protein. In addition to the biology-based signal processing methods, a statistical method is considered that uses a physics-based model to improve processing performance. In particular, an externally developed physics-based model for sea clutter is examined when detecting a low radar cross-section target in heavy sea clutter. This novel model includes a process that generates random dynamic sea clutter based on the governing physics of water gravity and capillary waves and a finite-difference time-domain electromagnetics simulation process based on Maxwell's equations propagating the radar signal. A subspace clutter suppression detector is applied to remove dominant clutter eigenmodes, and its improved performance over matched filtering is demonstrated using simulations. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2014

Page generated in 0.0402 seconds