Spelling suggestions: "subject:"deconvolution"" "subject:"econvolution""
71 |
Restauration échographique multi-vues appliquée à l'imagerie 3D du sein et du coeurSoler, Pau 21 March 2006 (has links) (PDF)
Restauration Echographique Multi-vues appliquée à l'Imagerie 3D du Sein et du Cœur Nous présentons dans cette thèse de nouvelles techniques pour l'amélioration de la qualité d'images ultrasonores tridimensionnelles. L'échographie est l'une des technologies les plus utilisées en imagerie médicale, car elle est non-invasive, temps réel et relativement peu coûteuse. Cependant, elle a des inconvénients, comme la présence de speckle, une faible résolution spatiale et la dépendance de l'angle d'acquisition. La technique de combinaison spatiale (en anglais spatial compounding), qui consiste à combiner des images selon différents points de vue, a démontré qu'elle permettait de réduire le niveau de speckle. Nous proposons d'étendre cette technique afin améliorer également la résolution spatiale et la dépendance de l'angle d'acquisition. Nous proposons deux nouvelles approches : (i) une déconvolution multi-vues, consistant à résoudre le problème inverse où l'on suppose que chaque acquisition est une version dégradée du tissu original, et on recherche l'image la plus vraisemblable, et (ii) une fusion multi-vues, où l'on construit un volume à partir des caractéristiques présentes dans chaque acquisition. Nous appliquons ces méthodes sur l'imagerie 3D du sein et du cœur. L'obtention des volumes 3D du sein reste un défi, entre autres parce que la résolution est limitée dans le plan d'élévation. Les méthodes proposées permettent de résoudre ces problèmes. De nouvelles sondes en temps-réel et 3D pour l'imagerie du cœur ont été récemment introduites sur le marché. La combinaison par nos méthodes des acquisitions à travers différentes fenêtres acoustiques permet d'augmenter le champ de vue, le contraste des parois et le rapport signal sur bruit.
|
72 |
Imaging at the Nano-scale: State of the Art and Advanced TechniquesAumond, Bernardo D., El Rifai, Osamah M., Youcef-Toumi, Kamal 01 1900 (has links)
Surface characteristics such as topography and critical dimensions serve as important indicators of product quality and manufacturing process performance especially at the micrometer and the nanometer scales. This paper first reviews different technologies used for obtaining high precision 3-D images of surfaces, along with some selected applications. Atomic force microscopy (AFM) is one of such methods. These images are commonly distorted by convolution effects, which become more prominent when the sample surface contains high aspect ratio features. In addition, data artifacts can result from poor dynamic response of the instrument used. In order to achieve reliable data at high throughput, dynamic interactions between the instrument's components need to be well understood and controlled, and novel image deconvolution schemes need to be developed. Our work aims at mitigating these distortions and achieving reliable data to recover metrology soundness. A summary of our findings will be presented. / Singapore-MIT Alliance (SMA)
|
73 |
Signal processing and amplifier design for inexpensive genetic analysis instrumentsChoi, Sheng Heng 11 1900 (has links)
The Applied Miniaturisation Laboratory (AML) has recently built a laser-induced fluorescent capillary electrophoresis (LIF-CE) genetic analysis instrument, called the Tricorder Tool Kit (TTK). By using a photodiode instead of photomultiplier tubes in the optical detection, the AML has lowered the cost and size compared to commercial LIF-CE products. However, maintaining an adequate signal-to-noise (SNR) and limit of detection (LOD) is a challenge.
By implementing a multistage amplifier, we increased the bandwidth and voltage swing while maintaining the transimpedance gain compared to the previous design. We also developed signal processing algorithms for post-experiment processing of CE. Using wavelet transform, iterative polynomial baseline fitting, and Jansson's deconvolution, we improved the SNR, reduced baseline variations, and separated overlapping peaks in CE signals. By improving the electronics and signal processing, we lowered the LOD of the TTK, which is a step towards the realisation of inexpensive point-of-care molecular medical diagnosis instruments. / Computer, Microelectronic Devices, Circuits and Systems
|
74 |
From Population to Single Cells: Deconvolution of Cell-cycle DynamicsGuo, Xin January 2012 (has links)
<p>The cell cycle is one of the fundamental processes in all living organisms, and all cells arise from the division of existing cells. To better understand the regulation of the cell cycle, synchrony experiments are widely used to monitor cellular dynamics during this process. In such experiments, a large population of cells is generally arrested or selected at one stage of the cycle, and then released to progress through subsequent division stages. Measurements are then taken in this population at a variety of time points after release to provide insight into the dynamics of the cell cycle. However, due to cell-to-cell variability and asymmetric cell division, cells in a synchronized population lose synchrony over time. As a result, the time-series measurements from the synchronized cell populations do not accurately reflect the underlying dynamics of cell-cycle processes.</p><p>In this thesis, we introduce a deconvolution algorithm that learns a more accurate view of cell-cycle dynamics, free from the convolution effects associated with imperfect cell synchronization. Through wavelet-basis regularization, our method sharpens signal without sharpening noise, and can remarkably increase both the dynamic range and the temporal resolution of time-series data. Though it can be applied to any such data, we demonstrate the utility of our method by applying it to a recent cell-cycle transcription time course in the eukaryote <italic>Saccharomyces cerevisiae</italic>. We show that our method more sensitively detects cell-cycle-regulated transcription, and reveals subtle timing differences that are masked in the original population measurements. Our algorithm also explicitly learns distinct transcription programs for both mother and daughter cells, enabling us to identify 82 genes transcribed almost entirely in the early G1 in a daughter-specific manner.</p><p>In addition to the cell-cycle deconvolution algorithm, we introduce <italic>DOMAIN</italic>, a protein-protein interaction (PPI) network alignment method, which employs a novel <italic>direct-edge-alignment</italic> paradigm to detect conserved functional modules (e.g., protein complexes, molecular pathways) from pairwise PPI networks. By applying our approach to detect protein complexes conserved in yeast-fly and yeast-worm PPI networks, we show that our approach outperforms two widely used approaches in most alignment performance metrics. We also show that our approach enables us to identify conserved cell-cycle-related functional modules across yeast-fly PPI networks.</p> / Dissertation
|
75 |
Enhancing the Nitrite Reductase Activity of Modified Hemoglobin: Bis-tetramers and their PEGylated DerivativesLui, Francine Evelyn 10 January 2012 (has links)
The need for an alternative to red cells in transfusions has led to the creation of hemoglobin-based oxygen carriers (HBOCs). However, evaluations of all products tested in clinical trials have noted cardiovascular complications, raising questions about their safety that led to the abandonment of all those products. It has been considered that the adverse side effects come from the scavenging of the vasodilator – nitric oxide (NO) by the deoxyheme sites of the hemoglobin derivatives. Another observation is that HBOCs with lower oxygen affinity than red cells release oxygen prematurely in arterioles, triggering an unwanted homeostatic response. Since the need for such a product remains critical, it is important to understand the reactivity patterns that contribute to the observed complications.
Various alterations of the protein have been attempted in order to reduce HBOC-induced vasoconstriction. Recent reports suggest that a safe and effective product should be pure, homogenous and have a high molecular weight along with appropriate oxygenation properties. While these properties are clearly important, vasodilatory features of hemoglobin through its nitrite reductase activity may also act as an in situ source of NO. It follows that HBOCs with an enhanced ability to produce NO from endogenous nitrite may serve to counteract vasoactivity associated with NO-scavenging by hemoglobin.
Here we characterize the effects of different protein modifications on the nitrite reductase activity of hemoglobin. We produced a variety of HBOCs that include cross-linked tetramers, polyethylene glycol (PEG) conjugates and bis-tetramers of hemoglobin. We report that the rate of NO production strongly depends on the conformational state of the protein, with R-state stabilized proteins (PEG-Hbs), exhibiting the fastest rates. In particular, we found that PEGylated bis-tetramers of hemoglobin (BT-PEG) exhibit increased nitrite reductase activity while retaining cooperativity and stability. Animal studies of BT-PEG demonstrated that this material is benign: it did not cause significant increases in systemic blood pressure in mice, the major side effect associated with existing HBOCs. BT-PEG exhibits an enhanced nitrite reductase activity together with sample purity and homogeneity, molecular size and shape, and appropriate oxygenation properties, characteristics of a clinically useful HBOC.
|
76 |
Enhancing the Nitrite Reductase Activity of Modified Hemoglobin: Bis-tetramers and their PEGylated DerivativesLui, Francine Evelyn 10 January 2012 (has links)
The need for an alternative to red cells in transfusions has led to the creation of hemoglobin-based oxygen carriers (HBOCs). However, evaluations of all products tested in clinical trials have noted cardiovascular complications, raising questions about their safety that led to the abandonment of all those products. It has been considered that the adverse side effects come from the scavenging of the vasodilator – nitric oxide (NO) by the deoxyheme sites of the hemoglobin derivatives. Another observation is that HBOCs with lower oxygen affinity than red cells release oxygen prematurely in arterioles, triggering an unwanted homeostatic response. Since the need for such a product remains critical, it is important to understand the reactivity patterns that contribute to the observed complications.
Various alterations of the protein have been attempted in order to reduce HBOC-induced vasoconstriction. Recent reports suggest that a safe and effective product should be pure, homogenous and have a high molecular weight along with appropriate oxygenation properties. While these properties are clearly important, vasodilatory features of hemoglobin through its nitrite reductase activity may also act as an in situ source of NO. It follows that HBOCs with an enhanced ability to produce NO from endogenous nitrite may serve to counteract vasoactivity associated with NO-scavenging by hemoglobin.
Here we characterize the effects of different protein modifications on the nitrite reductase activity of hemoglobin. We produced a variety of HBOCs that include cross-linked tetramers, polyethylene glycol (PEG) conjugates and bis-tetramers of hemoglobin. We report that the rate of NO production strongly depends on the conformational state of the protein, with R-state stabilized proteins (PEG-Hbs), exhibiting the fastest rates. In particular, we found that PEGylated bis-tetramers of hemoglobin (BT-PEG) exhibit increased nitrite reductase activity while retaining cooperativity and stability. Animal studies of BT-PEG demonstrated that this material is benign: it did not cause significant increases in systemic blood pressure in mice, the major side effect associated with existing HBOCs. BT-PEG exhibits an enhanced nitrite reductase activity together with sample purity and homogeneity, molecular size and shape, and appropriate oxygenation properties, characteristics of a clinically useful HBOC.
|
77 |
Development and testing of an organic scintillator detector for fast neutron spectrometryMickum, George Spencer 10 April 2013 (has links)
The use of organic scintillators is an established method for the measurement of neutron spectra above several hundred keV. Fast neutrons are detected largely by proton recoils in the scintillator resulting from neutron elastic scattering with hydrogen. This leads to a smeared rectangular pulse-height distribution for monoenergetic neutrons. The recoil proton distribution ranges in energy from zero to the incident neutron energy. In addition, the pulse-height distribution is further complicated by structure due to energy deposition from alpha particle recoils from interactions with carbon as well as carbon recoils themselves. In order to reconstruct the incident neutron spectrum, the pulse-height spectrum has to be deconvoluted (unfolded) using the computed or measured response of the scintillator to monoenergetic neutrons. In addition gamma rays, which are always present when neutrons are present, lead to Compton electron recoils in the scintillator. Fortunately, for certain organic scintillators, the electron recoil events can be separated from the heavier particle recoil events in turn to distinguish gamma-ray induced events from neutron-induced events. This is accomplished by using the risetime of the pulse from the organic scintillator seen in the photomultiplier tube as a decay of light.
In this work, an organic scintillator detection system was assembled which includes neutron-gamma separation capabilities to store the neutron-induced and gamma-induced recoil spectra separately. An unfolding code was implemented to deconvolute the spectra into neutron and gamma energy spectra. In order to verify the performance of the system, a measurement of two reference neutron fields will be performed with the system, unmoderated Cf-252 and heavy-water moderated Cf-252. After the detection system has been verified, measurements will be made with an AmBe neutron source.
|
78 |
A novel approach to restoration of Poissonian imagesShaked, Elad 09 February 2010 (has links)
The problem of reconstruction of digital images from their degraded measurements is regarded as a problem of central importance in various fields of engineering and imaging sciences. In such cases, the degradation is typically caused by the resolution limitations of an imaging device in use and/or by the destructive influence of measurement noise. Specifically, when the noise obeys a Poisson probability law, standard approaches to the problem of image reconstruction are based on using fixed-point algorithms which follow the methodology proposed by Richardson and Lucy in the beginning of the 1970s. The practice of using such methods, however, shows that their convergence properties tend to deteriorate at relatively high noise levels (which typically takes place in so-called low-count settings). This work introduces a novel method for de-noising and/or de-blurring of digital images that have been corrupted by Poisson noise. The proposed method is derived using the framework of MAP estimation, under the assumption that the image of interest can be sparsely represented in the domain of a properly designed linear transform. Consequently, a shrinkage-based iterative procedure is proposed, which guarantees the maximization of an associated maximum-a-posteriori criterion. It is shown in a series of both computer-simulated and real-life experiments that the proposed method outperforms a number of existing alternatives in terms of stability, precision, and computational efficiency.
|
79 |
Processing Techniques of Aeromagnetic Data. Case Studies from the Precambrian of MozambiqueMagaia, Luis January 2009 (has links)
During 2002-2006 geological field work were carried out in Mozambique. The purpose was to check the preliminary geological interpretations and also to resolve the problems that arose during the compilation of preliminary geological maps and collect samples for laboratory studies. In parallel, airborne geophysical data were collected in many parts of the country to support the geological interpretation and compilation of geophysical maps. In the present work the aeromagnetic data collected in 2004 and 2005 in two small areas northwest of Niassa province and another one in eastern part of Tete province is analysed using GeosoftTM. The processing of aeromagnetic data began with the removal of diurnal variations and corrections for IGRF model of the Earth in the data set. The study of the effect of height variations on recorded magnetic field, levelling and interpolation techniques were also studied. La Porte interpolation showed to be a good tool for interpolation of aeromagnetic data using measured horizontal gradient. Depth estimation techniques are also used to obtain semi-quantitative interpretation of geological bodies. It was showed that many features in the study areas are located at shallow depth (less than 500 m) and few geological features are located at depths greater than 1000 m. This interpretation could be used to draw conclusions about the geology or be incorporated into further investigations in these areas.
|
80 |
A novel approach to restoration of Poissonian imagesShaked, Elad 09 February 2010 (has links)
The problem of reconstruction of digital images from their degraded measurements is regarded as a problem of central importance in various fields of engineering and imaging sciences. In such cases, the degradation is typically caused by the resolution limitations of an imaging device in use and/or by the destructive influence of measurement noise. Specifically, when the noise obeys a Poisson probability law, standard approaches to the problem of image reconstruction are based on using fixed-point algorithms which follow the methodology proposed by Richardson and Lucy in the beginning of the 1970s. The practice of using such methods, however, shows that their convergence properties tend to deteriorate at relatively high noise levels (which typically takes place in so-called low-count settings). This work introduces a novel method for de-noising and/or de-blurring of digital images that have been corrupted by Poisson noise. The proposed method is derived using the framework of MAP estimation, under the assumption that the image of interest can be sparsely represented in the domain of a properly designed linear transform. Consequently, a shrinkage-based iterative procedure is proposed, which guarantees the maximization of an associated maximum-a-posteriori criterion. It is shown in a series of both computer-simulated and real-life experiments that the proposed method outperforms a number of existing alternatives in terms of stability, precision, and computational efficiency.
|
Page generated in 0.0891 seconds