• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 2
  • Tagged with
  • 30
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Comparison of Standard Denoising Methods for Peptide Identification

Carpenter, Skylar 01 May 2019 (has links)
Peptide identification using tandem mass spectrometry depends on matching the observed spectrum with the theoretical spectrum. The raw data from tandem mass spectrometry, however, is often not optimal because it may contain noise or measurement errors. Denoising this data can improve alignment between observed and theoretical spectra and reduce the number of peaks. The method used by Lewis et. al (2018) uses a combined constant and moving threshold to denoise spectra. We compare the effects of using the standard preprocessing methods baseline removal, wavelet smoothing, and binning on spectra with Lewis et. al’s threshold method. We consider individual methods and combinations, using measures of distance from Lewis et. al's scoring function for comparison. Our findings showed that no single method provided better results than Lewis et. al's, but combining techniques with that of Lewis et. al's reduced the distance measurements and size of the data set for many peptides.
12

Depth Estimation Using Adaptive Bins via Global Attention at High Resolution

Bhat, Shariq 21 April 2021 (has links)
We address the problem of estimating a high quality dense depth map from a single RGB input image. We start out with a baseline encoder-decoder convolutional neural network architecture and pose the question of how the global processing of information can help improve overall depth estimation. To this end, we propose a transformer-based architecture block that divides the depth range into bins whose center value is estimated adaptively per image. The final depth values are estimated as linear combinations of the bin centers. We call our new building block AdaBins. Our results show a decisive improvement over the state-of-the-art on several popular depth datasets across all metrics. We also validate the effectiveness of the proposed block with an ablation study.
13

Distortion of power law blinking with binning and thresholding

Amecke, Nicole, Heber, André, Cichos, Frank 22 May 2018 (has links)
Fluorescence intermittency is a random switching between emitting (on) and non-emitting (off) periods found for many single chromophores such as semiconductor quantum dots and organic molecules. The statistics of the duration of on- and off-periods are commonly determined by thresholding the emission time trace of a single chromophore and appear to be power law distributed. Here we test with the help of simulations if the experimentally determined power law distributions can actually reflect the underlying statistics. We find that with the experimentally limited time resolution real power law statistics with exponents αon/off ≳ 1.6, especially if αon ≠ αoff would not be observed as such in the experimental data after binning and thresholding. Instead, a power law appearance could simply be obtained from the continuous distribution of intermediate intensity levels. This challenges much of the obtained data and the models describing the so-called power law blinking.
14

On Resilient System Testing and Performance Binning

Han, Qiang 02 June 2015 (has links)
No description available.
15

The Hot Interstellar Medium in Normal Elliptical Galaxies

Diehl, Steven 26 September 2006 (has links)
No description available.
16

2D and 3D Reflection Seismic Studies over Scandinavian Deformation Zones

Lundberg, Emil January 2014 (has links)
The study of deformation zones is of great geological interest since these zones can separate rocks with different characteristics. The geometry of these structures with depth is important for interpreting the geological history of an area. Paper I to III present 2D reflection seismic data over deformation zones targeting structures in the upper 3-4 km of the crust. These seismic profiles were acquired with a crooked-line recording geometry. 2D seismic processing assumes a straight recording geometry. Most seismic processing tools were developed for sub-horizontally layered structures. However, in the crystalline rocks in Scandinavia more complex structures with contrasting dip directions and folding are common. The crooked-line recording geometries have the benefit of sampling a 3D volume. This broader sampling can be used to gain knowledge about the true geometry of subsurface structures. Correlation with geological maps and other geophysical data along with seismic data modeling can be used to differentiate reflections from faults or fracture zones from other reflectivity, e.g. mafic bodies. Fault and fracture zones may have a large impedance contrast to surrounding rocks, while ductile shear zones usually do not. The ductile shear zones can instead be interpreted based on differing reflectivity patterns between domains and correlations with geology or magnetic maps. Paper IV presents 3D reflection seismic data from a quick-clay landslide site in southern Sweden. The area is located in a deformation zone and structures in unconsolidated sediments may have been influenced by faults in the bedrock. The main target layer is located at only 20 m depth, but good surface conditions during acquisition and careful processing enabled a clear seismic image of this shallow layer to be obtained.The research presented in this thesis provides increased knowledge about subsurface structures in four geologically important areas. The unconventional processing methods used are recommended to future researchers working with data from crooked-line recording geometries in crystalline environments. The imaging of shallow structures at the quick-clay landslide site shows that the 3D reflection seismic method can be used as a complement to other geophysical measurements for shallow landslide site investigations.
17

Denoising Tandem Mass Spectrometry Data

Offei, Felix 01 May 2017 (has links)
Protein identification using tandem mass spectrometry (MS/MS) has proven to be an effective way to identify proteins in a biological sample. An observed spectrum is constructed from the data produced by the tandem mass spectrometer. A protein can be identified if the observed spectrum aligns with the theoretical spectrum. However, data generated by the tandem mass spectrometer are affected by errors thus making protein identification challenging in the field of proteomics. Some of these errors include wrong calibration of the instrument, instrument distortion and noise. In this thesis, we present a pre-processing method, which focuses on the removal of noisy data with the hope of aiding in better identification of proteins. We employ the method of binning to reduce the number of noise peaks in the data without sacrificing the alignment of the observed spectrum with the theoretical spectrum. In some cases, the alignment of the two spectra improved.
18

A Hybrid Pixel Detector ASIC with Energy Binning for Real-Time, Spectroscopic Dose Measurements

Wong, Winnie January 2012 (has links)
Hybrid pixel detectors have been demonstrated to provide excellent quality detection of ionising photon radiation, particularly in X-ray imaging. Recently, there has been interest in developing a hybrid pixel detector specifically for photon dosimetry. This thesis is on the design, implementation, and preliminary characterisation of the Dosepix readout chip. Dosepix has 256 square pixels of 220 mm side-length, constituting 12.4 mm2 of photo-sensitive area per detector. The combination of multiple pixels provides many parallel processors with limited input flux, resulting in a radiation dose monitor which can continuously record data and provide a real-time report on personal dose equivalent. Energy measurements are obtained by measuring the time over threshold of each photon and a state machine in the pixel sorts the detected photon event into appropriate energy bins. Each pixel contains 16 digital thresholds with 16 registers to store the associated energy bins. Preliminary measurements of Dosepix chips bump bonded to silicon sensors show very promising results. The pixel has a frontend noise of 120 e-. In low power mode, each chip consumes 15 mW, permitting its use in a portable, battery-powered system. Direct time over threshold output from the hybrid pixel detector assembly reveal distinctive photo-peaks correctly identifying the nature of incident photons, and verification measurements indicate that the pixel binning state machines accurately categorise charge spectra. Personal dose equivalent reconstruction using this data has a flat response for a large range of photon energies and personal dose equivalent rates.
19

Signal reconstruction from incomplete and misplaced measurements

Sastry, Challa, Hennenfent, Gilles, Herrmann, Felix J. January 2007 (has links)
Constrained by practical and economical considerations, one often uses seismic data with missing traces. The use of such data results in image artifacts and poor spatial resolution. Sometimes due to practical limitations, measurements may be available on a perturbed grid, instead of on the designated grid. Due to algorithmic requirements, when such measurements are viewed as those on the designated grid, the recovery procedures may result in additional artifacts. This paper interpolates incomplete data onto regular grid via the Fourier domain, using a recently developed greedy algorithm. The basic objective is to study experimentally as to what could be the size of the perturbation in measurement coordinates that allows for the measurements on the perturbed grid to be considered as on the designated grid for faithful recovery. Our experimental work shows that for compressible signals, a uniformly distributed perturbation can be offset with slightly more number of measurements.
20

Détermination de sondes oligonucléotidiques pour l'exploration à haut débit de la diversité taxonomique et fonctionnelle d'environnements complexes / Selection of oligonucleotide probes for high-throughput study of complex environments

Parisot, Nicolas 17 October 2014 (has links)
Les microorganismes, par leurs fascinantes capacités d’adaptation liées à l’extraordinaire diversité de leurs capacités métaboliques, jouent un rôle fondamental dans tous les processus biologiques. Jusqu’à récemment, la mise en culture était l’étape préliminaire obligatoire pour réaliser l’inventaire taxonomique et fonctionnel des microorganismes au sein des environnements. Cependant ces techniques ne permettent d’isoler qu’une très faible fraction des populations microbiennes et tendent donc à être remplacées par des outils moléculaires haut-débit. Dans ce contexte, l’évolution des techniques de séquençage a laissé entrevoir de nouvelles perspectives en écologie microbienne mais l’utilisation directe de ces techniques sur des environnements complexes, constitués de plusieurs milliers d’espèces différentes, reste néanmoins encore délicate. De nouvelles stratégies de réduction ciblée de la complexité comme la capture de gènes ou les biopuces ADN représentent alors une bonne alternative notamment pour explorer les populations microbiennes même les moins abondantes. Ces stratégies à haut-débit reposent sur la détermination de sondes combinant à la fois une forte sensibilité, une très bonne spécificité et un caractère exploratoire. Pour concevoir de telles sondes plusieurs logiciels ont été développés : PhylGrid 2.0, KASpOD et ProKSpOD. Ces outils généralistes et polyvalents sont applicables à la sélection de sondes pour tout type de gènes à partir des masses de données produites à l’heure actuelle. L’utilisation d’architectures de calculs hautement parallèles et d’algorithmes innovants basés sur les k-mers ont permis de contourner les limites actuelles. La qualité des sondes ainsi déterminées a pu permettre leur utilisation pour la mise au point de nouvelles approches innovantes en écologie microbienne comme le développement de deux biopuces phylogénétiques, d’une méthode de capture de gènes en solution ainsi que d’un algorithme de classification des données métagénomiques. Ces stratégies peuvent alors être employées pour diverses applications allant de la recherche fondamentale pour une meilleure compréhension des écosystèmes microbiens, au suivi de processus de bioremédiation en passant par l’identification de tous types de pathogènes (eucaryotes, procaryotes et virus). / Microorganisms play a crucial role in all biological processes related to their huge metabolic potentialities. Until recently, the cultivation was a necessary step to appraise the taxonomic and functional diversity of microorganisms within environments. These techniques however allow surveying only a small fraction of microbial populations and tend to be consequently replaced by highthroughput molecular tools. While the evolution of sequencing technologies opened the door to unprecedented opportunities in microbial ecology, massive sequencing of complex environments, with thousands of species, still remains inconceivable. To overcome this limitation, strategies were developed to reduce the sample complexity such as gene capture or DNA microarrays.These high-throughput strategies rely on the selection of sensitive, specific and explorative probes. To design such probes several programs have been developed: PhylGrid 2.0, KASpOD and ProKSpOD. These multipurpose tools were implemented to design probes from the exponentially growing sequence datasets in microbial ecology. Using highly parallel computing architectures and innovative k-mers based strategies allowed overcoming major limitations in this field. The high quality probe sets were used to develop innovative strategies in microbial ecology including two phylogenetic microarrays, a gene capture approach and a taxonomic binning algorithm for metagenomic data. These approaches can be carried out for various applications including better understanding of microbial ecosystems, bioremediation monitoring or identification of pathogens (eukaryotes, prokaryotes and viruses).

Page generated in 0.7741 seconds