Spelling suggestions: "subject:"deconvolution"" "subject:"econvolution""
121 |
Magnetic field of the magnetic chemically peculiar star V1148 OriPettersson, Kristoffer January 2023 (has links)
This project aims to obtain and interpret the measurements of the mean longitudinal magnetic field of the chemically peculiar star V1148 Ori. To achieve this aim 12 spectropolarimetric observations obtained by the CFHT using the spectropolarimeter ESPaDOnS were used. The method used to extract the magnetic field signatures from the spectra is called least-squares deconvolution. This method yields line-averaged profiles with a high signal-to-noise ratio. These mean line profiles are necessary to compute the mean longitudinal field. Results of the mean longitudinal field measurements were plotted as a function of the rotational phase, and to this, a sinusoidal function describing a dipolar field was fitted. The dipolar field parameters were computed for two different stellar radii. Inconsistent values for the stellar radii were obtained from the literature, and therefore we calculated two values for each of the parameters. For the surface polar field strength, we found BR1 = 17.38±0.30 kG and BR2 = 12.81±0.22 kG. The calculations involving one of the stellar radii gave results more consistent with previous findings. However, the discrepancy in parameter values could not be accounted for by the small uncertainties. So no definite conclusions can be drawn about the dipolar field parameters. Our fit aligns well with our longitudinal field measurements, no clear indication of any significant deviation from our model assumption, which suggests that the mean longitudinal field is consistent with a large-scale dipolar-like structure.
|
122 |
Identifying cell type-specific proliferation signatures in spatial transcriptomics data and inferring interactions driving tumour growthWærn, Felix January 2023 (has links)
Cancer is a dangerous disease caused by mutations in the host's genome that makes the cells proliferateuncontrollably and disrupts bodily functions. The immune system tries to prevent this, but tumours have methods ofdisrupting the immune system's ability to combat the cancer. These immunosuppression events can for examplehappen when the immune system interacts with the tumour to recognise it or try and destroy it. The tumours can bychanging their displayed proteins on the cell surface avoid detection or by excreting proteins, they can neutralisedangerous immune cells. This happens within the tumour microenvironment (TME), the immediate surrounding of atumour where there is a plethora of different cells both aiding and suppressing the tumour. Some of these cells arenot cancer cells but can still aid the tumour due to how the tumour has influenced them. For example, throughangiogenesis, where new blood vessels are formed which feeds the tumour. The interactions in the TME can be used as a target for immunotherapy, a field of treatments which improves theimmune system's own ability at defending against cancer. Immunotherapy can for example help the immune systemby guiding immune cells towards the tumour. It is therefore essential to understand the complex system ofinteractions within the TME to be able to create new methods of immunotherapy and thus treat cancers moreefficiently. Concurrently new methods of mapping what happens in a tissue have been developed in recent years,namely spatial transcriptomics (ST). It allows for the retrieval of transcriptomic information of cells throughsequencing while still retaining spatial information. However, the ST methods which capture the wholetranscriptome of the cells and reveal the cell-to-cell interactions are not of single-cell resolution yet. They capturemultiple cells in each spot, creating a mix of cells in the sequencing. This mix of cells can be detangled, and theproportions of each cell type revealed through the process of deconvolution. Deconvolution works by mapping thesingle cell expression profile of different cell types onto the ST data and figuring out what proportions of expressioneach cell type produces the expression of the mix. This reveals the cellular composition of the microenvironment.But since the interactions in the TME depend on the cells current expression we need to deconvolute according tophenotype and not just cell type. In this project we were able to create a tool which automatically finds phenotypes in the single-cell data and usesthose phenotypes to deconvolute ST data. Phenotypes are found using dimensionality reduction methods todifferentiate cells according to their contribution to the variability in the data. The resulting deconvoluted data wasthen used as the foundation for describing the growth of a cancer as a system of phenotype proportions in the tumourmicroenvironment. From this system a mathematical model was created which predicts the growth and couldprovide insight into how the phenotypes interact. The tool created worked as intended and the model explains thegrowth of a tumour in the TME with not just cancer cells phenotypes but other cell phenotypes as well. However, nonew interaction could be discovered by the final model and no phenotype found could provide us with new insightsto the structure of the TME. But our analysis was able to identify structures we expect to see in a tumour, eventhough they might not be so obvious, so an improved version of our tools might be able to find even more detailsand perhaps new, more subtle interactions.
|
123 |
Artificial intelligence based deconvolving on megavoltage photon beam profiles for radiotherapy applicationsWeidner, Jan, Horn, Julian, Kabat, Christopher Nickolas, Stathakis, Sotirios, Geissler, Philipp, Wolf, Ulrich, Poppinga, Daniela 04 May 2023 (has links)
Objective. The aim of this work is an AI based approach to reduce the volume effect of ionization
chambers used to measure high energy photon beams in radiotherapy. In particular for profile
measurements, the air-filled volume leads to an inaccurate measurement of the penumbra. Approach.
The AI-based approach presented in this study was trained with synthetic data intended to cover a
wide range of realistic linear accelerator data. The synthetic data was created by randomly generating
profiles and convolving them with the lateral response function of a Semiflex 3D ionization chamber.
The neuronal network was implemented using the open source tensorflow.keras machine learning
framework and a U-Net architecture. The approach was validated on three accelerator types (Varian
TrueBeam, Elekta VersaHD, Siemens Artiste) at FF and FFF energies between 6 MV and 18 MV at
three measurement depths. For each validation, a Semiflex 3D measurement was compared against a
microDiamond measurement, and the AI processed Semiflex 3D measurement was compared against
the microDiamond measurement. Main results. The AI approach was validated with dataset
containing 306 profiles measured with Semiflex 3D ionization chamber and microDiamond. In 90%
of the cases, the AI processed Semiflex 3D dataset agrees with the microDiamond dataset within 0.5
mm/2% gamma criterion. 77% of the AI processed Semiflex 3D measurements show a penumbra
difference to the microDiamond of less than 0.5 mm, 99% of less than 1 mm. Significance. This AI
approach is the first in the field of dosimetry which uses synthetic training data. Thus, the approach is
able to cover a wide range of accelerators and the whole specified field size range of the ionization
chamber. The application of the AI approach offers an quality improvement and time saving for
measurements in the water phantom, in particular for large field sizes
|
124 |
Scatterometer Image Reconstruction Tuning and Aperture Function Estimation for Advanced Microwave Scanning Radiometer on the Earth Observing SystemGunn, Brian Adam 28 May 2010 (has links) (PDF)
AMSR-E is a space-borne radiometer which measures Earth microwave emissions or brightness temperatures (Tb) over a wide swath. AMSR-E data and images are useful in mapping valuable Earth-surface and atmospheric phenomena. A modified version of the Scatterometer Image Reconstruction (SIR) algorithm creates Tb images from the collected data. SIR is an iterative algorithm with tuning parameters to optimize the reconstruction for the instrument and channel. It requires an approximate aperture function for each channel to be effective. This thesis presents a simulator-based optimization of SIR iteration and aperture function threshold parameters for each AMSR-E channel. A comparison of actual Tb images generated using the optimal and sub-optimal values is included. Tuned parameters produce images with sharper transitions between regions of low and high Tb for lower-frequency channels. For higher-frequency channels, the severity of artifacts due to temporal Tb variation of the input measurements decreases and coverage gaps are eliminated after tuning. A two-parameter Gaussian-like bell model is currently assumed in image reconstruction to approximate the AMSR-E aperture function. This paper presents a method of estimating the effective AMSR-E aperture function using Tb measurements and geographical information. The estimate is used as an input for image reconstruction. The resulting Tb images are compared with those produced with the previous Gaussian approximation. Results support the estimates found in this paper for channels 1h, 1v, and 2h. Images processed using the old or new aperture functions for all channels differed by a fraction of a Kelvin over spatially smooth regions.
|
125 |
Extraction of Loudspeaker- and room impulse responses under overlapping conditionsGustafsson, Felix January 2022 (has links)
A loudspeaker is often considered to be a Linear Time Invariant (LTI) system, which can be completely categorized by its impulse response. What sets loudspeakers apart from other LTI-systems is the acoustical aspect including echoes, which makes it a lot harder to take accurate noise free measurements compared to other LTI-systems such as a simple RC circuit. There are two main challenges regarding loudspeaker measurement, the first is high frequency reflections of surrounding surfaces and the second is low frequency modal resonances in the room stemming from the initial echoes. A straightforward way of dealing with this issue is simply truncating the measured impulse response before the arrival of the first high frequency reflection. This is however not without its problems as this will result in high uncertainty for low frequency content of the measurement. The longer time until the first reflection is measured, the better the measurement. The ideal measurement would be a noise free environment with infinite distance towards the nearest reflective surface. This is of course not possible in practice, but this ideal environment can be simulated by using an anechoic chamber. This thesis investigates the possibility of creating pseudo anechoic measurements in a general room using optimization with information extracted from measurement data in combination with linear time-varying (LTV) filtering. Algorithms for extracting information such as time delay between reflections as well as compensation for distortion in the reflections have been developed. This information is later used to minimize a cost function in order to obtain an estimation of the loudspeakers' impulse response using multiple measurements. The resulting estimation is then filtered using the LTV filter in order to obtain the pseudo anechoic impulse response. This thesis investigates two different loudspeakers in two ordinary rooms as well as in an anechoic chamber, and evaluates the performance of the developed methods. The overall results seem promising, but due to some inconsistencies of the measurements taken in the anechoic chamber that changes the direct wave of the loudspeakers, the developed methods are unable to achieve a true anechoic impulse response. It is concluded that to be able to achieve true pseudo anechoic results, measurements in rooms must better resemble the ones taken inside the anechoic chamber. This in combination with tuning the hyper parameters of the LTV filter looks promising to achieve pseudo anechoic impulse responses with high correlation to the true anechoic measurements.
|
126 |
New Procedures for Data Mining and Measurement Error Models with Medical Imaging ApplicationsWang, Xiaofeng 15 July 2005 (has links)
No description available.
|
127 |
SPATIAL RESOLUTION CHARACTERIZATION OF IMAGES TAKEN FROM A CAPILLARY-BASED HIGH PRESSURE CHAMBER FOR BIOLOGICAL IMAGING STUDIESRaber, Erica Candace 08 August 2006 (has links)
No description available.
|
128 |
Subspace Techniques for Parallel Magnetic Resonance ImagingGol Gungor, Derya 30 December 2014 (has links)
No description available.
|
129 |
A MONTE CARLO SIMULATION AND DECONVOLUTION STUDY OF DETECTOR RESPONSE FUNCTION FOR SMALL FIELD MEASUREMENTSFENG, YUNTAO January 2006 (has links)
No description available.
|
130 |
DEVELOPMENT AND APPLICATION OF TIME-RESOLVED FLUORESCENCE SPECTROSCOPY ANALYSIS WITH SPECIMENS OF THE UPPER GI TRACTLePalud, Michelle L. 04 1900 (has links)
<p>Current gold standard practices for the diagnosis of tissue disease involve invasive tissue biopsies subjected to a time consuming histopathological examination process. An optical biopsy can offer a non-invasive diagnostic alternative by exploiting the properties of naturally occurring light-tissue interactions. A time-resolved fluorescence spectroscopy instrument (355 nm excitation) has previously been developed by our lab to capture the fluorescence response of gastrointestinal tissue (370-550 nm in 5 nm increments, 25 ns at 1000 ps/pt). Measurements were conducted ex-vivo during routine upper gastrointestinal tract biopsies on duodenum, antrum, stomach body, and esophageal tissue. The work currently presented is focused on protocol development for tissue handling, measurement collection, clinical data management, fluorescent decay modeling using Laguerre based deconvolution, instrument performance evaluation, and k-means based classification.</p> <p>Descriptive parameters derived from spectral (total signal intensity) and temporal (lifetime and Laguerre polynomial coefficients) analysis were used to evaluate the data. It was found that data were only compromised when the total signal intensity for the peak wavelength 455 nm fell blow 19.5 V·ns. The data did not exhibit any signs of photobleaching or pulse width broadening that would have otherwise distorted the lifetime from its true fluorescence response. Data for diseased tissue were limited so the clinical diagnosis was used to classify normal duodenum tissue from normal esophageal tissue. Over 400 pairs of parameters demonstrated k-means can identify duodenum tissue with 87.5 % sensitivity and 87.5 % specificity or better. With some dimensional axis transformations these results could be improved. The lifetimes are not factors here but the relative intensity and decay shape were. Protocols can be applied to diseased or other tissue types with little adaptation. Just a single set of parameters may hold the key to help surgeons choose optimum locations for traditional biopsies or perhaps one day replace them altogether.</p> / Master of Applied Science (MASc)
|
Page generated in 0.0568 seconds