• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 75
  • 18
  • 10
  • 7
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 163
  • 23
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • 14
  • 14
  • 14
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Comparison of Two Methods of External Scatter Dose Contributions to the Contralateral Breast

Cutlip, James 22 July 2008 (has links)
No description available.
42

The Effects of Immersion on 3D Information Visualization

Raja, Dheva 02 August 2006 (has links)
The effects of immersion with respect to information visualization have rarely been explored. In this thesis, we describe a methodology, two information visualization applications that were developed using the CAVE, and three user studies in order to explore, examine and attempt to quantify the effects of immersion. We focus on three major components of immersion: field of regard (FOR), head-based rendering (HBR), and stereoscopic viewing. We hypothesize that a high degree of FOR will result in increased task performance and user satisfaction when visualizing data represented by scatter and surface plots. We also hypothesize that HBR and stereoscopic viewing will result in increased task performance, but the effects of these components would be greater in the scatter plots than surface plots. We have conducted three user studies with the information visualization applications developed for this research. In the first study, an exploratory pilot study, we observed a trend in favor of using high FOR and HBR. In the second exploratory pilot study, we observed a slight trend in favor of high FOR. In the third study, thirty-two subjects performed tasks using both the scatter plots and surface plots with eight test conditions. We observed trends in favor of high levels of FOR, HBR and stereoscopic viewing in scatter plots, a slight trend in favor of HBR for surface plots, and a significant interaction effect between FOR and HBR in scatter plots for a particular task. / Master of Science
43

Distributed Vibration Sensing using Rayleigh Backscatter in Optical Fibers

Sang, Alexander Kipkosgei 22 December 2011 (has links)
Sensing has been essential for the investigation, understanding, exploitation, and utilization of physical phenomena. Traditional single-point sensing methods are being challenged by the multi-point or distributed sensing capabilities afforded by optical fiber sensors. A powerful technique available for distributed sensing involves the use of the Optical Frequency Domain Reflectometry (OFDR). This work focuses on using OFDR as a means of obtaining distributed vibration measurements using the Rayleigh scatter along a single-mode optical fiber. The effort begins by discussing various distributed measurement techniques currently in use before discussing the OFDR technique. Next, a thorough discussion on how high spatially resolved Rayleigh measurements are acquired and how such measurements can be used to make static strain measurements is presented. A new algorithm to resolve strain at regions of high spatial gradient is developed. This results in enhanced measurement performance of systems using the Rayleigh scatter to determine static strain or temperature measurements by improving measurement fidelity at the high gradient locations. Next, discussions on how dynamic strain (vibration) couples to optical fiber in a single point and in a distributed setting are presented. Lessons learned are then used to develop a new and unique distributed vibration measurement algorithm. Various consequential benefits are then reviewed before concluding remarks are stated. A simulation model was developed and used to supplement this investigation in every step of the discussion. The model was used to gain insight on how various physical phenomena interact with the optical fiber. The simulation was also used to develop and optimize the high gradient and vibration algorithms developed herein. Simple experiments were then used to validate the theory and the simulation models. / Ph. D.
44

A Scheme for Ultra-Fast Computed Tomography Based on Stationary Multi-Beam X-ray Sources

Gong, Hao 16 February 2017 (has links)
The current cardiac computed tomography (CT) technology is mainly limited by motion blurring and radiation dose. The conceptual multi-source interior CT scheme has provided a potential solution to reduce motion artifacts and radiation exposure. This dissertation work conducted multi-facet investigations on a novel multi-source interior CT architecture (G. Cao, et. al, IEEE Access, 2014;2:1263-71) which employs distributed stationary multi-beam Carbon-nanotube (CNT) X-ray sources and simultaneously operates multiple source-detector chains to improve temporal resolution. The collimation based interior CT is integrated in each imaging chain, to suppress radiation dose. The central thesis statement is: Compared to conventional CT design, this distributed source array based multi-source interior CT architecture shall provide ultra-fast CT scan of region-of-interest (ROI) inside body with comparable image quality at lower radiation dose. Comprehensive studies were conducted to separately investigate three critical aspects of multi-source interior CT: interior CT mode, X-ray scattering, and scatter correction methods. First, a single CNT X-ray source based interior micro-CT was constructed to serve as a down-scaled experimental verification platform for interior CT mode. Interior CT mode demonstrated comparable contrast-noise-ratio (CNR) and image structural similarity to the standard global CT mode, while inducing a significant radiation dose reduction (< 83.9%). Second, the data acquisition of multi-source interior CT was demonstrated at clinical geometry, via numerical simulation and physical experiments. The simultaneously operated source-detector chains induced significant X-ray forward / cross scattering and thus caused severe CNR reduction (< 68.5%) and CT number error (< 1122 HU). To address the scatter artifacts, a stationary beam-stopper-array (BSA) based and a source-trigger-sequence (STS) based scatter correction methods were proposed to enable the online scatter measurement / correction with further radiation dose reduction (< 50%). Moreover, a deterministic physics model was also developed to iteratively remove the scatter-artifacts in the multi-source interior CT, without the need for modifications in imaging hardware or protocols. The three proposed scatter correction methods improved CNR (< 94.0%) and suppressed CT number error (< 48 HU). With the dedicated scatter correction methods, the multi-source interior CT could provide ROI-oriented imaging with acceptable image quality at significantly reduced radiation dose. / Ph. D.
45

A Curvature-Corrected Rough Surface Scattering Theory Through The Single-Scatter Subtraction Method

Diomedi II, Kevin Paul 21 March 2019 (has links)
A new technique is presented to study radio propagation and rough surface scattering problems based on a reformulation of the Magnetic Field Integration Equation (MFIE) called the Single-Scatter Subtraction (S^3) method. This technique amounts to a physical preconditioning by separating the single- and multiple-scatter currents and removing the single-scattering contribution from the integral term that is present in the MFIE. This requires the calculation of a new quantity that is the kernel of the MFIE integral call the kernel integral or Gbar. In this work, 1-dimensional deterministically rough surfaces are simulated by surfaces consisting of single and multiple cosines. In order to truncate the problem domain, a beam illumination is used as the source term and it is shown that this also causes the kernel integral to have a finite support. Using the Single Scatter Subtraction method on these surfaces, closed-form expressions are found for the kernel integral and thus the single-scatter current for a well defined region of validity of surface parameters which may then be efficiently radiated into the far field numerically. Both the closed-form expressions, and the computed radiated fields are studied for their physical significance. This provides a clear physical intuition for the technique as an augmentation to existing ones as a bent-plane approximation as shown analytically and also validated by numeric results. Further analysis resolves a controversy on the nature of Bragg scatter which is found to be a multiple-scatter phenomenon. Error terms present in the kernel integral also raise new questions on the effect of truncation for any MFIE-based solution. Additionally, a dramatic enhancement of backscatter predicted by this new approach versus the Kirchhoff method is observed as the angle of incidence increases due to the error terms. / Doctor of Philosophy / A new technique is presented to study the interaction of electromagnetic waves with rough surfaces. Building on the technique called the Magnetic Field Integral Equation (MFIE) which allows the solution for the electromagnetic fields scattered from the surface by considering only the induced electric and magnetic currents on the surface, the Single-Scatter Substraction (S 3 ) method separates the surface currents into those that interact with the surface only once or single-scatter, and those that interact multiple times called multiple-scatter. Since this is the introduction of this technique, only the former is investigated. In this study, a new quantity which is an integral of one of the components of the standard MFIE is studied and closed-form approximations are presented along with bounds of validity. This provides closed form solutions for the single-scattering currents, from which the radiated fields may be efficiently found numerically. Since they are closed form, the expressions provide insight into the nature of the physical scattering process. Numerical results of these expressions are compared to the standard approximate technique as well as the ”exact” solution found by numerically solving the MFIE. Compared to the standard approximate technique which approximates the surface by a tangent plane at each point on the surface, the single-scatter currents approximate the surface with a bent-plane at each point. This shifts the scattered fields from certain directions to others, and highlights where single- and multiple-scattering have an effect.
46

Characterization and Optimization of Silicon-strip Detectors for Mammography and Computed Tomography

Chen, Han January 2016 (has links)
The goal in medical x-ray imaging is to obtain the image quality requiredfor a given detection task, while ensuring that the patient dose is kept as lowas reasonably achievable. The two most common strategies for dose reductionare: optimizing incident x-ray beams and utilizing energy informationof transmitted beams with new detector techniques (spectral imaging). Inthis thesis, dose optimization schemes were investigated in two x-ray imagingsystems: digital mammography and computed tomography (CT). In digital mammography, the usefulness of anti-scatter grids was investigatedas a function of breast thickness with varying geometries and experimentalconditions. The general conclusion is that keeping the grid is optimalfor breasts thicker than 5 cm, whereas the dose can be reduced without a gridfor thinner breasts. A photon-counting silicon-strip detector developed for spectral mammographywas characterized using synchrotron radiation. Energy resolution, ΔE/Ein, was measured to vary between 0.11-0.23 in the energy range 15-40 keV, which is better than the energy resolution of 0.12-0.35 measured inthe state-of-the-art photon-counting mammography system. Pulse pileup hasshown little effect on energy resolution. In CT, the performance of a segmented silicon-strip detector developedfor spectral CT was evaluated and a theoretical comparison was made withthe state-of-the-art CT detector for some clinically relevant imaging tasks.The results indicate that the proposed photon-counting silicon CT detector issuperior to the state-of-the-art CT detector, especially for high-contrast andhigh-resolution imaging tasks. The beam quality was optimized for the proposed photon-counting spectralCT detector in two head imaging cases: non-enhanced imaging and Kedgeimaging. For non-enhanced imaging, a 120-kVp spectrum filtered by 2half value layer (HVL) copper (Z = 29) provides the best performance. Wheniodine is used in K-edge imaging, the optimal filter is 2 HVL iodine (Z = 53)and the optimal kVps are 60-75 kVp. In the case of gadolinium imaging, theradiation dose can be minimized at 120 kVp filtered by 2 HVL thulium (Z =69). / <p>QC 20160401</p>
47

Optimisation et validation des méthodes de calcul de dose à distance des faisceaux d’irradiation pour leur application dans les études épidémiologiques et cliniques en radiothérapie / Optimization and validation of out-of-field dose calculation methods in external beam radiation therapy for use in epidemiological and clinical studies

Vũ Bezin, Jérémi 17 December 2015 (has links)
La proportion de survivants à un cancer dans la population des pays développés augmente rapidement. Dans plus de la moitié des cas, la radiothérapie a été une composante de leur traitement. Les rayons ionisants alors administrés peuvent induire de graves conséquences à long terme, en particulier les cancers radio-induits et les maladies cardiovasculaires. Ces évènements sont dus non seulement aux fortes doses administrées au volume cible, mais également aux doses plus faibles, de quelques milligray à quelques gray, non souhaitées, mais inévitablement administrées dans le reste du corps du patient par la dose hors champ. L’évolution des techniques de planification du traitement et de l’informatique en médecine permettent aujourd’hui d’obtenir, systématiquement, l’évaluation précise des doses les plus fortes administrées au patient. Les doses faibles à intermédiaires administrées en dehors du faisceau de traitement, ne sont pour leur part, ni habituellement prises en compte, ni correctement évaluées par les systèmes actuels de planification du traitement. L’objectif de ce travail était de proposer des méthodes pour estimer le rayonnement hors champ des faisceaux de photons des accélérateurs de radiothérapie externe. L’utilisation d’une bibliothèque graphique nous a permis de réaliser une représentation géométrique 3D partielle des appareils de traitement et des sources photoniques responsables de la dose reçue par le patient. Nous avons déterminé l’intensité de ces sources en utilisant des mesures réalisées dans des champs simples. Le modèle ainsi calibré permettait de simuler la variation de l’intensité des sources en fonction de la taille du champ. Cette approche a permis de décrire avec succès la variation de la dose mesurée par TLD en fonction de la distance et de la taille du champ en dehors de champs carrés. Les écarts entres les doses calculées et celles mesurées étaient inférieurs à 10 %. Une application dans des conditions cliniques a été menée, l’écart était alors en moyenne de 25 %. / The number of cancer survivors in developed counties increases rapidly. Fifty percent of patients treated for cancer will receive radiation therapy as part of their treatment. Ionizing radiation may induce severe long term effects, including secondary cancers and cardio-vascular diseases. Long term effects are not only due to high doses delivered in target volumes, but also to lower doses, ranging from several milligrays to several grays, undesired, but inevitably delivered in the rest of the patient’s body outside the treatment beams. Improvements in treatment planning technics and the use of computers in medicine made it possible to systematically estimate, prior to treatment, the highest doses delivered to the patient’s body. However, lower doses delivered outside the treatment beams are neither taken into account nor evaluated by present treatment planning systems. The aim of our work was to establish methods to estimate radiation doses outside photon beams from accelerators used in external radiation therapy. A graphics library was used to render a partial 3D representation of the accelerator and the photon sources associated. The intensity of these sources was determined using measurements performed in simple geometry fields. The calibrated model was hence used to estimate the source intensity variation with respect to field size. Using this method, we were able to estimate the variations of the TLD measured doses with respect to distance and field size with a 10% average discrepancy between calculations and measurements for points outside the field. Also, when testing the model in a clinical setup, the average discrepancy increased to 25%.
48

Towards an Efficient Spectral Element Solver for Poisson’s Equation on Heterogeneous Platforms / Mot en effektiv spektrala element-lösare för Poissons ekvation på heterogena plattformar

Nylund, Jonas January 2022 (has links)
Neko is a project at KTH to refactor the widely used fluid dynamics solver Nek5000 to support modern hardware. Many aspects of the solver need adapting for use on GPUs, and one such part is the main communication kernel, the Gather-Scatter (GS) routine. To avoid race conditions in the kernel, atomic operations are used, which can be inefficient. To avoid the use of atomics, elements were grouped in such a way that when multiple writes to the same address are necessary, they will always come in blocks. This way, each block can be assigned to a single thread and handled sequentially, avoiding the need for atomic operations altogether. In the scope of the thesis, a Poisson solver was also ported from CPU to Nvidia GPUs. To optimise the Poisson solver, a batched matrix multiplication kernel was developed to efficiently perform small matrix multiplications in bulk, to better utilise the GPU. Optimisations using shared memory and kernel unification was done. The performance of the different implementations was tested on two systems using a GTX1660 and dual Nvidia A100 respectively. The results show only small differences in performance between the two versions of the GS kernels when only considering computational cost, and in a multi-rank setup the communication time completely overwhelms any potential difference. The shared memory matrix multiplication kernel yielded around a 20% performance boost for the Poisson solver. Both versions vastly outperformed cuBLAS. The unified kernel also had a large positive impact on the performance, yielding up to a 50% increase in throughput. / Neko är ett KTH-projekt med syfte att vidareutveckla det populära beräkningsströmningsdynamik-programmet Nek5000 för moderna datorsystem. Speciell vikt har lagts vid att stödja heterogena plattformar med dedikerade accelleratorer för flyttalsberäkningar. Den idag vanligast förekommande sådana är grafikkort (GPUer). En viktig del av Neko är Gather-Scatter (GS)-funktionen, som är den huvudsakliga kommunikations-funktionen mellan processer i programmet. I GS-funktionen kan race conditions uppstå då flera trådar skriver till samma minnesaddress samtidigt. Detta kan undvikas med atomic operations, men användande av dessa kan ha negativ inverkan på prestanda. I detta masterarbete utvecklades en alternativ implementation där element i GS-algoritmen grupperades på sådant sätt att alla operationer på samma element kommer i block. På så sätt kan de enkelt behandlas i sekvens och därmed undvika behovet av atomic operations. Inom ramen för masterarbetet implementerades en numerisk lösare av Poisson’s ekvation för GPUer. Optimering av koden genom att göra matrismultiplikationer i bulk genomfördes, och vidare genom utnyttjande av shared memory. Prestandan utvärderades på två olika datorsystem med en GTX1660 respektive två A100 GPUer. Enbart små skillnader sågs mellan de olika GS-implementationerna, med en svag fördel om ca 5% högre prestanda för den grupperade varianten i högupplösta domäner. Poisson-lösaren visade på höga prestandasiffror jämfört med cuBLAS-biblioteket.
49

Novel methods for scatter correction and dual energy imaging in cone-beam CT

Dong, Xue 22 May 2014 (has links)
Excessive imaging doses from repeated scans and poor image quality mainly due to scatter contamination are the two bottlenecks of cone-beam CT (CBCT) imaging. This study investigates a method that combines measurement-based scatter correction and a compressed sensing (CS)-based iterative reconstruction algorithm to generate scatter-free images from low-dose data. Scatter distribution is estimated by interpolating/extrapolating measured scatter samples inside blocked areas. CS-based iterative reconstruction is finally carried out on the under-sampled data to obtain scatter-free and low-dose CBCT images. In the tabletop phantom studies, with only 25% dose of a conventional CBCT scan, our method reduces the overall CT number error from over 220 HU to less than 25 HU, and increases the image contrast by a factor of 2.1 in the selected ROIs. Dual-energy CT (DECT) is another important application of CBCT. DECT shows promise in differentiating materials that are indistinguishable in single-energy CT and facilitates accurate diagnosis. A general problem of DECT is that decomposition is sensitive to noise in the two sets of projection data, resulting in severely degraded qualities of decomposed images. The first study of DECT is focused on the linear decomposition method. In this study, a combined method of iterative reconstruction and decomposition is proposed. The noise on the two initial CT images from separate scans becomes well correlated, which avoids noise accumulation during the decomposition process. To fully explore the benefits of DECT on beam-hardening correction and to reduce the computation cost, the second study is focused on an iterative decomposition method with a non-linear decomposition model for noise suppression in DECT. Phantom results show that our methods achieve superior performance on DECT imaging, with respect to noise reduction and spatial resolution.
50

ANISOTROPIC POLARIZED LIGHT SCATTER AND MOLECULAR FACTOR COMPUTING IN PHARMACEUTICAL CLEANING VALIDATION AND BIOMEDICAL SPECTROSCOPY

Urbas, Aaron Andrew 01 January 2007 (has links)
Spectroscopy and other optical methods can often be employed with limited or no sample preparation, making them well suited for in situ and in vivo analysis. This dissertation focuses on the use of a near-infrared spectroscopy (NIRS) and polarized light scatter for two such applications: the assessment of cardiovascular disease, and the validation of cleaning processes for pharmaceutical equipment.There is a need for more effective in vivo techniques for assessing intravascular disorders, such as aortic aneurysms and vulnerable atherosclerotic plaques. These, and other cardiovascular disorders, are often associated with structural remodeling of vascular walls. NIRS has previously been demonstrated as an effective technique for the analysis of intact biological samples. In this research, traditional NIRS is used in the analysis of aortic tissue samples from a murine knockout model that develops abdominal aortic aneurysms (AAAs) following infusion of angiotensin II. Effective application of NIRS in vivo, however, requires a departure from traditional instrumental principles. Toward this end, the groundwork for a fiber optic-based catheter system employing a novel optical encoding technique, termed molecular factor computing (MFC), was developed for differentiating cholesterol, collagen and elastin through intervening red blood cell solutions. In MFC, the transmission spectra of chemical compounds are used to collect measurements directly correlated to the desired sample information.Pharmaceutical cleaning validation is another field that can greatly benefit from novel analytical methods. Conventionally cleaning validation is accomplished through surface residue sampling followed by analysis using a traditional analytical method. Drawbacks to this approach include cost, analysis time, and uncertainties associated with the sampling and extraction methods. This research explores the development of in situ cleaning validation methods to eliminate these issues. The use of light scatter and polarization was investigated for the detection and quantification of surface residues. Although effective, the ability to discriminate between residues was not established with these techniques. With that aim in mind, the differentiation of surface residues using NIRS and MFC was also investigated.

Page generated in 0.0477 seconds