• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 76
  • 18
  • 10
  • 7
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 164
  • 23
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • 14
  • 14
  • 14
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Radiotherapy X-ray dose distribution beneath retracted patient compensators

Piyaratna, Nelson, University of Western Sydney, Nepean, Faculty of Science and Technology January 1995 (has links)
Computer designed missing tissue and dose compensators have been produced and dosimetrically tested under a linear accelerator 6MV X-ray beam. Missing tissues compensators were developed to correct for patient external contour change only. Target dose compensators were developed to achieve a uniform dose throughout the target volume. With compensators present in the beam, data acquisition was repeated in a water phantom and an Anthropomorphic phantom. Clinically acceptable dose uniformity was achieved within these phantoms. For external contour compensation flat isodose curves were obtained giving an even dose in the region of interest. The dose difference found was within plus/minus 3% only. For the phantoms containing inhomogeneities dose uniformity to target volume was achieved within plus/minus 7%. Prediction of radiation dose was made using a GE Target Series 2 Treatment Planning Computer for each of the phantoms. Validation of the computer predicted dose was carried out using diode and TLD measurements. The measured data in the water tank was consistent with the computer data within plus/minus 2% for external contour changes and for inhomogeneities. The TLD measured results in the anthropomorphic phantom agreed with the planning computer results within 6%. Up to 4% of the difference is explainable due to supra-linearity and scatter effects / Master of Science (Hons) (Physics)
62

Tuning of Metaheuristics for Systems Biology Applications

Höghäll, Anton January 2010 (has links)
<p>In the field of systems biology the task of finding optimal model parameters is a common procedure. The optimization problems encountered are often multi-modal, i.e., with several local optima. In this thesis, a class of algorithms for multi-modal problems called metaheuristics are studied. A downside of metaheuristic algorithms is that they are dependent on algorithm settings in order to yield ideal performance.This thesis studies an approach to tune these algorithm settings using user constructed test functions which are faster to evaluate than an actual biological model. A statistical procedure is constructed in order to distinguish differences in performance between different configurations. Three optimization algorithms are examined closer, namely, scatter search, particle swarm optimization, and simulated annealing. However, the statistical procedure used can be applied to any algorithm that has configurable options.The results are inconclusive in the sense that performance advantages between configurations in the test functions are not necessarily transferred onto real biological models. However, of the algorithms studied a scatter search implementation was the clear top performer in general. The set of test functions used must be studied if any further work is to be made following this thesis.In the field of systems biology the task of finding optimal model parameters is a common procedure. The optimization problems encountered are often multi-modal, i.e., with several local optima. In this thesis, a class of algorithms for multi-modal problems called metaheuristics are studied. A downside of metaheuristic algorithms is that they are dependent on algorithm settings in order to yield ideal performance.</p><p>This thesis studies an approach to tune these algorithm settings using user constructed test functions which are faster to evaluate than an actual biological model. A statistical procedure is constructed in order to distinguish differences in performance between different configurations. Three optimization algorithms are examined closer, namely, scatter search, particle swarm optimization, and simulated annealing. However, the statistical procedure used can be applied to any algorithm that has configurable options.</p><p>The results are inconclusive in the sense that performance advantages between configurations in the test functions are not necessarily transferred onto real biological models. However, of the algorithms studied a scatter search implementation was the clear top performer in general. The set of test functions used must be studied if any further work is to be made following this thesis.</p>
63

The Development of a Parameterized Scatter Removal Algorithm for Nuclear Materials Identification System Imaging

Grogan, Brandon Robert 01 May 2010 (has links)
This dissertation presents a novel method for removing scattering effects from Nuclear Materials Identification System (NMIS) imaging. The NMIS uses fast neutron radiography to generate images of the internal structure of objects non-intrusively. If the correct attenuation through the object is measured, the positions and macroscopic cross-sections of features inside the object can be determined. The cross sections can then be used to identify the materials and a 3D map of the interior of the object can be reconstructed. Unfortunately, the measured attenuation values are always too low because scattered neutrons contribute to the unattenuated neutron signal. Previous efforts to remove the scatter from NMIS imaging have focused on minimizing the fraction of scattered neutrons which are misidentified as directly transmitted by electronically collimating and time tagging the source neutrons. The parameterized scatter removal algorithm (PSRA) approaches the problem from an entirely new direction by using Monte Carlo simulations to estimate the point scatter functions (PScFs) produced by neutrons scattering in the object. PScFs have been used to remove scattering successfully in other applications, but only with simple 2D detector models. This work represents the first time PScFs have ever been applied to an imaging detector geometry as complicated as the NMIS. By fitting the PScFs using a Gaussian function, they can be parameterized and the proper scatter for a given problem can be removed without the need for rerunning the simulations each time. In order to model the PScFs, an entirely new method for simulating NMIS measurements was developed for this work. The development of the new models and the codes required to simulate them are presented in detail. The PSRA was used on several simulated and experimental measurements and chi-squared goodness of fit tests were used to compare the corrected values to the ideal values that would be expected with no scattering. Using the PSRA resulted in an improvement of the chi-squared test by a factor of 60 or more when applied to simple homogeneous objects.
64

A Practical Optimum Design Of Steel Structures With Scatter Search Method And Sap2000

Korkut, Ahmet Esat 01 February 2013 (has links) (PDF)
In the literature, a large number of metaheuristic search techniques have been proposed up to present time and some of those have been used in structural optimization. Scatter search is one of those techniques which has proved to be effective when solving combinatorial and nonlinear optimization problems such as scheduling, routing, financial product design and other problem areas. Scatter search is an evolutionary method that uses strategies based on a composite decision rules and search diversification and intensification for generating new trial points. Broodly speaking, this thesis is concerned with the use and application of scatter search technique in structural optimization. A newly developed optimization algorithm called modified scatter search is modified which is computerized in a software called SOP2012. The software SOP2012 is integrated with well-known structural analysis software SAP2000 using application programming interface for size optimum design of steel structures. Numerical studies are carried out using a test suite consisting of five real size design examples taken from the literature. In these examples, various steel truss and frame structures are designed for minimum weight according to design limitations imposed by AISC-ASD (Allowable Stress Design Code of American Institute of Steel Construction). The results reveal that the modified scatter search technique is very effective optimization technique for truss structures, yet its performance can be assessed ordinary for frame structures.
65

Impact of Light Scatter on the Assessment of Retinal Arteriolar Hemodynamics

Azizi, Behrooz January 2010 (has links)
Introduction and Purpose: Vascular pathologies play an important role in the etiology and progression of number of ocular diseases. Many instruments are developed to monitor retinal hemodynamics, including the Canon Laser Blood Flowmeter (CLBF), in an attempt to better understand the pathophysiology of the disease (Chapter 2). The purpose of this thesis is to determine the impact of light scatter on retinal arteriolar hemodynamic measurement assessed by the CLBF as intraocular light scatter is an inevitable consequence of ageing and particularly cataract. Methodology: Chapter 4 – Artificial light scatter model: One eye from each of 10 healthy young subjects between the ages of 18 and 30 (23.6 ± 3.4) was randomly selected. To simulate light scatter, cells comprising a plastic collar and two plano lenses were filled with solutions of differing concentrations of polystyrene microspheres (Polysciences Inc., USA). 0.002%, 0.004%, 0.006%, 0.008% were prepared, as well as distilled water only. After a preliminary screening to confirm subject eligibility, seven arteriolar hemodynamic measurements were taken by randomly placing the cells between the CLBF objective lens and the subjects’ cornea. Chapter 5 – Ten patients scheduled for extracapsular cataract extraction using phacoemulsification and intraocular lens implantation between the ages of 61 and 84 (mean age 73 years, SD ± 8) were prospectively recruited. Two visits were required to complete the study; One prior to the surgery and one at least six weeks after the surgery to allow for full post-operative recovery. The severity of cataract was documented using the Lens Opacity Classification System (LOCS, III) at the first visit. Each subject underwent visual function assessment at both visits using logMAR Early Treatment Diabetic Retinopathy Study (ETDRS) visual acuity charts and the Brightness Acuity Tester (BAT). Retinal arteriolar hemodynamics were measured at both visits using the high intensity setting of the Canon Laser Blood Flowmeter. Results: Chapter 4: Our light scatter model resulted in an artifactual increase of retinal arteriolar diameter (p<0.0001) and thereby increased retinal blood flow (p<0.0001). The 0.006% and 0.008% microsphere concentrations produced significantly higher diameter and flow values than baseline. Centerline blood velocity, however, was not affected by light scatter. Retinal arteriolar diameter values were significantly less with the high intensity laser than with the low intensity laser (p=0.0007). Chapter 5: Group mean retinal arteriolar diameter and blood flow were reduced following extracapsular cataract extraction (Wilcoxon signed-rank test, p=0.022 and p=0.028 respectively); however, centerline blood velocity was unchanged (Wilcoxon signed-rank test, p=0.074). Conclusions: Using an artificial light scatter model (Chapter 3), we demonstrated that the densitometry assessment of vessel diameter is increasingly impacted as the magnitude of artificial light scatter increases; this effect can be partially negated by increasing laser intensity. We showed similar results in the presence of cataract (Chapter 4) by measuring the retinal arteriolar hemodynamics before and after removal of cataract. Care needs to be exercised in the interpretation of studies of retinal vessel diameter that use similar densitometry techniques as cataract is an inevitable consequence of aging.
66

Tuning of Metaheuristics for Systems Biology Applications

Höghäll, Anton January 2010 (has links)
In the field of systems biology the task of finding optimal model parameters is a common procedure. The optimization problems encountered are often multi-modal, i.e., with several local optima. In this thesis, a class of algorithms for multi-modal problems called metaheuristics are studied. A downside of metaheuristic algorithms is that they are dependent on algorithm settings in order to yield ideal performance.This thesis studies an approach to tune these algorithm settings using user constructed test functions which are faster to evaluate than an actual biological model. A statistical procedure is constructed in order to distinguish differences in performance between different configurations. Three optimization algorithms are examined closer, namely, scatter search, particle swarm optimization, and simulated annealing. However, the statistical procedure used can be applied to any algorithm that has configurable options.The results are inconclusive in the sense that performance advantages between configurations in the test functions are not necessarily transferred onto real biological models. However, of the algorithms studied a scatter search implementation was the clear top performer in general. The set of test functions used must be studied if any further work is to be made following this thesis.In the field of systems biology the task of finding optimal model parameters is a common procedure. The optimization problems encountered are often multi-modal, i.e., with several local optima. In this thesis, a class of algorithms for multi-modal problems called metaheuristics are studied. A downside of metaheuristic algorithms is that they are dependent on algorithm settings in order to yield ideal performance. This thesis studies an approach to tune these algorithm settings using user constructed test functions which are faster to evaluate than an actual biological model. A statistical procedure is constructed in order to distinguish differences in performance between different configurations. Three optimization algorithms are examined closer, namely, scatter search, particle swarm optimization, and simulated annealing. However, the statistical procedure used can be applied to any algorithm that has configurable options. The results are inconclusive in the sense that performance advantages between configurations in the test functions are not necessarily transferred onto real biological models. However, of the algorithms studied a scatter search implementation was the clear top performer in general. The set of test functions used must be studied if any further work is to be made following this thesis.
67

Etablering av kulturberoende produkter i andra kulturer : Spridning av svensk matkultur / Establishment of Culturally Bounded Products in Other Cultures : Spreading Swedish Food Culture

Söderblom, Cecilia, Grönberg, Sofia January 2012 (has links)
Syftet med den här studien är att skapa en modell för att öka förståelsen för etablering av kulturberoende produkter i andra kulturer genom att analysera samband mellan spridningsmekanismer, anpassningsmekanismer och upplevelser.Då begreppet kultur är brett har vi valt att avgränsa studien till matkultur. Anledningen till detta är att ämnet är aktuellt i Sverige då regeringen ämnar marknadsföra Sverige som ett ”matland”. För att Sverige ska betraktas som ett ”matland” krävs förståelse för etablering av kulturberoende produkter i andra kulturer.För att få ett helhetsperspektiv av hur etablering av kulturberoende produkter går till har empiri insamlats genom verklighetsbaserade exempel samt genom intervjuer med personer inom kultur och matindustrin.Vi har sett att det finns samband mellan spridningsmekanismer, anpassningsmekanismer och upplevelser. Sambandet ligger i att det måste finnas ett budskap ska spridas och detta måste sedan anpassas efter kultur för att en positiv upplevelse ska kunna skapas. På så sätt kan en positiv uppfattning av svensk matkultur bildas vilket underlättar för kulturexport. Sambanden mellan faktorerna: spridningsmekanismer, anpassningsmekanismer och upplevelse, förklaras i vår resultatmodell.
68

Impact of Light Scatter on the Assessment of Retinal Arteriolar Hemodynamics

Azizi, Behrooz January 2010 (has links)
Introduction and Purpose: Vascular pathologies play an important role in the etiology and progression of number of ocular diseases. Many instruments are developed to monitor retinal hemodynamics, including the Canon Laser Blood Flowmeter (CLBF), in an attempt to better understand the pathophysiology of the disease (Chapter 2). The purpose of this thesis is to determine the impact of light scatter on retinal arteriolar hemodynamic measurement assessed by the CLBF as intraocular light scatter is an inevitable consequence of ageing and particularly cataract. Methodology: Chapter 4 – Artificial light scatter model: One eye from each of 10 healthy young subjects between the ages of 18 and 30 (23.6 ± 3.4) was randomly selected. To simulate light scatter, cells comprising a plastic collar and two plano lenses were filled with solutions of differing concentrations of polystyrene microspheres (Polysciences Inc., USA). 0.002%, 0.004%, 0.006%, 0.008% were prepared, as well as distilled water only. After a preliminary screening to confirm subject eligibility, seven arteriolar hemodynamic measurements were taken by randomly placing the cells between the CLBF objective lens and the subjects’ cornea. Chapter 5 – Ten patients scheduled for extracapsular cataract extraction using phacoemulsification and intraocular lens implantation between the ages of 61 and 84 (mean age 73 years, SD ± 8) were prospectively recruited. Two visits were required to complete the study; One prior to the surgery and one at least six weeks after the surgery to allow for full post-operative recovery. The severity of cataract was documented using the Lens Opacity Classification System (LOCS, III) at the first visit. Each subject underwent visual function assessment at both visits using logMAR Early Treatment Diabetic Retinopathy Study (ETDRS) visual acuity charts and the Brightness Acuity Tester (BAT). Retinal arteriolar hemodynamics were measured at both visits using the high intensity setting of the Canon Laser Blood Flowmeter. Results: Chapter 4: Our light scatter model resulted in an artifactual increase of retinal arteriolar diameter (p<0.0001) and thereby increased retinal blood flow (p<0.0001). The 0.006% and 0.008% microsphere concentrations produced significantly higher diameter and flow values than baseline. Centerline blood velocity, however, was not affected by light scatter. Retinal arteriolar diameter values were significantly less with the high intensity laser than with the low intensity laser (p=0.0007). Chapter 5: Group mean retinal arteriolar diameter and blood flow were reduced following extracapsular cataract extraction (Wilcoxon signed-rank test, p=0.022 and p=0.028 respectively); however, centerline blood velocity was unchanged (Wilcoxon signed-rank test, p=0.074). Conclusions: Using an artificial light scatter model (Chapter 3), we demonstrated that the densitometry assessment of vessel diameter is increasingly impacted as the magnitude of artificial light scatter increases; this effect can be partially negated by increasing laser intensity. We showed similar results in the presence of cataract (Chapter 4) by measuring the retinal arteriolar hemodynamics before and after removal of cataract. Care needs to be exercised in the interpretation of studies of retinal vessel diameter that use similar densitometry techniques as cataract is an inevitable consequence of aging.
69

Improving attenuation corrections obtained using singles-mode transmission data in small-animal PET

Vandervoort, Eric 05 1900 (has links)
The images in positron emission tomography (PET) represent three dimensional dynamic distributions of biologically interesting molecules labelled with positron emitting radionuclides (radiotracers). Spatial localisation of the radio-tracers is achieved by detecting in coincidence two collinear photons which are emitted when the positron annihilates with an ordinary electron. In order to obtain quantitatively accurate images in PET, it is necessary to correct for the effects of photon attenuation within the subject being imaged. These corrections can be obtained using singles-mode photon transmission scanning. Although suitable for small animal PET, these scans are subject to high amounts of contamination from scattered photons. Currently, no accurate correction exists to account for scatter in these data. The primary purpose of this work was to implement and validate an analytical scatter correction for PET transmission scanning. In order to isolate the effects of scatter, we developed a simulation tool which was validated using experimental transmission data. We then presented an analytical scatter correction for singles-mode transmission data in PET. We compared our scatter correction data with the previously validated simulation data for uniform and non-uniform phantoms and for two different transmission source radionuclides. Our scatter calculation correctly predicted the contribution from scattered photons to the simulated data for all phantoms and both transmission sources. We then applied our scatter correction as part of an iterative reconstruction algorithm for simulated and experimental PET transmission data for uniform and non-uniform phantoms. We also tested our reconstruction and scatter correction procedure using transmission data for several animal studies (mice, rats and primates). For all studies considered, we found that the average reconstructed linear attenuation coefficients for water or soft-tissue regions of interest agreed with expected values to within 4%. Using a 2.2 GHz processor, the scatter correction required between 6 to 27 minutes of CPU time (without any code optimisation) depending on the phantom size and source used. This extra calculation time does not seem unreasonable considering that, without scatter corrections, errors in the reconstructed attenuation coefficients were between 18 to 45% depending on the phantom size and transmission source used.
70

Enhancing the image quality of digital breast tomosynthesis

Feng, Si 27 August 2014 (has links)
A novel imaging technology, digital breast tomosynthesis (DBT), is a technique that overcomes the tissue superposition limitation of conventional mammography by acquiring a limited number of X-ray projections from a narrow angular range, and combining these projections to reconstruct a pseudo-3D image. The emergence of DBT as a potential replacement or adjunct to mammographic screening mandates that solutions be found to two of its major limitations, namely X-ray scatter and mono-energetic reconstruction methods. A multi-faceted software-based approach to enhance the image quality of DBT imaging has the potential to increase the sensitivity and specificity of breast cancer detection and diagnosis. A scatter correction (SC) algorithm and a spectral reconstruction (SR) algorithm are both ready for implementation and clinical evaluation in a DBT system and exhibit the potential to improve image quality. A principal component analysis (PCA) based model of breast shape and a PCA model of X-ray scatter optimize the SC algorithm for the clinical realm. In addition, a comprehensive dosimetric characterization of a FDA approved DBT system has also been performed, and the feasibility of a new dual-spectrum, single-acquisition DBT imaging technique has also been evaluated.

Page generated in 0.0506 seconds