Spelling suggestions: "subject:"[een] ATTENUATION"" "subject:"[enn] ATTENUATION""
11 |
Acoustics of long enclosuresKang, Jian January 1996 (has links)
No description available.
|
12 |
Evaluation of attenuation and scatter correction requirements in small animal PET and SPECT imagingKonik, Arda Bekir 01 July 2010 (has links)
Positron emission tomography (PET) and single photon emission tomography (SPECT) are two nuclear emission-imaging modalities that rely on the detection of high-energy photons emitted from radiotracers administered to the subject. The majority of these photons are attenuated (absorbed or scattered) in the body, resulting in count losses or deviations from true detection, which in turn degrades the accuracy of images. In clinical emission tomography, sophisticated correction methods are often required employing additional x-ray CT or radionuclide transmission scans. Having proven their potential in both clinical and research areas, both PET and SPECT are being adapted for small animal imaging. However, despite the growing interest in small animal emission tomography, little scientific information exists about the accuracy of these correction methods on smaller size objects, and what level of correction is required.
The purpose of this work is to determine the role of attenuation and scatter corrections as a function of object size through simulations. The simulations were performed using Interactive Data Language (IDL) and a Monte Carlo based package, Geant4 application for emission tomography (GATE). In IDL simulations, PET and SPECT data acquisition were modeled in the presence of attenuation. A mathematical emission and attenuation phantom approximating a thorax slice and slices from real PET/CT data were scaled to 5 different sizes (i.e., human, dog, rabbit, rat and mouse). The simulated emission data collected from these objects were reconstructed. The reconstructed images, with and without attenuation correction, were compared to the ideal (i.e., non-attenuated) reconstruction. Next, using GATE, scatter fraction values (the ratio of the scatter counts to the total counts) of PET and SPECT scanners were measured for various sizes of NEMA (cylindrical phantoms representing small animals and human), MOBY (realistic mouse/rat model) and XCAT (realistic human model) digital phantoms. In addition, PET projection files for different sizes of MOBY phantoms were reconstructed in 6 different conditions including attenuation and scatter corrections. Selected regions were analyzed for these different reconstruction conditions and object sizes. Finally, real mouse data from the real version of the same small animal PET scanner we modeled in our simulations were analyzed for similar reconstruction conditions.
Both our IDL and GATE simulations showed that, for small animal PET and SPECT, even the smallest size objects (~2 cm diameter) showed ~15% error when both attenuation and scatter were not corrected. However, a simple attenuation correction using a uniform attenuation map and object boundary obtained from emission data significantly reduces this error (~1% for smallest size and ~6% for largest size, in non-lung regions). In addition, we did not observe any significant improvement between the uses of uniform or actual attenuation map (e.g., only ~0.5% for largest size in PET studies). The scatter correction was not significant for smaller size objects, but became increasingly important for larger sizes objects.
These results suggest that for all mouse sizes and most rat sizes, uniform attenuation correction can be performed using emission data only. For smaller sizes up to ~ 4 cm, scatter correction is not required even in lung regions. For larger sizes if accurate quantization needed, additional transmission scan may be required to estimate an accurate attenuation map for both attenuation and scatter corrections.
|
13 |
Laboratory experiments and numerical modeling of wave attenuation through artificial vegetationAugustin, Lauren Nicole 15 May 2009 (has links)
It is commonly known that coastal vegetation dissipates energy and aids in
shoreline protection by damping incoming waves and depositing sediment in vegetated
regions. However, this critical role of vegetation to dampen wave forces is not fully
understood at present. A series of laboratory experiments were conducted in the Haynes
Coastal Laboratory and 2-D flume at Texas A&M University to examine different
vegetative scenarios and analyze the wave damping effects of incident wave height, stem
density, wave period, plant type, and water depth with respect to stem length.
In wetland regions vegetation is one of the main factors influencing hydraulic
roughness. Traditional open-channel flow equations, including the Manning and Darcy-
Weisbach friction factor approach, have been successfully applied to determine bottom
friction coefficients for flows in the presence of vegetation. There have been numerous
relationships derived relating the friction factor to different flow regime boundary layers to try and derive a wave friction factor for estimating energy dissipation due to bottom
bed roughness. The boundary layer problem is fairly complex, and studies relating the
wave friction factor to vegetation roughness elements are sparse. In this thesis the
friction factor is being applied to estimate the energy dissipation under waves due to
artificial vegetation. The friction factor is tuned to the laboratory experiments through
the use of the numerical model COULWAVE so that the pipe flow formulation can be
reasonably applied to wave problems. A numerical friction factor is found for each case
through an iterative process and empirical relationships are derived relating the friction
factor for submerged and emergent plant conditions to the Ursell number. These
relationships can be used to reasonably estimate a wave friction factor for practical
engineering purposes.
This thesis quantitatively analyzes wave damping due to the effects of wave
period, incident wave height, horizontal stem density, water depth relative to stem
length, and plant type for a 6 m plant bed length. A friction factor is then determined
numerically for each of the laboratory experiments, and a set of equations is derived for
predicting a roughness coefficient for vegetation densities ranging between 97 stems/m2
and 162 stems/m2.
|
14 |
Potential impacts of vertical cable seismic: modeling, resolution and multiple attenuationWilson, Ryan Justin 30 September 2004 (has links)
Vertical cable seismic methods are becoming more relevant as we require high quality and high resolution seismic data in both land and marine environments. Our goal in this thesis is to demonstrate the impacts of vertical cable surveying in these areas.
Vertical cable methods have been applied to the marine environment with encouraging results. Data quality is similar to that of traditional towed-streamer data, without the long, cumbersome towed-streamers which are difficult to maneuver in congested areas. The current marine vertical cable processing schemes tend to use primaries and receiver ghosts of primaries for imaging. Therefore, we demonstrate the ability of the current multiple attenuation algorithms developed by Ikelle (2001) to preserve either primaries or the receiver ghosts of primaries.
As we focus on land acquisition, we discover that vertical cable surveying can overcome many of the traditional problems of land seismics. In fact, our investigations lead us to believe that problems such as ground roll, guided waves and statics can be avoided almost entirely using vertical cable acquisition methods. Furthermore, land vertical surveying is naturally suited for multi-component acquisition and time-lapse surveying.
To fully analyze the applicability of vertical cable surveys in marine and land environments, we also investigate the problem of cable spacing and sampling within each cable. We compare the resolution of vertical cable data and horizontal data by calculating the maximum angular coverage of each acquisition geometry and measuring the occurrence of each angle within this coverage, such that more occurrences means better resolution. From our investigations, we find that by using vertical cables of no more than 500 m in length at 500 m intervals, we can acquire higher resolution seismic data relative to horizontal surface methods for an image point, horizontal reflector or a dipping reflector.
The key tool used in these investigations is fully elastic finite-difference modeling. We chose this technique based on its ability to properly and accurately model the full wavefield through complex models, all the while preserving amplitudes and the phase of reflected, diffracted and converted wavefields.
|
15 |
CO2 rock physics: a laboratory studyYam, Helen Unknown Date
No description available.
|
16 |
The use of emission-transmission computed tomography for improved quantification in SPECTVillafuerte, Mercedes Rodriguez January 1994 (has links)
The attenuation of photons within the body has been recognised as the major limiting factor hindering the ability of single photon emission computed tomography (SPECT) as a quantitative technique. This thesis investigates several aspects of an emission-transmission SPECT system using the Monte Carlo method and experimental techniques. The system was based on a rotating gamma camera fitted with a parallel hole collimator. The simulation of a transmission study was performed using a simple non-uniform mathematical phantom with two different external sources, a collimated line source and a flood source. The results showed that the attenuation maps were highly dependent on the geometry and photon energy of the source. The collimated line source produced improved image quality with lower statistical noise than the flood source. The results showed that, when high atomic number elements are present in the tissue composition, the attenuation coefficients at different energies are related through a second order polynomial transformation. If the object under study is formed of soft tissue equivalent materials, a linear transformation holds. The attenuation maps generated in the transmission study were used to correct for non-uniform attenuation compensation of an emission phantom. The results showed that non-uniform attenuation compensation improved image quality and reduced noise when compared to data without attenuation compensation. The presence of scattered photons in the emission data reduced the quality of the images and precluded accurate quantification. Absolute quantification was performed using the percent air sensitivity criterion. The largest difference between the theoretical and the Monte Carlo simulated images was approximately 8%. An emission-transmission myocardial perfusion study was simulated using an anthropomorphic phantom. Two photon energies of clinical interest were used, 75 keV and 140 keV, corresponding to the main photon emission energies of 201Tl and 99mTC. The results showed that 99mTc provided better image quality than 201Tl. Non-uniform attenuation compensation produced a very good agreement between the theoretical prediction and the simulation when scatter-free data were considered. The results presented in this thesis indicate that it is not possible to accomplish accurate attenuation compensation in general situations if scatter correction is not applied.
|
17 |
Transcriptome and microRNome of Theileria annulata Host CellsRchiad, Zineb 06 1900 (has links)
Tropical Theileriosis is a parasitic disease of calves with a profound economic
impact caused by Theileria annulata, an apicomplexan parasite of the genus
Theileria. Transmitted by Hyalomma ticks, T. annulata infects and transforms
bovine lymphocytes and macrophages into a cancer-like phenotype
characterized by all six hallmarks of cancer. In the current study we investigate
the transcriptional landscape of T. annulata-infected lymphocytes to define genes
and miRNAs regulated by host cell transformation using next generation
sequencing. We also define genes and miRNAs differentially expressed as a
result of the attenuation of a T.annulata-infected macrophage cell line used as a
vaccine. By comparing the transcriptional landscape of one attenuated and two
transformed cell lines we identify four genes that we propose as key factors in
transformation and virulence of the T. annulata host cells. We also identify miR-
126-5p as a key regulator of infected cells proliferation, adhesion, survival and
invasiveness. In addition to the host cell trascriptome we studied T. annulata
transcriptome and identified the role of ROS and TGF-β2 in controlling parasite
gene expression. Moreover, we have used the deep parasite ssRNA-seq data to
refine the available T. annulata annotation. Taken together, this study provides
the full list of host cell’s genes and miRNAs transcriptionally perturbed after
infection with T. annulata and after attenuation and describes genes and miRNAs
never identified before as players in this type of host cell transformation.
Moreover, this study provides the first database for the transcriptome of T.
annulata and its host cells using next generation sequencing.
|
18 |
The Attenuation of Solar Radiation in Urban AtmospheresTanabe, Richard H. 04 1900 (has links)
<p> Unsworth and Monteith's (1972) aerosol attenuation coefficient TA was calculated with hourly cloudless data at four North American and four European stations for varying time periods. Monthly and seasonal turbidity trends were examined. Annual cycles were observed with summer maximums and winter minimums. The North American stations were less
turbid and had more pronounced trends than the European stations. Both air mass origin and local weather affect the turbidity. Local sources of pollution have a significant effect on turbidity most notably in large urban centres.</p> / Thesis / Candidate in Philosophy
|
19 |
Development of a diver-deployed instrument for the measurement of sediment density gradients by X-ray attenuation measurementsGuild, Matthew David 2009 August 1900 (has links)
Acoustical interactions with ocean sediments effect a wide range of sonar applications in littoral environments. An important factor in understanding the acoustical behavior of the ocean bottom is how the sediment density changes with depth. Although there are existing techniques for obtaining information about sediment gradients, these methods are unable to provide direct measurements of the sediment density gradient without significantly disrupting the test site and requiring significant diver support for installation and implementation.
The proposed X-Ray Attenuation Measurement (XRAM) device aims to improve upon these existing techniques with the goal of being a portable diver operated device that can perform direct in situ measurements of sediment density gradients without significant disruption of the ocean bottom. To accomplish this, the XRAM utilizes the attenuation of x-rays passing through the sediment to measure the density as a function of depth, and is arranged in a compact, portable design that can be deployed and operated by a single diver. The layout and basic design of the XRAM device is discussed, and a physical model of its operation is developed. Results of experimental testing on homogeneous liquid samples and liquid/solid mixtures to evaluate the effectiveness of the XRAM device in measuring density gradients are presented. Based on the analysis of these results, recommendations of improved performance for future development are given. / text
|
20 |
Naturlig nedbrytning av klorerade lösningsmedel i grundvatten / Natural attenuation of chlorinated solvents in groundwaterNugin, Kaisa January 2004 (has links)
<p>Chlorinated solvents are common contaminants in soil and water. Under anaerobic conditions microbes are capable of transforming chlorinated solvents into ethylene which would result in a remediation of the contaminated area. In order to use natural attenuation as a remediation method evidence of continuous degradation is required. Furthermore, the degradation must occur at a sufficient rate and continuous monitoring of the site is needed until the demanded levels are achieved. A field study was performed on the basis of data from a dry-cleaning facility contaminated mainly by perchloroethylene. The purpose of the study was to define the existing situation regarding distribution and transformation of contaminant in order to evaluate the possibilities of using natural attenuation as a method of remediation. Degradation of perchloroethylene proceeds through successive removal of chlorine, with the formation of trichloroethylene, dichloroethylene, vinyl chloride and ethylene. There exists evidence of degradation as far as vinyl chloride on the site but whether transformation continues to ethylene is not established. The computer model Biochlor was used to simulate distribution and degradation of the contaminants. The site possesses a complex hydrogeology and the existing data are not sufficient to distinguish the effect of degradation from other factors such as spreading of contaminant between different layers of soil. Since degradation failed to be quantified, natural attenuation can not be recommended as a safe remediation method at the considered site without further investigations.</p> / <p>Klorerade lösningsmedel är vanligt förekommande föroreningar i mark och vatten. Under anaeroba förhållanden kan mikrober omvandla klorerade kolväten till eten vilket leder till rening av det förorenade området. För att kunna använda denna naturliga nedbrytning som saneringsmetod krävs bevis för att nedbrytning fortskrider i tillräcklig utsträckning för att rena området och därefter krävs kontinuerlig provtagning till dess målen för saneringen har uppnåtts. En fallstudie utfördes utifrån data från en kemtvättsfastighet förorenad av i första hand perkloreten. Syftet var att kartlägga föroreningssituationen med avseende på spridning och nedbrytning av de klorerade föreningarna för att undersöka om naturlig nedbrytning var en möjlig framtida saneringsmetod. Nedbrytning av perkloreten sker stegvis genom att klor avspjälkas, under bildande av produkterna trikloreten, dikloreten, vinylklorid och etengas. Nedbrytning av förorening har konstaterats ske på fastigheten fram till vinylklorid men huruvida nedbrytning avstannat där eller fullföljts till etengas är ej klarlagt. Datormodellen Biochlor användes för att simulera spridning och nedbrytning av utsläppet. Fältplatsen har en komplex hydrogeologi och befintliga fältdata var inte tillräckliga för att särskilja nedbrytningens effekt från faktorer såsom spridning av förorening mellan olika jordlager. Eftersom nedbrytningen inte kunde kvantifieras kan naturlig nedbrytning inte rekommenderas som säker saneringsmetod på denna fältplats utan kompletterande analyser.</p>
|
Page generated in 0.049 seconds