• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 570
  • 181
  • 54
  • 47
  • 23
  • 18
  • 10
  • 9
  • 9
  • 8
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 1208
  • 1208
  • 1208
  • 173
  • 172
  • 165
  • 128
  • 124
  • 120
  • 108
  • 102
  • 96
  • 86
  • 84
  • 79
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
961

Assessing the dose received by the victims of a radiological dispersal device with Geiger-Mueller detectors

Manger, Ryan Paul 10 July 2008 (has links)
This research investigates the use of G-M counters to triage the individuals who have been exposed to a Radiological Dispersal Device (RDD). Upon being exposed to an RDD, inhalation of the airborne radionuclide is a method which someone can receive a considerable amount of dose. Bioassay via analysis of excreta is a commonly used method of determining the dose received, yet it would be cumbersome if there are a large number of people needing to be screened. An in vivo method must be considered so that a non-intrusive and more efficient triaging method can be implemented. Whole body counters are commonly used in counting facilities as an in vivo bioassay method, yet they are limited in number and not easily portable. Therefore, a more portable and more common detection device should be considered. G-M survey meters are common devices that are highly portable, making them ideal candidates to fulfill this necessity. The ease of use contributes to the viability of the device as a portable, in vivo screening device. To analyze this detector, a Monte Carlo model of the detector was created to be used in simulations with the Medical Internal Radiation Dose phantoms. The detector was placed in a few locations on the phantoms. Four locations were strategically chosen for detector placement: the posterior upper right torso, the anterior upper right torso, the lateral upper thigh, and the anterior of the neck. Six phantoms were considered: Reference Male, Female, Adipose Male, Adipose Female, Post Menopausal Adipose Female, and a Child. Six radionuclides were investigated: Am-241, Co-60, Cs-137, I-131, Ir-192, and Sr-90. The nuclides were distributed throughout the phantoms according to Dose and Risk Calculation Software, a code that determines how a radionuclide is distributed over time upon inhalation, ingestion, or injection. A set of time dependent guidelines were developed, determining the count rate per unit dose inhaled for each detector location and phantom type.
962

A multi-objective stochastic approach to combinatorial technology space exploration

Patel, Chirag B. 18 May 2009 (has links)
Several techniques were studied to select and prioritize technologies for a complex system. Based on the findings, a method called Pareto Optimization and Selection of Technologies (POST) was formulated to efficiently explore the combinatorial technology space. A knapsack problem was selected as a benchmark problem to test-run various algorithms and techniques of POST. A Monte Carlo simulation using the surrogate models was used for uncertainty quantification. The concepts of graph theory were used to model and analyze compatibility constraints among technologies. A probabilistic Pareto optimization, based on the concepts of Strength Pareto Evolutionary Algorithm II (SPEA2), was formulated for Pareto optimization in an uncertain objective space. As a result, multiple Pareto hyper-surfaces were obtained in a multi-dimensional objective space; each hyper-surface representing a specific probability level. These Pareto layers enabled the probabilistic comparison of various non-dominated technology combinations. POST was implemented on a technology exploration problem for a 300 passenger commercial aircraft. The problem had 29 identified technologies with uncertainties in their impacts on the system. The distributions for these uncertainties were defined using beta distributions. Surrogate system models in the form of Response Surface Equations (RSE) were used to map the technology impacts on the system responses. Computational complexity of technology graph was evaluated and it was decided to use evolutionary algorithm for probabilistic Pareto optimization. The dimensionality of the objective space was reduced using a dominance structure preserving approach. Probabilistic Pareto optimization was implemented with reduced number of objectives. Most of the technologies were found to be active on the Pareto layers. These layers were exported to a dynamic visualization environment enabled by a statistical analysis and visualization software called JMP. The technology combinations on these Pareto layers are explored using various visualization tools and one combination is selected. The main outcome of this research is a method based on consistent analytical foundation to create a dynamic tradeoff environment in which decision makers can interactively explore and select technology combinations.
963

A multi-resolution approach for modeling flow and solute transport in heterogeneous porous media

Gotovac, Hrvoje January 2009 (has links)
Subsurface processes are usually characterized by rare field experiments, sparse measurements,multi-resolution interpretations, stochastic description, related uncertainties and computational complexity. Over the last few decades, different computational techniques and strategies have become indispensable tools for flow and solute transport prediction in heterogeneous porousmedia. This thesis develops a multi-resolution approach based on Fup basis functions with compactsupport, enabling the use of an efficient and adaptive procedure, closely related to currentunderstood physical interpretation. All flow and transport variables, as well as intrinsic heterogeneity,are described in a multi-resolution representation, in the form of a linear combination ofFup basis functions. Each variable is represented on a particular adaptive grid with a prescribedaccuracy. The methodology is applied to solving problems with sharp fronts, and to solving flowand advective transport in highly heterogeneous porous media, under mean uniform flow conditions.The adaptive Fup collocation method, through the well known method of lines, efficientlytracks solutions with sharp fronts, resolving locations and frequencies at all spatial and/or temporalscales. The methodology yields continuous velocity fields and fluxes, enabling accurate andreliable transport analysis. Analysis of the advective transport proves the robustness of the firstordertheory for low and mild heterogeneity. Moreover, due to the accuracy of the improved Monte-Carlo methodology, this thesis presents the effects of high heterogeneity on ensembleflow and travel time statistics. The difference between Eulerian and Lagrangian velocity statisticsand the importance of higher travel time moments are indicative of high heterogeneity. The thirdtravel time moment mostly describes a peak and late arrivals, while higher moments are requiredfor early arrivals which are linked with the largest uncertainty. A particular finding is the linearityof all travel time moments, which implies that in the limit an advective transport in multi-Gaussian field becomes Fickian. By comparison, the transverse displacement pdf converges to aGaussian distribution around 20 integral scales after injection, even for high heterogeneity. Thecapabilities of the presented multi-resolution approach, and the quality of the obtained results,open new areas for further research. / Markprocesser karakteriseras ofta av fåtaliga fältexperiment, glesa mätningar, heterogenitet påolika skalor, slumpmässighet och relaterade osäkerheter, samt beräkningsmässiga svårigheter.Under de senaste årtiondena har olika beräkningstekniker och strategier blivit ovärderliga verktygför att förutspå vattenflöde och ämnestransport i heterogena porösa medier. Denna doktorsavhandling utvecklar ett angreppssätt med flerskaliga upplösningar baserat på Fup basis funktionermed kompakt stöd, som möjliggör en effektiv och anpassningsbar procedur, nära relaterad tillrådande fysiska tolkningar. Alla flödes- och transportvariabler, så väl som heterogeniteten, beskrivsav en flerskaligt upplöst representation, i form av linjära kombinationer av Fup basis funktioner.Varje variabel representeras på ett speciellt anpassningsbar gridnät med given noggrannhet.Metoden appliceras för att lösa problem med skarpa fronter, samt vattenflöde och advektivämnestransport i starkt heterogena porösa medier. Adaptive Fup collocation metoden tillsammansmed den välkända Method of lines, spårar effektivt lösningar med skarpa fronter och löserupp positioner och frekvenser på alla rums- och/eller tidsskalor. Metoden ger kontinuerliga hastighetsfältoch flöden, och möjliggör noggrann och tillförlitlig transportanalys. Analys av advektivtransport understöder stabiliteten i första-ordningens transport teori för låg och mild heterogenitet.Utöver detta, som resultat av noggrannheten i den förbättrade Monte-Carlo metodiken, visardenna avhandling effekten av hög heterogenitet på ensemble statistiken för flöden och transporttider.Skillnaden mellan Eulerisk och Lagrangian hastighetsstatistik och betydelsen av högrestatistiska moment för transporttider, indikerar hög heterogenitet. Det tredje transporttidsmomentetbeskriver huvudsakligen sannolikhetspiken och de långa transporttiderna, medan högremoment behövs för de korta transporttiderna, som har den största osäkerheten. En speciell upptäcktär linjäariteten i transporttidsmoment, som indikerar att advektiv transport i multi-Gaussiska fält blir Gaussisk i gränsen. Som jämförelse konvergerar sannolikhetsfunktioner förden transversella transportförflyttningen mot en Gaussisk fördelning vid runt 20 korrelationslängder efter injektion, även för hög heterogenitet. Förmågan i det presenterade angreppssättet med flerskalig upplösning, och resultatens noggrannhet, öppnar nya områden för fortsatt forskning. / QC 20100714
964

Calculation of scatter in cone beam CT : steps towards a virtual tomograph /

Malusek, Alexandr, January 2008 (has links)
Diss. (sammanfattning) Linköping : Linköpings universitet, 2008. / Härtill 5 uppsatser.
965

Study of spin-lattice relaxation rates in solids lattice-frame method compared with quantum density-matrix method, and Glauber dynamic /

Solomon, Lazarus, January 2006 (has links)
Thesis (M.S.)--Mississippi State University. Department of Physics and Astronomy. / Title from title screen. Includes bibliographical references.
966

Monte Carlo simulations using MCNPX of proton and anti-proton beam profiles for radiation therapy

Handley, Stephen Michael. January 2010 (has links) (PDF)
Thesis--University of Oklahoma. / Bibliography: leaves 90-92.
967

Atomic-scale calculations of interfacial structures and their properties in electronic materials

Liang, Tao, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Document formatted into pages; contains xvi, 136 p.; also includes graphics (some col.). Includes bibliographical references (p. 125-136). Available online via OhioLINK's ETD Center
968

Simulação do Forward Proton Detector do experimento DØ, utilizando o Geant4 / Simulation of the forward proton detector from DØ experiment using Geant4

Sandro Fonseca de Souza 08 November 2005 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Apresentam-se aqui os resultados da simulação do Espectrômetro de Dipolo do Forward Próton Detector (FPD) do experimento DØ usando o Geant4 O FPD consiste de um conjunto de espectrômetros de momentum localizados no tubo do feixe em ambos os lados do detector DØ com o objetivo de detectar prótons e/ou antiprótons produzidos em eventos difrativos resultantes das colisões próton-antipróton no centro do DØ Para determinar quais os fatores influenciam a resposta do FPD, utilizamos eventos com = 2 TeV gerados pelo programa PHOJET Nesta dissertação realizou-se a simulação de um detector bem conhecido, neste caso os espectrômetros de dipolo do FPD a fim de entender como desenvolver simulações utilizando o Geant4 para futuras aplicações no LHC / We present here the results from the simulation of the Dipole Spectrometer of the Forward Proton Detector (FPD), using the Geant4 simulation package. The FPD consists of momentum spectrometers placed in the beam pipe in both sides of DØ experiment. It was built to tag protons and/or antiprotons produced in diffractive events as a result of the collisions at the center of DØ. In order to estimate the response of our simulation we used events at = 2 TeV generated with PHOJET. Our motivation to simulate the dipole spectrometer of FPD is to learn and understand how Geant4 operates in order to use it in future applications at LHC
969

A Theoretical Perspective on Hydrogenation and Oligomerization of Acetylene over Pd Based Catalysts / Une étude théorique de l’hydrogénation et l’oligomérisation de l’acétylène sur des catalyseurs de palladium

Vignola, Emanuele 29 September 2017 (has links)
L’hydrogénation sélective de l’acétylène est un processus fondamental pour l’industrie pétrochimique qui permet la purification de l’éthylène utilisé dans les réactions de polymérisation. Ce processus est promu par des catalyseurs au palladium, qui présentent une bonne sélectivité en éthylène par rapport au produit d’hydrogénation totale, c’est-à-dire l’éthane. Les catalyseurs de palladium pur sont malheureusement désactivés par des oligomères qui se forment comme sous-produits de la réaction d’hydrogénation. Les catalyseurs d’usage industriel sont, pour cette raison, plutôt des alliages de palladium avec d’autres métaux, comme par exemple, l’argent. Ces alliages réduisent la production des oligomères, sans pour autant les supprimer complètement. Ce travail de thèse a été focalisé sur la compréhension à l’échelle moléculaire de la formation de ce mélange d’oligomères, souvent appelée « huile verte ». Pour commencer, une approche de champ moyen a été développée pour déterminer rapidement l’état de la surface catalytique de l’alliage Pd-Ag en condition de réaction. Ce modèle a montré que l’acétylène est capable de réorganiser la couche de la surface et de générer des îles de palladium. Pour confirmer cette prédiction, nous avons effectué des simulations Monte Carlo en utilisant un Hamiltonien modèle. Ces calculs ont produits des résultats similaires au modèle analytique simple. Ayant attribué la formation des oligomères aux domaines de palladium ainsi obtenus, les étapes de d’oligomérisation ont été étudies et comparés à celles qui décrivent l’hydrogénation de l’acétylène. Les calculs, réalisé avec l’approximation de la théorie de la fonctionnelle de la densité (DFT), ont montré que la formation des oligomères est compétitive avec l’hydrogénation. En plus, les oligomères sont plus faciles à hydrogéner que l’acétylene et pourraient, donc, impacter négativement sur l’hydrogénation sélective de l’acétylène. Le rôle exact des îles de palladium sous conditions réalistes est encore à clarifier, sachant que le palladium est recouvert d’une grande variété d’espèces chimiques. Les techniques d’intelligence artificielle peuvent aider à atteindre ce but : nous avons ainsi démontré qu’il est possible d’interpoler les résultats des calculs DFT d’une façon automatique et de décrire l’énergie du système en série de coefficients « cluster ». Ceci permet de prendre en compte les interactions latérales entre espèces chimiques à la surface du palladium. / Selective hydrogenation of acetylene in ethylene-rich flows is a fundamental process in the petrochemical industry since it allows the purification of ethylene for polymer applications. The reaction is catalyzed by Pd, which features acceptable selectivity towards ethylene compared to the total hydrogenation product, ethane. Pure Pd is, however, deactivated by oligomeric byproducts, known as ”green oil” in the literature. Therefore, most industrial catalysts are Pd-Ag alloys, where Ag helps to suppress the secondary reactions. This work addresses the formation of initial oligomers on Pd and Ag-Pd catalysts. A mean field based theoretical model was built to efficiently screen the topology of the topper most layer of the alloy catalyst under relevant conditions. This model gave evidence for strongly favored Pd island formation. To confirm this result, the system was then re-investigated by means of Monte Carlo simulations including the effect of segregation. Emergence of large domains of Pd were confirmed over large ratios of Ag to Pd. Green oil is expected to form on these catalytically active islands. To obtain a detailed view on the oligomerization process, activation energies were computed both for hydrogenation and oligomerization steps by periodic density functional theory on Pd(111). Oligomerization was found to be competitive with hydrogenation, with the hydrogenation of the oligomers being among the fastest processes. The role of Pd domains to green oil formation is still to be clarified under realistic conditions, where the surface is covered by many different species. A step forward to this goal was taken by developing a machine-learning tool which automatically interpolates model Hamiltonians on graphical lattices based on DFT computations, accounting for lateral interactions and distorted adsorption modes on crowded surfaces.
970

INCERTEZA DE MEDIÇÃO EM ANÁLISES MICOTOXICOLÓGICAS: ESTIMATIVA PELAS ABORDAGENS BOTTOM UP, MONTE CARLO E KRAGTEN / MEASUREMENT UNCERTAINTY IN MYCOTOXINS ANALYSIS: EVALUATION BY BOTTOM UP, MONTE CARLO AND KRAGTEN APPROACHES

Wovst, Liziane Rachel da Silva 06 March 2015 (has links)
Different approaches for the estimation of the uncertainty related to measurement results are found in the literature and in published guidelines. In the present work three approaches were used to estimate the uncertainty of the determination of aflatoxins (AB1, AB2, AG1, AG2) and deoxynivalenol in maize by liquid chromatography coupled to mass spectrometry tandem (LC-MS/MS): the Bottom up approach, adapted from the Guide to the Expression of Uncertainty in Measurement (ISO GUM); the Monte Carlo Method (MCM), which propagates distributions assigned to the input quantities through a numerical simulation; and the Kragten approach wich calculates standard deviations and confidence intervals with a universally applicable spreadsheet technique. A measurement equation was developed for mycotoxins analysis and a cause-and-effect diagram was draft to assist in the identification of the sources of uncertainty associated with the method. Detailed analysis of contributions of the various uncertainty sources was carried out. Measurement uncertainty was determined by the addition of the variances of the individual steps of the test procedure, according to each approach employed. The Bottom up, MCM and Kragten approaches produced very similar estimates to the combined uncertainty and the coefficient of variation (CV) between them was smaller than 1.0%. The main contribution to overall uncertainty is the intermediate precision with contributions over 90,0% for each mycotoxin. The results obtained with this research conclude that the three approaches are adequate for estimating the uncertainty in mycotoxin assays with LC-MS/MS technique. Among them, Bottom up is the most appropriate approach, since it requires that the analyst performs a detailed investigation about dominant components of the measurement uncertainty, allowing for better understanding and improvement of the measurement process. The Monte Carlo and Kragten methods are indicated the data generated by the Bottom up approach. / A literatura cita diferentes abordagens para estimar a incerteza de medição em ensaios quantitativos. No presente trabalho três abordagens foram utilizadas para estimar a incerteza da determinação de aflatoxinas (AB1, AB2, AG1, AG2) e desoxinivalenol em milho por cromatografia líquida acoplada à espectrometria de massa tandem (LC-MS/MS): a abordagem Bottom up, adaptada do Guia para a Expressão da Incerteza de Medição (ISO GUM); o método de Monte Carlo (MMC), que propaga distribuições atribuídas às grandezas de entrada através de uma simulação numérica; e a abordagem de Kragten que calcula desvios padrão e intervalos de confiança com uma técnica planilha universalmente aplicável. Um modelo matemático foi desenvolvido para os ensaios de micotoxinas e um diagrama de causa-efeito foi proposto para auxiliar na identificação das fontes de incerteza associadas ao método. A análise detalhada das contribuições das várias fontes de incerteza foi realizada. A incerteza de medição foi determinada pela adição das variações dos passos individuais do procedimento de ensaio, de acordo com cada abordagem. Os métodos Bottom up, MMC e Kragten geraram estimativas similares para a incerteza combinada, com um coeficiente de variação (CV) menor que 1,0% entre elas. O principal componente de incerteza é a precisão intermediária, com contribuições acima de 90,0% para cada micotoxina. Como resultado da pesquisa, conclui-se que as três abordagens são adequadas para estimar a incerteza nos ensaios para quantificação de micotoxinas por LC-MS/MS. Dentre elas, a abordagem Bottom up é a mais apropriada, pois requer que o analista avalie o método detalhadamente para identificar os principais componentes de incerteza, possibilitando a implementação de melhorias no sistema de medição. Os métodos de Monte Carlo e Kragten são indicados como uma ferramenta de confirmação dos resultados obtidos pela abordagem Bottom up.

Page generated in 0.0305 seconds