• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 570
  • 181
  • 54
  • 47
  • 23
  • 18
  • 10
  • 9
  • 9
  • 8
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 1208
  • 1208
  • 1208
  • 173
  • 172
  • 165
  • 128
  • 124
  • 120
  • 108
  • 102
  • 96
  • 86
  • 84
  • 79
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1171

On the formulation of the alternative hypothesis for geodetic outlier detection / Über die Formulierung der Alternativhypothese für die geodätische Ausreißererkennung

Lehmann, Rüdiger 24 July 2014 (has links) (PDF)
The concept of outlier detection by statistical hypothesis testing in geodesy is briefly reviewed. The performance of such tests can only be measured or optimized with respect to a proper alternative hypothesis. Firstly, we discuss the important question whether gross errors should be treated as non-random quantities or as random variables. In the first case, the alternative hypothesis must be based on the common mean shift model, while in the second case, the variance inflation model is appropriate. Secondly, we review possible formulations of alternative hypotheses (inherent, deterministic, slippage, mixture) and discuss their implications. As measures of optimality of an outlier detection, we propose the premium and protection, which are briefly reviewed. Finally, we work out a practical example: the fit of a straight line. It demonstrates the impact of the choice of an alternative hypothesis for outlier detection. / Das Konzept der Ausreißererkennung durch statistische Hypothesentests in der Geodäsie wird kurz überblickt. Die Leistungsfähigkeit solch eines Tests kann nur gemessen oder optimiert werden in Bezug auf eine geeignete Alternativhypothese. Als erstes diskutieren wir die wichtige Frage, ob grobe Fehler als nicht-zufällige oder zufällige Größen behandelt werden sollten. Im ersten Fall muss die Alternativhypothese auf das Mean-Shift-Modell gegründet werden, im zweiten Fall ist das Variance-Inflation-Modell passend. Als zweites stellen wir mögliche Formulierungen von Alternativhypothesen zusammen und diskutieren ihre Implikationen. Als Optimalitätsmaß schlagen wir das Premium-Protection-Maß vor, welches kurz überblickt wird. Schließlich arbeiten wir ein praktisches Beispiel aus: Die Anpassung einer ausgleichenden Gerade. Es zeigt die Auswirkung der Wahl einer Alternativhypothese für die Ausreißererkennung.
1172

評估極值相依組合信用風險之有效演算法 / Efficient Algorithms for Evaluating Portfolio Credit Risk with Extremal Dependence

施明儒, Shih,Ming Ju Unknown Date (has links)
蒙地卡羅模擬是在組合信用風險的管理上相當實用的計算工具。衡量組合信用風險時,必須以適當的模型描述資產間的相依性。常態關聯結構是目前最廣為使用的模型,但實證研究認為 t 關聯結構更適合用於配適金融市場的資料。在本文中,我們採用 Bassamboo et al. (2008) 提出的極值相依模型建立 t 關聯結構用以捕捉資產之間的相關性。同時,為增進蒙地卡羅法之收斂速度,我們以 Chiang et al. (2007) 的重要性取樣法為基礎,將其拓展到極值相依模型下,並提出兩階段的重要性取樣技巧確保使用此方法估計一籃子信用違約時,所有模擬路徑均會發生信用事件。數值結果顯示,所提出的演算法皆達變異數縮減。而在模型自由度較低或是資產池較大的情況下,兩階段的重要性取樣法將會有更佳的估計效率。我們也以同樣的思路,提出用以估計投資組合損失機率的演算法。雖然所提出的演算法經過重要性取樣的技巧後仍無法使得欲估計的事件在所有模擬路徑下都會發生,但數值結果仍顯示所提出的方法估計效率遠遠優於傳統蒙地卡羅法。 / Monte Carlo simulation is a useful tool on portfolio credit risk management. When measuring portfolio credit risk, one should choose an appropriate model to characterize the dependence among all assets. Normal copula is the most widely used mechanism to capture this dependence structure, however, some emperical studies suggest that $t$-copula provides a better fit to market data than normal copula does. In this article, we use extremal depence model proposed by Bassamboo et al. (2008) to construct $t$-copula. We also extend the importance sampling (IS) procedure proposed by Chiang et al. (2007) to evaluate basket credit default swaps (BDS) with extremal dependence and introduce a two-step IS algorithm which ensures credit events always take place for every simulation path. Numerical results show that the proposed methods achieve variance reduction. If the model has lower degree of freedom, or the portfolio size is larger, the two-step IS method is more efficient. Following the same idea, we also propose algorithms to estimate the probability of portfolio losses. Althought the desired events may not occur for some simulations, even if the IS technique is applied, numerical results still show that the proposed method is much better than crude Monte Carlo.
1173

Développement de stratégies de maintenance prévisionnelle de systèmes multi-composants avec structure complexe / Predictive maintenance strategies for multi-component systems with complex structure

Nguyen, Kim Anh 16 October 2015 (has links)
Aujourd'hui, les systèmes industriels deviennent de plus en plus complexes. Cette complexité est due d’une part à la structure du système qui ne se résume pas à des structures classiques en fiabilité, d’autre part à la prise en compte de composants présentant des phénomènes de dégradation graduelle que des systèmes de monitoring permettent de surveiller. Ceci mène à l'objectif de cette thèse portant sur le développement des stratégies de maintenance prévisionnelle pour des systèmes multi-composants complexes. Les politiques envisagées proposent notamment des stratégies de regroupement de composants permettant de tirer des dépendances économiques identifiées. Des facteurs d'importance permettant de prendre en compte la structure du système et la dépendance économique sont développés et combinés avec les évaluations de fiabilité prévisionnelle des composants pour l’élaboration de règles de décision de regroupement. De plus, un couplage des règles de décision de maintenance et de gestion des stocks est également étudié. L’ensemble des études menées montrent l’intérêt de la prise en compte de la fiabilité prévisionnelle des composants, des dépendances économiques et de la structure complexe du système dans l'aide à la décision de maintenance et de gestion des stocks. L’avantage des stratégies développées est vérifié en les comparant à d’autres existantes dans la littérature / Today, industrial systems become more and more complex. The complexity is due partly to the structure of the system that cannot be reduced to classic structure reliability (series structures, parallel structures, series-parallel structures, etc), secondly the consideration of components with gradual degradation phenomena that can be monitored. This leads to the main purpose of this thesis on the development of predictive maintenance strategies for complex multi-component systems. The proposed policies provide maintenance grouping strategies to take advantage of the economic dependence between components. The predictive reliability of components and importance measures allowing taking into account the structure of the system and economic dependence are developed to construct the grouping decision rules. Moreover, a joint decision rule for maintenance and spare parts provisioning is also studied.All the conducted studies show the interest in the consideration of the predictive reliability of components, economic dependencies as well as complex structure of the system in maintenance decisions and spare parts provisioning. The advantage of the developed strategies is confirmed by comparing with the other existing strategies in the literature
1174

Modelagem computacional de tomografia com feixe de prótons / Computational modeling of protons tomography

Olga Yevseyeva 16 February 2009 (has links)
Fundação Carlos Chagas Filho de Amparo a Pesquisa do Estado do Rio de Janeiro / Nessa tese foi feito um estudo preliminar, destinado à elaboração do programa experimental inicial para a primeira instalação da tomografia com prótons (pCT) brasileira por meio de modelagem computacional. A terapia com feixe de prótons é uma forma bastante precisa de tratamento de câncer. Atualmente, o planejamento de tratamento é baseado na tomografia computadorizada com raios X, alternativamente, a tomografia com prótons pode ser usada. Algumas questões importantes, como efeito de escala e a Curva de Calibração (fonte de dados iniciais para planejamento de terapia com prótons), foram estudados neste trabalho. A passagem de prótons com energias iniciais de 19,68MeV; 23MeV; 25MeV; 49,10MeV e 230MeV pelas camadas de materiais variados (água, alumínio, polietileno, ouro) foi simulada usando códigos Monte Carlo populares como SRIM e GEANT4. Os resultados das simulações foram comparados com a previsão teórica (baseada na solução aproximada da equação de transporte de Boltzmann) e com resultados das simulações feitas com outro popular código Monte Carlo MCNPX. Análise comparativa dos resultados das simulações com dados experimentais publicados na literatura científica para alvos grossos e na faixa de energias de prótons usada em medidas em pCT foi feita. Foi observado que apesar de que todos os códigos mostram os resultados parecidos alguns deslocamentos não sistemáticos podem ser observados. Foram feitas observações importantes sobre a precisão dos códigos e uma necessidade em medidas sistemáticas de frenagem de prótons em alvos grossos foi declarada. / In the present work a preliminary research via computer simulations was made in order to elaborate a prior program for the first experimental pCT setup in Brazil. Proton therapy is a high precise form of a cancer treatment. Treatment planning nowadays is performed basing on X ray Computer Tomography data (CT), alternatively the same procedure could be performed using proton Computer Tomography (pCT). Some important questions, as a scale effect and so called Calibration Curve (as a source of primary data for pCT treatment planning) were studied in this work. The 19.68MeV; 23MeV; 25MeV; 49.10MeV e 230MeV protons passage through varied absorbers (water, aluminum, polyethylene, gold) were simulated by such popular Monte Carlo packages as SRIM and GEANT4. The simulation results were compared with a theoretic prevision based on approximate solution of the Boltzmann transport equation and with simulation results of the other popular Monte Carlo code MCNPX. The comparative analysis of the simulations results with the experimental data published in scientific literature for thick absorbers and within the energy range used in the pCT measurements was made. It was noted in spite of the fact that all codes showed similar results some nonsystematic displacements can be observed. Some important observations about the codes precision were made and a necessity of the systematic measurements of the proton stopping power in thick absorbers was declared.
1175

Eléments de théorie du risque en finance et assurance / Elements of risk theory in finance and insurance

Mostoufi, Mina 17 December 2015 (has links)
Cette thèse traite de la théorie du risque en finance et en assurance. La mise en pratique du concept de comonotonie, la dépendance du risque au sens fort, est décrite pour identifier l’optimum de Pareto et les allocations individuellement rationnelles Pareto optimales, la tarification des options et la quantification des risques. De plus, il est démontré que l’aversion au risque monotone à gauche, un raffinement pertinent de l’aversion forte au risque, caractérise tout décideur à la Yaari, pour qui, l’assurance avec franchise est optimale. Le concept de comonotonie est introduit et discuté dans le chapitre 1. Dans le cas de risques multiples, on adopte l’idée qu’une forme naturelle pour les compagnies d’assurance de partager les risques est la Pareto optimalité risque par risque. De plus, l’optimum de Pareto et les allocations individuelles Pareto optimales sont caractérisées. Le chapitre 2 étudie l’application du concept de comonotonie dans la tarification des options et la quantification des risques. Une nouvelle variable de contrôle de la méthode de Monte Carlo est introduite et appliquée aux “basket options”, aux options asiatiques et à la TVaR. Finalement dans le chapitre 3, l’aversion au risque au sens fort est raffinée par l’introduction de l’aversion au risque monotone à gauche qui caractérise l’optimalité de l’assurance avec franchise dans le modèle de Yaari. De plus, il est montré que le calcul de la franchise s’effectue aisément. / This thesis deals with the risk theory in Finance and Insurance. Application of the Comonotonicity concept, the strongest risk dependence, is described for identifying the Pareto optima and Individually Rational Pareto optima allocations, option pricing and quantification of risk. Furthermore it is shown that the left monotone risk aversion, a meaningful refinement of strong risk aversion, characterizes Yaari’s decision makers for whom deductible insurance is optimal. The concept of Comonotonicity is introduced and discussed in Chapter 1. In case of multiple risks, the idea that a natural way for insurance companies to optimally share risks is risk by risk Pareto-optimality is adopted. Moreover, the Pareto optimal and individually Pareto optimal allocations are characterized. The Chapter 2 investigates the application of the Comonotonicity concept in option pricing and quantification of risk. A novel control variate Monte Carlo method is introduced and its application is explained for basket options, Asian options and TVaR. Finally in Chapter 3 the strong risk aversion is refined by introducing the left-monotone risk aversion which characterizes the optimality of deductible insurance within the Yaari’s model. More importantly, it is shown that the computation of the deductible is tractable.
1176

Aide au tolérancement tridimensionnel : modèle des domaines / Three-dimensional tolerancing assistance : domains model

Mansuy, Mathieu 25 June 2012 (has links)
Face à la demande de plus en plus exigeante en terme de qualité et de coût de fabrication des produits manufacturés, la qualification et quantification optimal des défauts acceptables est primordial. Le tolérancement est le moyen de communication permettant de définir les variations géométriques autorisé entre les différents corps de métier intervenant au cours du cycle de fabrication du produit. Un tolérancement optimal est le juste compromis entre coût de fabrication et qualité du produit final. Le tolérancement repose sur 3 problématiques majeures: la spécification (normalisation d'un langage complet et univoque), la synthèse et l'analyse de tolérances. Nous proposons dans ce document de nouvelles méthodes d'analyse et de synthèse du tolérancement tridimensionnel. Ces méthodes se basent sur une modélisation de la géométrie à l'aide de l'outil domaine jeux et écarts développé au laboratoire. La première étape consiste à déterminer les différentes topologies composant un mécanisme tridimensionnel. Pour chacune de ces topologies est définie une méthode de résolution des problématiques de tolérancement. Au pire des cas, les conditions de respect des exigences fonctionnelles se traduisent par des conditions d'existence et d'inclusions sur les domaines. Ces équations de domaines peuvent ensuite être traduites sous forme de système d'inéquations scalaires. L'analyse statistique s'appuie sur des tirages de type Monte-Carlo. Les variables aléatoires sont les composantes de petits déplacements des torseur écarts défini à l'intérieur de leur zone de tolérance (modélisée par un domaine écarts) et les dimensions géométriques fixant l'étendue des jeux (taille du domaine jeux associé). A l'issue des simulations statistiques, il est possible d'estimer le risque de non-qualité et les jeux résiduels en fonction du tolérancement défini. Le développement d'une nouvelle représentation des domaines jeux et écarts plus adapté, permet de simplifier les calculs relatifs aux problématiques de tolérancement. Le traitement local de chaque topologie élémentaire de mécanisme permet d'effectuer le traitement global des mécanismes tridimensionnels complexes avec prise en compte des jeux. / As far as the demand in quality and cost of manufacturing increase, the optimal qualification and quantification of acceptable defects is essential. Tolerancing is the means of communication between all actors of manufacturing. An optimal tolerancing is the right compromise between manufacturing cost and quality of the final product. Tolerancing is based on three major issues: The specification (standardization of a complete and unequivocal language), synthesis and analysis of the tolerancing. We suggest in this thesis some new analysis and synthesis of the three-dimensional tolerancing. These methods are based on a geometric model define by the deviations and clearances domains developed on the laboratory. The first step consists in determining the elementary topology that composes a three-dimensional mechanism. For each kind of these topologies one resolution method is defined. In worst case, the condition of functional requirement respect is traduced by existence and inclusions conditions on the domains. Then these domains equations can be translated in inequalities system of scalar. The statistical analysis uses the Monte-Carlo simulation. The random variables are the small displacements components of the deviation torsor which is defined inside its tolerance area (model by a deviations domain) and the geometrics dimensions which set the extent of clearance (size of the clearance domain). Thanks to statistical simulation, it is possible to estimate the non-quality rate in regards to the defined tolerancing. The development of a new representation of clearances and deviations domains most suitable, allows us to simplify the calculation for tolerancing problems. The local treatment of elementary topology makes enables the global treatment of complex three-dimensional mechanisms with take into account of clearances.
1177

Simulação dos perfis de espalhamento elástico de tecidos mamários e materiais equivalentes por código Monte Carlo / Simulation of elastic scattering profiles of breast tissues and equivalent materials by Monte Carlo code

Sato, Karoline Akemi 23 February 2018 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Em radiodiagnósticos, tais como em mamografias, é comum considerar que a radiação espalhada seja um problema, pois degrada o contraste da imagem. Porém, estudos recentes mostraram que é possível identificar a presença de anormalidades num tecido biológico, a partir da distribuição angular dos fótons espalhados (perfil de espalhamento), que fornecem as informações detalhadas sobre a composição estrutural do tecido. No entanto, estudos de base deste tipo tem-se utilizado de simulação computacional, devido à dificuldades e limitações, tanto na obtenção, armazenamento e manipulação das amostras, quanto na instrumentação experimental, além de questões éticas envolvidas. Neste trabalho, foram simulados os perfis de espalhamento elástico, na região de ângulos intermediários, denominado WAXS (do inglês, Wide Angle X-ray Scattering), de tecidos mamários normais e patológicos, bem como de materiais equivalentes a tais tecidos, por fornecer informações sobre as estruturas responsáveis pelo espalhamento em nível molecular. Para a simulação computacional dos perfis de espalhamento foi utilizado o código Monte Carlo, com o auxílio do software MC-GPU. Seu diferencial em relação a outros códigos para simulação de perfis na região de WAXS é a possibilidade de inclusão dos fatores de forma medidos experimentalmente, que englobam a Função de Interferência Molecular, ao invés apenas dos fatores de forma calculados utilizando o Modelo Atômico Independente. Foram construídos dois phantoms virtuais em forma cilíndrica, com inserções cilíndricas internas contendo os seguintes materiais: tecido adiposo, tecido glandular, água, dimetilformamida, etanol, glicerol e nylon, compondo 27 combinações, com simulações específicas em cada uma. Estes materiais foram escolhidos por apresentar características de atenuação similares às dos tecidos mamários normais e patológicos na energia utilizada. Foi utilizado um feixe de fótons monoenergético (Kα-Cu = 8,54 keV) e um detector bidimensional. Os padrões de espalhamento obtidos foram integrados para obtenção dos perfis de espalhamento. Os resultados com os phantoms virtuais foram muito semelhantes aos reportados na literatura para cada um dos materiais puros inseridos nos phantoms. Assim, este trabalho demonstrou à possibilidade de inclusão dos fatores de forma experimentais de cada material, nas simulações dos perfis de espalhamento dos phantom de mama normal e patológica, obtendo resultados mais realísticos e catalogando os resultados de forma que possa ser utilizado como base de dados em trabalhos futuros. Portanto, ainda que preliminares, os resultados deste trabalho corroboram para a exploração de novas técnicas de imagem mamária baseadas no espalhamento elástico de raios X. / In radiodiagnostics, such as mammograms, it is common to consider that scattered radiation is a problem because it degrades the contrast of the image. However, recent studies have shown that it is possible to identify the presence of abnormalities in a biological tissue, from the angular distribution of scattered photons (scattering profile), which provide detailed information on the structural composition of the tissue. Basic studies of this type have been used for computational simulation, due to the difficulties and limitations, both in obtaining, storing and manipulating the samples, as well as in the experimental instrumentation, besides the ethical issues involved. In this work, the elastic scattering profiles of normal and pathological mammary tissues, and the materials equivalent to these tissues, were simulated in the region of intermediate angles, called WAXS (Wide Angle X-ray Scattering). This structure responsible for scattering at the molecular level. For the computational simulation of the scattering profiles was used the Monte Carlo code, with the aid of the MC-GPU software. Its differential in relation to other codes for simulation of profiles in the WAXS region is the possibility of including experimentally measured form factors, which encompass the Molecular Interference Function, instead of only the form factors calculated using the Independent Atomic Model. Two cylindrical virtual phantoms were constructed with internal cylindrical inserts filled with the following materials: adipose tissue, glandular tissue, water, dimethylformamide, ethanol, glycerol and nylon, composing 27 combinations, with specific simulations in each one. These materials were chosen because they presented attenuation characteristics similar to the normal and pathological mammary tissues in the energy used. A monoenergetic photon beam (Kα-Cu = 8.54 keV) and a two-dimensional detector were used. The scattering patterns obtained were integrated to obtain the scattering profiles. The results with the virtual phantoms were very similar to those reported in the literature for each of the pure materials inserted in the phantoms. Thus, this work demonstrated the possibility of including the experimental form factors of each material in the simulations of the scattering profiles of the normal and pathological breast phantom, obtaining more realistic results and cataloging the results in a way that can be used as a database in future work. Therefore, although preliminary, the results of this work corroborate the exploration of new breast imaging techniques based on elastic X-ray scattering.
1178

Avaliação da interação de feixes monoenergéticos e polienergéticos por meio de simulações em GEANT4 em fantomas diversos / Evaluation of the interaction of monoenergy and polyenergytic beams by means of GEANT4 simulations in miscellaneous phanton

Yagui, Akemi 06 July 2017 (has links)
A terapia com prótons está presente em 16 países e até 2015 tratou mais de 130 mil pacientes. No entanto, no Brasil essa terapia ainda não está presente por diversos motivos, sendo o principal o alto custo. Antes de realizar tratamentos, é necessário fazer alguns testes para verificação da entrega de energia dos feixes de prótons. Como as medidas de microdosimetria são muito caras, a principal alternativa é a realização de simulações em programas computacionais, como o GEANT4 e SRIM. O GEANT4 é um programa que permite simular geometrias complexas, enquanto que o SRIM realiza simulações mais simples e ambas trabalham com o método de Monte Carlo. Neste trabalho foram utilizadas estas duas ferramentas para realizar simulações de feixes de prótons em fantomas com três diferentes composições (água, água e tecido ósseo, tecido ósseo e cerebral). Para realizar a análise da entrega de energia dos feixes de prótons ao longo destes fantomas, tornou-se necessário criar um programa denominada “Programa de Processamento de Dados em Próton Terapia Simulada”, que proporcionou criar matrizes, além dos cálculos dos picos de Bragg para avaliação da interação. Além disso, foi analisada a homogeneidade da interação de um feixe de prótons em um detector, em que foi verificado que as simulações em GEANT4 são homogêneas, não tendo uma tendência do feixe em se localizar em uma determinada região, assim como as energias depositadas são iguais nas regiões do fantoma. Também foram avaliados os valores do alcance de profundidade dos picos de Bragg em fantomas cilíndricos com três diferentes densidades: 1,03 g/cm³, 1,53 g/cm³ e 2,03 g/cm³, sendo a primeira, a densidade fornecida pelo GEANT4 para tecido cerebral. Foi verificado que as distâncias do alcance de profundidade dos picos de Bragg são iguais nestas três diferentes densidades. / Proton therapy is present in 16 countries and by 2015 has treated more than 130,000 Patients. However, in Brazil this therapy is not yet present for several reasons, Being the main the high cost. Before performing treatments, it is necessary to do some tests to verify the energy delivery of the proton bundles. As the Microdosimetry are very expensive, the main alternative is to carry out simulations in Programs such as GEANT4 and SRIM. GEANT4 is a program that Allows you to simulate complex geometries, while SRIM performs more complex simulations. Simple and both work with the Monte Carlo method. On this work were used these twain tools to perform a proton beam simulation in phantom with three different compositions (water, bones and water, brain and bones). To perform the energy delivery analysis of the proton beams along these phantoms, has become necessary create a program denominated “Data Processing Program Proton Therapy Simulated”, which allowed to create matrices, beyond the calculations of the Bragg peaks for interaction evaluation. Besides that, it was analyzing the homogeneity of the integration of a proton beam into a detector, in which it was verified that the simulations on GEANT4 are homogeneous, not having a tendency of the beam in locating in a certain region, just as the energies deposited are equal. The value of the depth range of the Bragg peaks were also evaluated in cylindrical phantoms with three different densities: 1,03 g/cm³, 1,53g/cm³ and 2,03 g/cm³, the first being the density provided by GEANT4 for brain tissue. It has been found that the depth range distances of the Bragg peaks are the same at these three different densities.
1179

Estimativas dos momentos estatísticos para o problema de flexão estocástica de viga em uma fundação Pasternak

Santos, Marcelo Borges dos 20 March 2015 (has links)
A presente dissertação propõe a resolução do problema de flexão estocástica em uma viga Euler-Bernoulli, sobre uma fundação do tipo Pasternak, através de um método computacional baseado na simulação de Monte Carlo. A incerteza está presente nos coeficientes elásticos da viga e da fundação. Primeiramente, é estabelecida a formulação matemática do problema que é oriunda, de um modelo físico de deslocamento da viga, que leva em consideração a influência da fundação sobre a resposta do problema. Portanto foi realizado um estudo a cerca dos modelos mais usuais de fundação, que são: o modelo do tipo Winkler, e modelo de Pasternak. Logo a seguir foi provado que o problema variacional abstrato, derivado da formulação forte do problema, apresenta solução e esta é única. Para a obtenção da solução do problema, foi realizada uma fundamentação matemática, dos seguintes assuntos: representação da incerteza, método de Galerkin, série de Neumann, e por fim das cotas inferiores e superiores. Finalmente, o desempenho das cotas inferiores e superiores, em relação à simulação de Monte Carlo direto, foram avaliadas através de vários casos, nos quais a incerteza repousa sobre os diversos coeficientes que compõe a equação de flexão na forma de um problema variacional. A metodologia mostrou-se eficiente, tanto no aspecto da convergência da resposta quanto no que se refere ao custo computacional. / This work proposes the resolution of stochastic bending problem in a Euler- Bernoulli beam, on a foundation type Pasternak, through a computational method based on Monte Carlo simulation. Uncertainty is present in the elastic coefficients of the beam and foundation. First, it is established the mathematical formulation of the problem which is derived from a physical model displacement of the beam, that takes into account the influence of the foundation on the problem of response. This requires an approach that is made up on the most common models of foundation, which are: the model Winkler type and model of Pasternak.In sequence we study the existence and uniqueness of the variational problem. To obtain the solution of the problem, a mathematical reasoning is carried out, to the following matters: representation of uncertainty, Galerkin method, serial Neumann, and finally the lower and upper bounds. Finally, the performance of lower and upper bounds, derived from direct simulation of Monte Carlo were evaluated through various cases where the uncertainty lies in the different coefficients composing the equation bending as a variational problem. The method proved to be efficient, both in the response of the convergence point as regards the computational cost.
1180

Avaliação da interação de feixes monoenergéticos e polienergéticos por meio de simulações em GEANT4 em fantomas diversos / Evaluation of the interaction of monoenergy and polyenergytic beams by means of GEANT4 simulations in miscellaneous phanton

Yagui, Akemi 06 July 2017 (has links)
A terapia com prótons está presente em 16 países e até 2015 tratou mais de 130 mil pacientes. No entanto, no Brasil essa terapia ainda não está presente por diversos motivos, sendo o principal o alto custo. Antes de realizar tratamentos, é necessário fazer alguns testes para verificação da entrega de energia dos feixes de prótons. Como as medidas de microdosimetria são muito caras, a principal alternativa é a realização de simulações em programas computacionais, como o GEANT4 e SRIM. O GEANT4 é um programa que permite simular geometrias complexas, enquanto que o SRIM realiza simulações mais simples e ambas trabalham com o método de Monte Carlo. Neste trabalho foram utilizadas estas duas ferramentas para realizar simulações de feixes de prótons em fantomas com três diferentes composições (água, água e tecido ósseo, tecido ósseo e cerebral). Para realizar a análise da entrega de energia dos feixes de prótons ao longo destes fantomas, tornou-se necessário criar um programa denominada “Programa de Processamento de Dados em Próton Terapia Simulada”, que proporcionou criar matrizes, além dos cálculos dos picos de Bragg para avaliação da interação. Além disso, foi analisada a homogeneidade da interação de um feixe de prótons em um detector, em que foi verificado que as simulações em GEANT4 são homogêneas, não tendo uma tendência do feixe em se localizar em uma determinada região, assim como as energias depositadas são iguais nas regiões do fantoma. Também foram avaliados os valores do alcance de profundidade dos picos de Bragg em fantomas cilíndricos com três diferentes densidades: 1,03 g/cm³, 1,53 g/cm³ e 2,03 g/cm³, sendo a primeira, a densidade fornecida pelo GEANT4 para tecido cerebral. Foi verificado que as distâncias do alcance de profundidade dos picos de Bragg são iguais nestas três diferentes densidades. / Proton therapy is present in 16 countries and by 2015 has treated more than 130,000 Patients. However, in Brazil this therapy is not yet present for several reasons, Being the main the high cost. Before performing treatments, it is necessary to do some tests to verify the energy delivery of the proton bundles. As the Microdosimetry are very expensive, the main alternative is to carry out simulations in Programs such as GEANT4 and SRIM. GEANT4 is a program that Allows you to simulate complex geometries, while SRIM performs more complex simulations. Simple and both work with the Monte Carlo method. On this work were used these twain tools to perform a proton beam simulation in phantom with three different compositions (water, bones and water, brain and bones). To perform the energy delivery analysis of the proton beams along these phantoms, has become necessary create a program denominated “Data Processing Program Proton Therapy Simulated”, which allowed to create matrices, beyond the calculations of the Bragg peaks for interaction evaluation. Besides that, it was analyzing the homogeneity of the integration of a proton beam into a detector, in which it was verified that the simulations on GEANT4 are homogeneous, not having a tendency of the beam in locating in a certain region, just as the energies deposited are equal. The value of the depth range of the Bragg peaks were also evaluated in cylindrical phantoms with three different densities: 1,03 g/cm³, 1,53g/cm³ and 2,03 g/cm³, the first being the density provided by GEANT4 for brain tissue. It has been found that the depth range distances of the Bragg peaks are the same at these three different densities.

Page generated in 0.0575 seconds