741 |
Measurements of the mass of the W boson in the W'+W'- #-># qqqq channel with the ALEPH detectorChalmers, Matthew Donald Kennedy January 1999 (has links)
No description available.
|
742 |
Formation of protonium and positronium in atomic collisionsWhitehead, Richard John January 2001 (has links)
A minimum-norm method has been developed for solving the coupled integro-differential equations describing the scattering of positrons by one-electron targets in which the rearrangement channels for positronium formation have been explicitly included. The minimum-norm method, applied to this application for the first time in this thesis, is an enhancement of a previously reported least-squares method which has enabled the extension to a significantly larger basis consisting of up to 26 states on the direct centre, including pseudostates, and 3 states on the positronium. The method has been applied here to e+-H and e+-He+ scattering; cross-sections have been produced for the latter over a range of energies up to 250 eV. The basis was found to be large enough to produce smooth cross sections and little evidence of pseudoresonance structure was found. The results are the first converged cross sections to be calculated for e+-He+ scattering using the coupled channel approximation. Results for e+-H scattering compare well with the work of other authors. A highly efficient parallel code was developed for solving the largest coupling cases. The results prove the minimum-norm approach to be an accurate and reliable method for large-scale coupled channel calculations involving rearrangement collisions. Also in this thesis, the capture of slow antiprotons by atomic hydrogen and positronium has been simulated by the Classical Trajectory Monte Carlo (CTMC) method. Statistically accurate cross sections for protonium and antihydrogen formation have been obtained and the energy dependence of the process established. Antihydrogen formation from antiproton collisions with positronium in the presence of a laser has also been simulated with the CTMC method and the effects of laser polarisation, frequency and intensity studied. Enhancements of the antihydrogen formation cross section were observed and it is suggested that more sophisticated calculations should be undertaken
|
743 |
Simulation d'évènements rares par Monte Carlo dans les réseaux hautement fiables / Rare event simulation using Monte Carlo in highly reliable networksSaggadi, Samira 08 July 2013 (has links)
Le calcul de la fiabilité des réseaux est en général un problème NP-difficile. On peut par exemple s’intéresser à la fiabilité des systèmes de télécommunications où l'on veut évaluer la probabilité qu’un groupe sélectionné de nœuds peuvent communiquer. Dans ce cas, un ensemble de nœuds déconnectés peut avoir des conséquences critiques, que ce soit financières ou au niveau de la sécurité. Une estimation précise de la fiabilité est ainsi nécessaire. Dans le cadre de ce travail, on s'intéresse à l’étude et au calcul de la fiabilité des réseaux hautement fiables. Dans ce cas la défiabilité est très petite, ce qui rend l’approche standard de Monte Carlo inutile, car elle nécessite un grand nombre d’itérations. Pour une bonne estimation de la fiabilité des réseaux au moindre coût, nous avons développé de nouvelles techniques de simulation basées sur la réduction de variance par échantillonnage préférentiel. / Network reliability determination, is an NP-hard problem. For instance, in telecommunications, it is desired to evaluate the probability that a selected group of nodes communicate or not. In this case, a set of disconnected nodes can lead to critical financials security consequences. A precise estimation of the reliability is, therefore, needed. In this work, we are interested in the study and the calculation of the reliability of highly reliable networks. In this case the unreliability is very small, which makes the standard Monte Carlo approach useless, because it requires a large number of iterations. For a good estimation of system reliability with minimum cost, we have developed new simulation techniques based on variance reduction using importance sampling.
|
744 |
Simulační model vývoje penzijního připojištění / The Simulation Model of the Development of Pension InsuranceZárubová, Radka January 2011 (has links)
First, this thesis introduces the system of pension insurance with state contribution including its proposed amendment made in 2009. Its aim is to forecast and to analyse expected development in pension insurance with state contribution. The main part of the thesis is focused on the simulation model of this insurance product. Within this model, annual interest on contributions is randomly generated and the amount of money a client of a hypothetical pension fund would receive is calculated. To facilitate this simulation, I programmed and attached (as a part of the thesis) an application in VBA language which enables to run this simulation in the preset number of replications. The thesis gives four examples of simulation experiments -- a simulation of pension insurance, and a simulation of pension saving, both versions both with and without contributions made by client's employer. The comparison of the expected efficiency of the both systems from the point of view of the government and a client is drawn at the end of the thesis.
|
745 |
Exploring appropriate offset values for pencil beam and Monte Carlo dose optimization in lung stereotactic body radiotherapy encompassing the effects of respiration and tumor locationUnknown Date (has links)
Evaluation of dose optimization using the Pencil Beam (PB) and Monte Carlo (MC) algorithms may allow physicists to apply dosimetric offsets to account for inaccuracies of the PB algorithm for lung cancer treatment with Stereotactic Body Radiotherapy (SBRT). 20 cases of Non-Small Cell Lung Cancer (NSCLC) were selected. Treatment plans were created with Brainlab iPlanDose® 4.1.2. The D97 of the Planning Target Volume (PTV) was normalized to 50 Gy on the Average Intensity Projection (AIP) using the fast PB and compared with MC. This exact plan with the same beam Monitor Units (MUs) was recalculated over each respiratory phase. The results show that the PB algorithm has a 2.3-2.4% less overestimation at the maximum exhalation phase than the maximum inhalation phase when compared to MC. Significantly smaller dose difference between PB and MC is also shown in plans for peripheral lesions (7.7 ± 0.7%) versus central lesions (12.7±0.8%)(p< 0.01). / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
|
746 |
Phantom Study Incorporating A Diode Array Into The Treatment Planning System For Patient-Specific Quality AssuranceUnknown Date (has links)
The purpose of this research is to accurately match the calculation environment, i.e. the treatment planning system (TPS) with the measurement environment (using a 2-D diode array) for lung Stereotactic Body Radiation Therapy (SBRT) patient-specific quality assurance (QA). Furthermore, a new phantom was studied in which the 2-D array and heterogeneities were incorporated into the patient-specific QA process for lung SBRT.
Dual source dual energy computerized tomography (DSCT) and single energy computerized tomography (SECT) were used to model phantoms incorporating a 2-D diode array into the TPS. A water-equivalent and a heterogeneous phantom (simulating the thoracic region of a patient) were studied. Monte Carlo and pencil beam dose distributions were compared to the measured distributions. Composite and individual fields were analyzed for normally incident and planned gantry angle deliveries. The distributions were compared using γ-analysis for criteria 3% 3mm, 2% 2mm, and 1% 1mm.
The Monte Carlo calculations for the DSCT modeled phantoms (incorporating the array) showed an increase in the passing percentage magnitude for 46.4 % of the fields at 3% 3mm, 85.7% at 2% 2mm, and 92.9% at 1% 1mm. The Monte Carlo calculations gave no agreement for the same γ-analysis criteria using the SECT.
Pencil beam calculations resulted in lower passing percentages when the diode array was incorporated in the TPS. The DSCT modeled phantoms (incorporating the array) exhibited decrease in the passing percentage magnitude for 85.7% of the fields at 3% 3mm, 82.1% at 2% 2mm, and 71.4% at 1% 1mm. In SECT modeled phantoms (incorporating the array), a decrease in passing percentage magnitude were found for 92.9% of the fields at 3% 3mm, 89.3% at 2% 2mm, and 82.1% at 1% 1mm.
In conclusion, this study demonstrates that including the diode array in the TPS results in increased passing percentages when using a DSCT system with a Monte Carlo algorithm for patient-specific lung SBRT QA. Furthermore, as recommended by task groups (e.g. TG 65, TG 101, TG 244) of the American Association of Physicists in Medicine (AAPM), pencil beam algorithms should be avoided in the presence of heterogeneous materials, including a diode array. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
|
747 |
Comparação de modelos com censura intervalar em análise de sobrevivência / Comparison of interval-censored models in survival analysisStrapasson, Elizabeth 20 April 2007 (has links)
Censura intervalar resulta quando os tempos de sobrevivência não são exatamente conhecidos, sabe-se apenas que eles ocorreram dentro de um intervalo. Dados de sobrevivência agrupados são casos particulares de censura intervalar quando os indivíduos são avaliados nos mesmos intervalos de tempo, ocasionando um grande número de empates. Um procedimento comum para a análise desse tipo de dados é ignorar a natureza de censura intervalar dos dados, ou seja, tratar a variável aleatória tempo como contínua e assumir que o evento ocorreu no início, no ponto médio ou no final do intervalo e, então, usar um método padrão de análise de sobrevivência. Neste estudo, simulações de Monte Carlo, com o modelo de Weibull, foram realizadas para comparar esses três procedimentos e um método novo proposto que é uma combinação desses três métodos e é orientado pela observação do histograma do tempo versus a freqüência de cada intervalo para a decisão de qual valor a ser usado. Considera-se também a análise dos dados como censura intervalar. Os resultados mostram que analisar os dados exatamente como censura intervalar é a forma correta. Entretanto, quando a taxa de falha aumenta o ponto médio poderia ser usado. A natureza discreta dos tempos de falha deve ser reconhecida quando existe um grande número de empates. Métodos de regressão para tratar dados agrupados são apresentados por Lawless (2003) e Collett (2003), cuja estrutura é especificada em termos da probabilidade de um indivíduo falhar em um intervalo, condicionada à sua sobrevivência ao intervalo anterior. Os modelos considerados na literatura são o de riscos proporcionais de Cox ou o logístico. O modelo de Weibull é proposto, neste trabalho, como uma alternativa ao modelo de Cox para ajustar dados de sobrevivência com censura intervalar no contexto de modelos discretos. Através de simulações foram construídas as estatísticas da razão de verossimilhança e do teste escore para a discriminação entre esses dois modelos. Para ilustrar as simulações duas aplicações em dados agronômicos foram utilizadas. / Interval-censored results when survival times are not exactly known, knowing only that they occur in an interval. Grouped survival data are particular cases of intervalcensored when individuals are evaluated in the same time-intervals, causing a great number of ties. A common procedure for the analysis this type of data is to ignore the nature of interval-censored data, or rather, treat the random variable time as continuous, and assume that the event occurred in the beginning, midpoint or interval end, and then use a standard method of survival analysis. In this study, Monte Carlo simulations according to Weibull model, were performed in order to compare these three procedures and a new method proposed which is a combination of the three, and is directed by the observation of time histogram versus each interval frequency in order to decide which value be used. Interval-censored data is also considered. The results show that to analyse the data exactly as interval-censored is the correct form. However, when the failure rate increase the midpoint could be used. The discrete nature of failure time must be recognized when there are a great number of ties. Regression methods to treat grouped data are presented by Lawless (2003) and Collett (2003), whose structure is specified in terms of the probability of an individual failing in an interval, conditioned to his survival to previous interval. The models considered in literature are either those of Cox proportional hazards or the logistic one. Weibull model is proposed in this study as an alternative to Cox model in order to adjust survival data with interval-censored in the context of discrete models. Through simulations were built the statistics of ratio likelihood and score test to distinguish between these two models. To illustrate the simulations two applications in agronomy data were used.
|
748 |
Avaliação de risco em operações de pouso de aeronaves em pistas paralelas utilizando procedimentos e técnicas CSPA. / Risk assessment in aircraft landing operations in parallel runways using CSPA procedures and techniques.Matsuyama, Rafael Tsuji 13 June 2011 (has links)
Historicamente, os sistemas de tráfego aéreo incorporaram níveis de automação nas atividades de controle do espaço aéreo com o intuito de atender à crescente demanda por serviços aéreos e de melhorar os níveis de segurança nos procedimentos de voo. Com o crescimento expressivo previsto para os próximos anos, devido ao aumento nos números de voos e de usuários, as opções tradicionais de expansão da malha aérea e / ou construção de novos aeroportos se tornaram onerosas economicamente, tornando necessária a adoção de alternativas, tais como as técnicas / procedimentos para pousos em pistas paralelas, como forma de aproveitar parte da atual infraestrutura aeroportuária existente, sem a necessidade de enormes aportes financeiros. Para avaliar a viabilidade de projetos de pousos simultâneos em pistas paralelas, um dos fatores importantes a serem analisados é o da avaliação do risco de colisão entre aeronaves associado durante esses procedimentos. Nesse cenário, este trabalho de pesquisa propõe uma extensão no modelo de avaliação de segurança de Ogata para procedimentos de pouso em pistas paralelas, considerando que o modelo original tem o objetivo de medir o nível de risco associado somente para operações de pouso convencionais em pistas paralelas. A extensão deste modelo ocorre no sentido de também permitir a simulação em outros cenários distintos de pouso, o que torna possível tanto a realização de comparativos entre técnicas / procedimentos utilizadas em operações de pouso em pistas paralelas, quanto a avaliação do nível de risco associado. Este modelo estendido de segurança utiliza o método de Monte Carlo, da mesma forma que o original, em que um número elevado de simulações de cenários possíveis de pousos em pistas paralelas é avaliado. Com os resultados obtidos, é analisado o impacto da variação da distância entre as pistas na segurança de pousos em pistas paralelas. / Historically, air traffic control systems have incorporated some levels of automation to manage procedures of airspace control in order to meet the growing demand for air transportation services and to improve levels of safety in flight procedures. With significant growth expected in the coming years due to an increase in numbers of flights and passengers, the traditional options of expanding the air traffic network and / or construction of new airports have become economically burdensome, requiring the adoption of alternatives such as techniques / procedures for landings on parallel runways as a way of taking advantage of part of the current airport infrastructure, without requiring enormous financial contributions. To assess the feasibility of projects of landing in parallel runways, one of the important factors to be analyzed is the evaluation of the risk of collision between aircraft, associated to these procedures. In this scenario, this research proposes to extend the Ogata safety assessment model in procedures for landing on parallel runways, whereas the original model aims to measure the level of risk associated only with conventional landing operations in parallel runways. The extension of this model occurs in order to allow the simulation of different landing scenarios, which makes possible both the conduct of comparative techniques / procedures used in landing operations on parallel runways, such as the risk level assessment. This model uses the Monte Carlo simulation, the same as the original model, in which a large number of simulations of possible scenarios for landings on parallel runways are evaluated. With these results, it studies the impact of the change of distance between lanes on the safety of aircraft landing on parallel runways.
|
749 |
Avaliação de dados nucleares para dosimetria de nêutrons / Evaluation of nuclear data for neutron dosimetryTardelli, Tiago Cardoso 01 November 2013 (has links)
Doses absorvidas e doses efetivas podem ser calculadas utilizando códigos computacionais de transporte de radiação. A qualidade desses cálculos depende dos dados nucleares, no entanto, são raras as informações sobre as diferenças nas doses causadas por diferentes bibliotecas. O objetivo desse estudo é comparar os valores de dose (absorvida e efetiva) obtidos utilizando diferentes bibliotecas de dados nucleares devido a uma fonte externa de nêutrons na faixa de 10-11 a 20 MeV. As bibliotecas de dados nucleares são: JENDL 4.0, JEFF 3.1.1 e ENDF/B-VII.0. Cálculos de doses foram realizados utilizando o código MCNPX considerando o modelo antropomórfico da ICRP-110. As diferenças nos valores das doses absorvidas utilizando as bibliotecas JEFF 3.1.1 e a ENDF/B.VII são pequenas, em torno de 1%, porém os resultados obtidos com a JENDL 4.0 apresentam diferenças de até 85 % compara aos resultados da ENDF/B-VII.0 e JEFF 3.1.1. Diferenças nas doses efetivas são em torno de 1,5% entre ENDF/B-VII.0 e JEFF 3.1.1, e 11 % entre ENDF/B-VII.0 e JENDL 4.0. / Absorbed dose and Effective dose are usually calculated using radiation transport computer codes. The quality of the calculations of absorbed dose depends on nuclear data utilized, however, there are rare information about the differences in dose caused by the use of different libraries. The objective of this study is to compare dose values obtained using different nuclear data libraries due to external source of neutrons in the energy range from 10-11 to 20 MeV. The nuclear data libraries used are: JENDL 4.0, JEFF 3.3.1 and ENDF/B.VII. Dose calculations were carried out with the MCNPX code considering the anthropomorphic ICRP 110 model. The differences in the absorbed dose values using JEFF 3.3.1 and ENDF/B.VII libraries are small, around 1%, but the results obtained with JENDL 4.0 presented differences up to 85% compared to ENDF and JEFF results. Differences in effective dose values are around 1.5% between ENDF and JEFF and 11% between ENDF/B.VII and JENDL 4.0.
|
750 |
Confiabilidade estrutural de pontes laminadas protendidas de madeira / Structural reliability of stress laminated timber bridgesLindquist, Malton 11 December 2006 (has links)
O conceito de tabuleiros laminados de madeira protendidos transversalmente foi usado inicialmente no Canadá na década de 70. Desde então, foi largamente utilizado em um número crescente de países. No Brasil, esse sistema foi utilizado pela primeira vez com a construção da ponte sobre o rio Monjolinho, na região metropolitana de São Carlos, estado de São Paulo. A importância deste sistema estrutural requer um maior conhecimento de sua segurança estrutural. Assim, este trabalho teve como objetivo estudar a confiabilidade estrutural de pontes protendidas de madeira, com enfoque na resistência da estrutura à flexão transversal. As pontes foram dimensionadas através de três métodos, o de Ritter, Eurocode e OTB, sendo os dois primeiros conhecidos na literatura e o último baseado na solicitação encontrada através de um software de análise de placas ortotrópicas, OTB. Para obter índices de confiabilidade foi utilizado o método FORM, e o método de Monte Carlo para simular a utilização das fórmulas de obtenção de resistências características sugeridas na NBR 7190:1997 e DIN 68364. Os resultados indicaram que o sistema estrutural é confiável dentro do modo de falha estudado. / Stress laminated timber bridges were first built in Canadá in the seventies. Since then, this kind of structure has been increasingly used in many countries. In Brazil, this structure was first designed in the bridge over Monjolinho stream, in São Carlos, São Paulo state. The importance of this system requires a better knowledge about its structural safety. Therefore, the aim of this work is to research the structural reliability of stress laminated timber bridges, with special focus on transversal bending strength. Bridges were designed by three methods: Ritter, Eurocode and OTB. Ritter and Eurocode are well known design methods. OTB is based on an ortotropic timber bridges analysis software. In order to obtain reliability results, FORM method was used. Monte Carlo method was also considered to simulate characteristic values outputs by brazilian code, NBR 7190:1997, and german code, DIN 68364. Results indicate that the structure os reliable for the limit state studied.
|
Page generated in 0.0345 seconds