Spelling suggestions: "subject:"other fhysics"" "subject:"other ephysics""
61 |
Open porosity fission gas release model applied to nuclear fuelsClaisse, Antoine January 2015 (has links)
Nitride fuels have gained a new interest in the last few years as both a candidate for GEN IV reactors and as accident tolerant fuels for current light water reactors. They however are decades behind oxide fuels when it comes to qualication and development of tools to assess their performances. In this thesis, such a tool is developed. The fuel performance codeTRANSURANUS, which has very good results with oxide fuels, is extended to handle nitride fuels. The relevant thermo-mechanical properties are implemented and fuel type dependent modules are updated. Their limitations and discrepancies are discussed. A particular attention is brought to the athermal ssion gas release, and a new model based on the open fabrication porosity is developed and added to the code, as a starting point toward a mechanistic model. It works well on oxide fuels, but its eciency is harder to evaluate for nitride fuels, due to large uncertainties on many key correlations such as the thermal conductivity and the eective diusion coecient of gas atoms. Recommendations are made to solve the most important problems. / <p>QC 20150604</p>
|
62 |
Direct Sampling of the Pulsed Eddy Current Response From Non-Ferromagnetic MetalsHolmér, Sara January 2021 (has links)
The non-destructive testing method Pulsed Eddy Current, or PEC, can be used to measure the thickness and resistivity of conducting and non-ferromagnetic materials. In this project I investigated if a gauge at ABB, which uses PEC technology to measure the thickness of aluminium strips in rolling mills, can extract more information about the strip by sampling the direct signal with a higher resolution than before. Using the gauge with this new sampling method, I first compared the noise level of the integrated signal obtained by using analog and digital integration. Then I used an upper noise level limit to determine how thick the strips could be without the noise level being too high. Finally, I used simulations and measurements of clad strips to determine if more information, such as the resistivity distribution, can be extracted from the measured signal and if the sampling frequency is high enough to detect the information. I found that the noise level can be reduced by replacing an analog integrator with a low-pass filter and integrating the signal digitally. I also found that the thickness can be between 1.3 mm and 12 mm whereas it can be between 0.50 mm and 8.0 mm with the previous sampling method. Finally, I found that the sampling frequency is high enough and that it is possible to determine if a strip has a clad layer or not. The results in this project suggest that it is beneficial to change the way the signal is sampled to this new method.
|
63 |
Development of an intelligent trigger system based on deep neural networksRigon, Luca January 2021 (has links)
No description available.
|
64 |
Machine Learning Model for Predicting the Repayment Rate of Loan TakersOskarsson, Emma January 2021 (has links)
Machine Learning (ML) uses statistics to find patterns in high dimensional data. The Swedish Board of Student Finance (CSN) wants to improve the way they classify new loan takers. Using Machine Learning (ML) on data from previous loan takers can establish patterns to use on new loan takers. The aim of this study is to investigate if CSN can improve the way they classify loan takers by their ability to pay back their loan. In this study, different ML models are applied to a data set from CSN, their performance are compared and investigated by the most related factors affecting an individuals repayment rate. A data set of a total of 2032095 individuals were analysed and used in the different models. Using Random Forest (RF) for binary classification produced the best result with a sensitivity of 0.9695 and a specificity of 0.8058.
|
65 |
Raytracing in Channel Model Development / Strålspårning i utveckling av kanalmodelleringRogne, Andreas January 2022 (has links)
The fifth generation of mobile internet is upon us, but still, there is work to do before the new technology is fully utilized. The new generation of cellular network promises frequencies ranging from the sub – 6 GHz to 39 GHz, the latter being in the mmWave spectrum. At these frequencies, we can utilize geometrical optics to calculate the radio wave propagations. The purpose of this work is to explore how raytracing can be used to predict wireless radio wave channels and pathloss in indoor and urban environment settings. The model presented in this work explores line of sight, reflection, refraction, diffraction and scattering. The model utilizes Frii’s pathloss model for pathloss in the line-of-sight case. For reflection and refraction, Snell’s laws of reflection and refraction were used. For diffraction, the uniform theory of diffraction was used, and the scattering explored in this work was created using a physics based bidirectional reflectance diffusion function. With this we create a basic raytracing program for simple environments with potential for expansion in future work. The simple environment is a cube made in an STL file. The algorithms for the different parameter were a hybrid method of shoot- and bounce and image method for reflection, a double counting method for refraction. The transitions between different shadow boundaries were smooth. While scattering was explored, more work needs to be done to implement scattering into the code. / Den femte generationen mobilt internet är över oss, men det återstår fortfarande arbete innan den nya tekniken utnyttjas fullt ut. Den nya generationen av mobilnät lovar frekvenser som sträcker sig från sub-6 GHz till 39 GHz, där det senare ligger i millimeter-spektrumet. Vid dessa frekvenser kan vi använda geometrisk optik för att beräkna radiovågsutbredningen. Syftet med detta arbete är att utforska hur raytracing kan användas för att förutsäga trådlösa radiovågskanaler och vägförluster i inomhus- och stadsmiljö. Modellen som presenteras i detta arbete utforskar synlinje, reflektion, brytning, diffraktion och spridning. Modellen använder Friis pathloss-modell för pathloss i siktlinjefallet. För reflektion och brytning användes Snells lagar för reflektion och brytning. För diffraktion användes den enhetliga teorin om diffraktion, och spridningen som utforskades i detta arbete skapades med hjälp av en fysikbaserad dubbelriktad reflektansdiffusionsfunktion. Med detta skapar vi ett grundläggande strålspårningsprogram för enkla miljöer med potential för expansion i framtida arbete. Algoritmerna för de olika parametrarna var en hybridmetod för skjut- och studs och bildmetod för reflektion, en dubbelräkningsmetod för refraktion. Övergångarna mellan olika skugggränser var kontinuerliga. Även om spridning undersöktes, så behövs mer arbete för att implementera spidning i koden.
|
66 |
Contouring variability in radiosurgery - dosimetric and radiobiological implicationsSandström, Helena January 2015 (has links)
The use of Stereotactic Radiation Therapy (SRT) employing one large fraction of radiation, as in stereotactic radiosurgery (SRS), or few fractions of high doses, has continuously increased due to the technical development and the progress in dose delivery complemented by the positive clinical experience. The success of stereotactic radiation therapy depends on many clinical, dosimetric and radiobiological factors. For SRS in particular, the delivery of a highly conformal dose distribution to the target in one fraction allowing at the same time the sparing of the normal tissue and the critical structures is part of the basic concept of the technique. Provided that the highly accurate radiosurgical equipment available today is used, planning and delivering the prescribed dose distribution is an achievable goal, and therefore the main issue to be solved is the definition of the target. As the target volume in radiosurgery is usually defined without margins, the success of the stereotactic approach critically depends on the accurate delineation of the target which could be identified as a factor of key importance. In addition, the delineation of the Organs At Risk (OAR) is also critical. The purpose of this work was to evaluate the current degree of variability for target and OAR contouring and to establish methods for analysing multi-observer data regarding structure delineation variability. A multi-center target and OAR delineation study was initiated. Two complex and six common cases to be treated with SRS were selected and subsequently distributed to centers around the world performing Gamma Knife® radiosurgery for delineation and treatment planning. The resulting treatment plans and the corresponding delineated structures were collected and analysed. Results showed a very high variability in contouring for four complex radiosurgery targets. Similar results indicating high variability in delineating the OAR and reporting the doses delivered to them were also reported. For the common radiosurgery targets however, a higher agreement in the delineation was observed, although lower than expected. The assessment of the quality of treatment planning for radiosurgery is usually performed with respect to the coverage of the target, the planning specificity, and dose to the sensitive structures and organs close to the target. However, physical dose conformity to the target does not guarantee the success of the treatment. The assessment of the plan quality should also be performed with respect to the clinical outcome expressed as probability of controlling the target that should be irradiated. In this respect, this study also aimed to create the framework for assessing the impact of the inaccuracy in delineating the target on the predicted treatment outcome for radiosurgery targets known for their high potential to invade the neighbouring normal tissue, using radiobiological models. In addition, radiobiological models have also been used to determine the tumour control probability accounting for the oxygenation for stereotactic radiation therapy targets. The results suggest that radiobiological modelling has the potential to add to the current knowledge in SRS by theoretically assessing the key factors that might influence the treatment outcome.
|
67 |
Scanning electron microscopy (SEM) analysis of tribofilms enhanced by fullerene-like nanoparticlesJenei, István Zoltán January 2012 (has links)
The beneficiary effects of WS2 inorganic fullerene-like nanoparticles in the lubrication industry were shown in recent years. The incorporation of the nanoparticles into lubricants (oils, greases) is however not straightforward. When two surfaces are sliding against each other and a lubricant is used, a thin layer (tribofilm) is formed on the contact area, which effects the friction process. Lubricants usually contain several additives. These additives can impair the friction reducing behaviour of the WS2 inorganic fullerene-like nanoparticles. This thesis investigates the effects of several additives in the lubrication process by analysing the tribofilms formed on the worn surfaces using energy-dispersive X-ray spectroscopy in a scanning electron microscope.
|
68 |
Configurations in Quantum InformationBlanchfield, Kate January 2012 (has links)
Measurements play a central role in quantum information. This thesis looksat two types: contextual measurements and symmetric measurements. Contextualityoriginates from the Kochen-Specker theorem about hidden variablemodels and has recently undergone a subtle shift in its manifestation. Symmetricmeasurements are characterised by the regular polytopes they formin Bloch space (the vector space containing all density matrices) and are thesubject of several investigations into their existence in all dimensions.We often describe measurements by the vectors in Hilbert space ontowhich our operators project. In this sense, both contextual and symmetricmeasurements are connected to special sets of vectors. These vectors areoften special for another reason: they form congurations in a given incidencegeometry.In this thesis, we aim to show various connections between congurationsand measurements in quantum information. The congurations discussedhere would have been well-known to 19th and 20th century geometers andwe show they are relevant for advances in quantum theory today. Specically,the Hesse and Reye congurations provide proofs of measurement contextuality,both in its original form and its newer guise. The Hesse congurationalso ties together dierent types of symmetric measurements in dimension3called SICs and MUBswhile giving insights into the group theoreticalproperties of higher dimensional symmetric measurements.
|
69 |
Nuclear data uncertainty propagation for a lead-cooled fast reactor: Combining TMC with criticality benchmarks for improved accuracyAlhassan, Erwin January 2014 (has links)
For the successful deployment of advanced nuclear systems and for optimization of current reactor designs, high quality and accurate nuclear data are required. Before nuclear data can be used in applications, they are first evaluated, benchmarked against integral experiments and then converted into formats usable for applications. The evaluation process in the past was usually done by using differential experimental data which was then complimented with nuclear model calculations. This trend is fast changing because of increase in computational power and tremendous improvements in nuclear reaction theories over the last decade. Since these model codes are not perfect, they are usually validated against a large set of experimental data. However, since these experiments are themselves not exact, the calculated quantities of model codes such as cross sections, angular distributions etc., contain uncertainties. A major source of uncertainty being the input parameters to these model codes. Since nuclear data are used in reactor transport codes asinput for simulations, the output of transport codes ultimately contain uncertainties due to these data. Quantifying these uncertainties is therefore important for reactor safety assessment and also for deciding where additional efforts could be taken to reduce further, these uncertainties. Until recently, these uncertainties were mostly propagated using the generalized perturbation theory. With the increase in computational power however, more exact methods based on Monte Carlo are now possible. In the Nuclear Research and Consultancy Group (NRG), Petten, the Netherlands, a new method called ’Total Monte carlo (TMC)’ has been developed for nuclear data evaluation and uncertainty propagation. An advantage of this approach is that, it eliminates the use of covariances and the assumption of linearity that is used in the perturbation approach. In this work, we have applied the TMC methodology for assessing the impact of nuclear data uncertainties on reactor macroscopic parameters of the European Lead Cooled Training Reactor (ELECTRA). ELECTRA has been proposed within the GEN-IV initiative within Sweden. As part of the work, the uncertainties of plutonium isotopes and americium within the fuel, uncertainties of the lead isotopes within the coolant and some structural materials of importance have been investigated at the beginning of life. For the actinides, large uncertainties were observed in the k-eff due to Pu-238, 239, 240 nuclear data while for the lead coolant, the uncertainty in the k-eff for all the lead isotopes except for Pb-204 were large with significant contribution coming from Pb-208. The dominant contributions to the uncertainty in the k-eff came from uncertainties in the resonance parameters for Pb-208. Also, before the final product of an evaluation is released, evaluated data are tested against a large set of integral benchmark experiments. Since these benchmarks differ in geometry, type, material composition and neutron spectrum, their selection for specific applications is normally tedious and not straight forward. As a further objective in this thesis, methodologies for benchmark selection based the TMC method have been developed. This method has also been applied for nuclear data uncertainty reduction using integral benchmarks. From the results obtained, it was observed that by including criticality benchmark experiment information using a binary accept/reject method, a 40% and 20% reduction in nuclear data uncertainty in the k-eff was achieved for Pu-239 and Pu-240 respectively for ELECTRA.
|
70 |
Deriving Physical Laws with Neural NetworksFusté Costa, Max January 2023 (has links)
The usage of neural networks to derive physical laws without any kind of pre-existing bias is a promising but relatively new field with the long-term goal to construct an artificial intelligence physicist that is able to derive physical laws from experimental data. In this project, a step is taken in the direction of solving complex problems by tackling the double pendulum, the simplest chaotic system. To do so, a neural network architecture, adapted from previous work, is used to find the relevant parameters of the system in multiple configurations of the pendulum. Afterwards, the possibility of a neural network derived general solution of the problem is discussed through the relevant aspects that increase its complexity.
|
Page generated in 0.3859 seconds