• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 134
  • 44
  • 24
  • 14
  • 6
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 267
  • 267
  • 267
  • 45
  • 28
  • 26
  • 24
  • 23
  • 21
  • 20
  • 20
  • 18
  • 18
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

O limite termodinâmico do modelo de Axelrod unidimensional de duas características e dois estados / The thermodynamic limit of the one-dimensional Axelrod model with two features and two states

Biral, Elias José Portes 04 March 2016 (has links)
Em meados de 1990, o cientista político Robert Axelrod propôs um modelo, baseado em agentes, de disseminação cultural, em que os agentes interagem localmente de acordo com os princípios da homofilia e influência social, visando responder à pergunta: \"Se as pessoas interagem umas com as outras e, através da interação, tornam-se mais semelhantes, por que existem diferenças culturais em nossa sociedade?\". Cada agente é considerado um elemento de uma matriz (um sítio em uma rede) e é modelado por uma lista de F características culturais, cada qual assumindo q estados possíveis. Essa lista é a identidade cultural do agente. As identidades culturais iniciais de cada agente são definidas de modo aleatório com igual probabilidade para as qF identidades diferentes. Em cada vez, escolhemos um agente ao acaso (o agente alvo), assim como um de seus vizinhos. Esses dois agentes interagem com uma probabilidade igual a sua semelhança cultural, definida como a fração de características idênticas nas suas listas culturais. Quanto mais semelhantes forem, maior a probabilidade de sua interação - este é o princípio da homofilia. Uma interação consiste em selecionar aleatoriamente uma das características distintas, fazendo o estado da característica escolhida do agente alvo igual ao estado correspondente do seu vizinho - este é o princípio da influência social. Esse procedimento é repetido até que o sistema fique congelado numa configuração absorvente. Observamos configurações monoculturais absorventes, em que todos os agentes têm a mesma identidade cultural, e configurações multiculturais, em que existem diferentes domínios culturais na rede. Contudo, no modelo unidimensional com F = q = 2, simulações de Monte Carlo mostram convergência para as configurações monoculturais em cerca de 30% das escolhas das condições iniciais, enquanto os resultados analíticos exatos indicam que a convergência monocultural deve acontecer sempre. Nessa dissertação, estudamos o modelo de Axelrod unidimensional com F = q = 2, usando simulações de Monte Carlo. Mostramos que a discrepância entre as simulações e os resultados exatos acontece devido à não-comutação do limite termodinâmico, em que o tamanho da rede tende para o infinito, e o limite de tempo assintótico, em que o tempo de simulação vai para infinito. Nossos resultados oferecem uma melhor compreensão do modelo de Axelrod unidimensional e promovem a importância do acordo entre teoria e simulações na ciência. / In the mid 1990s, the political scientist Robert Axelrod proposed an agent based-model of cultural dissemination, in which agents interact locally according to the principles of homophily and social influence, aiming at answering the question: \"If people interact with each other and, through the interaction, become more similar, why are there cultural differences in our society?\". Each agent is considered as an element of an array (a site in a lattice) and is modeled by a list of F cultural features, each of which assuming q possible states. This list is the cultural identity of the agent. The initial cultural identities of each agent are set randomly with equal probability for the qF different identities. At each time we pick an agent at random (the target agent) as well as one of its neighbors. These two agents interact with probability equal to their cultural similarity, defined as the fraction of identical features in their cultural lists. The more similar they are, the greater the likelihood of their interaction - this is the homophily principle. An interaction consists of selecting at random one of the distinct features, and making the state of the selected feature of the target agent equal to the corresponding state of its neighbor - this is the social influence principle. This procedure is repeated until the system is frozen into an absorbing configuration. We observe monocultural absorbing configurations, in which all agents have the same cultural identity, and multicultural configurations, in which there are different cultural domains in the lattice. In the one-dimensional model with F = q = 2, however, Monte Carlo simulations show convergence to the monocultural configurations in about 30% of the choices of the initial conditions, while the exact analytical results indicate that the monocultural convergence should always happen. In this thesis, we study the one-dimensional Axelrod model for F = q = 2 using Monte Carlo simulations. We show that the discrepancy between the simulations and the exact results is due to the non-commutation of the thermodynamic limit, in which the chain size goes to infinity, and the time-asymptotic limit, in which the simulation time goes to infinity. Our results offer a better understanding of the one-dimensional Axelrod model and promote the importance of the agreement between theory and simulations in science.
92

Self-assembly of two-dimensional convex and nonconvex colloidal platelets

Pakalidou, Nikoletta January 2017 (has links)
One of the most promising routes to create advanced materials is self-assembly. Self-assembly refers to the self-organisation of building blocks to form ordered structures. As the properties of the self-assembled materials will inherit the properties of the basic building blocks, it is then possible to engineer the properties of the materials by tailoring the properties of the building blocks. In order to create mesoscale materials, the self-assembly of molecular building blocks of different sizes and interactions is important. Mesoscopic materials can be obtained by using larger building blocks such as nano and colloidal particles. Colloidal particles are particularly attractive as building blocks because it is possible to design interparticle interactions by controlling both the chemistry of the particles' surface and the properties of the solvent in which the particles are immersed. The self-assembly of spherical colloidal particles has been widely reported in the literature. However, advances in experimental techniques to produce particles with different shapes and sizes have opened new opportunities to create more complex structures that cannot be formed using spherical particles. Indeed, the particles' shape and effective interactions between them dictate the spatial arrangement and micro-structure of the system, which can be engineered to produce functional materials for a wide range of applications. The driving forces determining the self-assembly of colloidal particles can be modified by the use of external influences such as geometrical confinement and electromagnetic forces. Geometrical confinement, for example, has been used to design quasi two-dimensional materials such as multi-layered structures of spheres, dimers, rods, spherical caps, and monolayers of platelets with various geometries and symmetries. In this dissertation, we present three computer simulations studies using Monte Carlo and Molecular Dynamics simulations determining the self-assembly of monolayer colloidal platelets with different shapes confined in two dimensions. These particles have been selected due to recent experiments in colloidal particles with similar shapes. All the particles' models are represented by planar polygons, and three different effects affecting their self-assembly have been analysed: (a) the curvature of the particles' vertices; (b) the curvature of the particles' edges; and finally (c) the addition of functional groups on the particles' surface. These studies aim to demonstrate that the subtle changes on the particle's shape can be used to engineer complex patterns for the fabrication of advanced materials. Monte Carlo simulations are performed to study the self-assembly of colloidal platelets with rounded corners with 4, 5, and 6-fold symmetries. Square platelets provide a rich phase behaviour that ranges between disorder-order and order-order phase transitions. Suprisingly, the disk-like shape of pentagons and hexagons prevents the total crystallisation of these systems, even at a high pressure state. A hysteresis gap is observed by the analysis of compression and expansion runs for the case of square platelets and the thermodynamic method known as direct coexistence method is used to be accurately determined the point of the order-order transition. Further, unexpected results are obtained by performing Molecular Dynamics simulations in systems with platelets with 3, 4, 5, and 6-fold symmetries when all the sides of each polygon are curved. Macroscopic chiral symmetry breaking is observed for platelets with 4 and 6-fold symmetries, and for the first time a rule is promoted to explain when these chiral structures can be formed driven only by packing effects. This unique rule is verified also for platelets with the same curved sides as previously when functional chains tethered to either vertices or sides. Indeed, square platelets with curved sides confined in two dimensions can form chiral structures at medium densities when flexible chains tethered to either vertices or sides. Triangular platelets with curved sides can form chiral structures only when the chains are tethered to the corners, since the chains experience an one-hand rotation to sterically protect one side. When the chains are symmetrically tethered to the sides, local chiral symmetry breaking is observed as both left-hand and right-hand sides on each vertex are sterically protected allowing the same probability for rotation either in clockwise or anticlockwise direction.
93

O limite termodinâmico do modelo de Axelrod unidimensional de duas características e dois estados / The thermodynamic limit of the one-dimensional Axelrod model with two features and two states

Elias José Portes Biral 04 March 2016 (has links)
Em meados de 1990, o cientista político Robert Axelrod propôs um modelo, baseado em agentes, de disseminação cultural, em que os agentes interagem localmente de acordo com os princípios da homofilia e influência social, visando responder à pergunta: \"Se as pessoas interagem umas com as outras e, através da interação, tornam-se mais semelhantes, por que existem diferenças culturais em nossa sociedade?\". Cada agente é considerado um elemento de uma matriz (um sítio em uma rede) e é modelado por uma lista de F características culturais, cada qual assumindo q estados possíveis. Essa lista é a identidade cultural do agente. As identidades culturais iniciais de cada agente são definidas de modo aleatório com igual probabilidade para as qF identidades diferentes. Em cada vez, escolhemos um agente ao acaso (o agente alvo), assim como um de seus vizinhos. Esses dois agentes interagem com uma probabilidade igual a sua semelhança cultural, definida como a fração de características idênticas nas suas listas culturais. Quanto mais semelhantes forem, maior a probabilidade de sua interação - este é o princípio da homofilia. Uma interação consiste em selecionar aleatoriamente uma das características distintas, fazendo o estado da característica escolhida do agente alvo igual ao estado correspondente do seu vizinho - este é o princípio da influência social. Esse procedimento é repetido até que o sistema fique congelado numa configuração absorvente. Observamos configurações monoculturais absorventes, em que todos os agentes têm a mesma identidade cultural, e configurações multiculturais, em que existem diferentes domínios culturais na rede. Contudo, no modelo unidimensional com F = q = 2, simulações de Monte Carlo mostram convergência para as configurações monoculturais em cerca de 30% das escolhas das condições iniciais, enquanto os resultados analíticos exatos indicam que a convergência monocultural deve acontecer sempre. Nessa dissertação, estudamos o modelo de Axelrod unidimensional com F = q = 2, usando simulações de Monte Carlo. Mostramos que a discrepância entre as simulações e os resultados exatos acontece devido à não-comutação do limite termodinâmico, em que o tamanho da rede tende para o infinito, e o limite de tempo assintótico, em que o tempo de simulação vai para infinito. Nossos resultados oferecem uma melhor compreensão do modelo de Axelrod unidimensional e promovem a importância do acordo entre teoria e simulações na ciência. / In the mid 1990s, the political scientist Robert Axelrod proposed an agent based-model of cultural dissemination, in which agents interact locally according to the principles of homophily and social influence, aiming at answering the question: \"If people interact with each other and, through the interaction, become more similar, why are there cultural differences in our society?\". Each agent is considered as an element of an array (a site in a lattice) and is modeled by a list of F cultural features, each of which assuming q possible states. This list is the cultural identity of the agent. The initial cultural identities of each agent are set randomly with equal probability for the qF different identities. At each time we pick an agent at random (the target agent) as well as one of its neighbors. These two agents interact with probability equal to their cultural similarity, defined as the fraction of identical features in their cultural lists. The more similar they are, the greater the likelihood of their interaction - this is the homophily principle. An interaction consists of selecting at random one of the distinct features, and making the state of the selected feature of the target agent equal to the corresponding state of its neighbor - this is the social influence principle. This procedure is repeated until the system is frozen into an absorbing configuration. We observe monocultural absorbing configurations, in which all agents have the same cultural identity, and multicultural configurations, in which there are different cultural domains in the lattice. In the one-dimensional model with F = q = 2, however, Monte Carlo simulations show convergence to the monocultural configurations in about 30% of the choices of the initial conditions, while the exact analytical results indicate that the monocultural convergence should always happen. In this thesis, we study the one-dimensional Axelrod model for F = q = 2 using Monte Carlo simulations. We show that the discrepancy between the simulations and the exact results is due to the non-commutation of the thermodynamic limit, in which the chain size goes to infinity, and the time-asymptotic limit, in which the simulation time goes to infinity. Our results offer a better understanding of the one-dimensional Axelrod model and promote the importance of the agreement between theory and simulations in science.
94

Improving the simulation of IaaS Clouds / Amélioration de simulation de cloud IaaS via l’emploi de méthodes stochastiques

Bertot, Luke 17 June 2019 (has links)
Les clouds sont devenus ces dernières années des plate-formes incontournables dans le monde informatique, car ils permettent de provisionner des ressources à la demande et de ne payer qu’à l’usage. Ceci ouvre la possibilité de concevoir de nouvelles stratégies pour la planification et l’exécution des applications parallèles de type tâches indépendantes ou workflow. Cependant, trouver une stratégie bien adaptée aux contraintes des utilisateurs, que ce soit en termes de coûts et de temps d’exécution, est un problème difficile, pour lequel des outils de prédictions sont nécessaires. Néanmoins, la variabilité inhérente de ces plate-formes complexifient le développement d’un tel outil de prédiction. Notre thèse est que la simulation stochastique est une approche pertinente pour obtenir une prédiction s’accommodant de la variabilité, en produisant une distribution probabiliste des prédictions englobant les résultats réels observables. Pour le démontrer, nous utilisons une méthode de Monte-Carlo permettant de créer des simulations stochastiques par la répétitions de simulations déterministes. Nous montrons que cette méthode associée à certaines distributions d’entrée permettent de modéliser la variabilité d’une plate-forme à travers un unique paramètre. Pour évaluer la méthode proposée, nous comparons les résultats de notre méthode probabiliste à des exécutions réelles d’applications scientifiques. Nos expériences montrent que notre méthode permet de produire des prédictions représentatives des exécutions réelles observées. / The ability to provision resources on the fly and their pay-as-you-go nature has made cloud computing platforms a staple of modern computer infrastructure. Such platforms allow for new scheduling strategies for the execution of computing workloads. Finding a strategy that satisfies a user’s cost and time constraints is a difficult problem that requires a prediction tool. However the inherent variability of these platforms makes building such a tool a complex endeavor. Our thesis is that, by producing probability distributions of possible outcomes, stochastic simulation can be used to produce predictions that account for the variability. To demonstrate this we used Monte Carlo methods to produce a stochastic simulation by repeatedly running deterministic simulations. We show that this method used in conjunction with specific input models can model the variability of a platform using a single parameter. To validate our method we compare our results to real executions of scientific workloads. Our experiments show that our method produces predictions capable of representing theobserved real executions.
95

Optimisation of environmental gamma spectrometry using Monte Carlo methods

Hernández Suárez, Francisco Javier January 2002 (has links)
<p>Dissertation in Environmental Physics to be publicly examined in Häggsalen (Ångström Laboratory), Uppsala University, on Friday, November 8, 2002 at 10:00 am for the degree of doctor of philosophy in Physics. The examination will be conducted in English. </p><p>Gamma spectrometry is one of the tools commonly used for the measurement of various environmental radionuclides. Simultaneous determination of the absolute activity of gamma emitting radiotracers in a wide range of environmental matrices and fractions necessitates proper and accurate evaluation of the sample-to-detector efficiency. Several radiotracers require, in addition, the use of sub-routines for self-absorption corrections. </p><p>Gamma spectrometry is an important and elegant tool for assessing environmental changes. Optimisation of ultra low-level gamma spectrometry for reliable assessment of such changes requires harmonisation of laboratory needs with sampling and site conditions.</p><p>Different aspects of the calculation of sample-to-detector efficiencies using empirical and Monte Carlo approaches are discussed here, including the uncertainties related to the simulation of the performance of different HPGe detectors and the effects of the incomplete collection of charges in Ge-crystals. Various simulation codes for the computation of peak efficiencies in planar and well Ge-detectors have been developed from scratch. The results of the simulations have been tested against experimental data and compared to other simulation results obtained with the Monte Carlo N-Particle code (MCNP). The construction of calibration sources with improved absorption and collimation characteristics have been, also, described in this work. These sources have been especially designed for the efficiency calibration of Ge-detectors at energies below 100 keV. </p><p>Flexible, fully tested and prototype approaches for the evaluation of self-absorption corrections, based on Monte Carlo simulations, are described. Special consideration is given to the problems related to the sample's variability in size, density and composition. Several examples of the absolute and simultaneous determination of environmental multitracers which benefited from self-absorption corrections and the optimised efficiency calibration algorithms are, also, presented and discussed. These examples include, among other things, a comprehensive analysis of the gamma spectrometry of <sup>234</sup>Th in a wide range of matrices and the speciation of several radionuclides in sediments from a hard-water lake.</p>
96

Constrained crystallization and depletion in the polymer medium for transdermal drug delivery system

Zeng, Jianming 13 July 2004 (has links)
Transdermal drug delivery systems (TDS) are pharmaceutical devices that are designed to deliver specific drugs to the human body by diffusion through skin. The TDS effectiveness suffers from crystallization in the patch when they are kept in storage for more than two years. It has been reported that there are two types of crystals in the patch: needle and aggregate, and growth of drug crystals in TDS generally occurs only in the middle third of the polymer layer. In our study, fluorescence microscopy, EDS (SEM) and Raman microspectroscopy were used to further characterize the crystals. The results show that the needle crystals most probably contain estradiol and acrylic resin conjugate. The FTIR spectrum of the model sample proved the occurrence of a reaction between estradiol and acrylic resin. Crystal growth in an unstressed matrix of a dissolved crystallizable drug component was simulated using a kinetic Monte Carlo model. Simulation using Potts model with proper boundary condition gives the crystals in the middle of matrix in the higher temperature. Bond fluctuation model is also being implemented to study representative dense TDS polymer matrix. This model can account for the size effect of polymer chain on the crystal growth. The drug release profile from TDS was also studied by simulating the diffusion of drug molecules using Monte Carlo techniques for different initial TDS microstructure. The release rate and profile of TDS depend on the dissolution process of the crystal. At low storage temperature, the grains are evenly distributed throughout the thickness of the TDS patch, thus the release rate and profile is similar to the randomly initiated system. Further work on stress induced crystallization is currently under development. Although the study was specifically done for drug in a polymer medium, the techniques developed in this investigation is in general applicable to any constrained crystallization in a polymer medium.
97

Nano-scale Phase Separation And Glass Forming Ability Of Iron-boron Based Metallic Glasses

Aykol, Muratahan 01 September 2008 (has links) (PDF)
This study is pertinent to setting a connection between glass forming ability (GFA) and topology of Fe-B based metallic glasses by combining intimate investigations on spatial atomic arrangements conducted via solid computer simulations with experimentations on high GFA bulk metallic glasses. In order to construct a theoretical framework, the nano-scale phase separation encountered in metallic glasses is investigated for amorphous Fe80B20 and Fe83B17 alloys via Monte Carlo equilibration and reverse Monte Carlo simulation. The phenomenon is identified regarding three topological aspects: 1) Pure Fe-clusters as large as ~0.9 nm and Fe-contours with ~0.72 nm thickness, 2) Fe-rich highly deformed body centered cubic regions, 3) B-centered prismatic units with polytetrahedral order forming distinct regions of high and low coordinations are found. All topological aspects are compiled into a new model called Two-Dimensional Projection Model for predicting contributions to short and medium range order (MRO) and corresponding spacing relations. The outcome geometrically involves proportions approximating golden ratio. After successfully producing soft magnetic Fe-Co-Nb-B-Si based bulk metallic glass and bulk nanocrystalline alloys with a totally conventional route, influences of alloying elements on structural units and crystallization modes are identified by the developed model and radial distributions. While Co atoms substitute for Fe atoms, Nb and Si atoms deform trigonal prismatic units to provide local compactions at the outset of MRO. Cu atoms alter the type of MRO which resembles crystalline counterparts and accompanying nanocrystals that precipitate. The GFA can be described by a new parameter quantifying the MRO compaction, cited as &amp / #934 / .
98

Reliability in performance-based regulation

Solver, Torbjörn January 2005 (has links)
<p>In reregulated and restructured electricity markets the production and retail of electricity is conducted on competitive markets, the transmission and distribution on the other hand can be considered as natural monopolies. The financial regulation of Distribution System Operators (DSOs) has in many countries, partly as a consequence of the restructuring in ownership, gone through a major switch in regulatory policy. From applying regulatory regimes were the DSOs were allowed to charge their customers according to their actual cost plus some profit, i.e. cost-based regulation, to regulatory models in which the DSOs performance are valued in order to set the allowable revenue, i.e. Performance-Based Regulation (PBR). In regulatory regimes that value performance, the direct link between cost and income is weakened or sometimes removed. This give the regulated DSOs strong cost cutting incentives and there is consequently a risk of system reliability deterioration due to postponed maintenance and investments in order to save costs. To balance this risk the PBR-framework is normally complemented with some kind of quality regulation (QR). How both the PBR and QR frameworks are constructed determines the incentive that the DSO will act on and will therefore influence the system reliability development.</p><p>This thesis links the areas of distribution system reliability and performancebased regulation. First, the key incentive features within PBR, that includes the quality of supply, are identified using qualitative measures that involve analyses of applied regulatory regimes, and general regulatory policies. This results in a qualitative comparison of applied PBR models. Further, the qualitative results are quantified and analysed further using time sequential Monte Carlo simulations (MCS). The MCS enables detailed analysis of regulatory features, parameter settings and financial risk assessments. In addition, the applied PBRframeworks can be quantitatively compared. Finally, some focus have been put on the Swedish regulation and the tool developed for DSO regulation, the Network Performance Assessment Model (NPAM), what obstacles there might be and what consequences it might bring when in affect.</p>
99

Protein Crystallization: Soft Matter and Chemical Physics Perspectives

Fusco, Diana January 2014 (has links)
<p>X-ray and neutron crystallography are the predominant methods for obtaining atomic-scale information on bimolecular macromolecules. Despite the success of these techniques, generating well diffracting crystals critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. </p><p>The fields of structural biology and soft matter have independently sought out fundamental principles to rationalize protein crystallization. Yet the conceptual differences and limited overlap between the two disciplines may have prevented a comprehensive understanding of the phenomenon to emerge. Part of this dissertation focuses on computational studies of rubredoxin and human uniquitin that bridge the two fields.</p><p>Using atomistic simulations, the protein crystal contacts are characterized, and patchy particle models are accordingly parameterized. Comparing the phase diagrams of these schematic models with experimental results enables the critical review of the assumptions behind the two approaches, and reveals insights about protein-protein interactions that can be leveraged to crystallize proteins more generally. In addition, exploration of the model parameter space provides a rationale for several experimental observations, such as the success and occasional failure of George and Wilson's proposal for protein crystallization conditions and the competition between different crystal forms.</p><p>These simple physical models enlighten the connection between protein phase behavior and protein-protein interactions, which are, however, remarkably sensitive to the protein chemical environment. To help determine relationships between the physico-chemical protein properties and crystallization propensity, statistical models are trained on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.</p><p>To conclude, the behavior of water in protein crystals is specifically examined. Water is not only essential for the correct functioning and folding of proteins, but it is also a key player in protein crystal assembly. Although water occupies up to 80% of the volume fraction of a protein crystal, its structure has so far received little attention and it is often overly simplified in the structural refinement process. Merging information derived from molecular dynamics simulations and original structural information provides a way to better understand the behavior of water in crystals and to develop a method that enriches standard structural refinement.</p> / Dissertation
100

Optimisation of environmental gamma spectrometry using Monte Carlo methods

Hernández Suárez, Francisco Javier January 2002 (has links)
Dissertation in Environmental Physics to be publicly examined in Häggsalen (Ångström Laboratory), Uppsala University, on Friday, November 8, 2002 at 10:00 am for the degree of doctor of philosophy in Physics. The examination will be conducted in English. Gamma spectrometry is one of the tools commonly used for the measurement of various environmental radionuclides. Simultaneous determination of the absolute activity of gamma emitting radiotracers in a wide range of environmental matrices and fractions necessitates proper and accurate evaluation of the sample-to-detector efficiency. Several radiotracers require, in addition, the use of sub-routines for self-absorption corrections. Gamma spectrometry is an important and elegant tool for assessing environmental changes. Optimisation of ultra low-level gamma spectrometry for reliable assessment of such changes requires harmonisation of laboratory needs with sampling and site conditions. Different aspects of the calculation of sample-to-detector efficiencies using empirical and Monte Carlo approaches are discussed here, including the uncertainties related to the simulation of the performance of different HPGe detectors and the effects of the incomplete collection of charges in Ge-crystals. Various simulation codes for the computation of peak efficiencies in planar and well Ge-detectors have been developed from scratch. The results of the simulations have been tested against experimental data and compared to other simulation results obtained with the Monte Carlo N-Particle code (MCNP). The construction of calibration sources with improved absorption and collimation characteristics have been, also, described in this work. These sources have been especially designed for the efficiency calibration of Ge-detectors at energies below 100 keV. Flexible, fully tested and prototype approaches for the evaluation of self-absorption corrections, based on Monte Carlo simulations, are described. Special consideration is given to the problems related to the sample's variability in size, density and composition. Several examples of the absolute and simultaneous determination of environmental multitracers which benefited from self-absorption corrections and the optimised efficiency calibration algorithms are, also, presented and discussed. These examples include, among other things, a comprehensive analysis of the gamma spectrometry of 234Th in a wide range of matrices and the speciation of several radionuclides in sediments from a hard-water lake.

Page generated in 0.0944 seconds