Spelling suggestions: "subject:"desimulations."" "subject:"bysimulations.""
101 |
Reconstrução 3D de imagens em tomografia por emissão de pósitrons com Câmaras de Cintilação / 3D Image Reconstruction Positron Emission Tomography Scintillation CamerasFabio Henrique Palladino 08 December 2004 (has links)
A Tomografia por Emissão de Pósitrons (PET) está se definindo como um dos métodos preferidos para diagnóstico e seguimento de inúmeras doenças em Oncologia, Neurologia e Cardiologia. Esta modalidade é realizada com sistemas dedicados e sistemas baseados em câmaras de cintilação, que podem ser também usados em tomografia por emissão de fótons únicos (SPECT). Neste trabalho, efetuamos uma avaliação dos fatores que favorecem a quantificação em imagens PET com câmaras de cintilação em coincidência, caracterizadas por urna menor sensibilidade em relação a sistemas dedicados. Avaliamos as condições de quantificação de imagens sob os modos 2D e 3D de aquisição, obtidas por métodos de reconstrução 2D e 3D diversos e correções associadas. Dados de aquisição foram simulados por método de Monte Carlo empregando parâmetros realistas. Objetos de interesse diversos foram modelados. Imagens foram reconstruídas pelos métodos FBP, ART, MLEM e OSEM e consideramos correções de sensibilidade, normalização de detector, espalhamento e atenuação de radiação. Estabelecemos uma metodologia de avaliação de detectabilidade e recuperação de contrastes em imagens que contemplam, a partir de dois parâmetros mensuráveis, os aspectos mais relevantes em quantificação. Análises visuais também foram consideradas. Verificamos que o modo 3D é mais adequado que 2D na recuperação de baixos contrastes no objeto com a aplicação das correções selecionadas. A detectabilidade de pequenas estruturas está limitada pelos efeitos de volume parcial e pela resolução espacial finita dos sistemas de detecção. Os métodos ART, MLEM e, em particular, OSEM com 8 subconjuntos, apresentam-se adequados para estudos quantitativos no modo 3D. Os parâmetros definidos para avaliação podem ser empregados como indicadores de condições propícias a estudos quantitativos. / Volumetric reconstruction in gamma camera based PET imaging Positron Emission Tomography (PET) is considered as a very useful tool for diagnosing and following several diseases in Oncology, Neurology and Cardiology. Two types of systems are available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we assessed a number of factors affecting the quantitation of gamma camera based PET imaging, characterized by a lower sensitivity compared to those of dedicated systems. We also evaluated image quantitation conditions under 2D and 3D acquisition/reconstruction modes, for different reconstruction methods and associated corrections. Acquisition data were simulated by Monte Carla method, using realistic parameters. Several objects of interest were modelled. We reconstructed slices and volumes using FBP, ART, MLEM and OSEM and also included four corrections: detector sensitivity, detector normalization, scatter and attenuation of annihilation photons. We proposed a method to assess detectability and object contrast recovery by using two measurable parameters. Visual analysis was also considered. We found that 3D mode is more effective than 2D for low contrast recovery when the selected (J corrections are applied. Detectability of small structures is limited by partial volume effects and device finite spatial resolution. ART, MLEM and specially 8-subsets OSEM are the most adequate methods for quantitative studies in 3D mode. The parameter that we have defined may also be used as indicators of suitable conditions for quantitation in images.
|
102 |
Dinâmica e genealogia de modelos de evolução / Dynamics and genealogy of evolution modelsMilton Taidi Sonoda 21 February 2001 (has links)
Nesse trabalho investigamos através de simulações numéricas a evolução da composição genética de uma população, dando atenção especial ao processo dinâmico conhecido como catraca de Muller, que é responsável pela degradação da população devido ao acúmulo de mutações deletérias em populações finitas. Consideramos também a genealogia dos indivíduos em uma população sob a ação da catraca de Muller. Ainda, investigamos analiticamente o limite determinístico do modelo, no qual o tamanho da população é infinito, onde o processo da catraca não atua. O relevo replicativo, ou seja, a função que mapeia a carga genética de um indivíduo com a sua probabilidade de reprodução utilizado nesse trabalho é uma generalização do relevo originalmente proposto por Muller para ilustrar o processo da catraca. Adicionamos a esse relevo um parâmetro de epistase que simula a interação entre os sítios das seqüências dos indivíduos. A escolha desse parâmetro determina três tipos possíveis de epistase: (i) sinergística, no qual as mutações ficam cada vez mais deletérias com o número de mutações já existentes; (ii) atenuante, no qual o efeito deletério de uma nova mutação é atenuado; e (iii) multiplicativa, no qual as novas mutações causam danos idênticos, independentemente do número anterior de mutações / In this work we investigate through numerical simulations the evolution of the genetic composition of a population, giving emphasis to the dynamic process termed Muller\'s ratchet, which is responsible for the degradation of the population due to the accumulation of deleterious mutations in finite populations. We consider also the genealogy of the individuals evolving in a population under the effect of the Muller\'s ratchet. In addition, we investigate analytically the deterministic limit of the model, in which the population size is infinite, where ratchet process does not act. The replication landscape, i.e., the function that maps the genetic load of an individual on its probability of reproduction used in this work is a generalization of that originally considered by Muller to illustrate the process of the ratchet. In particular, we add to that landscape a parameter of epistasis that models the interactions among the sites of the sequences of the individuals. The tunning of this parameter determines three different types of epistasis: (i) synergistic, where the mutations become more deleterious with the number of mutations already present; (ii) diminishing, where the deleterious effect of a new mutation is attenuated; and (iii) multiplicative, where the new mutations cause identical damages, independently of the previous number of mutations
|
103 |
Molecular dynamics study of the allosteric control mechanisms of the glycolytic pathwayNaithani, Ankita January 2015 (has links)
There is a growing body of interest to understand the regulation of allosteric proteins. Allostery is a phenomenon of protein regulation whereby binding of an effector molecule at a remote site affects binding and activity at the protein‟s active site. Over the years, these sites have become popular drug targets as they provide advantages in terms of selectivity and saturability. Both experimental and computational methods are being used to study and identify allosteric sites. Although experimental methods provide us with detailed structures and have been relatively successful in identifying these sites, they are subject to time and cost limitations. In the present dissertation, Molecular Dynamics Simulations (MDS) and Principal Component Analysis (PCA) have been employed to enhance our understanding ofallostery and protein dynamics. MD simulations generated trajectories which were then qualitatively assessed using PCA. Both of these techniques were applied to two important trypanosomatid drug targets and controlling enzymes of the glycolytic pathway - pyruvate kinase (PYK) and phosphofructokinase (PFK). Molecular Dynamics simulations were first carried out on both the effector bound and unbound forms of the proteins. This provided a framework for direct comparison and inspection of the conformational changes at the atomic level. Following MD simulations, PCA was run to further analyse the motions. The principal components thus captured are in quantitative agreement with the previously published experimental data which increased our confidence in the reliability of our simulations. Also, the binding of FBP affects the allosteric mechanism of PYK in a very interesting way. The inspection of the vibrational modes reveals interesting patterns in the movement of the subunits which differ from the conventional symmetrical pattern. Also, lowering of B-factors on effector binding provides evidence that the effector is not only locking the R-state but is also acting as a general heat-sink to cool down the whole tetramer. This observation suggests that protein rigidity and intrinsic heat capacity are important factors in stabilizing allosteric proteins. Thus, this work also provides new and promising insights into the classical Monod-Wyman-Changeux model of allostery.
|
104 |
Cosmological simulations with AGN feedbackTaylor, Philip January 2015 (has links)
We implement a model for, and study the effects of, AGN feedback in cosmological hydrodynamical simulations. In our model, black holes form high-density, primordial gas, to imitate the likely channels of black hole formation in the early Universe. We find that a black hole seed mass of 10²⁻³h⁻¹M⊙ is required to produce simulations that match the cosmic star formation rate density, and present-day black hole mass - velocity dispersion and galaxy size - velocity dispersion relations. We therefore suggest that Population III stars can be the progenitors of the super-massive black holes seen today. Using our fiducial model, we run two large simulations ((25h⁻¹ Mpc)³), one with and one without AGN feedback. With these, we follow the population of galaxies that forms across cosmic time, and find that the inclusion of AGN feedback improves the agreement of simulated and observed galaxy properties, such as the mass and luminosity functions. This agreement is best at z = 0, and fairly good out to z = 2-3. Evidence for downsizing in the evolution of galaxies is found, both in the present-day colour-magnitude and [α/Fe]-velocity dispersion relations, and by the fact that high-mass galaxies attain their present-day metallicity earlier and faster than do low-mass ones. With our hydrodynamical simulations, we can also investigate the internal structure of galaxies, and look at the effects of galaxy mergers and AGN feedback on the stellar and gas-phase metallicity gradients of galaxies. Stellar metallicity gradients are found to be sensitive to galaxy mergers, while gas-phase metallicity gradients are more affected by AGN activity. This suggests that simultaneous measurements of these two quantities can help disentangle the actions of mergers and AGN feedback on a galaxy's history. Finally, we develop a new method to identify massive AGN-driven outflows from the most massive simulated galaxy. These events cause the intra-cluster medium to be hotter and more chemically enriched compared to the simulation without AGN feedback, and therefore AGN feedback may be required in order to attain the metallicities observed in clusters.
|
105 |
The modification of a computer simulation for use in the professional training of South African secondary school teachers with specific reference to the probationary yearMarsh, Cecille Joan Anna January 1989 (has links)
The topic of this thesis arose out of a desire to meet the need for a practical means of supplementing the preparation of Higher Diploma of Education (H.D.E.) students for their future role as first-year teachers. It was established that this need was not adequately filled by conventional university teacher-training methods. The literature about computerised simulation of role-playing and teaching activities was investigated and the investigation indicated that such simulations had been relatively successful. A published American computer simulation, TENURE, in which the student plays the role of a first-year teacher, was selected for modification to meet the needs of South African students. This program is implemented in the TUTOR computer language and runs on the Control Data South Africa PLATO system. In order to determine the needs of South African students, two groups of Rhodes University students worked through the simulation as it was being modified. The modifications were adapted according to the students' responses to a questionnaire. The simulation has been tested by 72 H.D.E. students and several educationists and the response has been positive
|
106 |
Univariate parametric and nonparametric statistical quality control techniques with estimated process parametersHuman, Schalk William 17 October 2009 (has links)
Chapter 1 gives a brief introduction to statistical quality control (SQC) and provides background information regarding the research conducted in this thesis. We begin Chapter 2 with the design of Shewhart-type Phase I S2, S and R control charts for the situation when the mean and the variance are both unknown and are estimated on the basis of m independent rational subgroups each of size n available from a normally distributed process. The derivations recognize that in Phase I (with unknown parameters) the signaling events are dependent and that more than one comparison is made against the same estimated limits simultaneously; this leads to working with the joint distribution of a set of dependent random variables. Using intensive computer simulations, tables are provided with the charting constants for each chart for a given false alarm probability. Second an overview of the literature on Phase I parametric control charts for univariate variables data is given assuming that the form of the underlying continuous distribution is known. The overview presents the current state of the art and what challenges still remain. It is pointed out that, because the Phase I signaling events are dependent and multiple signaling events are to be dealt with simultaneously (in making an in-control or not-in-control decision), the joint distribution of the charting statistics needs to be used and the recommendation is to control the probability of at least one false alarm while setting up the charts. In Chapter 3 we derive and evaluate expressions for the run-length distributions of the Phase II Shewhart-type p-chart and the Phase II Shewhart-type c-chart when the parameters are estimated. We then examine the effect of estimating and on the performance of the p-chart and the c-chart via their run-length distributions and associated characteristics such as the average run-length, the false alarm rate and the probability of a “no-signal”. An exact approach based on the binomial and the Poisson distributions is used to derive expressions for the Phase II run-length distributions and the related Phase II characteristics using expectation by conditioning (see e.g. Chakraborti, (2000)). We first obtain the characteristics of the run-length distributions conditioned on point estimates from Phase I and then find the unconditional characteristics by averaging over the distributions of the point estimators. The in-control and the out-of-control properties of the charts are looked at. The results are used to discuss the appropriateness of the widely followed empirical rules for choosing the size of the Phase I sample used to estimate the unknown parameters; this includes the number of reference samples m and the sample size n. Chapter 4 focuses on distribution-free control charts and considers a new class of nonparametric charts with runs-type signaling rules (i.e. runs of the charting statistics above and below the control limits) for both the scenarios where the percentile of interest of the distribution is known and unknown. In the former situation (or Case K) the charts are based on the sign test statistic and enhance the sign chart proposed by Amin et al. (1995); in the latter scenario (or Case U) the charts are based on the two-sample median test statistic and improve the precedence charts by Chakraborti et al. (2004). A Markov chain approach (see e.g. Fu and Lou, (2003)) is used to derive the run-length distributions, the average run-lengths, the standard deviation of the run-lengths etc. for our runs rule enhanced charts. In some cases, we also draw on the results of the geometric distribution of order k (see e.g. Chapter 2 of Balakrishnan and Koutras, (2002)) to obtain closed form and explicit expressions for the run-length distributions and/or their associated performance characteristics. Tables are provided for implementation of the charts and examples are given to illustrate the application and usefulness of the charts. The in-control and the out-of-control performance of the charts are studied and compared to the existing nonparametric charts using criteria such as the average run-length, the standard deviation of the run-length, the false alarm rate and some percentiles of the run-length, including the median run-length. It is shown that the proposed “runs rules enhanced” sign charts offer more practically desirable in-control average run-lengths and false alarm rates and perform better for some distributions. Chapter 5 wraps up this thesis with a summary of the research carried out and offers concluding remarks concerning unanswered questions and/or future research opportunities. / Thesis (PhD)--University of Pretoria, 2009. / Mathematics and Applied Mathematics / unrestricted
|
107 |
Dynamic aspects of a wind/diesel system with flywheel energy storageCoonick, Alun Howard January 1991 (has links)
No description available.
|
108 |
Highly Driven Polymer Translocation in the Presence of External Constraints: Simulations and TheorySean-Fortin, David January 2017 (has links)
DNA sequencing via nanopore translocation was a pipedream two decades ago. Today, biotech companies are releasing commercial devices. Yet many challenges still hover around the simple concept of threading a long DNA molecule through a small nanoscopic pore with the aim of extracting the DNA’s sequence along the process.
In this thesis I use computer simulations to create what are in essence virtual pro- totypes for testing design ideas for the improvement of nanopore translocation devices. These ideas are based on the general concept of modifying the average shape of the initial DNA conformations. This is done, for example, by introducing new geometrical features to the nanopore’s surrounding or by the means of some external force.
The goal of these simulations is not just to test design improvements, but also to systematically deconstruct the physical mechanisms involved in the translocation process. The roles of pore friction, initial polymer conformations, monomer crowding on the trans- side of the membrane, Brownian fluctuations, and polymer rigidity can, with careful consideration, be essentially muted at will. Computer simulations in this sense play the role of a sandbox in which the physics can be tinkered with, in order to assess and evaluate the magnitude of certain approximations found in theoretical modelling of translocation. This enables me to construct theoretical models that contain the necessary features pertaining to the different designs tested by simulations.
The work presented here is thus constituted of both Langevin Dynamics simulations and adaptations of the Tension-Propagation theory of polymer translocation when the polymer is subject to the various test conditions.
|
109 |
Simulated Power Study of ANCOVA vs. Repeated Measure Analyses for Two-Way Designs with one Repeated MeasureLemay, Julien January 2017 (has links)
Whether one should use an analysis of covariance or a form of difference score test (difference as an outcome or repeated measure) is not always clear. The literature on the topic focused for a while on Lord's paradox which lead to the conclusion that both analyses were equally valid when there is true random assignment. Yet, the issue of which analysis is best was little explored. In an attempt to create a unifying simulation framework that will allow for comparable results when exploring various data structure variations, I will tackle 5 such manipulations by exploring the impact of varying effect size and relationship between time points, violating the homogeneity of the regression slopes assumption, exploring the effect of large systematic baseline differences, the impact of data missing at random, as well as comparing the sample size requirements for a given test power. The programs provided, which allow for tens of millions of simulations to be run in a reasonable time frame (within a day) also puts to rest any ambiguity on the stability of the results. By analyzing Type I error rate and statistical power, I establish that ANCOVA respects the type-I error rate of alpha, and has more power than repeated measure analysis in most cases, but should be avoided when there is a baseline imbalance. Hence, in cases where ANCOVA is applicable, it is preferable to use it over other difference score tests.
|
110 |
Molecular dynamics simulations of amphiphilic macromolecules at interfacesNawaz, Selina January 2013 (has links)
The aim of this thesis is to investigate the structural and thermodynamic properties of biologically and technological relevant macromolecules when placed at soft interfaces. In particular two amphiphilic macromolecules characterized by different topologies have been investigated namely amphiphilic dendrimers and linear block copolymers. This goal is achieved using a multiscale approach which includes all-atom, united atom and coarse grained models by means of molecular dynamic simulations.Amphiphilic dendrimers have shown to be promising building blocks for a range of interfacial materials and can be used in applications such as surface-base sensors or surface nanopatterning. In this part of the thesis by means of all-atom molecular dynamics simulations, we investigated the structure and stability of alkyl-modified polyamido-amide (PAMAM) dendrimers at the air/water interface as a function of the number and the relative position of the modified end groups. We found that the PAMAM dendrimer with all terminal groups functionalized is more stable at the interface than the Janus dendrimer, where only half the amine groups are modified. These results indicate that monolayers of fully functionalized molecules could be as stable as (or more stable than) those self-assembled from Janus molecules.The second part of the thesis is devoted to model a particular family of amphiphilic triblock copolymer sold as Pluronics, consisting of poly(ethylene oxide) (PEO) and poly(propylene oxide) (PPO) arranged as PEO–PPO–PEO. There is evidence that this class of amphiphilic materials can be used for different biological applications. A fuller understanding of the molecular mechanisms underpinning their interactions with living cells is essential for ensuring the polymers safety and efficacy in biomedical applications. Using united-atom molecular dynamics simulations and membrane lysis assays, we investigated the relationship between the molecular conformations of a subset of the Pluronic copolymers (L31, L61, L62 and L64) and their haemolytic activity. Our computational studies suggest that the hydrophilic blocks in these copolymers interact with the polar head groups of lipid molecules, resulting in a predicted modification of the structure of the membranes. Parallel membrane lysis assays in human erythrocytes indicate differences in the rates of haemolysis, as a result of incubation with these polymers, which correlate well with the predicted interactions from the atomistic simulations. The computational data thus provide a putative mechanism to rationalize the available experimental data on membrane lysis by these copolymers. The data quantitatively agree with haemoglobin release endpoints measured when copolymers with the same molecular weight and structure as of those modelled are incubated with erythrocytes. The data further suggest some new structure– function relationships at the nanoscale that are likely to be of importance in determining the biological activity of these otherwise inert copolymers.In order to visualise the effect of Pluronics at a length and time scale closer to the experimental one, in the third part of the thesis we developed a coarse-grained model for the amphiphilic copolymers within the framework of the MARTINI forcefield (Marrink et al., J. Phys. Chem. B, 2007, 111, 7812). The MARTINI force field is usually parameterized targeting thermodynamic properties. In addition to this, we further parameterized it based on atomistic simulations validating the parameters against structural properties of the copolymers. The ability of the model to predict several structural and thermodynamic properties of the atomistic system have been explored. The aim of this work is to be able to simulate the polymer/lipid interface at polymer concentration similar to the experimental one.
|
Page generated in 0.0673 seconds