• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 110
  • 25
  • 21
  • 12
  • 11
  • 8
  • 7
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 245
  • 109
  • 108
  • 39
  • 38
  • 33
  • 30
  • 30
  • 25
  • 24
  • 19
  • 18
  • 17
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Algorithmique parallèle du texte : du modèle systolique au modèle CGM

Garcia, Thierry 27 November 2003 (has links) (PDF)
Nous avons tous l'intuition qu'un travail peut être réalisé en beaucoup moins de temps s'il est réparti entre plusieurs personnes ou sur plusieurs machines. Cette notion se nomme le parallélisme qui peut se définir comme l'état de ce qui se développe dans la même direction ou en même temps. C'est naturellement que la notion de parallélisme a été appliquée aux ordinateurs. De ce fait, il a été possible de répondre aux besoins de puissance nécessaire à la réalisation de projets gourmands en temps de calculs et en taille mémoire. Le parallélisme combiné à une algorithmique performante permet de gagner du temps afin de répondre au mieux à d'importants besoins. Il rompt avec l'approche classique qui consiste à gagner de la vitesse en effectuant plus rapidement chaque opération, approche bornée par les lois de la physique. La notion de parallélisme a donc grandement contribué à la multiplication des modèles informatiques. <br /><br />Nous nous intéresserons au modèle systolique et au modèle parallèle à gros grains baptisé (Coarse Grained Multicomputers). Le modèle CGM a été proposé par F. Dehne et al. et il possède des propriétés qui le rendent très intéressant d'un point de vue pratique. Il est parfaitement adapté à la modélisation des architectures existantes pour lesquelles le nombre de processeurs peut être de plusieurs milliers et la taille des données peut atteindre plusieurs milliards d'octets. Un algorithme développé pour ce modèle est constitué de calculs locaux utilisant, si possible, des algorithmes séquentiels optimaux et de rondes de communication dont le nombre doit être indépendant de la taille des données à traiter. Le modèle CGM est donc très intéressant d'un point de vue économique. En effet, ce modèle est indépendant des architectures réelles et permet de réutiliser des algorithmes séquentiels efficaces, ce qui le rend très portable. <br /><br />Dans cette thèse nous nous intéressons à des problèmes d'algorithmique du texte. Ces problèmes peuvent améliorer la compression de données ou bien être utilisés en bio-informatique. Ainsi, nous proposons des solutions CGM aux problèmes de recherche de la plus longue sous-suite croissante, de la plus longue sous-suite commune à deux mots, du plus long suffixe répété en chaque caractère d'un mot et de répétitions. Pour cela, nous sommes partis de solutions systoliques existantes que nous avons adaptées au modèle CGM. Le but de ce travail est en fait double. D'une part, nous proposons pour la première fois des solutions CGM à ces quatre problèmes. D'autre part, nous montrons comment des solutions systoliques peuvent être dérivées en algorithmes CGM. En effet, de nombreux problèmes ont été étudiés sur des architectures systoliques, c'est à dire des machines dédiées, non réutilisables pour d'autres problèmes. Le modèle CGM quant à lui permet de travailler avec des machines peu coûteuses et réutilisables à souhaits. De plus, l'expérience acquise au cours de ces travaux nous permet d'avoir une bonne idée des solutions systoliques adaptables au modèle CGM. Ceci pourrait permettre de consolider le pont existant entre modèles à grains fins et modèles à gros grains. <br /><br />Nous finissons cette thèse par une discussion sur l'équilibrage de charge des solutions proposées et sur la prédictivité de l'adaptation d'autres solutions systoliques au modèle CGM.
102

Modelling coarse-grained beach profile evolution

Jamal, Mohamad Hidayat January 2011 (has links)
Coarse-grained beaches are particularly prevalent in the UK, composed of accumulations of either gravel, or mixed sand and gravel sediments. The aim of the work presented in this thesis is to improve capabilities for predicting coarse-grained beach 2D profile development. In particular, the effects of infiltration and sediment sorting are considered. In this study, the public domain numerical model, XBeach (v12) is developed further. This model was initially developed for studying sandy environments especially for the case of dune erosion. Here, the model is modified to enhance its capability to predict beach profile change on coarse-grained beaches. Improvements include: use of Lagrangian interpretation of velocity in place of Eulerian for driving sediment movement; introduction of a new morphological module based upon Soulsby’s sediment transport equation for waves and currents; incorporation of Packwood’s infiltration approach in the unsaturated area of the swash region; and implementation of a multiple sediment fraction algorithm for sediment sorting of mixed sediments. These changes are suggested and justified in order to significantly improve the application of this model to gravel and mixed beaches, especially with regard to swash velocity asymmetry which is responsible for development of the steep accretionary phase steep berm above waterline and sediment sorting. A comparison between model simulation and large scale experiments is presented with particular regard to the tendency for onshore transport and profile steepening during calm conditions; offshore transport and profile flattening during storm conditions; and sediment sorting in the swash zone. Data used for this and the model calibration comes from the Large Wave Channel (GWK) of the Coastal Research Centre (FZK) in Hannover, Germany. The results are found to agree well with the measured experimental data on gravel beach profile evolution. This is due to the inclusion of infiltration in the model which weakens the backwash volume and velocity in a more satisfying manner than through the use of asymmetric swash friction and transport coefficient. The model also simulates sediment sorting of a mixed sediment beach. However, the profile comparisons were not satisfactory due to limitations of the numerical model such as the constant permeability rate used throughout the simulation and the non-conservation of the sediment volume in the laboratory data by an order of 50%. From the simulation, it was found that the fine sediment moves offshore and the coarser sediment moves onshore. This is because of infiltration weakens the backwash velocity; the coarser sediment moving onshore barely moves back offshore while the fine sediment remains in motion. This pattern agrees with the pattern obtained from sediment samples analysis in the experiment and provides an explanation for the existence of composite beaches. The model is also shown to be capable of switching from accretionary to erosive conditions as the wave conditions become more storm-like. Again, the model simulations were in a good agreement with the observations from the GWK dataset. Numerical model simulations on the effects of the tidal cycle on coarse-grained beach profile evolution were also carried out. This preliminary investigation showed that the model was able to predict the anticipated profile change associated with a coarse-grained beach under such wave and tidal forcing. Tidally forced accretion and erosion were compared with those predicted under similar beach sediments and wave conditions for constant water level. The main differences are that the affected area is wider and the berm is located on the upper beach during flood for both gravel and mixed beaches. Therefore, the model developed in this study can be seen to be a robust tool with which to investigate cross-shore beach profile change on coarse-grained beaches and sediment sorting on mixed beaches. Further work is also indicated.
103

Pushing the boundaries : molecular dynamics simulations of complex biological membranes

Parton, Daniel L. January 2011 (has links)
A range of simulations have been conducted to investigate the behaviour of a diverse set of complex biological membrane systems. The processes of interest have required simulations over extended time and length scales, but without sacrifice of molecular detail. For this reason, the primary technique used has been coarse-grained molecular dynamics (CG MD) simulations, in which small groups of atoms are combined into lower-resolution CG particles. The increased computational efficiency of this technique has allowed simulations with time scales of microseconds, and length scales of hundreds of nm. The membrane-permeabilizing action of the antimicrobial peptide maculatin 1.1 was investigated. This short α-helical peptide is thought to kill bacteria by permeabilizing the plasma membrane, but the exact mechanism has not been confirmed. Multiscale (CG and atomistic) simulations show that maculatin can insert into membranes to form disordered, water-permeable aggregates, while CG simulations of large numbers of peptides resulted in substantial deformation of lipid vesicles. The simulations imply that both pore-forming and lytic mechanisms are available to maculatin 1.1, and that the predominance of either depends on conditions such as peptide concentration and membrane composition. A generalized study of membrane protein aggregation was conducted via CG simulations of lipid bilayers containing multiple copies of model transmembrane proteins: either α-helical bundles or β-barrels. By varying the lipid tail length and the membrane type (planar bilayer or spherical vesicle), the simulations display protein aggregation ranging from negligible to extensive; they show how this biologically important process is modulated by hydrophobic mismatch, membrane curvature, and the structural class or orientation of the protein. The association of influenza hemagglutinin (HA) with putative lipid rafts was investigated by simulating aggregates of HA in a domain-forming membrane. The CG MD study addressed an important limitation of model membrane experiments by investigating the influence of high local protein concentration on membrane phase behaviour. The simulations showed attenuated diffusion of unsaturated lipids within HA aggregates, leading to spontaneous accumulation of raft-type lipids (saturated lipids and cholesterol). A CG model of the entire influenza viral envelope was constructed in realistic dimensions, comprising the three types of viral envelope protein (HA, neuraminidase and M2) inserted into a large lipid vesicle. The study represents one of the largest near-atomistic simulations of a biological membrane to date. It shows how the high concentration of proteins found in the viral envelope can attenuate formation of lipid domains, which may help to explain why lipid rafts do not form on large scales in vivo.
104

Um controle de versões refinado e flexível para artefatos de software / Flexible and fine-grained version control for software artifacts

Junqueira, Daniel Carnio 07 January 2008 (has links)
As atividades de controle de versões são consideradas essenciais para a manutenção de sistemas de computador. Elas começaram a ser realizadas na década de 1950 de forma manual. As primeiras ferramentas de controle de versões, que surgiram nos anos setenta, não evoluíram significativamente desde sua criação e, até hoje, o controle de versões de arquivos é geralmente realizado em arquivos ou mesmo módulos completos, utilizando os conceitos que foram lançados há mais de três décadas. Com a popularização da utilização de sistemas computacionais, ocorreu um sensível aumento no número de sistemas existentes e, também, na complexidade dos mesmos. Além disso, muitas alterações ocorreram nos ambientes de desenvolvimento de software, e existe demanda por sistemas que permitam aos desenvolvedores ter cada vez mais controle automatizado sobre o que está sendo desenvolvido. Para isso, algumas abordagens de controle de versões refinados para artefatos de software foram propostas, mas, muitas vezes, não oferecem a exibilidade de utilização exigida pelos ambientes de desenvolvimento de software. Neste trabalho, é apresentado um sistema que visa a fornecer suporte ao controle de versões refinado e flexível para artefatos de software, tendo por base um modelo bem definido para representação das informações da estrutura dos arquivos que compõem determinado projeto de software, sejam eles código-fonte dos programas de computador, documentação criada em Latex, arquivos XML, entre outros. O sistema apresentado foi projetado para ser integrado com outras soluções utilizadas em ambientes de desenvolvimento de software / Version control tasks are considered essential for the maintenance of computers systems. They have been done since beginning of 50\'s in a by hand manner. First tools, which were released in 70\'s, didn\'t evolve significantly since its creation, and, in general, version control systems still work with entire files or even modules of software, having the same concepts that were launched more than three decades ago. With the popularization of computers systems there had been a sensible increase in the number of existing systems and also in the complexity of these systems. Besides that many changes have taken place in the software development environments, and there is demand for systems which allow developers to have more automated control about what is being developed. Regard to this demand some approaches of fine-grained version control have been proposed, but they usually do not provide the required exibility for its use in the real software development environments. In this work its presented a system which aims at providing support for exible and fine-grained version control of software artifacts, using a well defined model to represent the logical structure of the files which compose a software project, independently of its type - they can be XML files, source-code files, Latex files and others. The system has been designed to be integrated with other software solutions used in software development environments
105

Computer simulations exploring conformational preferences of short peptides and developing a bacterial chromosome model

Li, Shuxiang 15 December 2017 (has links)
Computer simulations provide a potentially powerful complement to conventional experimental techniques in elucidating the structures, dynamics and interactions of macromolecules. In this thesis, I present three applications of computer simulations to investigate important biomolecules with sizes ranging from two-residue peptides, to proteins, and to whole chromosome structures. First, I describe the results of 441 independent explicit-solvent molecular dynamics (MD) simulations of all possible two-residue peptides that contain the 20 standard amino acids with neutral and protonated histidine. 3JHNHα coupling constants and δHα chemical shifts calculated from the MD simulations correlated quite well with recently published experimental measurements for a corresponding set of two-residue peptides. Neighboring residue effects (NREs) on the average 3JHNHα and δHα values of adjacent residues were also reasonably well reproduced. The intrinsic conformational preferences of each residue, and their NREs on the conformational preferences of adjacent residues, were analyzed. Finally, these NREs were compared with corresponding effects observed in a coil library and the average β-turn preferences of all residue types were determined. Second, I compare the abilities of three derivatives of the Amber ff99SB force field to reproduce a recent report of 3JHNHα scalar coupling constants for hundreds of two-residue peptides. All-atom MD simulations of 256 two-residue peptides were performed and the results showed that a recently-developed force field (RSFF2) produced a dramatic improvement in the agreement with experimental 3JHNHα coupling constants. I further show that RSFF2 also improved modestly agreement with experimental 3JHNHα coupling constants of five model proteins. However, an analysis of NREs on the 3JHNHα coupling constants of the two-residue peptides indicated little difference between the force fields’ abilities to reproduce experimental NREs. I speculate that this might indicate limitations in the force fields’ descriptions of nonbonded interactions between adjacent side chains or with terminal capping groups. Finally, coarse-grained (CG) models and multi-scale modeling methods are used to develop structural models of entire E. coli chromosomes confined within the experimentally-determined volume of the nucleoid. The final resolution of the chromosome structures built here was one-nucleotide-per-bead (1 NTB), which represents a significant increase in resolution relative to previously published CG chromosome models, in which one bead corresponds to hundreds or even thousands of basepairs. Based on the high-resolution final 1 NTB structures, important physical properties such as major and minor groove widths, distributions of local DNA bending angles, and topological parameters (Linking Number (Lk), Twist (Tw) and Writhe (Wr)) were accurately computed and compared with experimental measurements or predictions from a worm-like chain (WLC) model. All these analyses indicated that the chromosome models built in this study are reasonable at a microscopic level. This chromosome model provides a significant step toward the goal of building a whole-cell model of a bacterial cell.
106

A Finite Domain Constraint Approach for Placement and Routing of Coarse-Grained Reconfigurable Architectures

Saraswat, Rohit 01 May 2010 (has links)
Scheduling, placement, and routing are important steps in Very Large Scale Integration (VLSI) design. Researchers have developed numerous techniques to solve placement and routing problems. As the complexity of Application Specific Integrated Circuits (ASICs) increased over the past decades, so did the demand for improved place and route techniques. The primary objective of these place and route approaches has typically been wirelength minimization due to its impact on signal delay and design performance. With the advent of Field Programmable Gate Arrays (FPGAs), the same place and route techniques were applied to FPGA-based design. However, traditional place and route techniques may not work for Coarse-Grained Reconfigurable Architectures (CGRAs), which are reconfigurable devices offering wider path widths than FPGAs and more flexibility than ASICs, due to the differences in architecture and routing network. Further, the routing network of several types of CGRAs, including the Field Programmable Object Array (FPOA), has deterministic timing as compared to the routing fabric of most ASICs and FPGAs reported in the literature. This necessitates a fresh look at alternative approaches to place and route designs. This dissertation presents a finite domain constraint-based, delay-aware placement and routing methodology targeting an FPOA. The proposed methodology takes advantage of the deterministic routing network of CGRAs to perform a delay aware placement.
107

Resource variation and the evolution of phenotypic plasticity in fishes

Ruehl, Clifton Benjamin 30 September 2004 (has links)
Resource variation and species interactions require organisms to respond behaviorally, physiologically, and morphologically within and among generations to compensate for spatial and temporal environmental variation. One successful evolutionary strategy to mitigate environmental variation is phenotypic plasticity: the production of alternative phenotypes in response to environmental variation. Phenotypic plasticity yields multiple characters that may enable organisms to better optimize phenotypic responses across environmental gradients. In this thesis, I trace the development of thought on phenotypic plasticity and present two empirical studies that implicate phenotypic plasticity in producing morphological variation in response to resource variation. The first empirical study addresses trophic plasticity, population divergence, and the effect of fine-scale environmental variation in western mosquitofish (Gambusia affinis). Offspring from two populations were fed either attached or unattached food items offered in three orientations: (1) water surface, (2) mid-water, (3) benthic, and (4) a daily rotation of the former three (fine-grained variation). Attached food induced wide heads, blunt snouts and rounded pectoral fins relative to morphology in the unattached treatment. Mid-water feeding induced elongated heads and deeper mid-bodies relative to benthic and surface feeding induced morphologies. The rotating treatment produced intermediate morphologies. Population divergence seemed related to both trophic and predation ecology. Ecomorphological consequences of induced morphologies and the need for inclusion of greater ecological complexity in studies of plasticity are discussed. The second study examines induced morphological plasticity and performance in red drum (Sciaenops ocellatus). I fed hatchery fish either hard or soft food for two months. Performance trials were designed to measure their ability to manipulate and consume hard food items. External morphology and the mass of pharyngeal crushing muscles were assessed for variation among treatments. A hard food diet induced deeper bodies and larger heads, more massive pharyngeal muscles, and initially more efficient consumption of hard food than fish receiving soft food. The observed morphological variation is in accordance with variation among species. Determining evolutionary mechanisms operating within red drum populations should eventually aid in developing and optimizing conservation efforts and ease the transition from hatchery facilities to estuaries.
108

Objective Approaches to Single-Molecule Time Series Analysis

Taylor, James 24 July 2013 (has links)
Single-molecule spectroscopy has provided a means to uncover pathways and heterogeneities that were previously hidden beneath the ensemble average. Such heterogeneity, however, is often obscured by the artifacts of experimental noise and the occurrence of undesired processes within the experimental medium. This has subsequently caused in the need for new analytical methodologies. It is particularly important that objectivity be maintained in the development of new analytical methodology so that bias is not introduced and the results improperly characterized. The research presented herein identifies two such sources of experimental uncertainty, and constructs objective approaches to reduce their effects in the experimental results. The first, photoblinking, arises from the occupation of dark electronic states within the probe molecule, resulting in experimental data that is distorted by its contribution. A method based in Bayesian inference is developed, and is found to nearly eliminate photoblinks from the experimental data while minimally affecting the remaining data and maintaining objectivity. The second source of uncertainty is electronic shot-noise, which arises as a result of Poissonian photon collection. A method based in wavelet decomposition is constructed and applied to simulated and experimental data. It is iii found that, while making only one assumption, that photon collection is indeed a Poisson process, up to 75% of the shot-noise contribution may be removed from the experimental signal by the wavelet-based procedure. Lastly, in an effort to connect model-based approaches such as molecular dynamics simulation to model-free approaches that rely solely on the experimental data, a coarse-grained molecular model of a molecular ionic fluorophore diffusing within an electrostatically charged polymer brush is constructed and characterized. It is found that, while the characteristics of the coarse-grained simulation compare well with atomistic simulations, the model is lacking in its representation of the electrostatically-driven behavior of the experimental system.
109

Investigation of Structural Behaviors of Methyl Methacrylate Oligomers within Confinement Space by Coarse-grained Configurational-bias Monte Carlo Simulation

Chang, Chun-Yi 16 August 2010 (has links)
The coarse-grained configurational-bias Monte Carlo (CG-CBMC) simulation was employed to study the structural behaviors of methyl methacrylate (MMA) oligomers adsorbed on grooved substrate due to molecular dynamics (MD) simulation is probably trapped at some local energy minima and difficult to carry out over a long enough time to allow relaxation of chain motion for an enormous polymeric system. Therefore, the CG-CBMC simulation was adopted in the present study. In this study, three types of chains are classified according to their positions relative to the groove. Type 1, Type 2, and Type 3 represent the entire MMA-oligomer within the groove, the MMA-oligomer partially within the groove, and the oligomer outside the groove, respectively. The orientational order parameters of Type 1 and Type 2 oligomers decrease with the increase of groove width, but the orientational order parameter of Type 3 oligomers is approximately equal to 0.1. In addition, observation of the orientational order parameters of Type 1 oligomers interacting with the grooved substrate at different interaction strengths decrease with increasing the groove width. Furthermore, the orientational order parameters of Type 1 oligomers within the narrowest (20 Å) and the widest (35 Å) groove with different depths were determined. For the narrowest groove, the arrangement of Type 1 oligomers will be influenced by the groove width. However, in the case of the widest groove, the orientational order parameter of Type 1 oligomers is approximately equal to 0.2. This study can help engineers clarify the characteristics and phenomena of physical adsorption of the molecules, as well as contributing to the application of recent technology.
110

Resource variation and the evolution of phenotypic plasticity in fishes

Ruehl, Clifton Benjamin 30 September 2004 (has links)
Resource variation and species interactions require organisms to respond behaviorally, physiologically, and morphologically within and among generations to compensate for spatial and temporal environmental variation. One successful evolutionary strategy to mitigate environmental variation is phenotypic plasticity: the production of alternative phenotypes in response to environmental variation. Phenotypic plasticity yields multiple characters that may enable organisms to better optimize phenotypic responses across environmental gradients. In this thesis, I trace the development of thought on phenotypic plasticity and present two empirical studies that implicate phenotypic plasticity in producing morphological variation in response to resource variation. The first empirical study addresses trophic plasticity, population divergence, and the effect of fine-scale environmental variation in western mosquitofish (Gambusia affinis). Offspring from two populations were fed either attached or unattached food items offered in three orientations: (1) water surface, (2) mid-water, (3) benthic, and (4) a daily rotation of the former three (fine-grained variation). Attached food induced wide heads, blunt snouts and rounded pectoral fins relative to morphology in the unattached treatment. Mid-water feeding induced elongated heads and deeper mid-bodies relative to benthic and surface feeding induced morphologies. The rotating treatment produced intermediate morphologies. Population divergence seemed related to both trophic and predation ecology. Ecomorphological consequences of induced morphologies and the need for inclusion of greater ecological complexity in studies of plasticity are discussed. The second study examines induced morphological plasticity and performance in red drum (Sciaenops ocellatus). I fed hatchery fish either hard or soft food for two months. Performance trials were designed to measure their ability to manipulate and consume hard food items. External morphology and the mass of pharyngeal crushing muscles were assessed for variation among treatments. A hard food diet induced deeper bodies and larger heads, more massive pharyngeal muscles, and initially more efficient consumption of hard food than fish receiving soft food. The observed morphological variation is in accordance with variation among species. Determining evolutionary mechanisms operating within red drum populations should eventually aid in developing and optimizing conservation efforts and ease the transition from hatchery facilities to estuaries.

Page generated in 0.0305 seconds