• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 126
  • 81
  • 15
  • 10
  • 9
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 310
  • 310
  • 53
  • 50
  • 45
  • 44
  • 39
  • 30
  • 28
  • 28
  • 26
  • 26
  • 23
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Sequential Quantum-Dot Cellular Automata Design And Analysis Using Dynamic Bayesian Networks

Venkataramani, Praveen 29 October 2008 (has links)
The increasing need for low power and stunningly fast devices in Complementary Metal Oxide Semiconductor Very large Scale Integration (CMOS VLSI) circuits, directs the stream towards scaling of the same. However scaling at sub-micro level and nano level pose quantum mechanical effects and thereby limits further scaling of CMOS circuits. Researchers look into new aspects in nano regime that could effectively resolve this quandary. One such technology that looks promising at nano-level is the quantum dot cellular automata (QCA). The basic operation of QCA is based on transfer of charge rather than the electrons itself. The wave nature of these electrons and their uncertainty in device operation demands a probabilistic approach to study their operation. The data is assigned to a QCA cell by positioning two electrons into four quantum dots. However the site in which the electrons settles is uncertain and depends on various factors. In an ideal state, the electrons position themselves diagonal to each other, through columbic repulsion, to a low energy state. The quantum cell is said to be polarized to +1 or -1, based on the alignment of the electrons. In this thesis, we put forth a probabilistic model to design sequential QCA in Bayesian networks. The timing constraints inherent in sequential circuits due to the feedback path, makes it difficult to assign clock zones in a way that the outputs arrive at the same time instant. Hence designing circuits that have many sequential elements is time consuming. The model presented in this paper is fast and could be used to design sequential QCA circuits without the need to align the clock zones. One of the major advantages of our model lies in its ability to accurately capture the polarization of each cell of the sequential QCA circuits. We discuss the architecture of some of the basic sequential circuits such as J-K flip flop (FF), RAM memory cell and s27 benchmark circuit designed in QCADesigner. We analyze the circuits using a state-of-art Dynamic Bayesian Networks (DBN). To our knowledge this is the first time sequential circuits are analyzed using DBN. For the first time, Estimated Posterior Importance Sampling Algorithm (EPIS) is used to determine the probabilistic values, to study the effect due to variations in physical dimension and operating temperature on output polarization in QCA circuits.
102

Tree Restructuring Approach to Mapping Problem in Cellular Architecture FPGAS

Ramineni, Narahari 10 February 1995 (has links)
This thesis presents a new technique for mapping combinational circuits to Fine-Grain Cellular-Architecture FPGAs. We represent the netlist as the binary tree with decision variables associated with each node of the tree. The functionality of the tree nodes is chosen based on the target FPGA architecture. The proposed tree restructuring algorithms preserve local connectivity and allow direct mapping of the trees to the cellular array, thus eliminating the traditional routing phase. Also, predictability of the signal delays is a very important advantage of the developed approach. The developed bus-assignment algorithm efficiently utilizes the medium distance routing resources (buses). The method is general and can be used for any Fine Grain CA-type FPGA. To demonstrate our techniques, ATMEL 6000 series FPGA was used as a target architecture. The area and delay comparison between our methods and commercial tools is presented using a set of MCNC benchmarks. Final layouts of the implemented designs are included. Results show that the proposed techniques outperform the available commercial tools for ATMEL 6000 FPGAs, both in area and delay optimization.
103

Prilog razvoju metoda arhitektonskog projektovanja školskih zgrada / A contribution for the method of school architecture design

Ecet Dejan 09 May 2019 (has links)
<p>Centralna tema ovog istraživanje je primena ćelijskog automata računarskog modela u procesu projektovanja školskih zgrada. Istraživanje dominantno obrađuje teme vezane za upotrebu savremenih računarskih tehnologija u procesu arhitektonskog projektovanja, a primenjeno konkretno na školske objekte.</p> / <p>The central theme of this research is the application of the cellular automata computer model in the design of school buildings. Research tends to examine a large number of contemporary architectural design dillemas, foremost the subject of usage of modern informational technologies in architecture, applied to school buildings.</p>
104

Geographical vector agents

Hammam, Yasser, n/a January 2008 (has links)
Simulating geographic phenomena in a realistic and plausible way requires real-world entities to be abstracted based on the dynamic physical characteristics they exhibit, and treated as individuals in a simulation domain. These processes cannot be adequately supported by the traditional spatial model based on cellular-space such as Cellular Automata (CA). Although this approach has received a great attention as a most favoured technique for simulating the geographic phenomena from different aspects, the need for a generic spatial model to overcome the limitations encountered in such an approach has been raised. This applies particularly to the way real-world entities are represented in a simulation domain regarding their physical characteristics and temporal aspects. In this thesis, a new computational approach for a spatial model suitable for simulating geographic phenomena is presented: the vector agents model. The vector agent is goal-oriented, adaptable, physically defined by an Euclidean geometry and able to change its own geometric characteristics while interacting with other agents in its neighbourhood using a set of rules. The agent is modelled with sensor, state, and strategies. The successful implementation of the model�s architecture allows the representation of the physical characteristics of real-world entities and to observe their complex and dynamic behaviour in a simulation domain. Vector agents have developed out of a need to create a systematic basis for the geometric components of Geographic Automata Systems (GAS), as outlined by Torrens and Benenson (2005). A generic vector agents model was built, then tested and validated from different aspects, from which results demonstrated the model�s efficiency. It is confirmed that vector agents are flexible in producing different complex shapes and patterns for recreating real geographic phenomena through the generic use of three algorithms of geometric manipulation: midpoint displacement by using the relaxed Brownian Motion (fractal-like) algorithm, edge displacement and vertex displacement. The effectiveness of this was initially ascertained visually. A simple heuristic to govern shape growth rate and complexity was derived based on the interplay of the three algorithms. There was a further abstract model comparison against the cellular-agents environment, with the result that vector agents have the ability to emerge patterns similar to what can be produced by cellular-agents with the advantage of representing entities as individuals with their own attributes with realistic geometric boundaries. On the other hand, the city as a complex geographic phenomenon was used as a specific domain for validating the model with a real-world system. The results of the urban land use simulations (driven by simple rules based on three classical urban theories) confirmed that: (a) the model is flexible enough to incorporate various external rules based on real-world systems and (b) the model has a sufficient capability in emerging a variety of patterns under several environments close to actual patterns. The agent environment also proved to be an effective way of easily combining the rules associated with each urban theory (different agents behaved according to different theories). Finally, limitations raised through the development of this work are addressed leading to outline possible extensions of both model computation and the domain of applications.
105

Organizing a Global Coordinate System from Local Information on an Amorphous Computer

Nagpal, Radhika 29 August 1999 (has links)
This paper demonstrates that it is possible to generate a reasonably accurate coordinate system on randomly distributed processors, using only local information and local communication. By coordinate systems we imply that each element assigns itself a logical coordinate that maps to its global physical location, starting with no apriori knowledge of position or orientation. The algorithm presented is inspired by biological systems that use chemical gradients to determine the position of cells. Extensive analysis and simulation results are presented. Two key results are: there is a critical minimum average neighborhood size of 15 for good accuracy and there is a fundamental limit on the resolution of any coordinate system determined strictly from local communication. We also demonstrate that using this algorithm, random distributions of processors produce significantly better accuracy than regular processor grids - such as those used by cellular automata. This has implications for discrete models of biology as well as for building smart sensor arrays.
106

Coupling, space and time Mixing for parallel stochastic dynamics

Louis, Pierre-Yves January 2004 (has links)
We first introduce some coupling of a finite number of Probabilistic Cellular Automata dynamics (PCA), preserving the stochastic ordering. Using this tool, for a general attractive probabilistic cellular automata on SZd, where S is finite, we prove that a condition (A) is equivalent to the (time-) convergence towards equilibrium of this Markovian parallel dynamics, in the uniform norm, exponentially fast. This condition (A) means the exponential decay of the influence from the boundary for the invariant measures of the system restricted to finite ‘box’-volume. For a class of reversible PCA dynamics on {−1, +1}Zd / with a naturally associated Gibbsian potential ϕ, we prove that a Weak Mixing condition for ϕ implies the validity of the assumption (A); thus the ‘exponential ergodicity’ of the dynamics towards the unique Gibbs measure associated to ϕ holds. On some particular examples of this PCA class, we verify that our assumption (A) is weaker than the Dobrushin-Vasershtein ergodicity condition. For some special PCA, the ‘exponential ergodicity’ holds as soon as there is no phase transition.
107

Increasing coupling for probabilistic cellular automata

Louis, Pierre-Yves January 2005 (has links)
We give a necessary and sufficient condition for the existence of an increasing coupling of N (N >= 2) synchronous dynamics on S-Zd (PCA). Increasing means the coupling preserves stochastic ordering. We first present our main construction theorem in the case where S is totally ordered; applications to attractive PCAs are given. When S is only partially ordered, we show on two examples that a coupling of more than two synchronous dynamics may not exist. We also prove an extension of our main result for a particular class of partially ordered spaces.
108

Investigation of the implications of nitric oxide on biofilm development

Ulfenborg, Benjamin January 2008 (has links)
Biofilms are communities of sessile microorganisms attached to a surface and imbeddedin a matrix of extracellular polysaccharide substances. These communities can be foundin diverse aquatic environments, such as in industrial pipes and in humans. By formingmicrocolony structures, which are highly resistant to adverse physical conditions as wellas antimicrobial agents, biofilms are very problematic when associated with e.g.persistent infections. In order to find new ways of controlling biofilm growth, theprocesses involved in biofilm development must be investigated further. The maininterest of this study is the occurrence of void formation inside biofilms. Thisphenomenon has been observed in several studies and has been correlated to cell deathinside the microcolonies. The occurrence of cell death has recently been associated withthe presence of nitric oxide in the biofilm. In this study, the implications of nitric oxideaccumulation on biofilm development were investigated using an individual-basedmodel. Specifically, the role of nitric oxide in void formation was considered. A largenumber of simulations were run using different parameter settings in order to determine ifnitric oxide could account for the occurrence of void formation observed experimentally.The general predictions made by the model system showed agreement to someexperimental data, but not to others. Sloughing, the detachment of chunks of cells fromthe biofilm, was observed in the majority of simulations. In some cases, the model alsopredicted the presence of live cells inside the voids, which has been observedexperimentally.
109

On the Convective-Scale Predictability of the Atmosphere

Bengtsson, Lisa January 2012 (has links)
A well-represented description of convection in weather and climate models is essential since convective clouds strongly influence the climate system. Convective processes interact with radiation, redistribute sensible and latent heat and momentum, and impact hydrological processes through precipitation. Depending on the models’ horizontal resolution, the representation of convection may look very different. However, the convective scales not resolved by the model are traditionally parameterized by an ensemble of non-interacting convective plumes within some area of uniform forcing, representing the “large scale”. A bulk representation of the mass-flux associated with the individual plumes in the defined area provide the statistical effect of moist convection on the atmosphere. Studying the characteristics of the ECMWF ensemble prediction system it is found that the control forecast of the ensemble system is not variable enough in order to yield a sufficient spread using an initial perturbation technique alone. Such insufficient variability may be addressed in the parameterizations of, for instance, cumulus convection where the sub-grid variability in space and time is traditionally neglected. Furthermore, horizontal transport due to gravity waves can act to organize deep convection into larger scale structures which can contribute to an upscale energy cascade. However, horizontal advection and numerical diffusion are the only ways through which adjacent model grid-boxes interact in the models. The impact of flow dependent horizontal diffusion on resolved deep convection is studied, and the organization of convective clusters is found very sensitive to the method of imposing horizontal diffusion. However, using numerical diffusion in order to represent lateral effects is undesirable. To address the above issues, a scheme using cellular automata in order to introduce lateral communication, memory and a stochastic representation of the statistical effects of cumulus convection is implemented in two numerical weather models. The behaviour of the scheme is studied in cases of organized convective squall-lines, and initial model runs show promising improvements. / <p>At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 4: Submitted. </p>
110

Untersuchung und Lokalisierung künftiger urbaner Flächennutzungsänderungen in Flussauen - Eine Anwendung des rasterbasierten Simulationsmodells LuSIM in der Elbaue Dresdens

Waha, Katharina 27 June 2012 (has links) (PDF)
Das Auftreten von Starkregen führt zu steigendem Wasserstand und Wasservolumen in den Fließgewässern und zu Überschwemmungen der Flussauen. Tritt Hochwasser im menschlichen Siedlungsraum auf, entstehen Schäden an privatem und öffentlichem Eigentum sowie im gewerblichen Bereich. Deshalb ist es notwendig Dynamiken der Flächennutzungsänderungen zu untersuchen, da die Flächennutzung die Art der anfälligen Elemente im Falle eines Hochwassers und die Schadenshöhe bestimmt. Als Simulationswerkzeug wird der rasterbasierte zelluläre Automat LuSIM (vgl. STRÖBL et al. 2003) genutzt, der als Erweiterung für ArcView 3.x zur Verfügung steht. Das Untersuchungsgebiet befindet sich in der Elbaue, zum Teil im Stadtgebiet Dresden und zum Teil im Stadtgebiet von Radebeul im Landkreis Meißen. In der vorliegenden Arbeit wird die zukünftige urbane Flächennutzungsentwicklung im Untersuchungsgebiet nach drei Szenarios simuliert. Die Neuentstehung von Wohnbauland ist dabei die treibende Kraft. Die Arbeit leistet damit einen Beitrag zur ex ante-Analyse und Bewertung der Hochwasserrisiken. Der zelluläre Automat verortet neue potentielle Wohnstandorte basierend auf Eignungsregeln und dem Bedarf an neuem Wohnbauland. Die Regeln zur Eignung jeder Rasterzelle als Wohnstandort leite ich aus der Analyse historischer Nutzungsänderungen ab. Den Bedarf an zukünftigem Wohnbauland ermittle ich aus dem vom Bundesamt für Bauwesen und Raumordnung (2006) in seiner Raumordnungsprognose 2020/2050 prognostizierten Neubau von Wohnungen.

Page generated in 0.0303 seconds