• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 158
  • 81
  • 15
  • 12
  • 11
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 362
  • 362
  • 53
  • 51
  • 50
  • 46
  • 45
  • 33
  • 32
  • 29
  • 28
  • 26
  • 24
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Modelling angiogenesis : a discrete to continuum approach

Pillay, Samara January 2017 (has links)
Angiogenesis is the process by which new blood vessels develop from existing vessels. Angiogenesis is important in a number of conditions such as embryogenesis, wound healing and cancer. It has been modelled phenomenologically at the macroscale, using the well-known 'snail-trail' approach in which trailing endothelial cells follow the paths of other, leading endothelial cells. In this thesis, we systematically determine the collective behaviour of endothelial cells from their behaviour at the cell-level during corneal angiogenesis. We formulate an agent-based model, based on the snail-trail process, to describe the behaviour of individual cells. We incorporate cell motility through biased random walks, and include processes which produce (branching) and annihilate (anastomosis) cells to represent sprout and loop formation. We use the transition probabilities associated with the discrete model and a mean-field approximation to systematically derive a system of non-linear partial differential equations (PDEs) of population behaviour that impose physically realistic density restrictions, and are structurally different from existing snail-trail models. We use this framework to evaluate the validity of a classical snail-trail model and elucidate implicit assumptions. We then extend our framework to explicitly account for cell volume. This generates non-linear PDE models which vary in complexity depending on the extent of volume exclusion incorporated on the microscale. By comparing discrete and continuum models, we assess the extent to which continuum models, including the classical snail-trail model, account for single and multi-species exclusion processes. We also distinguish macroscale exclusion effects introduced by each cell species. Finally, we compare the predictive power of different continuum models. In summary, we develop a microscale to macroscale framework for angiogenesis based on the snail-trail process, which provides a systematic way of deriving population behaviour from individual cell behaviour and can be extended to account for more realistic and/or detailed cell interactions.
142

Sequential Quantum-Dot Cellular Automata Design And Analysis Using Dynamic Bayesian Networks

Venkataramani, Praveen 29 October 2008 (has links)
The increasing need for low power and stunningly fast devices in Complementary Metal Oxide Semiconductor Very large Scale Integration (CMOS VLSI) circuits, directs the stream towards scaling of the same. However scaling at sub-micro level and nano level pose quantum mechanical effects and thereby limits further scaling of CMOS circuits. Researchers look into new aspects in nano regime that could effectively resolve this quandary. One such technology that looks promising at nano-level is the quantum dot cellular automata (QCA). The basic operation of QCA is based on transfer of charge rather than the electrons itself. The wave nature of these electrons and their uncertainty in device operation demands a probabilistic approach to study their operation. The data is assigned to a QCA cell by positioning two electrons into four quantum dots. However the site in which the electrons settles is uncertain and depends on various factors. In an ideal state, the electrons position themselves diagonal to each other, through columbic repulsion, to a low energy state. The quantum cell is said to be polarized to +1 or -1, based on the alignment of the electrons. In this thesis, we put forth a probabilistic model to design sequential QCA in Bayesian networks. The timing constraints inherent in sequential circuits due to the feedback path, makes it difficult to assign clock zones in a way that the outputs arrive at the same time instant. Hence designing circuits that have many sequential elements is time consuming. The model presented in this paper is fast and could be used to design sequential QCA circuits without the need to align the clock zones. One of the major advantages of our model lies in its ability to accurately capture the polarization of each cell of the sequential QCA circuits. We discuss the architecture of some of the basic sequential circuits such as J-K flip flop (FF), RAM memory cell and s27 benchmark circuit designed in QCADesigner. We analyze the circuits using a state-of-art Dynamic Bayesian Networks (DBN). To our knowledge this is the first time sequential circuits are analyzed using DBN. For the first time, Estimated Posterior Importance Sampling Algorithm (EPIS) is used to determine the probabilistic values, to study the effect due to variations in physical dimension and operating temperature on output polarization in QCA circuits.
143

The Design, Realization and Testing of the ILU of the CCM2 Using FPGA Technology

Foote, David W. 09 June 1994 (has links)
Most existing computers today are built upon a subset of the arithmetic system which is based upon the foundation of set theory. All formal systems can be expressed in terms of arithmetic and logic on current arithmetic computers through an appropriate model, then work with the model using software manipulation. However, severe speed degradation is the price one must pay for using a software-based approach, making several high-level formal systems impractical. To improve the speed at which computers can implement these high-level systems, one must either design special hardware, implementing specific operations much like math and image processing coprocessors, or execute operations upon multiple processors in a parallel fashion. Due to the increase in developing applications for the manipulation of logic functions, an interest in the logic machine has arisen. Many applications such as logic optimization, simulation, pattern recognition and image processing can be better implemented with a logic machine. This thesis proposes the design, hardware realization, and testing of the iterative logic unit (ILU) of the Cube Calculus Machine II (CCM2). The CCM2 is a general purpose computer with an architecture that emphasizes a data path designed to execute operations of cube calculus, a popular algebraic model used in the minimization of Boolean functions. The ILU is an iterative logic array of cells (ITs) using internal distributed control, enabling the execution of basic cube operations, while the Control Unit (CU) handles global signals from the host computer. The ILU of the CCM2 has been realized in hardware using Xilinx Logic Cell Arrays (LCAs). FPGAs offer the logic density and versatility of gate arrays, with the off-the shelf availability and time-to-market advantages of standard user-programmable devices. These devices can be reconfigured, allowing multiple revisions and future design generations to accommodate the same device, thus saving design and production costs, an ideal solution to the resource and financial problems plaguing the University environment.
144

Tree Restructuring Approach to Mapping Problem in Cellular Architecture FPGAS

Ramineni, Narahari 10 February 1995 (has links)
This thesis presents a new technique for mapping combinational circuits to Fine-Grain Cellular-Architecture FPGAs. We represent the netlist as the binary tree with decision variables associated with each node of the tree. The functionality of the tree nodes is chosen based on the target FPGA architecture. The proposed tree restructuring algorithms preserve local connectivity and allow direct mapping of the trees to the cellular array, thus eliminating the traditional routing phase. Also, predictability of the signal delays is a very important advantage of the developed approach. The developed bus-assignment algorithm efficiently utilizes the medium distance routing resources (buses). The method is general and can be used for any Fine Grain CA-type FPGA. To demonstrate our techniques, ATMEL 6000 series FPGA was used as a target architecture. The area and delay comparison between our methods and commercial tools is presented using a set of MCNC benchmarks. Final layouts of the implemented designs are included. Results show that the proposed techniques outperform the available commercial tools for ATMEL 6000 FPGAs, both in area and delay optimization.
145

INVESTIGATING SMOKE EXPOSURE AND CHRONIC OBSTRUCTIVE PULMONARY DISEASE (COPD) WITH A CALIBRATED AGENT BASED MODEL (ABM) OF IN VITRO FIBROBLAST WOUND HEALING.

Ratti, James A 01 January 2018 (has links)
COPD is characterized by tissue inflammation and impaired remodeling that suggests fibroblast maintenance of structural homeostasis is dysregulated. Thus, we performed in vitro wound healing experiments on normal and diseased human lung fibroblasts and developed an ABM of fibroblasts closing a scratched monolayer using NetLogo to evaluate differences due to COPD or cigarette smoke condensate exposure. This ABM consists of a rule-set governing the healing response, accounting for cell migration, proliferation, death, activation and senescence rates; along with the effects of heterogeneous activation, phenotypic changes, serum deprivation and exposure to cigarette smoke condensate or bFGF. Simulations were performed to calibrate parameter-sets for each cell type using in vitro data of scratch-induced migration, viability, senescence-associated beta-galactosidase and alpha-smooth muscle actin expression. Parameter sensitivities around each calibrated parameter-set were analyzed. This model represents the prototype of a tool designed to explore fibroblast functions in the pathogenesis of COPD and evaluate potential therapies.
146

Prilog razvoju metoda arhitektonskog projektovanja školskih zgrada / A contribution for the method of school architecture design

Ecet Dejan 09 May 2019 (has links)
<p>Centralna tema ovog istraživanje je primena ćelijskog automata računarskog modela u procesu projektovanja školskih zgrada. Istraživanje dominantno obrađuje teme vezane za upotrebu savremenih računarskih tehnologija u procesu arhitektonskog projektovanja, a primenjeno konkretno na školske objekte.</p> / <p>The central theme of this research is the application of the cellular automata computer model in the design of school buildings. Research tends to examine a large number of contemporary architectural design dillemas, foremost the subject of usage of modern informational technologies in architecture, applied to school buildings.</p>
147

Geographical vector agents

Hammam, Yasser, n/a January 2008 (has links)
Simulating geographic phenomena in a realistic and plausible way requires real-world entities to be abstracted based on the dynamic physical characteristics they exhibit, and treated as individuals in a simulation domain. These processes cannot be adequately supported by the traditional spatial model based on cellular-space such as Cellular Automata (CA). Although this approach has received a great attention as a most favoured technique for simulating the geographic phenomena from different aspects, the need for a generic spatial model to overcome the limitations encountered in such an approach has been raised. This applies particularly to the way real-world entities are represented in a simulation domain regarding their physical characteristics and temporal aspects. In this thesis, a new computational approach for a spatial model suitable for simulating geographic phenomena is presented: the vector agents model. The vector agent is goal-oriented, adaptable, physically defined by an Euclidean geometry and able to change its own geometric characteristics while interacting with other agents in its neighbourhood using a set of rules. The agent is modelled with sensor, state, and strategies. The successful implementation of the model�s architecture allows the representation of the physical characteristics of real-world entities and to observe their complex and dynamic behaviour in a simulation domain. Vector agents have developed out of a need to create a systematic basis for the geometric components of Geographic Automata Systems (GAS), as outlined by Torrens and Benenson (2005). A generic vector agents model was built, then tested and validated from different aspects, from which results demonstrated the model�s efficiency. It is confirmed that vector agents are flexible in producing different complex shapes and patterns for recreating real geographic phenomena through the generic use of three algorithms of geometric manipulation: midpoint displacement by using the relaxed Brownian Motion (fractal-like) algorithm, edge displacement and vertex displacement. The effectiveness of this was initially ascertained visually. A simple heuristic to govern shape growth rate and complexity was derived based on the interplay of the three algorithms. There was a further abstract model comparison against the cellular-agents environment, with the result that vector agents have the ability to emerge patterns similar to what can be produced by cellular-agents with the advantage of representing entities as individuals with their own attributes with realistic geometric boundaries. On the other hand, the city as a complex geographic phenomenon was used as a specific domain for validating the model with a real-world system. The results of the urban land use simulations (driven by simple rules based on three classical urban theories) confirmed that: (a) the model is flexible enough to incorporate various external rules based on real-world systems and (b) the model has a sufficient capability in emerging a variety of patterns under several environments close to actual patterns. The agent environment also proved to be an effective way of easily combining the rules associated with each urban theory (different agents behaved according to different theories). Finally, limitations raised through the development of this work are addressed leading to outline possible extensions of both model computation and the domain of applications.
148

Organizing a Global Coordinate System from Local Information on an Amorphous Computer

Nagpal, Radhika 29 August 1999 (has links)
This paper demonstrates that it is possible to generate a reasonably accurate coordinate system on randomly distributed processors, using only local information and local communication. By coordinate systems we imply that each element assigns itself a logical coordinate that maps to its global physical location, starting with no apriori knowledge of position or orientation. The algorithm presented is inspired by biological systems that use chemical gradients to determine the position of cells. Extensive analysis and simulation results are presented. Two key results are: there is a critical minimum average neighborhood size of 15 for good accuracy and there is a fundamental limit on the resolution of any coordinate system determined strictly from local communication. We also demonstrate that using this algorithm, random distributions of processors produce significantly better accuracy than regular processor grids - such as those used by cellular automata. This has implications for discrete models of biology as well as for building smart sensor arrays.
149

Coupling, space and time Mixing for parallel stochastic dynamics

Louis, Pierre-Yves January 2004 (has links)
We first introduce some coupling of a finite number of Probabilistic Cellular Automata dynamics (PCA), preserving the stochastic ordering. Using this tool, for a general attractive probabilistic cellular automata on SZd, where S is finite, we prove that a condition (A) is equivalent to the (time-) convergence towards equilibrium of this Markovian parallel dynamics, in the uniform norm, exponentially fast. This condition (A) means the exponential decay of the influence from the boundary for the invariant measures of the system restricted to finite ‘box’-volume. For a class of reversible PCA dynamics on {−1, +1}Zd / with a naturally associated Gibbsian potential ϕ, we prove that a Weak Mixing condition for ϕ implies the validity of the assumption (A); thus the ‘exponential ergodicity’ of the dynamics towards the unique Gibbs measure associated to ϕ holds. On some particular examples of this PCA class, we verify that our assumption (A) is weaker than the Dobrushin-Vasershtein ergodicity condition. For some special PCA, the ‘exponential ergodicity’ holds as soon as there is no phase transition.
150

Increasing coupling for probabilistic cellular automata

Louis, Pierre-Yves January 2005 (has links)
We give a necessary and sufficient condition for the existence of an increasing coupling of N (N >= 2) synchronous dynamics on S-Zd (PCA). Increasing means the coupling preserves stochastic ordering. We first present our main construction theorem in the case where S is totally ordered; applications to attractive PCAs are given. When S is only partially ordered, we show on two examples that a coupling of more than two synchronous dynamics may not exist. We also prove an extension of our main result for a particular class of partially ordered spaces.

Page generated in 0.087 seconds