• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 7
  • 7
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Performance driven design systems in practice

Joyce, Sam January 2016 (has links)
This thesis is concerned with the application of computation in the context of professional architectural practice and specifically towards defining complex buildings that are highly integrated with respect to design and engineering performance. The thesis represents applied research undertaken whilst in practice at Foster + Partners. It reviews the current state of the art of computational design techniques to quickly but flexibly model and analyse building options. The application of parametric design tools to active design projects is discussed with respect to real examples as well as methods to then link the geometric definitions to structural engineering analysis, to provide performance data in near real time. The practical interoperability between design software and engineering tools is also examined. The role of performance data in design decision making is analysed by comparing manual work-flows with methods assisted by computation. This extends to optimisation methods which by making use of design automation actively make design decisions to return optimised results. The challenges and drawbacks of using these methods effectively in real deign situations is discussed, especially the limitations of these methods with respect to incomplete problem definitions, and the design exploration resulting in modified performance requirements. To counter these issues a performance driven design work flow is proposed. This is a mixed initiative whereby designer centric understanding and decisions are computer assisted. Flexible meta-design descriptions that encapsulate the variability of the design space under consideration are explored and compared with existing optimisation approaches. Computation is used to produce and visualise the performance data from these large design spaces generated by parametric design descriptions and associated engineering analysis. Novel methods are introduced that define a design and performance space using cluster computing methods to speed up the generation of large numbers of options. The use of data visualisation is applied to design problems, showing how in real situations it can aid design orientation and decision making using the large amount of data produced. Strategies to enable these work-flows are discussed and implemented, focusing on re-appropriating existing web design paradigms using a modular approach concentrating on scalable data creation and information display.
2

In search of the DomoNovus : speculative designs for the computationally-enhanced domestic environment

Didakis, Stavros January 2017 (has links)
The home is a physical place that provides isolation, comfort, access to essential needs on a daily basis, and it has a strong impact on a person’s life. Computational and media technologies (digital and electronic objects, devices, protocols, virtual spaces, telematics, interaction, social media, and cyberspace) become an important and vital part of the home ecology, although they have the ability to transform the domestic experience and the understanding of what a personal space is. For this reason, this work investigates the domestication of computational media technology; how objects, systems, and devices become part of the personal and intimate space of the inhabitants. To better understand the taming process, the home is studied and analysed from a range of perspectives (philosophy, sociology, architecture, art, and technology), and a methodological process is proposed for critically exploring the topic with the development of artworks, designs, and computational systems. The methodology of this research, which consists of five points (Context, Media Layers, Invisible Matter, Diffusion, and Symbiosis), suggests a procedure that is fundamental to the development and critical integration of the computationally enhanced home. Accordingly, the home is observed as an ecological system that contains numerous properties (organic, inorganic, hybrid, virtual, augmented), and is viewed on a range of scales (micro, meso and macro). To identify the “choreographies” that are formed between these properties and scales, case studies have been developed to suggest, provoke, and speculate concepts, ideas, and alternative realities of the home. Part of the speculation proposes the concept of DomoNovus (the “New Home”), where technological ubiquity supports the inhabitants’ awareness, perception, and imagination. DomoNovus intends to challenge our understanding of the domestic environment, and demonstrates a range of possibilities, threats, and limitations in relation to the future of home. This thesis, thus, presents methods, experiments, and speculations that intend to inform and inspire, as well as define creative and imaginative dimensions of the computationally-enhanced home, suggesting directions for the further understanding of the domestic life.
3

A Computational Architecture Methodology For Design In Traditional Tissue: The Case Of Kalkan

Kutay, Karabag 01 September 2010 (has links) (PDF)
This study targets to address the problem of &#039 / new building in a traditional setting&#039 / , utilizing computational design tools. The intention is to provide a methodology for analysis of architectural features of a traditional tissue and moreover propose computational design strategies utilizing algorithms for processing analytical data serving new building design. In the introduction part, this goal is exposed as well as a critic discussion based on a conservationist perspective for contemporary examples of computational design. Contemporary digital tools and methods employed in the field of architecture are discussed with a focus on algorithmic approaches, followed by a brief history for utilization of computational tools and digital design philosophy in the following chapter. Moreover organic architecture is discussed as a complex entity composed of integral elements and their relations, as well as the designer
4

Computational workflow management for conceptual design of complex systems : an air-vehicle design perspective

Balachandran, Libish Kalathil January 2007 (has links)
The decisions taken during the aircraft conceptual design stage are of paramount importance since these commit up to eighty percent of the product life cycle costs. Thus in order to obtain a sound baseline which can then be passed on to the subsequent design phases, various studies ought to be carried out during this stage. These include trade-off analysis and multidisciplinary optimisation performed on computational processes assembled from hundreds of relatively simple mathematical models describing the underlying physics and other relevant characteristics of the aircraft. However, the growing complexity of aircraft design in recent years has prompted engineers to substitute the conventional algebraic equations with compiled software programs (referred to as models in this thesis) which still retain the mathematical models, but allow for a controlled expansion and manipulation of the computational system. This tendency has posed the research question of how to dynamically assemble and solve a system of non-linear models. In this context, the objective of the present research has been to develop methods which significantly increase the flexibility and efficiency with which the designer is able to operate on large scale computational multidisciplinary systems at the conceptual design stage. In order to achieve this objective a novel computational process modelling method has been developed for generating computational plans for a system of non-linear models. The computational process modelling was subdivided into variable flow modelling, decomposition and sequencing. A novel method named Incidence Matrix Method (IMM) was developed for variable flow modelling, which is the process of identifying the data flow between the models based on a given set of input variables. This method has the advantage of rapidly producing feasible variable flow models, for a system of models with multiple outputs. In addition, criteria were derived for choosing the optimal variable flow model which would lead to faster convergence of the system. Cont/d.
5

A multi-paradigm modelling framework for simulating biocomplexity

Kaul, Himanshu January 2013 (has links)
The following thesis presents a computational framework that can capture inherently non-linear and emergent biocomplex phenomena. The main motivation behind the investigations undertaken was the absence of a suitable platform that can simulate, both the continuous features as well as the discrete, interaction-based dynamics of a given biological system, or in short, dynamic reciprocity. In order to determine the most powerful approach to achieve this, the efficacy of two modelling paradigms, transport phenomena as well as agent-based, was evaluated and eventually combined. Computational Fluid Dynamics (CFD) was utilised to investigate optimal boundary conditions, in terms of meeting cellular glucose consumption requirements and exposure to physiologically relevant shear fields, that would support mesenchymal stem cell growth in a 3-dimensional culture maintained in a commercially available bioreactor. In addition to validating the default bioreactor configuration and operational parameter ranges as suitable towards sustaining stem cell growth, the investigation underscored the effectiveness of CFD as a design tool. However, due to the homogeneity assumption, an untenable assumption for most biological systems, CFD often encounters difficulties in simulating the interaction-reliant evolution of cellular systems. Therefore, the efficacy of the agent-based approach was evaluated by simulating a morphogenetic event: development of in vitro osteogenic nodule. The novel model replicated most aspects observed in vitro, which included: spatial arrangement of relevant players inside the nodule, interaction-based development of the osteogenic nodules, and the dependence of nodule growth on its size. The model was subsequently applied to interrogate the various competing hypotheses on this process and identify the one that best captures transformation of osteoblasts into osteocytes, a subject of great conjecture. The results from this investigation annulled one of the competing hypotheses, which purported the slow-down in the rate of matrix deposition by certain osteoblasts, and also suggested the acquisition of polarity to be a non-random event. The agent-based model, however, due to being inherently computationally expensive, cannot be recommended to model bulk phenomena. Therefore, the two approaches were integrated to create a modelling platform that was utilised to capture dynamic reciprocity in a bioreactor. As a part of this investigation, an amended definition of dynamic reciprocity and its computational analogue, dynamic assimilation, were proposed. The multi-paradigm platform was validated by conducting melanoma chemotaxis under foetal bovine serum gradient. Due to its CFD and agent-based modalities, the platform can be employed as both a design optimisation as well as hypothesis testing tool.
6

Enhancing numerical modelling efficiency for electromagnetic simulation of physical layer components

Sasse, Hugh Granville January 2010 (has links)
The purpose of this thesis is to present solutions to overcome several key difficulties that limit the application of numerical modelling in communication cable design and analysis. In particular, specific limiting factors are that simulations are time consuming, and the process of comparison requires skill and is poorly defined and understood. When much of the process of design consists of optimisation of performance within a well defined domain, the use of artificial intelligence techniques may reduce or remove the need for human interaction in the design process. The automation of human processes allows round-the-clock operation at a faster throughput. Achieving a speedup would permit greater exploration of the possible designs, improving understanding of the domain. This thesis presents work that relates to three facets of the efficiency of numerical modelling: minimizing simulation execution time, controlling optimization processes and quantifying comparisons of results. These topics are of interest because simulation times for most problems of interest run into tens of hours. The design process for most systems being modelled may be considered an optimisation process in so far as the design is improved based upon a comparison of the test results with a specification. Development of software to automate this process permits the improvements to continue outside working hours, and produces decisions unaffected by the psychological state of a human operator. Improved performance of simulation tools would facilitate exploration of more variations on a design, which would improve understanding of the problem domain, promoting a virtuous circle of design. The minimization of execution time was achieved through the development of a Parallel TLM Solver which did not use specialized hardware or a dedicated network. Its design was novel because it was intended to operate on a network of heterogeneous machines in a manner which was fault tolerant, and included a means to reduce vulnerability of simulated data without encryption. Optimisation processes were controlled by genetic algorithms and particle swarm optimisation which were novel applications in communication cable design. The work extended the range of cable parameters, reducing conductor diameters for twisted pair cables, and reducing optical coverage of screens for a given shielding effectiveness. Work on the comparison of results introduced ―Colour maps‖ as a way of displaying three scalar variables over a two-dimensional surface, and comparisons were quantified by extending 1D Feature Selective Validation (FSV) to two dimensions, using an ellipse shaped filter, in such a way that it could be extended to higher dimensions. In so doing, some problems with FSV were detected, and suggestions for overcoming these presented: such as the special case of zero valued DC signals. A re-description of Feature Selective Validation, using Jacobians and tensors is proposed, in order to facilitate its implementation in higher dimensional spaces.
7

The identification & optimisation of endogenous signalling pathway modulators

Gianella-Borradori, Matteo Luca January 2013 (has links)
<strong>Chapter 1</strong> Provides an overview of drug discovery with particular emphasis on library selection and hit identification methods using virtual based approaches. <strong>Chapter 2</strong> Gives an outline of the bone morphogenetic protein (BMP) signalling pathway and literature BMP pathway modulators. The association between the regulation of BMP pathway and cardiomyogenesis is also described. <strong>Chapter 3</strong> Describes the use of ligand based virtual screening to discover small molecule activators of the BMP signalling pathway. A robust cell based BMP responsive gene activity reporter assay was developed to test the libraries of small molecules selected. Hit molecules from the screen were synthesised to validate activity. It was found that a group of known histone deacetylase (HDAC) inhibitors displayed most promising activity. These were evaluated in a secondary assay measuring the expression of two BMP pathway regulated genes, hepcidin and Id1, using reverse transcription polymerase chain reaction (RT-PCR). 188 was discovered to increase expression of both BMP-responsive genes. <strong>Chapter 4</strong> Provides an overview of existing cannabinoid receptor (CBR) modulating molecules and their connection to progression of atherosclerosis. <strong>Chapter 5</strong> Outlines the identification and optimisation of selective small molecule agonists acting at the cannabinoid 2 receptor (CB<sub>2</sub>R). Ligand based virtual screen was undertaken and promising hits were synthesised to allow structure activity relationship (SAR) to be developed around the hit molecule providing further information of the functional groups tolerated at the active site. Subsequent studies led to the investigation and optimisation of physicochemical properties around 236 leading to the development of a suitable compound for in vivo testing. Finally, a CB<sub>2</sub>R selective compound with favourable physicochemical properties was evaluated in vivo in a murine inflammation model and displayed reduced recruitment of monocytes to the site of inflammation.

Page generated in 0.1361 seconds