1 |
DATA ARTICULATIONBARR, DAVID F. 09 October 2007 (has links)
No description available.
|
2 |
Metapolis : virtual reality vs. real virtuality in a digital art pavillionKruger, Leanne 30 November 2011 (has links)
This dissertation focuses on architecture in the information age.Information technology is evolving at an alarming rate, which opens up a vast landscape of possibilities within the architectural realm. These possibilities are discussed and implemented into anarchitectural intervention, with a specific focus on the relationship between the real and the virtual. A digital art pavilion is proposed on the corner of Proes and van der Walt streets in Pretoria CBD, where the Munitoria Complex (Tshwane Municipal Offices) is currently situated. This intervention should act as a catalyst for positive change by narrowing the digital divide that is currently causing social and cultural segregation; providing a tool for upliftment by informing city dwellers. This negates the current "culture of ignorance" by stimulating a culture of knowledge. / Dissertation (MArch(Prof))--University of Pretoria, 2011. / Architecture / unrestricted
|
3 |
Architectures Numériques et Résurgence Baroque : bernard Cache, Greg Lynn et le Pli de Deleuze / Digital Architectures and Baroque Resurgence : Bernard Cache, Greg Lynn and Deleuze's Fold.Plihon, Florence 07 October 2016 (has links)
Autour des années 1990, le mot baroque ressurgit dans certains discours sur l’architecture numérique naissante et sur les changements qu’induisent les nouveaux logiciels de conception et de production par ordinateur. Peut-on dire que certains architectes vont jusqu’à prolonger un élan baroque dans de nouveaux modes d'exploration de la forme architecturale par les outils numériques ? Cette thèse propose une approche croisant la philosophie, l’anthropologie et la linguistique pour analyser les discours de deux architectes pionniers dans le développement des technologies informatiques. Greg Lynn (USA, 1967*) aborde la conception par des algorithmes qui génèrent des formes dites complexes, alors que Bernard Cache (France, 1958*) explore et théorise le file-to-factory, c’est-à-dire une chaîne continue entre la conception et la production. Ils se rapprochent tous deux par leur interprétation de Le Pli, Leibniz et le Baroque, de Gilles Deleuze (1988), ouvrage dans lequel ils puisent de nombreux concepts pour approcher les thèmes de la continuité, de la variation infinie des formes, et du non standard. L’analyse se concentre sur l’implication du baroque dans les stratégies discursives en tant qu’homologie structurante de leur pensée, et soulève sa dimension fictionnelle. La notion, en plus de constituer un archétype fondateur de l’imaginaire architectural de ces architectes, constitue d’une part un outil opératoire utile à leurs productions théoriques et architecturales. Elle est d’autre part instrumentalisée pour répondre aux enjeux intellectuels de leur époque. / During the 1990s, the word baroque reappears in some discourses on emerging digital architecture and the changes induced by new design and production softwares. Can we say that some architects are extending a baroque impetus in new exploratory modes of architectural form by digital tools? This thesis proposes an approach that combines philosophy, anthropology and linguistics to analyze the speeches of two architects, pioneers in the development of digital technologies. Greg Lynn (USA, 1967*) approaches the design by algorithms that generate complex forms, while Bernard Cache (France, 1958*) explores and theorizes the file-to-factory, which is a continuous chain between design and production. They become closer in their respective interpretation of The Fold, Leibniz and the Baroque, Gilles Deleuze (1988), a philosophical work from which they pull out many concepts to theorize the themes of continuity, the infinite variety of forms, and the non-standard. The analysis focuses on the involvement of the baroque in the discursive strategies as an homology structuring their thoughts, and raises its fictional dimension. The concept, in addition to be a founding archetype of the imaginary of these architects, is a useful operating tool in their theoretical and architectural productions. However, it is adapted to answer the intellectual challenges of their time.
|
4 |
The aesthetics of emergenceEdnie-Brown, Pia Hope, pia@rmit.edu.au January 2008 (has links)
Principles of design composition are commonly understood to pertain to geometrical systems for arranging parts in assembling a formal whole. Connection to socio-cultural 'meaning' and relevance arguably occurs primarily via the assumed divinity or universality of these systems. In the contemporary architectural world, where explicitly held beliefs in fundamental, geometrically defined principles or values have dissipated, guiding principles of composition appear to be obsolete. This seems particularly true in relation to work that highlights process - or change, responsiveness, interactivity and adaptability - since this implies that the composition remains in flux and unable to be grounded in the composition of form. While processually inflected architecture (referred to here as 'processual architecture'), has been an active field since at least the 1960s, it has been significantly developed since design experiments involving digital computation intensified in t he 1990s. For this field of work, both highly celebrated and criticised as superficial or unethical, any connection to 'meaning' or value that might be offered by principles of composition would appear especially lost. This thesis reviews, counterpoises and reorients these assumptions, arguing a case for the value of processual architectural that has not been previously articulated. After the last 10 to 15 years of digital experimentation, it is clear that digital technology in itself is not the primary issue, but simply part of a complex equation. The thesis articulates this 'equation' through the model of emergence, which has been used in the field with increasing prominence in recent years. Through both practice-based research and theoretical development, a processually inflected theory of composition is proposed. This offers pathways through which the potential of processual architecture might be productively developed, aiming to open this field of work into a deeper engagement with pressing contemporary socio-political issues. The thesis demonstrates how the cultivation of particular modes of attention and engagement, found to hold an implicit but nevertheless amplified significance within processual architecture, make it possible to develop an embodied awareness pertaining to an 'ethico-aesthetic know-how'. This know-how is acquired and matured through attention to the affective dimensions that arise through design activity. The thesis highlight aspects of design process and products that are routinely suppressed in architectural discourse, generating new insights into the importance of affect for design process, design products and the relations between them. The ethical dimensions of such an approach become especially poignant through the explicit connection made between design activity and the practices of everyday life. Relationships between architecture and the social become re-energised, in a radically alternative manner to the social agendas of modernism or the more literary critiques of post-modernism. Through detailed discussions of the specific, local conditions with a series of design projects I have undertaken, I argue how and why close attention to the affective dimensions of design process offers new and productive ways to approach research through design practice. This offers a response to the calls for new 'post-critical' forms of research through empowering both sides of a previously held divide: theory and practice.
|
5 |
From Numbers To Digits: On The Changing Role Of Mathematics In ArchitectureKoc, Betul 01 June 2008 (has links) (PDF)
This study is a critical reconsideration of architecture&rsquo / s affiliation with mathematics and geometry both as practical instrument and theoretical reference. The thesis claims that mathematics and its methodological structure provided architects with an ultimate foundation and a strong reference outside architecture itself ever since the initial formations of architectural discourse. However, the definitive assumptions and epistemological consequences of this grounding in mathematical clarity, methodological certainty and instrumental precision gain a new insight with the introduction of digital technologies. Since digital technologies offer a new formation for this affiliation either with their claim of a better geometric representation or mathematical controllability of physical reality (space), the specific focus on these newly emerging technologies will be developed within a theoretical frame presenting the significant points of mathematics in architecture.
|
6 |
Digital Architecture As The Extension Of Physical Spaces: AsymptoteAyoglu, Halil 01 January 2005 (has links) (PDF)
The aim of this thesis is to develop an understanding of digital architecture as an extension to the physical spaces. The thesis claims that Virtual Reality Environments (VREs) coexist, supplement, support and extend the physical environments. VREs enable the users to deal with manipulable, multi-dimensional, interactive digital environment. Asymptote&rsquo / s New York Stock Exchange Three-Dimensional Trading Floor (NYSE 3DTF) VRE is a significant example to analyze digital architecture in this perspective. The 3DTF is a project where architecture and information bring each other into a new meaning through the spatialization of information in digital
medium. The thesis analyzes how 3DTF VRE becomes extension to the existing actual NYSE in terms of four tools of analysis: visualization, navigation, interaction,and data integration. This thesis proposes to rethink architecture&rsquo / s relation with information through an understanding of extension.
|
7 |
Vertikální farma / Vertical FarmHurník, Václav Unknown Date (has links)
Vertical farms, vertical, a word describing the essence of the meaning of this type of cultivation. So why are we building vertical farms in sheet metal halls and using only the potential of racks stacked on top of each other? Let's use the potential of this technology and the great adaptability of plants to the new environment and create a new food source operating locally and in continuous operation. Not only would this benefit from the availability of local and foreign food regardless of the season or the changing climate, but the changes it would bring to the economic gap would have a major impact on reducing intercontinental and intracontinental traffic. This would reduce carbon emissions and reduce agricultural areas that could replace forests that are more environmentally friendly.
|
8 |
Digital Hardware Architectures for Exact and Approximate DCT Computation Using Number Theoretic TechniquesEdirisuriya, Amila 21 May 2013 (has links)
No description available.
|
9 |
Numérisation rapide d'un système synchronisé en sortie d'antennes multi-réparties tel que le Radiohéliographe de Nançay / High speed digital synchronized system for antenna array such as Nançay RadioheliographAit Mansour, El Houssain 19 January 2018 (has links)
Le Radiohéliographe de Nançay est le seul instrument dédié à l'imagerie du soleil en ondes décimétriques-métriques. Il fonctionne sur le principe de l'interférométrie, en utilisant 47 antennes essentiellement réparties sur des axes est-ouest (3,2 km) et nord-sud (2,5 km). Cette étude a pour but d'explorer un nouveau concept de numérisation propre à la radioastronomie du futur, appliquée ici à l'interférométrie solaire. Elle porte sur la numérisation rapide d'un système synchronisé en sortie d'antennes. Ces aspects "numérisation rapide" et "synchronisation" sont d'une importance capitale pour les prochains radiotélescopes du futur. Ils permettent de simplifier les chaînes de réception radiofréquence et de diminuer la consommation électrique ainsi que les coûts d'entretien et de la maintenance. L'application à l'observation du soleil comporte cependant des contraintes originales, comme la grande dynamique des signaux, qui ne sont pas prises en compte actuellement dans les études en cours pour les radiotélescopes du futur. Le radiohéliographe actuel a une chaîne de réception analogique avec une numérisation centralisée. La commutation entre les différentes fréquences dans la bande 150-450~MHz est réalisée d'une façon analogique et temporelle. Ceci nécessite beaucoup de calibrations analogiques et oblige de figer la gamme des fréquences (10 fréquences de largeur 1~MHz). De plus, en interférométrie métrique, les très grandes longueurs de câbles coaxiales sont onéreuses. Les signaux transmis des antennes au récepteur sont toujours sources d'erreurs et des fluctuations importantes réduisent l'information radiofréquence. Toutefois, apporter une numérisation complète de la bande (300~MHz) permet d'avoir de la souplesse dans le traitement et l'analyse des données (résolution fréquentielle et la possibilité d'observer plusieurs bandes simultanément, traitement des parasites). Ceci engendre la nécessité d'avoir une très grande précision des horloges (0,7~ps d'erreur de phase) pour cadencer des ADC (Analog-to-Digital-Converter) large bande (1~GHz d'horloge). L'objectif principal de la thèse est d'étudier la synchronisation pour l'application à un réseau d'antennes multi-réparties. Le saut technologique ainsi induit et les concepts étudiés sont un enjeu grandissant dans les grands projets européens et internationaux. / The Nançay Radioheliograph is the only instrument dedicated to the solar corona imaging in the 150-450 MHz frequency band. It operates on the principle of interferometry, using 47 antennas essentially distributed on the east-west (3.2 km) and north-south (2.5 km) axes. This study aims to explore a new technical concept for future radio astronomy, applied to solar interferometer. It deals with the rapid digitization of a synchronized system at the antenna sides. High speed digitization and high accuracy synchronization are the most important aspects for future radio telescopes. They make it possible to simplify radiofrequency reception chains and reduce their power consumption, as well as maintenance costs and complexity. The application to the observation of the sun, however, has some original constraints, such as the great dynamics of the signals, which are not taken into account in the current studies for future radio telescopes. The current radio telescope has an analog receiver with a centralized digitization. The switching time between each frequency (10 frequencies of 1 MHz width) in 150-450 MHz band analyzed introduce latency in solar images processing, also decrease the signal-to-noise ratio. In addition, in metric interferometry, the several lengths of coaxial cables in which the signal is transported from the antennas to the receiver always cause significant errors and fluctuations in the radiofrequency reception chains. Providing full digitization of the band (300 MHz) allows more flexibility in data processing and analyzing (frequency resolution and the ability to observe multiple bands simultaneously). This required high clock accuracy (0.7 ps of jitter) for ADCs clocks (1 GHz clock). Therefore, the main objective of this thesis is to reach a sub-ns global time synchronization of distributed networks such as radio interferometer array as the Nançay Radioheliograph. The technological leap thus induced is a growing challenge in major European and international projects.
|
10 |
Méthodes et outils pour l'analyse tôt dans le flot de conception de la sensibilité aux soft-erreurs des applications et des circuits intégrés / Methods and tools for the early analysis in the design flow of the sensitivity to soft-errors of applications and integrated circuitsMansour, Wassim 31 October 2012 (has links)
La miniaturisation des gravures des transistors résulte en une augmentation de la sensibilité aux soft-erreurs des circuits intégrés face aux particules énergétiques présentes dans l’environnement dans lequel ils opèrent. Une expérimentation, présentée au cours de cette thèse, concernant l'étude de la sensibilité face aux soft-erreurs, dans l'environnement réel, des mémoires SRAM provenant de deux générations de technologies successives, a mis en évidence la criticité de cette thématique. Cela pour montrer la nécessité de l'évaluation des circuits faces aux effets des radiations, surtout les circuits commerciaux qui sont de plus en plus utilisés dans les applications spatiales et avioniques et même dans les hautes altitudes, afin de trouver les méthodologies permettant leurs durcissements. Plusieurs méthodes d'injection de fautes, ayant pour but l'évaluation de la sensibilité des circuits intégrés face aux soft-erreurs, ont été le sujet de plusieurs recherches. Les travaux réalisés au cours de cette thèse ont eu pour but le développement d'une méthode automatisable, avec son outil, permettant l'émulation des effets des radiations sur des circuits dont on dispose de leurs codes HDL. Cette méthode, appelée NETFI (NETlist Fault Injection), est basée sur la manipulation de la netlist du circuit synthétisé pour permettre l'injection de fautes de types SEU, SET et Stuck_at. NETFI a été appliquée sur plusieurs architectures pour étudier ses potentialités ainsi que son efficacité. Une étude sur un algorithme tolérant aux fautes, dit self-convergent, exécuté par un processeur LEON3, a été aussi présenté dans le but d'effectuer une comparaison des résultats issus de NETFI avec ceux issus d'une méthode de l'état de l'art appelée CEU (Code Emulated Upset). / Reducing the dimensions of transistors increases the soft-errors sensitivity of integrated circuits to energetic particles present in the environments in which they operate. An experiment, presented in this thesis, aiming to study soft-errors sensitivity, in real environment, of SRAM memories issued from two successive technologies, put in evidence the criticality of this thematic. This is to show the need to evaluate circuit's sensitivity to radiation effects, especially commercial circuits that are used more and more for space and avionic applications and even at high altitudes, in order to find the appropriate hardening methodologies. Several fault-injection methods, aiming at evaluating the sensitivity to soft-errors of integrated circuits, were goals for many researches. In this thesis was developed an automated method, and its corresponding tool, allowing the emulation of radiation effects on HDL-based circuits. This method, so-called NETFI (NETlist Fault-Injection), is based on modifying the netlist of the synthesized circuit to allow injecting faults of different types (SEU, SET and Stuck_at). NETFI was applied on different architectures in order to assess its efficiency and put in evidence its capabilities. A study on a fault-tolerant algorithm, so-called self-convergent, executed by a LEON3 processor, was also presented in order to perform an objective comparison between the results issued from NETFI and those issued from another state-of-the-art method, called CEU (Code Emulated Upset).
|
Page generated in 0.0879 seconds