Spelling suggestions: "subject:"tube"" "subject:"cube""
71 |
A generic processing in memory cycle accurate simulator under hybrid memory cube architecture / Um simulador genérico ciclo-acurado para processamento em memória baseado na arquitetura da hybrid memory cubeOliveira Junior, Geraldo Francisco de January 2017 (has links)
PIM - uma técnica onde elementos computacionais são adicionados perto, ou idealmente, dentro de dispositivos de memória - foi uma das tentativas criadas durante os anos 1990 visando mitigar o notório memory wall problem. Hoje em dia, com o amadurecimento do processo de integração 3D, um novo horizonte para novas arquiteturas PIM pode ser explorado. Para investigar este novo cenário, pesquisadores dependem de simuladores em software para navegar pelo espaço de exploração de projeto. Hoje, a maioria dos trabalhos que focam em PIM, implementam simuladores locais para realizar seus experimentos. Porém, esta metodologia pode reduzir a produtividade e reprodutibilidade. Neste trabalho, nós mostramos o desenvolvimento de um simulador de PIM preciso, modular e parametrizável. Nosso simulador, chamado CLAPPS, visa a arquitetura de memória HMC, uma memória 3D popular, que é amplamente utilizada em aceleradores PIM do estado da arte. Nós desenvolvemos nosso mecanismo utilizando a linguagem de programação SystemC, o que permite uma simulação paralela nativamente. A principal contribuição do nosso trabalho se baseia em desenvolver a interface amigável que permite a fácil exploração de arquiteturas PIM. Para avaliar o nosso sistema, nós implementamos um modulo de PIM que pode executar operações vetoriais com diferente tamanhos de operandos utilizando o proposto conjunto de ferramentas. / PIM - a technique which computational elements are added close, or ideally, inside memory devices - was one of the attempts created during the 1990s to try to mitigate the memory wall problem. Nowadays, with the maturation of 3D integration technologies, a new landscape for novel PIM architectures can be investigated. To exploit this new scenario, researchers rely on software simulators to navigate throughout the design evaluation space. Today, most of the works targeting PIM implement in-house simulators to perform their experiments. However, this methodology might hurt overall productivity, while it might also preclude replicability. In this work, we showed the development of a precise, modular and parametrized PIM simulation environment. Our simulator, named CLAPPS, targets the HMC architecture, a popular 3D-stacked memory widely employed in state-of-the-art PIM accelerators. We have designed our mechanism using the SystemC programming language, which allows native parallel simulation. The primary contribution of our work lies in developing a user-friendly interface to allow easy PIM architectures exploitation. To evaluate our system, we have implemented a PIM module that can perform vector operations with different operand sizes using the proposed set of tools.
|
72 |
Design and Characterization of Circularly Polarized Cavity-Backed Slot Antennas in an In-House-Constructed Anechoic ChamberChandak, Mangalam 01 August 2012 (has links)
Small satellites are satellites that weight less than 500 kg. Compared to larger satellites, a small satellite, especially a cube satellite, has limited surface area. The limited surface area casts challenges for allocating essential parts, such as antennas, for the satellite. Therefore, antennas that are conformal to the satellite surface have distinct advantages over other types of antennas that need significant mounting area. One of the very effective conformal antennas is cavity-backed slot antennas that can be integrated around solar cells and do not compete for extra surface area. The previous study performed on cavity-backed slot antennas was mainly a feasibility study and did not address realistic concerns such as effective feeding methods for the antennas. This thesis work is aimed at providing more detailed study on achieving high quality circular polarization (CP) and simplified feed design to initiate effective integration of the antenna with solar panel. In order to accurately characterize an antenna, an effective antenna range in an anechoic chamber is important. Utah State University had an effective near-field range; however, there was not an fully shielded anechoic chamber. As another objective of this thesis work, a state-of-the-art anechoic chamber has been constructed, calibrated, and utilized to measure different antenna parameters. This thesis also shows correct methods to measure important antenna properties such as CP and antenna efficiency.
|
73 |
Development of a Generic PDA Based Control Mechanism for in-house Fabricated Miniature SensorsKedia, Sunny 19 November 2004 (has links)
A novel method of controlling miniature sensors using Handspring Visor Prism PDA has been implemented. A generic motherboard was developed to map the data and address lines from the Visor onto a Complex Programmable Logic Device (CPLD) to provide basic electrical signals to the sensor board. The sensor board housed the sensor and contained application specific circuitry. The PDA, the motherboard, and the sensor board completed the control mechanism for the sensor. Miniature sensors and PDA based control mechanism scaled down the size of the complete system making the unit portable. This unit facilitated a faster analysis of data on field. Two applications were targeted: Flurometer (bio-sensor) and Corner Cube Retroreflector (CCR-optical sensor for communication). A sensor board was developed to control a thermally regulated fluorometer undergoing the Nuclei Acid Synthesis Based Amplification (NASBA) process, which detected the fluorescence from the solution containing target RNA. NASBA runs were conducted using solution containing K. brevis- Red tide organisms to validate the interface of the PDA with a fluorometer. Real time fluorescence plot over time was obtained on the PDA indicating presence/absence of the target RNA; thus, it successfully interfaced the PDA with the fluorometer. Additionally, a sensor board was developed to control the electrostatic actuation mechanism of the MEMS based CCR. Efforts were made to fabricate the vertical mirrors of CCR using wet and dry fabrication techniques.
|
74 |
Deriving Statewide Freight Truck Flows from Global Positioning System (GPS) DataBakhshi Zanjani, Akbar 02 July 2014 (has links)
An accelerated growth in the volume of freight shipped on Florida's highways has led to a significant increase in truck traffic, influencing traffic operations, safety, and the state of repair of highway infrastructure. Traffic congestion in turn has impeded the speed and reliability of freight movement on the highway system. Appropriate planning and decision making processes are necessary to address these issues. However, a main challenge in establishing such processes is the lack of adequate data on statewide freight movements. As traditional data sources on freight movement are either inadequate or no longer available, new sources of data must be investigated.
A recently available source of data on nationwide freight flows is based on a joint venture by the American Transportation Research Institute (ATRI) and the Federal Highway Administration (FHWA) to develop and test a national system for monitoring freight performance measures on key corridors in the nation. This data is obtained from trucking companies who use GPS-based technologies to remotely monitor their trucks. The database contains GPS traces of a large number of trucks as they traveled through the national highway system. This provides unprecedented amounts of data on freight truck movements throughout the nation (and in Florida). Such truck GPS data can potentially be used to support planning, operation, and management processes associated with freight movements. Further, the data can be put to better use when used in conjunction with other freight data obtained from other sources.
The overarching goal of this thesis is to investigate the use of large streams of truck-GPS data from the American Transportation Research Institute (ATRI) for the estimation of statewide freight truck flows in Florida. To this end, first, an algorithm was devised to convert large streams of raw GPS data into a database of truck trips. The algorithm was applied to four months of ATRI's truck-GPS data comprising over 145 Million GPS records to derive a database of more than 1.2 million truck trips starting and/or ending in Florida. This database was used to analyze truck travel characteristics and origin-destination truck flow patterns for different geographical regions in Florida. The resulting database was used in conjunction with the GPS data to analyze the extent to which ATRI's data represents observed truck traffic flows in the state. It was found that at an aggregate level, almost 10% of heavy truck traffic flows in Florida is captured in the ATRI data.
Finally, the database of truck trips derived from ATRI's truck-GPS data was combined with observed heavy truck traffic volumes at different locations within and outside Florida to derive an origin-destination (OD) table of truck flows within, into, and out of the state. To this end, first, the truck trip database developed from ATRI's truck-GPS data was converted into a seed OD table at the TAZ-level spatial resolution used in FLSWM. Subsequently, a mathematical procedure called origin-destination matrix estimation (ODME) method was employed to combine the OD flow table generated from the ATRI data with observed truck traffic volume information at different locations within and outside Florida. The OD table of truck flows estimated from this procedure can be used for a variety of purposes, including the calibration and validation of the heavy truck modeling components of FLSWM.
|
75 |
Hello, ruel WorldJanuary 2003 (has links)
The analytic component of the thesis approaches major questions in Cultural Studies, Philosophy and Social Theory through an investigation of various forms of creative practice. I approach the question of agency, for instance, through a study of stage actors, who must recite set lines, and yet feel empowered precisely by the opportunity to act through them. Investigation of the author's own work as a poet and novelist serves to cast light upon ideology, ie. how one might use a constrained language and yet feel empowered to speak new things through it. I apply these investigations to Althusser, whose famous essay on the total power of ideological interpellation is permeated with theatrical metaphor. I suggest that Althusser is repressing the creative component of everyday life, something social theory has ever found hard to theorise. I proceed to suggest that the place where such processes are analysed is in the philosophy of science. The work of Charles Saunders Peirce on the experimental method is, my investigation uncovers, surprisingly geared toward the investigation of creativity. In science one has a method for, and an extensive literature on, discovering new phenomena. My thesis is that the experimental method of modern scientists, and the creative method of modern writers, both geared toward the production of things that are at once new and true, is largely the same. I use Peircean semiotics to argue that creative composition is about listening to the languages spoken all round you, and transcribing their objective contours. So as to have effect on others. Which is just like science. And in both instances we are endlessly spoken through at every moment by the myriad languages which interpellate us. Whence creativity (for those who are open to it). My three portfolio pieces are: Cube Root of Book: a series of one hundred and thirty two poems set at intervals along the descending spiral of a Fibonacci number sequence. The 14th Floor, an Hypothesis, an experimental novel, written quite literally as an experiment; i.e. having written the novel, I then wrote up a prac-report detailing what I had learnt about the performance of writing, including the above. Unaustralia, a Study of Heads, an attempt to show the relevance of these findings to Cultural Studies and other related practices. It centres on my new reading of Althusser, and is flanked by mini-ethnographies of creative practice, including the above. The poetry is presented as a major new creative work. The experimental novel/ book of philosophy as a substantial contribution to knowledge.
|
76 |
Modélisation de l'incertitude géologique par simulation stochastique de cubes de proportions de faciès - Application aux réservoirs pétroliers de type carbonaté ou silico-clastiqueZerkoune, Abbas 08 June 2009 (has links) (PDF)
Après sa découverte, les choix relatifs au développement d'un gisement se prennent sur la base de représentations incertaines du champ. En effet, sa caractérisation utilise des modèles numériques spatiaux porteurs de l'incertitude liée à la complexité du milieu souterrain. D'ordinaire, les méthodes de simulations stochastiques, qui génèrent des modèles équiprobables du sous-sol, sont supposées les quantifier. Néanmoins, ces images alternatives du champ renvoient à des tirages au sein d'un modèle probabiliste unique. Elles oublient l'incertitude relative au choix du modèle probabiliste sous-jacent, et tendent à la sous-estimer. Ce travail de recherche vise à améliorer la quantification de cette incertitude. Elle retranscrit la part de doute relative à la compréhension des propriétés du milieu sur les modèles probabilistes, et propose de l'intégrer à ce niveau. Cette thèse précise d'abord la notion d'incertitude en modélisation pétrolière, en particulier sur les modèles géologiques 3D comprenant différents faciès. Leur construction demande au préalable de définir en tout point de l'espace leur probabilité d'existence : c'est le cube de proportions. Généralement, bien que ces probabilités soient peu connues, les méthodes actuelles d'évaluation de l'incertitude sédimentaire les gardent figées. De fait, elles oublient le caractère incertain du scénario géologique et son impact sur le cube de proportions. Deux méthodes stochastiques de simulation ont été développées afin de générer des modèles équiprobables en termes de cubes de proportions. Elles intègrent la variabilité liée aux proportions de faciès, et explorent dans son ensemble un tel domaine d'incertitude. La première reste relativement attachée à la géologie. Elle intègre directement l'incertitude liée aux paramètres qui composent le scénario géologique. Elle décrit sa mise en oeuvre sur les divers paramètres du scénario géologique, qu'ils prennent la forme de signaux aux puits, de cartes ou d'hypothèses plus globales à l'échelle du réservoir. Une démarche de type Monte-Carlo échantillonne les composantes du schéma sédimentaire. Chaque tirage permet de construire un cube de proportions par l'intermédiaire d'un géomodeleur qui intègre de façon plus ou moins explicite les paramètres du scénario géologique. La méthodologie est illustrée et appliquée à un processus inédit de modélisation des dépôts carbonatés en milieu marin. La seconde revêt un caractère plus géostatistique en se concentrant davantage sur le cube de proportions. Elle vise plutôt à réconcilier les différents modèles sédimentaires possibles. Dans le modèle maillé de réservoir, elle estime la loi de distribution des proportions de faciès cellule par cellule - supposées suivrent une loi de Dirichlet, à partir de quelques modèles, construits sur la base de scénarios géologiques distincts. Elle simule alors les proportions de façon séquentielle, maille après maille, en introduisant une corrélation spatiale (variogramme) qui peut être déterministe ou probabiliste. Divers cas pratiques, composés de réservoirs synthétiques ou de champs réels, illustrent et précisent les différentes étapes de la méthode proposée.
|
77 |
Codage d'automates et théorie des cubes intersectantsDuff, Christopher 01 March 1991 (has links) (PDF)
Cette thèse propose une methode de codage optimise d'automate synchrone dans un environnement du type compilateur de silicium. Dans une première partie, on recherche sur le graphe d'état des situations prédictives de minimisation des équations des variables internes et des sorties. Ceci définit des contraintes sur le codage en terme d'une liste de groupes d'adjacence d'états a immerger sur des faces ou cubes de l'hypercube. Dans une deuxième partie le codage est réalisé en satisfaisant au mieux ces affectations de faces et leurs intersections. Les principes de base de cette approche sont les suivants: (i) pour la première phase, la recherche de situations prédictives de minimisation est fondée sur la théorie des paires de partition de Hartmannis. Les situations sont recherchées entrée par entrée; cette approche locale permettra de faire face aux grandes complexités; remarquons que le codage des entrées n'est pas aborde. La priorité est donnée aux fusions potentielles de monômes dans les équations. On ne recherchera pas de façon indifférenciée comme dans d'autres approches (mustang) toutes les minimisations possibles incluant les factorisations. En effet, il est raisonnablement estime que seule la fusion de monômes assure un gain de surface et en connectique; (ii) pour la deuxième phase, les techniques d'immersion dans l'hypercube seront très sophistiquées. Elles reprendront une représentation de l'hypercube par le treillis de l'ensemble des parties de n éléments. Pour résoudre les problèmes des contraintes intersectantes, c'est-a-dire des contraintes impliquant des sous-ensembles d'états en commun, une théorie dite théorie des cubes intersectant sera proposée. Les résultats de cette thèse ont donne lieu a un logiciel intégré dans le système asyl. Les résultats obtenus en gain de surface sur silicium, de chemin critique et de facteurs de routage sont les meilleurs actuellement connus
|
78 |
Analyse d'un cube de données : décomposition tensorielle et liens entre procédures de comparaison de tableaux rectangulaires de donnéesMizere, Dominique 17 June 1981 (has links) (PDF)
.
|
79 |
Méthodes de Bootstrap en population finieChauvet, Guillaume 14 December 2007 (has links) (PDF)
Cette thèse est consacrée aux méthodes de Bootstrap pour unepopulation ?nie. Le premier chapitre introduit quelques rappels sur l'échantillonnage et propose une présentation synthétique des principales méthodes d'estimation de précision. Le chapitre 2 rappelle les méthodes de Bootstrap proposées pour un sondage aléatoire simple et introduit deux nouvelles mé thodes. Le chapitre 3 donne un nouvel algorithme de Bootstrap, consistant pour l'estimation de variance d'un estimateur par substitution dans le cas d'un tirage à forte entropie. Dans le chapitre 4, nous introduisons la notion d'échantillonnage équilibré et proposons un algorithme rapide. Nous montrons que l'algorithme de Bootstrap proposé est également consistant pour l'estimation de variance d'un tirage équilibré à entropie maximale. Le cas d'un échantillonnage complexe et celui d'un redressement est traité au chapitre 5. Une application au Nouveau Recensement de la population est donnée dans le chapitre 6.
|
80 |
Hedmans Kvadratrotsalgoritm / Hedman´s square root algorithmHedman, Anders January 2001 (has links)
<p>I detta 10-poängsarbete går jag igenom hur min egenhändigt producerade kvadratrotsalgoritm fungerar praktiskt och teoretiskt. Med denna algoritm kan man för hand räkna ut kvadratrötter som innehåller 50-60 värdesiffror. Med de tidigare kända algoritmerna för kvadratrötter kan man räkna ut 5-6 värdesiffror. </p><p>Min algoritm fungerar inte på samma sätt som de tidigare använda kvadratrotsalgoritmerna men den är lika korrekt. Stor tyngdvikt i arbetet har därför lagts på att visa på att det finns flera olika korrekta algoritmer för våra vanliga räknesätt. </p><p>Arbetet innehåller också en kort skildring av den pågående debatten huruvida algoritmräkning i grundskolan hämmar elevernas matematiska tänkande eller inte.</p>
|
Page generated in 0.0379 seconds