• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 30
  • 30
  • 8
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Particionamento e escalonamento de matrizes densas para processamento em hardware reconfigurável

de Oliveira Lima, Derci 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:55:53Z (GMT). No. of bitstreams: 2 arquivo2330_1.pdf: 3127189 bytes, checksum: 01d7c49fae931bcb459bfda7372af32e (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / A solução de problemas complexos em várias áreas do conhecimento humano, tais como: análise de investimento no setor bancário, análise e visualização de imagens médicas em tempo real, indústria de óleo e gás, etc. que utilizam muitas vezes algoritmos complexos e/ou uma grande massa de dados, têm requerido cada vez mais sistemas computacionais de alto desempenho para seu processamento. Estes aplicativos, em sua maioria, devido a sua grande massa de dados, grandes laços de processamento em seus procedimentos, podem consumir dias ou até meses de trabalho, em computadores de processamento seqüencial, para apresentar o resultado final. Existem casos em que este tempo excessivo pode inviabilizar um projeto em questão, por perder o time to market de um produto. Diferentes tecnologias e estruturas de dados têm sido sugeridas para lidar com tais problemas, visando uma melhor customização, tentando retirar o melhor da arquitetura e do sistema, seja em termos de software como de hardware. Dentre estas arquiteturas hw/sw, optamos neste trabalho ao estudo de uma solução baseada em FPGAs (Field Programmable Gate Arrays) como um co-processador. O uso deste dispositivo permite uma nova abordagem do problema. Agora, um determinado aplicativo poderia ser particionado em duas partes: a primeira, aquela com características de controle, processo seqüencial, continuaria sendo executado no host com processadores genéricos, enquanto que a parte com os grandes laços de processamento seriam processados, com maior desempenho por explorar o paralelismo, nos co-processadores com FPGAs. Porém, a movimentação dos dados entre a memória principal do host e a memória externa do FPGA é considerada um grande gargalo para o processamento em hardware. Vários autores em seus trabalhos demonstram o desempenho alcançado com o uso de processamento em hardware, mas consideram que os dados já estão na memória externa do FPGA. Poucos comentam sobre a perda de desempenho quando se considera a movimentação de dados. Neste trabalho foram estudadas técnicas de particionamento de grandes matrizes densas, reuso de dados e as estratégias que melhor se adéquam para algumas arquiteturas estudadas neste trabalho. As latências desta movimentação de dados entre o host e o co-processador em FPGA foram o foco deste trabalho também. Concluímos com um estudo de caso onde propomos uma estratégia para particionamento e multiplicação de matrizes por blocos no FPGA virtex 5 (XC5VLX50T -1 FF1136), montado em uma placa (ML 555 Board) da Xilinx
2

Statistical considerations in the design and analysis of cross-over trials

Morrey, Gilbert Heneage January 1991 (has links)
No description available.
3

Implementation of CCSDS Telemetry and Command Standards for the Fast Auroral Snapshot (FAST) Small Explorer Mission

Olsen, Douglas 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Recommendations of the Consultative Committee for Space Data Systems (CCSDS) provide a standard approach for implementing spacecraft packet telemetry and command interfaces. The Fast Auroral Snapshot (FAST) Small Explorer mission relies heavily on the CCSDS virtual channel and packetization concepts to achieve near real-time commanding and distribution of telemetry between separate space borne science and spacecraft processors and multiple ground stations. Use of the CCSDS recommendations allows the FAST mission to realize significant re-use of ground systems developed for the first Small Explorer mission, and also simplifies system interfaces and interactions between flight software developers, spacecraft integrators, and ground system operators.
4

The design and implementation of computer-based spreadsheets for teaching and learning data handling in mathematics and mathematical literacy

25 May 2010 (has links)
M.Ed. / The computer technology can be used as a complementary tool for teaching as well as a learning tool. Whilst not having the same level of computer use as the developed countries, South Africa is increasingly becoming a society of technology users; therefore, one prime goal of education would be to develop a basic knowledge of the structure and operating principle behind the use of computers in teaching and learning. This study was conducted at a selected public secondary school in Johannesburg West District, Gauteng Province. It focuses on the design and implementation of Data Handling lesson through computer-based spreadsheets for Grade 10 learners. The design type of this study is qualitative design experiment method with a small component of quantitative approach. This method is regarded as an educational intervention developed as a way of carrying out formative research to test and refine educational designs on the principles derived from earlier research. The method addressed the curricular content (Data Handling) in the classroom as well as PowerPoint, Microsoft Word and Spreadsheets software design. The design framework is derived from design principles generated from instructional system design theory and constructivist perspective. Design principles are used as a framework for the analysis of the learners’ and teacher’s experiences of a Data Handling lesson through computer-based spreadsheets. Data was collected through observation, interviews and assessment activities Findings suggested that the use of computer-based spreadsheets in teaching and learning had contributed positive effect towards the learning of Data Handling. Both teacher and her learners identified the positive experiences of fun and interest in using computer-based spreadsheets in teaching and learning.
5

A reusable command and data handling system for university CubeSat missions

Johl, Shaina Ashley Mattu 24 March 2014 (has links)
A Command and Data Handling (C&DH) system is being developed as part of a series of CubeSat missions being built at The University of Texas at Austin’s Texas Spacecraft Laboratory (TSL). With concurrent development of four missions, and with more missions planned for the future, the C&DH team is developing a system architecture that can support many mission requirements. The presented research aims to establish itself as a reference for the development of the C&DH system architecture so that it can be reused for future university missions. The C&DH system is designed using a centralized architecture with one main flight computer controlling the actions and the state of the satellite. A Commercial Off-The-Shelf (COTS) system-on-module embedded computer running a Linux environment hosted on a custom interface board is used as the platform for the mission software. This design choice and the implementation details of the flight software are described in detail in this report. The design of the flight software and the associated hardware are integral components of the spacecraft for the current missions in the TSL which, when flown, will be some of the most operationally complex CubeSat missions attempted to date. / text
6

Détection de patterns d'activité bioélectrique simulée et modélisation de réseaux neuraux bioinspirés par l'expression génique / Detection of patterns of simulated bioelectric activity and modeling of bioinspired neural networkswith genetic expression

Shaposhnyk, Vladyslav 12 September 2011 (has links)
L'architecture modulaire est une caractéristique distinctive des circuits cérébraux. En particulier, il a été observé l'existence de connexions réciproques entre des zones fonctionnellement interconnectées dans le cortex, et qui par ailleurs sont hiérarchiquement organisées. De plus, le développement évolutif est une autre caractéristique distinctive des espèces vivantes ; même les virus sont capables d'adaptation pour mieux répondre à de nouvelles conditions environnementales. En tenant compte de ces deux importants aspects, nous avons construit un nouvel et unique outil de simulation permettant de modéliser et d'étudier l'évolution des circuits multi-modulaires hiérarchiques. Dans ce modèle, chaque module est représenté par des réseaux de neurones impulsionels et caractérisé à la fois par des changements d'activités neurales imbriquées et par la plasticité synaptique. La morte cellulaire, la plasticité synaptique et l'apoptose intégrés dans le modèle créent des liens auto-associatifs au sein des modules. Ces liens peuvent générer une activité zonale qui reflète l'évolution de la connectivité fonctionnelle à l'intérieur comme à l'extérieur des modules, et donc entre les plusieurs modules neuronaux. L'activité bioélectrique de chaque module est enregistrée au moyen des électrodes virtuelles. Les signaux, electrochipogrammes (EChG), sont analysés par les méthodes fréquentiels et les méthodes de potentiels évoqués afin de trouver des généralités dans le comportement émergeant. En plus de ces méthodes conventionnelles, nous proposons une nouvelle approche de régression non-linéaire structurelle afin de fournir des outils plus puissants et mieux adaptés aux données habituellement analysées dans ce domaine. Nous avons donc testé l'effet d'un stimulus externe sur le développement de liens fonctionnels d'un réseau neuronaux. Le circuit est structuré hiérarchiquement avec un unique module sensoriel et d'autres modules constitués de deux voies parallèles organisées aussi de façon hiérarchique. Nos résultats montrent que les circuits modélisés manifestent un comportement similaire que les circuits biologiques réels. En particulier, tous les éléments du circuit peuvent traiter et maintenir des patterns d'activité liés à la disparition du stimulus. Les résultats obtenus dans nos expériences apportent un éclairage sur les processus émergents et coordonnés de l'activité électrique enregistrée par des EEG de circuits inter-corticaux hiérarchiques et évolutifs qui sont artificiels ou réels. Plus généralement, notre approche concernant les signaux EEG pourrait être étendue à la modélisation d'une vaste variété des processus cognitifs et comportementaux. / Modular architecture is a hallmark of many brain circuits. Particularly, in the cerebral cortex it has been observed that reciprocal connections are often present between functionally interconnected areas that are hierarchically organized. Evolutionary development is another distinctive characteristic of living species, even the simplest viruses are capable to adapt to better fit new environmental conditions. Having hierarchical architectures and evolutionary features in mind, we build unique and novel simulation framework, which allows us to model and to study evolving hierarchically organized circuits of modules of spiking neural networks. Each module is characterized by embedded neural development and expression of spike timing dependent plasticity. Cell death, synaptic plasticity and projection pruning, embedded in the neural model, drive the build-up of auto-associative links within each module, which generate an areal activity that reflect the changes in the corresponding functional connectivity within and between neuronal modules. Bio-electric activity of each module is recorded by means of virtual electrodes and these signals, called electrochipograms (EChG), are analyzed by time and frequency domain methods in order to find general patterns of emerging behavior. Beside time and frequency domain analysis methods, a novel robust non-linear structural regression approach is proposed to provide researchers with more powerful tools specially adapted to the data typically used in the domain. We tested the effect of an external stimulus at fixed frequency fed to a sensory module, which pro jecting its activity to two hierarchically organized parallel pathways. We found that modeled circuits manifest behavior similar in certain aspects to that of real brains. We show evidence that all networks of modules are able to maintain long patterns of activity associated with the stimulus offset. These findings bring new insights to the understanding of EEG-like signals, both real and virtual. The findings prove that the approach is successful and could be extended to model cognitive and behavioral processes in the brains.
7

A MODULAR APPROACH TO LANDSAT 7 GROUND PROCESSING

Mah, G. R., Pater, R., Alberts, K., O’Brien, M., Senden, T. 10 1900 (has links)
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California / Current Landsat 7 processing is based on a single-string, multifunction approach. A follow-on system has been designed that repartitions functions across multiple hardware platforms to provide increased flexibility and support for additional missions. Downlink bit stream acquisition has been moved to lower cost systems functioning as “capture appliances” with high-speed network interconnections to Level 0 processing on generic compute servers. This decouples serial data stream acquisition from the processing system to allow the addition or replacement of compute servers, without the reintegration of specialized high-speed capture hardware. Moreover, it also allows the easy integration of new systems and missions without extensive system redesign or additional software.
8

Factors affecting atmospheric Polycyclic Aromatic¡@Hydrocarbons in Kaohsiung coast by GMDH

Chiou, Guo-Yang 18 August 2010 (has links)
Coastal atmospheric concentrations of polycyclic aromatic hydrocarbons (PAHs) were measured at top of a building on campus of the National Sun Yat-sen University of Kaohsiung. Concentrations of 52 PAH compounds were analyzed in both gaseous and particulate phases in air samples collected from May 2008 through April 2009. PAHs diagnostic, Hierarchical Analysis (HCA) and Principal Component Analysis (PCA) were employed to determine the potential sources of PAHs. The Group Method of Data Handling (GMDH) was applied to relate atmospheric PAH concentration to air quality variables like SO2 and O3, as well as meteorological conditions like precipitation and temperature. During the sampling period the mean of total PAH concentrations was 14.2 ng/m3. Over all, PAH concentrations in winter were higher than summer, with the lowest concentrations of PAHs occurred in June (2.22 ng/m3), while the highest occurred in January (32.4 ng/m3). The night-time concentrations of PAHs are higher than daytime. The 2, 3-ring PAHs were mostly present in the gaseous phase, 4-ring were dominant in the gaseous phase, while 5, 6, 7-ring PAHs were mostly present in the particulate phase. During the Ghost festival and Asian dust storm events, atmospheric concentrations of PAHs and PAHs/TSP ratios were both found increased. Meteorological conditions, such as temperature and relative humidity, may strongly affect PAH concentrations, the gaseous and particulate PAHs correlate significantly with SO2, NOx, and PM10. Result from analyses of diagnostic rations, HCA and PCA, indicates the major sources of PAHs include gasoline and diesel exhaust. By using GMDH, a reasonable appraisal index was obtained for the pattern forecast potency with the meteorological and air quality variables. The GMDH algorithm obtained during 2008~2009 was tested in predictions and compared with what measured in 2007~2008.
9

Analýza USB rozhraní / USB communication protocol analysis

Zošiak, Dušan January 2009 (has links)
Tato práce je zaměřena na zpracování a analýzu USB komunikačního protokolu a implementace jeho jednotlivých částí do FPGA obvodu s využitím programovacího jazyka VHDL. Ve finální podobě by měla práce představovat souhrnný a ucelený dokument popisující principy USB rozhraní a jeho komunikace doplněných praktickým návrhem v jazyce VHDL, který by byl schopen převést data do USB.
10

Spatial Assessment of Soil Contamination through GIS Data Management

Sjödell, Ingrid January 2018 (has links)
Spatial data management within the environmental field has a large range of application possibilities and comes with great advantages. In this study methods and technologies for spatial data management of soil contamination has been assessed in Geographical Information Systems (GIS), in order to identify in which way spatial data applications and tools can contribute with valuable information for these type of projects. The spatial assessment has been applied on a case study site in Kagghamra, Stockholm, exposed to high levels of contaminants, arsenic in particular. Subjects that have been evaluated are arsenic contamination distribution pattern, estimation of volume contaminated soil and amount of samples needed for spatial analyses. Furthermore, two versions of an exploratory soil sensitivity estimation model based on site specific ground and landscape parameters as well as literature references have been developed. The data management included large quantities of primary and secondary data of the commination levels as well as geological and ground properties. First hand collected geophysical field data obtained from Electromagnetic (EM) and Induced Polarisation (IP) measurements was also interpreted. The benefits of using geophysical measurements in soil contamination projects has been investigated. In this case the benefits were few due to difficult measuring conditions with disturbance noise. Spatial interpolations with the Natural Neighbour  (NN) technique are proven to be useful in transforming point contamination data into continuous layers. From the interpolation surfaces (arsenic distribution map) a variety of information can be extracted, such as a first hand volume estimation of contaminated soil, possibilities of reduction in amount of field sampling or to investigate statistical information and relations to different site specific ground conditions. The soil sensitivity estimation models are combined maps consisting of data layers that are relevant for the arsenic behaviour and interaction in the subsurface. Site specific Model (1) is based the data layers Soil type, Iron level, Soil depth, Slope  and illustrates mainly areas exposed to high concentrations of arsenic as high sensitivity areas. The more general, literature supported Model (2) also includes Vegetation cover and Topographic Wetness Index (TWI) and is not related highly to the arsenic distribution in the site area, but could contribute with general implications of sensitive areas if applied on a another, larger site area. Efficient management of large data quantities, economic and time saving benefits from less physical sampling and good representation and visualisation possibilities of the site conditions, as a tool for stakeholder communication and decision-making are the main contributions from the spatial data management.

Page generated in 0.1014 seconds