• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 23
  • 13
  • 8
  • 6
  • 5
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 221
  • 221
  • 86
  • 73
  • 48
  • 43
  • 32
  • 25
  • 24
  • 22
  • 20
  • 18
  • 17
  • 17
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Transcription of the Bleek and Lloyd Collection using the Bossa Volunteer Thinking Framework

Munyaradzi, Ngoni 01 November 2013 (has links)
The digital Bleek and Lloyd Collection is a rare collection that contains artwork, notebooks and dictionaries of the earliest habitants of Southern Africa. Previous attempts have been made to recognize the complex text in the notebooks using machine learning techniques, but due to the complexity of the manuscripts the recognition accuracy was low. In this research, a crowdsourcing based method is proposed to transcribe the historical handwritten manuscripts, where volunteers transcribe the notebooks online. An online crowdsourcing transcription tool was developed and deployed. Experiments were conducted to determine the quality of transcriptions and accuracy of the volunteers compared with a gold standard. The results show that volunteers are able to produce reliable transcriptions of high quality. The inter-transcriber agreement is 80% for |Xam text and 95% for English text. When the |Xam text transcriptions produced by the volunteers are compared with the gold standard, the volunteers achieve an average accuracy of 69.69%. Findings show that there exists a positive linear correlation between the inter-transcriber agreement and the accuracy of transcriptions. The user survey revealed that volunteers found the transcription process enjoyable, though it was difficult. Results indicate that volunteer thinking can be used to crowdsource intellectually-intensive tasks in digital libraries like transcription of handwritten manuscripts. Volunteer thinking outperforms machine learning techniques at the task of transcribing notebooks from the Bleek and Lloyd Collection.
152

Impact of Solar Resource and Atmospheric Constituents on Energy Yield Models for Concentrated Photovoltaic Systems

Mohammed, Jafaru 24 July 2013 (has links)
Global economic trends suggest that there is a need to generate sustainable renewable energy to meet growing global energy demands. Solar energy harnessed by concentrated photovoltaic (CPV) systems has a potential for strong contributions to future energy supplies. However, as a relatively new technology, there is still a need for considerable research into the relationship between the technology and the solar resource. Research into CPV systems was carried out at the University of Ottawa’s Solar Cells and Nanostructured Device Laboratory (SUNLAB), focusing on the acquisition and assessment of meteorological and local solar resource datasets as inputs to more complex system (cell) models for energy yield assessment. An algorithm aimed at estimating the spectral profile of direct normal irradiance (DNI) was created. The algorithm was designed to use easily sourced low resolution meteorological datasets, temporal band pass filter measurement and an atmospheric radiative transfer model to determine a location specific solar spectrum. Its core design involved the use of an optical depth parameterization algorithm based on a published objective regression algorithm. Initial results showed a spectral agreement that corresponds to 0.56% photo-current difference in a modeled CPV cell when compared to measured spectrum. The common procedures and datasets used for long term CPV energy yield assessment was investigated. The aim was to quantitatively de-convolute various factors, especially meteorological factors responsible for error bias in CPV energy yield evaluation. Over the time period from June 2011 to August 2012, the analysis found that neglecting spectral variations resulted in a ~2% overestimation of energy yields. It was shown that clouds have the dominant impact on CPV energy yields, at the 60% level.
153

Dynamic pattern recognition and data storage using localized holographic recording

Karbaschi, Arash 05 May 2008 (has links)
A new technique for optical pattern recognition with two-center recording of persistent holograms in doubly doped LiNbO₃3:Fe:Mn crystal is presented, by which the holograms are localized in separate slices along the recording medium. The localized recording method has the distinctive advantage of selective recording and erasure of the individual holograms without affecting the entire holographic recording medium. This capability enables dynamic content modification of the optical pattern recognition systems. Also, the diffraction efficiency of localized holograms is much larger than that of the normal volume multiplexed holograms. It is theoretically shown that the localized holographic correlator (LHC) outperforms the conventional volume holographic correlators in terms of crosstalk, shift invariance, and capacity. The LHC is experimentally demonstrated. Several persistent holograms are localized within separate slices as close as 33 μm apart along the crystal. The excessive diffraction efficiency of the localized holograms is employed to enhance the LHC robustness through multiplexing several holograms per pattern within individual slices of the recording medium. A holographic data storage system based on two-center holographic recording in a doubly doped LiNbO3:Fe:Mn crystal is developed with angular multiplexing capability. The associated imaging system has been optimized for the pixel matching of pixelated bit patterns generated by a spatial light modulator (SLM) through the recording medium onto a camera. The initial multiplexed holograms show promising contrast of dark and bright pixels. With the optimized imaging system of the developed holographic memory, the implementation of a dynamic read/write data storage system with localized recording is envisioned. The large diffraction efficiency of the localized holograms enables multilevel (M-ary) data coding to improve the storage density of the system.
154

Materials for Magnetic Recording Applications

Burkert, Till January 2005 (has links)
In the first part of this work, the influence of hydrogen on the structural and magnetic properties of Fe/V(001) superlattices was studied. The local structure of the vanadium-hydride layers was determined by extended x-ray absorption fine structure (EXAFS) measurements. The magnetic ordering in a weakly coupled Fe/V(001) superlattice was investigated using the magneto-optical Kerr effect (MOKE). The interlayer exchange coupling is weakened upon alloying with hydrogen and a phase with short-range magnetic order was observed. The second part is concerned with first-principles calculations of magnetic materials, with a focus on magnetic recording applications. The uniaxial magnetic anisotropy energy (MAE) of Fe, Co, and Ni was calculated for tetragonal and trigonal structures. Based on an analysis of the electronic states of tetragonal Fe and Co at the center of the Brillouin zone, tetragonal Fe-Co alloys were proposed as a material that combines a large uniaxial MAE with a large saturation magnetization. This was confirmed by experimental studies on (Fe,Co)/Pt superlattices. The large uniaxial MAE of L10 FePt is caused by the large spin-orbit interaction on the Pt sites in connection with a strong hybridization between Fe and Pt. Furthermore, it was shown that the uniaxial MAE can be increased by alloying the Fe sublattice with Mn. The combination of the high-moment rare-earth (RE) metals with the high-TC 3d transition metals in RE/Cr/Fe multilayers (RE = Gd, Tb, Dy) gives rise to a strong ferromagnetic effective exchange interaction between the Fe layers and the RE layer. The MAE of hcp Gd was found to have two principal contributions, namely the dipole interaction of the large localized 4f spins and the band electron magnetic anisotropy due to the spin-orbit interaction. The peculiar temperature dependence of the easy axis of magnetization was reproduced on a qualitative level.
155

The impact of bus stop micro-locations on pedestrian safety in areas of main attraction

Kovacevic, Vlado S January 2005 (has links)
From the safety point of view, the bus stop is perhaps the most important part of the Bus Public Transport System, as it represents the point where bus passengers may interact directly with other road users and create conflicting situations leading to traffic accidents. For example, travellers could be struck walking to/from or boarding/alighting a bus. At these locations, passengers become pedestrians and at some stage crossing busy arterial roads at the bus stop in areas or at objects of main attraction usually outside of pedestrian designated facilities such as signal controlled intersections, zebra and pelican crossings. Pedestrian exposure to risk or risk-taking occurs when people want to cross the road in front of the stopped bus, at the rear of the bus or between the buses, particularly where bus stops are located on two-way roads (i.e. within the mid-block of the road with side streets, at non-signalised cross-section). However, it is necessary to have a better understanding of the pedestrian road-crossing risk exposure (pedestrian crossing distraction, obscurity and behaviour) within bus stop zones so that it can be incorporated into new design, bus stop placement, and evaluation of traffic management schemes where bus stop locations will play an increasingly important role. A full range of possible incidental interactions are presented in a tabular model that looks at the most common interacting traffic movements within bus stop zones. The thesis focused on pedestrian safety, discusses theoretical foundations of bus stops, and determines the types of accident risks between bus travellers as pedestrians and motor vehicles within the zones of the bus stop. Thus, the objectives of this thesis can be summarized as follows: (I) - Classification of bus stops, particularly according to objects of main attraction (pedestrian-generating activities); (II) - Analysis of traffic movement and interactions as an accident/risk exposure in the zone of bus stops with respect to that structure; (III) - Categorizing traffic accident in the vicinity of bus stops, and to analyse the interactions (interacting movements) that occur within bus stop zones in order to discover the nature of problems; (IV) - Formulation of tabular (pedestrian traffic accident prediction) models/forms (based on traffic interactions that creating and causing possibilities of accident conflict) for practical statistical methods of those accidents related to bus stop, and; (V) - Safety aspects related to the micro-location of bus stops to assist in the micro-location design, operations of bus stop safety facilities and safer pedestrian crossing for access between the bus stop and nearby objects of attraction. The scope of this thesis focuses on the theoretical foundation of bus stop microâ??location in areas of main attractions or at objects of main attraction, and traffic accident risk types as they occur between travellers as pedestrians and vehicle flow in the zone of the bus stop. The knowledge of possible interactions leads to the identification of potential conflict situations between motor vehicles and pedestrians. The problems discussed for each given conflict situation, has a great potential in increasing the knowledge needed to prevent accidents and minimise any pedestrian-vehicle conflict in this area and to aid in the development and planning of safer bus stops.
156

The Analysis of Big Data on Cites and Regions - Some Computational and Statistical Challenges

Schintler, Laurie A., Fischer, Manfred M. 28 October 2018 (has links) (PDF)
Big Data on cities and regions bring new opportunities and challenges to data analysts and city planners. On the one side, they hold great promise to combine increasingly detailed data for each citizen with critical infrastructures to plan, govern and manage cities and regions, improve their sustainability, optimize processes and maximize the provision of public and private services. On the other side, the massive sample size and high-dimensionality of Big Data and their geo-temporal character introduce unique computational and statistical challenges. This chapter provides overviews on the salient characteristics of Big Data and how these features impact on paradigm change of data management and analysis, and also on the computing environment. / Series: Working Papers in Regional Science
157

RepositÃrio Web para compartilhamento, reuso, versionamento e evoluÃÃo de conteÃdos binÃrios: modelagem e anÃlise por Redes de Petri coloridas / Web Repository for sharing, reuse, versioning and evolution of binary content: modeling and analysis by colored Petri nets

Corneli Gomes Furtado JÃnior 14 December 2011 (has links)
A livre disponibilizaÃÃo de conteÃdos digitais vem crescendo de maneira expressiva na Web. Muitos desses conteÃdos podem ser modificados, reutilizados e adaptados para propÃsitos especÃficos. Embora muitos recursos para armazenamento e disponibilizaÃÃo de conteÃdos sejam bastante difundidos, nÃo se percebem na Internet repositÃrios que possuam instrumentos apropriados para o controle de versÃes de conteÃdo binÃrio (CB). A maioria das soluÃÃes encontradas para a persistÃncia de dados na Web à baseada em Bancos de Dados Relacionais (BDR). A simples adiÃÃo de modificaÃÃes aos dados originais e armazenamento em novos registros de tabelas em um BD pode ser ineficiente devido a uma quantidade potencialmente grande de informaÃÃes redundantes. à possÃvel, para este fim, adaptar ferramentas de versionamento conhecidas por Sistemas de Controle de VersÃo (SCV), ferramentas especializadas no armazenamento de regiÃes modificadas de documentos. Entretanto, um SCV à menos eficiente do que os Sistemas Gerenciadores de Bancos de Dados (SGBD) em tempo de acesso e recuperaÃÃo de informaÃÃes, o que pode comprometer o desempenho da aplicaÃÃo, caso seja adotado como soluÃÃo para a persistÃncia de dados. Visando conceber um repositÃrio de CBs versionados na Web, com gestÃo eficiente tanto para o acesso como para o armazenamento de CBs, neste trabalho à analisado o desempenho dos SGBDs gratuitos mais utilizados na atualidade e de um SVC que se revelou o mais adequado ao escopo deste trabalho. Os resultados obtidos serviram de base para a especificaÃÃo da arquitetura de um repositÃrio que se apoia em uma abordagem hÃbrida, com o uso simultÃneo de um SGBD e um SVC. Em seguida, foi realizada a modelagem por Redes de Petri Coloridas, o que permitiu a simulaÃÃo e a anÃlise da arquitetura concebida, demonstrando-se a maior eficiÃncia da arquitetura proposta em relaÃÃo a uma abordagem de armazenamento tradicional. / The free availability of digital content has fairly increased on the Web. Many of these contents can be modified, reused and adapted for specific purposes. Although diferent resources for providing and storing contents are widely available, there is a lack of tools suitable to versioning control of binary content on the Internet. In addition, most solutions to persistent data on the Web are based on Relational DataBase (RDB). As a simple solution, we can modify the original data and store new records in the tables of a database. However, this leads to a possible ineficiency due to a potentially large amount of redundant information. Nevertheless, in order to overcome this issue, it is possible to adapt versioning tools, also known as Control Version System (CVS), to the solution. These tools are specialized in storage of the modified regions of documents. This process is known as "deltification". However, regarding access time and gathering data, a CVS is less eficient than a DataBase Management System (DBMS). Therefore, if a CVS is used as a solution for data persistence, this can reduce the overall performance of the application.Aiming at designing a repository of versioned binary content on theWeb, with eficient management for both accessing and storing binary data, this work analyzes the performance of free DBMSs most frequently used and a CVS, which we consider being the most suitable to the addressed repository. The attained results were the basis for the specification of the architecture of a repository that relies on a hybrid approach. The resulting approach lies in the simultaneous use of a DBMS and a CVS. We took into account features and runtime performance of both tools for each operation required in the final application. Then, we designed models on Colored Petri Nets, which allowed the simulation and analysis of the aimed architecture. As a result, we present the eficiency of the proposed architecture against a traditional storage approach.
158

Die Bedeutung der Segregations- und Oxidationsneigung Seltener Erden für die Einstellung hartmagnetischer intermetallischer Phasen in SmCo-basierten Nanopartikeln

Schmidt, Frank 18 April 2018 (has links) (PDF)
Aufgrund der sehr hohen magnetokristallinen Anisotropiekonstante eignet sich besonders die Phase SmCo5 für zukünftige Festplattenmedien mit hoher Speicherdichte. Durch die starke Oxidationsneigung und die gegebene chemischen Ähnlichkeit anderer Seltenen Erden ist es eine Herausforderung hartmagnetische SmCo-basierte Nanopartikel mittels Inertgaskondensation herzustellen. Zudem bestimmt die Oberflächenenergie maßgeblich die Eigenschaften von Nanopartikeln, sodass ein Element mit einer geringen solchen energetisch bevorzugt die Oberfläche bildet. Diese Arbeit zeigt auf, wie die sauerstoffbasierte Oxidation und die unterschiedlichen Oberflächenenergien der legierungsbildenden Elemente die Struktur, die Morphologie und die chemische Verteilung der Elemente innerhalb der Nanopartikel beeinflussen und so die Legierungsbildung einer hartmagnetischen Sm(Pr)Co-Phase steuern. Mithilfe von aberrationskorrigierter, hochauflösender Transmissionselektronenmikroskopie in Verbindung mit Elektronenenergieverlustspektroskopie werden Morphologie, Elementverteilung und Struktur von unterschiedlich hergestellten Sm(Pr)Co-Nanopartikeln untersucht und analysiert. Die auftretende Segregation der Seltenen Erden an die Oberfläche der Nanopartikel wird zum einen auf eine sauerstoffinduzierte, zum anderen auf eine intrinsische Segregation, also eine durch unterschiedliche Oberflächenenergien der legierungsbildenden Elementen hervorgerufene Segregation zurückgeführt. Anhand eines entwickelten geometrischen Modells wird zwischen den beiden Ursachen der Segregation unterschieden. Das Verständnis um die kausalen Zusammenhänge der Segregation lässt den Schritt zur Herstellung hartmagnetischer intermetallischer SmCo-basierter Nanopartikel zu. Hierzu werden speziell Nanopartikelagglomerate geformt und optisch in einem Lichtofen erhitzt, sodass die Primärpartikel in den Agglomeraten versintern und schließlich das resultierende sphärische Partikel kristallisiert. HRTEM-Aufnahmen und Elektronenbeugung bestätigen die erfolgreiche Herstellung von SmCo5- und Sm2Co17-basierten Nanopartikeln. Die Koerzitivfeldstärke dieser Partikelensembles beträgt 1,8T und einem Maximum in der Schaltfeldverteilung bei 3,6T. Die magnetischen Eigenschaften spiegeln die analysierten strukturellen, morphologischen und chemischen Eigenschaften der Nanopartikel wider.
159

Large scale data collection and storage using smart vehicles : An information-centric approach / Collecte et stockage de données à large échelle par des véhicules intelligents : une approche centrée sur le contenu

Khan, Junaid 04 November 2016 (has links)
De nos jours, Le nombre de dispositifs ne cesse d’augmenter ce qui induit une forte demande des applications en données multimédia. Cependant gérer des données massives générées et consommées par les utilisateurs mobiles dans une zone urbaine reste une problématique de taille pour les réseaux cellulaires existants qui sont à la fois limités en termes de cout et de bande passante mais aussi due à la nature de telles données centrées- connexion. D’autre part, l’avancée technologique en matière de véhicules autonomes permet de constituer une infrastructure prometteuse capable de prendre en charge le traitement, la sauvegarde, et la communication de ces données. En effet, Il est maintenant possible de recruter des véhicules intelligents pour des fins de collecte, de stockage, et de partage des données hétérogènes en provenance d’un réseau routier afin de répondre aux demandes des citoyens via des applications. Par conséquent, nous tirons profit de l'évolution récente en « information Centric Networking » ICN afin d'introduire deux nouvelles approches de collecte et de stockage de contenu par les véhicules, nommées respectivement VISIT et SAVING, plus efficaces et plus proches de l'utilisateur mobile en zone urbaine ainsi nous remédions aux problèmes liés à la bande passante et le coût. VISIT est une plate-forme qui définit de nouvelles mesures de centralité basées sur l'intérêt social des citoyens afin d’identifier et de sélectionner l'ensemble approprié des meilleurs véhicules candidats pour la collecte des données urbaines. SAVING est un système de stockage de données sociales, qui présente une solution de mise en cache des données d’une façon collaborative entre un ensemble de véhicules parmi d’autres désignés et recrutés selon une stratégie des théorie des jeux basée sur les réseaux complexes. Nous avons testé ces deux méthodes VISIT et SAVING sur des données simulées pour environ 2986 véhicules avec des traces de mobilité réalistes en zone urbaine, et les résultats ont prouvés que les deux méthodes permettent non seulement une collecte et un stockage efficaces mais aussi bien scalables / The growth in the number of mobile devices today result in an increasing demand for large amount of rich multimedia content to support numerous applications. It is however challenging for the current cellular networks to deal with such increasing demand, both in terms of cost and bandwidth for the ``massive'' content generated and consumed by mobile users in an urban environment due to its connection-centric nature. The technological advancement in modern vehicles allow us to harness their computing, caching and communication capabilities to supplement infrastructure network. It is now possible to recruit smart vehicles to collect, store and share heterogeneous data on urban streets in order to provide citizens with different services. Therefore, we leverage the recent shift towards Information Centric Networking (ICN) to introduce two schemes, VISIT and SAVING for the efficient collection and storage of content at vehicles, closer to the urban mobile user to avoid bandwidth and cost. VISIT is a platform which defines novel centrality metrics based on the social interest of urban users to identify and select the appropriate set of best candidate vehicles to perform urban data collection. SAVING is a social-aware data storage system which exploits complex networks to present game-theoretic solutions for finding and recruiting vehicles adequate to perform collaborative content caching in an urban environment. VISIT and SAVING are simulated for around 2986 vehicles with realistic urban mobility traces and comparison results with other schemes in literature suggest both not only efficient but also scalable data collection and storage systems
160

Optimizing Main Memory Usage in Modern Computing Systems to Improve Overall System Performance

Campello, Daniel Jose 20 June 2016 (has links)
Operating Systems use fast, CPU-addressable main memory to maintain an application’s temporary data as anonymous data and to cache copies of persistent data stored in slower block-based storage devices. However, the use of this faster memory comes at a high cost. Therefore, several techniques have been implemented to use main memory more efficiently in the literature. In this dissertation we introduce three distinct approaches to improve overall system performance by optimizing main memory usage. First, DRAM and host-side caching of file system data are used for speeding up virtual machine performance in today’s virtualized data centers. The clustering of VM images that share identical pages, coupled with data deduplication, has the potential to optimize main memory usage, since it provides more opportunity for sharing resources across processes and across different VMs. In our first approach, we study the use of content and semantic similarity metrics and a new algorithm to cluster VM images and place them in hosts where through deduplication we improve main memory usage. Second, while careful VM placement can improve memory usage by eliminating duplicate data, caches in current systems employ complex machinery to manage the cached data. Writing data to a page not present in the file system page cache causes the operating system to synchronously fetch the page into memory, blocking the writing process. In this thesis, we address this limitation with a new approach to managing page writes involving buffering the written data elsewhere in memory and unblocking the writing process immediately. This buffering allows the system to service file writes faster and with less memory resources. In our last approach, we investigate the use of emerging byte-addressable persistent memory technology to extend main memory as a less costly alternative to exclusively using expensive DRAM. We motivate and build a tiered memory system wherein persistent memory and DRAM co-exist and provide improved application performance at lower cost and power consumption with the goal of placing the right data in the right memory tier at the right time. The proposed approach seamlessly performs page migration across memory tiers as access patterns change and/or to handle tier memory pressure.

Page generated in 0.0632 seconds