Spelling suggestions: "subject:administracioneś́ńn+data" "subject:administrationś́ńn+data"
681 |
Linked Data in Libraries: A Case Study of Harvesting and Sharing Bibliographic Metadata with BIBFRAMETharani, Karim 30 April 2015 (has links)
By way of a case study this paper illustrates and evaluates the Bibliographic Framework (or BIBFRAME) as means for harvesting and sharing bibliographic metadata over the Web for libraries. BIBFRAME is an emerging framework developed by the Library of Congress for bibliographic description based on Linked Data. Much like Semantic Web, the goal of Linked Data is to make Web “data aware” and transform the existing Web of documents into a Web of data. Linked Data leverages the existing Web infrastructure and allows linking and sharing of structured data for human and machine consumption.
The BIBFRAME model attempts to contextualize the Linked Data technology for libraries. Library applications and systems contain high-quality structured metadata but this data is generally static in its presentation and seldom integrated with other internal metadata sources or linked to external Web resources. With BIBFRAME existing disparate library metadata sources such as catalogs and digital collections can be harvested and integrated over the Web. In addition, bibliographic data enriched with Linked Data could offer richer navigational control and access points for users. With Linked Data principles, metadata from libraries could also become harvestable by search engines, transforming dormant catalogs and digital collections into active knowledge repositories. Thus experimenting with Linked Data using existing bibliographic metadata holds the potential to empower libraries to harness the reach of commercial search engines to continuously discover, navigate, and obtain new domain specific knowledge resources on the basis of their verified metadata.
The initial part of the paper introduces BIBFRAME and discusses Linked Data in the context of libraries. The final part of this paper outlines a step-by-step process for implementing BIBFRAME with existing library metadata.
|
682 |
Selection of maintenance policies for a data warehousing environment : a cost based approach to meeting quality of service requirementsEngström, Henrik January 2002 (has links)
No description available.
|
683 |
Estimating the parameters of the truncated normal distributionAkhter, A. S. January 1987 (has links)
No description available.
|
684 |
Frame alignment for digital transmission : Analysis and optimizationAl-Subbagh, M. N. January 1988 (has links)
No description available.
|
685 |
Lattice vector quantization for image codingSampson, Demetrios G. January 1995 (has links)
No description available.
|
686 |
Analysis and application of polarisation diversity radar dataMcGuinness, R. January 1984 (has links)
No description available.
|
687 |
Optical buffering scheme for optical TCP/IP networksChin, Soon Wa January 2003 (has links)
No description available.
|
688 |
A study of local area mortality rates in Greater GlasgowMohamed, Yasmin January 1988 (has links)
No description available.
|
689 |
A study of the one-dimensional inverse problem in ultrasonic systemsLewis, J. E. January 1987 (has links)
No description available.
|
690 |
Approaching “Big Data” in Biological Research Imaging Spectroscopy with Novel CompressionChen, Yixuan 10 April 2014 (has links)
This research focuses on providing a fast and space efficient compression method to answer information queries on spectroscopic data. Our primary hypothesis was whether a conversion from decimal data to character/integer space could be done in a manner that enables use of succinct structures and provides good compression. This compression algorithm is motivated to handle queries on spectroscopic data that approaches limits of main computer memory.
The primary hypothesis is supported in that the new compression method can save 79.20% - 94.07% computer space on the average. The average of maximum error rates is also acceptable, being 0.05% - 1.36% depending on the subject that the data was collected from. Additionally, the data’s compression rate and entropy are negatively correlated; while compression rate and maximum error were positively correlated when the max error rates were performed on a natural logarithm transformation. The effects of different types of data sources on compression rate have been studied as well. Fungus datasets achieved highest compression rates, while mouse brain datasets obtained the lowest compression rates among four types of data sources. Finally, the effect of the studied compression algorithm and method on integrating spectral bands has been investigated in this study. The spectral integration for determining lipid, CH2 and dense core plaque obtained good image quality and the errors can be considered inconsequential except the case of determining creatine deposits. Despite the fact that creatine deposits are still recognizable in the reconstructed image, the image quality was reduced.
|
Page generated in 0.1541 seconds