Spelling suggestions: "subject:"[een] METADATA"" "subject:"[enn] METADATA""
121 |
Metadaty řízený finanční reporting firmy / Metadata driven financial reporting of a companyKunst, Jan January 2013 (has links)
Diploma thesis deals with analysis of common requirements of mid-sized and major companies for financial reporting and with design and development of reporting tool which meets such requirements. The target of the thesis is to create the tool for financial reporting, which would be universal and allows user to set up by himself, easily and without interfering the source code, those parts of reporting tool which usually varies for different companies, so they can't be set both universally and precisely in advance. The tool is developed in QlikView environment, which provides many advantages for quality reporting. My contribution within this thesis is in the use of nontrivial practices and maximal utilization of QlikView environment possibilities in order to make the final financial reporting tool modular, parameterizable, expandable and applicable in any company environment and to enable it to transfer the role of reports set-up towards the user.
|
122 |
Automatic Language Identification for Metadata Records: Measuring the Effectiveness of Various ApproachesKnudson, Ryan Charles 05 1900 (has links)
Automatic language identification has been applied to short texts such as queries in information retrieval, but it has not yet been applied to metadata records. Applying this technology to metadata records, particularly their title elements, would enable creators of metadata records to obtain a value for the language element, which is often left blank due to a lack of linguistic expertise. It would also enable the addition of the language value to existing metadata records that currently lack a language value. Titles lend themselves to the problem of language identification mainly due to their shortness, a factor which increases the difficulty of accurately identifying a language. This study implemented four proven approaches to language identification as well as one open-source approach on a collection of multilingual titles of books and movies. Of the five approaches considered, a reduced N-gram frequency profile and distance measure approach outperformed all others, accurately identifying over 83% of all titles in the collection. Future plans are to offer this technology to curators of digital collections for use.
|
123 |
Exploration of RDA-Based MARC21 Subject Metadata in Worldcat Database and Its Readiness to Support Linked Data FunctionalityZavalin, Vyacheslav I. 08 1900 (has links)
Subject of information entity is one of the fundamental concepts in the field of information science. Subject of any document represents its intellectual potential -- 'aboutness' of the document. Traditionally, subject (along with title and author) is the one of three major ways to access information, so subject metadata plays a central role in this process and the role is constantly growing. Previous research concluded that the larger bibliographic database is, the richer subject vocabularies and classification schemes are needed to support information discovery. Further, a high proportion of information objects are unretrievable without subject headings in metadata records. This exploratory study provides the analysis of the subject metadata in MARC 21 bibliographic records created in 2020; and develops understanding of the level and patterns of 'aboutness' representation in the MARC 21 bibliographic records. Study also examines how these records apply the recent RDA and MARC21 guidelines and features intended to support functionality in a Linked Data environment. Methods of Social Network Analysis were applied along with content analysis, to answer research questions of this study. Suggestions for future research, implications for education, and practical recommendations for library metadata creation and management are discussed.
|
124 |
Einsatz und Bewertung komponentenbasierter Metadaten in einer föderierten Infrastruktur für Sprachressourcen am Beispiel der CMDIEckart, Thomas 29 July 2016 (has links)
Die Arbeit setzt sich mit dem Einsatz der Component Metadata Infrastructure CMDI im Rahmen der föderierten Infrastruktur CLARIN auseinander, wobei diverse konkrete Problemfälle aufgezeigt werden.
Für die Erarbeitung entsprechender Lösungsstrategien werden unterschiedliche Verfahren adaptiert und für die Qualitätsanalyse von Metadaten und zur Optimierung ihres Einsatzes in einer föderierten Umgebung genutzt. Konkret betrifft dies vor allem die Übernahme von Modellierungsstrategien der Linked Data Community, die Übernahme von Prinzipien und Qualitätsmetriken der objektorientierten Programmierung für CMD-Metadatenkomponenten, sowie den Einsatz von Zentralitätsmaßen der Graph- bzw. Netzwerkanalyse für die Bewertung des Zusammenhalts des gesamten Metadatenverbundes.
Dabei wird im Rahmen der Arbeit die Analyse verwendeter Schema- bzw. Schemabestandteile sowie die Betrachtung verwendeter Individuenvokabulare im Zusammenspiel aller beteiligten Zentren in den Vordergrund gestellt.
|
125 |
MINING IRIG-106 CHAPTER 10 AND HDF-5 DATALockard, Michael T., Rajagopalan, R., Garling, James A. 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / Rapid access to ever-increasing amounts of test data is becoming a problem. The authors
have developed a data-mining methodology solution approach to provide a solution to
catalog test files, search metadata attributes to derive test data files of interest, and query
test data measurements using a web-based engine to produce results in seconds.
Generated graphs allow the user to visualize an overview of the entire test for a selected
set of measurements, with areas highlighted where the query conditions were satisfied.
The user can then zoom into areas of interest and export selected information.
|
126 |
iNET System Operational FlowsGrace, Thomas B., Abbott, Ben A., Moodie, Myron L. 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / The integrated Network-Enhanced Telemetry (iNET) project is transitioning from standards development to deployment of systems. In fielding a Telemetry Network System (TmNS) demonstration system, one must choose and integrate technological building blocks from the suite of standards to implement new test capabilities. This paper describes the operation of a TmNS and identifies the management, configuration, control, acquisition, and distribution of information and operational flows. These items are discussed utilizing a notional system to walk through the mechanisms identified by the iNET standards. Note that at the time of this paper the efforts discussed are only at the very beginning of the design process and will likely evolve throughout the design process.
|
127 |
IHAL and Web Service Interfaces to Vendor Configuration EnginesHamilton, John, Darr, Timothy, Fernandes, Ronald, Sulewski, Joe, Jones, Charles 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / In this paper, we present an approach towards achieving standards-based multi-vendor hardware configuration. This approach uses the Instrumentation Hardware Abstraction Language (IHAL) and a standardized web service Application Programming Interface (API) specification to allow any Instrumentation Support System (ISS) to control instrumentation hardware in a vendor neutral way without requiring non-disclosure agreements or knowledge of proprietary information. Additionally, we will describe a real-world implementation of this approach using KBSI‟s InstrumentMap application and an implementation of the web service API by L-3 Communications Telemetry East.
|
128 |
A Model-Based Methodology for Managing T&E MetadataHamilton, John, Fernandes, Ronald, Darr, Timothy, Graul, Michael, Jones, Charles, Weisenseel, Annette 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / In this paper, we present a methodology for managing diverse sources of T&E metadata. Central to this methodology is the development of a T&E Metadata Reference Model, which serves as the standard model for T&E metadata types, their proper names, and their relationships to each other. We describe how this reference model can be mapped to a range's own T&E data and process models to provide a standardized view into each organization's custom metadata sources and procedures. Finally, we present an architecture that uses these models and mappings to support cross-system metadata management tasks and makes these capabilities accessible across the network through a single portal interface.
|
129 |
METADATA MODELING FOR AIRBORNE DATA ACQUISITION SYSTEMSKupferschmidt, Benjamin, Pesciotta, Eric 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Many engineers express frustration with the multitude of vendor specific tools required to describe measurements and configure data acquisition systems. In general, tools are incompatible between vendors, forcing the engineer to enter the same or similar data multiple times. With the emergence of XML technologies, user centric data modeling for the flight test community is now possible. With this new class of technology, a vendor neutral, standard language to define measurements and configure systems may finally be realized. However, the allure of such a universal language can easily become too abstract, making it untenable for hardware configuration and resulting in a low vendor adoption rate. Conversely, a language that caters too much to vendor specific configuration will defeat its purpose. Achieving this careful balance is not trivial, but is possible. Doing so will produce a useful standard without putting it out of the reach of equipment vendors. This paper discusses the concept, merits, and possible solutions for a standard measurement metadata model. Practical solutions using XML and related technologies are discussed.
|
130 |
OntoStudyEditUciteli, Alexandr, Herre, Heinrich 10 February 2016 (has links) (PDF)
Background: The specification of metadata in clinical and epidemiological study projects absorbs significant expense. The validity and quality of the collected data depend heavily on the precise and semantical correct
representation of their metadata. In various research organizations, which are planning and coordinating studies, the required metadata are specified
differently, depending on many conditions, e.g., on the used study management software. The latter does not always meet the needs of a particular research organization, e.g., with respect to the relevant metadata attributes and structuring possibilities.
Methods: The objective of the research, set forth in this paper, is the development of a new approach for ontology-based representation and management of metadata. The basic features of this approach are
demonstrated by the software tool OntoStudyEdit (OSE). The OSE is designed and developed according to the three ontology method. This method for developing software is based on the interactions of three different kinds of ontologies: a task ontology, a domain ontology and a top-level ontology.
Results: The OSE can be easily adapted to different requirements, and it supports an ontologically founded representation and efficient management of metadata. The metadata specifications can by imported from various sources; they can be edited with the OSE, and they can be exported in/to several formats, which are used, e.g., by different study management software.
Conclusions: Advantages of this approach are the adaptability of the OSE by integrating suitable domain ontologies, the ontological specification of mappings between the import/export formats and the DO, the
specification of the study metadata in a uniform manner and its reuse in different research projects, and an intuitive data entry for non-expert users.
|
Page generated in 0.0464 seconds