Spelling suggestions: "subject:"[een] METADATA"" "subject:"[enn] METADATA""
71 |
APPLICATIONS OF A HARDWARE SPECIFICATION FOR INSTRUMENTATION METADATAHamilton, John, Fernandes, Ronald, Graul, Mike, Jones, Charles H. 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / In this paper, we discuss the benefits of maintaining a neutral-format hardware specification along with the telemetry metadata specification. We present several reasons and methods for maintaining the hardware specifications, as well as several potential uses of hardware specification. These uses include cross-validation with the telemetry metadata and automatic generation of both metadata and instrumentation networks.
|
72 |
Vendor Interoperability Through MDLWillden, Greg C., Seegmiller, Ray D., Araujo, Maria S., Abbott, Ben A., Malatests, William A. 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / Describing data formats has gone a long way in providing a common thread for moving test programs from one test range to another without incurring massive code rewrites. The introduction of the IRIG 106-93 standard provided the Telemetry Attributes Transfer Standard (TMATS) to achieve interoperability between the test article and ground processing system. The integrated Network Enhanced Telemetry (iNET) Metadata Description Language (MDL) extends the concept to include descriptions of the equipment configuration and setup. This MDL declarative language is both vendor neutral and vendor customizable (where needed) and extends interoperability down to the individual components of the instrumentation system. This paper describes the current state of MDL and its use across intended vendor lines.
|
73 |
Savage Modeling and Analysis Language (SMAL) metadata for tactical simulations and X3D visualizationsRauch, Travis M. 03 1900 (has links)
Visualizing operations environments in three dimensions (3D) supports the warfighters' ability to make rapid, well-informed decisions by presenting complex systems in a naturalistic, integrated display format. Unfortunately, constructing these environments is a time-consuming task requiring specific expertise not typically available in the command center. The future use of 3D visualization in military operations depends on the ability of personnel with minimal graphics experience to create virtual environments quickly and accurately by leveraging data-driven customization of content from model archives with the data available in the command center. Practical 3D visualization depends on standardized scene autogeneration. The Extensible 3D (X3D) Graphics family of specifications is approved by the International Standards Organization (ISO) as the Web-based format for the interchange and rendering of 3D scenes. Previous work has demonstrated that an archive of X3D scenes, such as the Scenario Authoring and Visualization for Advanced Graphical Environments (SAVAGE) library, can be used to autogenerate sophisticated 3D tactical environments. Assembling and making sense of the data necessary to autogenerate a 3D environment requires context and good documentation, best accomplished through metadata. Metadata also supports data-centric, component-based design; key philosophies in promoting interoperability of networked applications. Coupled with recent developments in X3D, enhanced features of the Savage X3D Model archives are now sufficiently mature to support rapid generation of tactical environments. This thesis proposes an XML metadata standard to collect and organize the information necessary to create and populate a tactical 3D virtual environment: the Savage Modeling and Analysis Language (SMAL). The logical extension of a well designed standard is the ability to cross the boundaries of usage, allowing simulators to share data with command and control (C2) suites and mission planning tools based on the construction of a virtual scene. SMAL provides the informational "glue" necessary to perform tactical modeling, simulation, and analysis using networked, physics-based X3D virtual environments.
|
74 |
A metadata service for an infrastructure of large scale distributed scientific datasetsAdeleke, Oluwalani Aeoluwa 12 June 2014 (has links)
In this constantly growing information technology driven era, data migration and
replication pose a serious bottleneck in the distributed database infrastructure envi-
ronment. For large heterogeneous environments with domains such as geospatial sci-
ence and high energy physics, where large array of scienti c data are involved, diverse
challenges are encountered with respect to dataset identi cation, location services,
and e cient retrieval of information. These challenges include locating data sources,
identifying e ective transfer route, and replication, just to mention a few. As dis-
tributed systems aimed at constant delivery of data to the point of query origination
continue to expand in size and functionality, e cient replication and data retrieval
systems have subsequently become increasingly important and relevant. One such
system is an infrastructure for large scale distributed scienti c data management.
Several data management systems have been developed to help manage these fast
growing datasets and their metadata. However little work has been done on allowing
cross-communication and data-sharing between these di erent dataset management
systems in a distributed, heterogeneous environment.
This dissertation addresses this problem, focusing particularly on metadata and
provenance service associated with it. We present the Virtual Uni ed Metadata
architecture to establish communication between remote sites within a distributed
heterogeneous environment using a client-server model. The system provides a frame-
work that allows heterogeneous metadata services communicate and share metadata
and datasets through the implementation of a communication interface. It allows
for metadata discovery and dataset identi cation by enabling remote query between
heterogeneous metadata repositories. The signi cant contributions of this system
include: { the design and implementation of a client/server based remote metadata query
system for scienti c datasets within distributed heterogeneous dataset reposito-
ries; { Implementation of a caching mechanism for optimizing the system performance; { Analyzing the quality of service with respect to correct dataset identi cation,
estimation of migration and replication time frame, and cache performance.
|
75 |
Linking music metadataMacrae, Robert January 2012 (has links)
The internet has facilitated music metadata production and distribution on an unprecedented scale. A contributing factor of this data deluge is a change in the authorship of this data from the expert few to the untrained crowd. The resulting unordered flood of imperfect annotations provides challenges and opportunities in identifying accurate metadata and linking it to the music audio in order to provide a richer listening experience. We advocate novel adaptations of Dynamic Programming for music metadata synchronisation, ranking and comparison. This thesis introduces Windowed Time Warping, Greedy, Constrained On-Line Time Warping for synchronisation and the Concurrence Factor for automatically ranking metadata. We begin by examining the availability of various music metadata on the web. We then review Dynamic Programming methods for aligning and comparing two source sequences whilst presenting novel, specialised adaptations for efficient, realtime synchronisation of music and metadata that make improvements in speed and accuracy over existing algorithms. The Concurrence Factor, which measures the degree in which an annotation of a song agrees with its peers, is proposed in order to utilise the wisdom of the crowds to establish a ranking system. This attribute uses a combination of the standard Dynamic Programming methods Levenshtein Edit Distance, Dynamic Time Warping, and Longest Common Subsequence to compare annotations. We present a synchronisation application for applying the aforementioned methods as well as a tablature-parsing application for mining and analysing guitar tablatures from the web. We evaluate the Concurrence Factor as a ranking system on a largescale collection of guitar tablatures and lyrics to show a correlation with accuracy that is superior to existing methods currently used in internet search engines, which are based on popularity and human ratings.
|
76 |
Butterfly: A Model of ProvenanceTang, Yaobin 13 March 2009 (has links)
Semantically rich metadata is foreseen to be pervasive in tomorrow's cyber world. People are more willing to store metadata in the hope that such extra information will enable a wide range of novel business intelligent applications. Provenance is metadata which describes the derivation history of data. It is considered to have great potential for helping the reasoning, analyzing, validating, monitoring, integrating and reusing of data. Although there are a few application-specific systems equipped with some degree of provenance tracking functionality, few formal models of provenance are present. A general purpose, formal model of provenance is desirable not only to widely promote the storage and inventive usage of provenance, but also to prepare for the emergence of so called provenance management system. In this thesis, I propose Butterfly, a general purpose provenance model, which offers the capability to model, store, and query provenance. It consists of a semantic model for describing provenance, and an extensible algebraic query model for querying provenance. An initial implementation of the provenance model is also briefly discussed.
|
77 |
Improvements in the effectiveness of information management in construction organisationsSheriff, Abdullahi January 2011 (has links)
Managing information effectively is crucial for improving processes within organisations. To this end, a more holistic approach is required which aligns the organisations information and technology needs with its people, processes and business model. This should shift the emphasis from collaborating on individual projects to a long term view of how best to support an organisation s business strategy. It also shifts the emphasis from managing information (storing information to exploit it) to managing with information (harnessing information to gain competitive advantage across the organisation). This thesis is the culmination of a four year EngD research critically investigating improvements in the effectiveness of Information Management (IM) in construction industry based organisations. It begins with an introduction to the research, its aims and objectives, and then presents a detailed review of related literature about the subject matter and the methodology employed in conducting the research. The eight research tasks carried out using extensive, varied and appropriate methods including literature reviews, case studies, interviews, surveys and workshops are then explained based upon which a number of key recommendations are drawn. The findings define IM as distinct from Information Technology (IT) and Information Systems (IS) and provide a critical insight into IM in construction industry based organisations including its strategic drivers, key influencers and barriers. It further highlights the four components of a holistic approach to IM as: The Content Model - the nature and structure of the content including metadata; The Enterprise Model - operational structure of the company including its business processes and information dependent activities; Systems and Technology - the most appropriate technological solutions in light of the defined content and enterprise models; and Implementation and Change Management - support for people and managing the transition into improved ways of working. The findings show that a clear alignment between an organisations information needs and its operational processes is key to developing and implementing an effective IM strategy. It shows that a unified approach is more appropriate for organisations than a single approach as certain multi-dimensional context specific factors exist which influence the nature of each organisations needs and therefore most suitable solutions to address them. The findings also demonstrate the importance of metadata in integrating organisations and enabling a holistic IM Strategy. A standard IM framework is developed and validated which provides a structured and systematic approach for organisations seeking to develop suitable metadata standards to meet their needs. The holistic approach applied to a number of case studies provides novel insights into the use of metadata in information retrieval; the development of aligned information architecture and the creation of holistic strategies for improving knowledge sharing and innovation. The thesis concludes with a critical review of the research and a number of key recommendations for organisations, IM research and the wider construction industry. All the findings and insights have been and continue to be disseminated through a number of peer reviewed publications.
|
78 |
Using Analytic Tools to Measure Overall Trends and Growth Patterns in Digital Commons CollectionsMabry, Holly, Jolley, Daniel 01 June 2018 (has links)
Digital Commons @ Gardner-Webb University was launched in Fall 2015 and currently has over 1300 papers including: theses and dissertations, journals in Education, Psychology, and Undergraduate Research, University Archives, and faculty scholarship activities. The repository has a small, but growing number of collections that continue to show significant year-to-year document download count increases, particularly in the nursing and education theses and dissertation collections.
Digital Commons provides a number of ways to track collection statistics and identify repository access and download trends. This presentation will look at how we used the Digital Commons Dashboard report tool and Google Analytics to identify the most popular collections and where they’re being accessed on campus and globally. Using this data, we were able to write targeted metadata and include third party tools such as the Internet Archive BookReader in order to improve outreach to the campus and global scholarly community.
|
79 |
Metadaten in der PhysikThomas Severiens 20 January 1997 (has links)
No description available.
|
80 |
Meta Tag Usage and Credibility Factors in Alternative Medicine WebsitesAndre S. Burton 19 April 2004 (has links)
Clearly, the wide range of health information sources on the World Wide Web has the potential to lead to distribution of inaccurate medical information from unqualified sources bringing a great risk. Given the growing number of Internet users that access health-related information, the need for a more standard means to validate web site content is apparent. This paper examines how source, information, timeliness, accessibility, and design factors impact web document credibility on a narrower health topic - Alternative Medicine. It also examines the contrasts of different levels of credibility with metadata usage as well as the relationships between metadata usage measures. These preliminary results and examinations give an overview of how metadata is currently being used in this subject area.
|
Page generated in 0.0487 seconds