• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • Tagged with
  • 5
  • 5
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Semantic Web Technologies for T&E Metadata Verification and Validation

Darr, Timothy, Fernandes, Ronald, Hamilton, John, Jones, Charles, Weisenseel, Annette 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The vision of the semantic web is to unleash the next generation of information sharing and interoperability by encoding meaning into the symbols that are used to describe various computational capabilities within the World Wide Web or other networks. This paper describes the application of semantic web technologies to Test and Evaluation (T&E) metadata verification and validation. Verification is a quality process that is used to evaluate whether or not a product, service, or system complies with a regulation, specification, or conditions imposed at the start of a development phase or which exists in the organization. Validation is the process of establishing documented evidence that provides a high degree of assurance that a product, service, or system accomplishes its intended requirements. While this often involves acceptance and suitability with external customers, automation provides significant assistance to the customers.
2

Where Next for XidML

Cooke, Alan 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / XidML is an open, vendor neutral, XML based standard for the FTI community used to capture the metadata associated with flight test instrumentation and data acquisition networks. This paper discusses the difference between metadata and meta-metadata, how these concepts apply to XidML and an optional schema, XdefML. The paper also describes how Settings and their validation may be applied to a Generic package definition. Some examples are given of how the current version XdefML can be leveraged to auto-generate graphical user interfaces that are both feature rich and incorporate sophisticated error checking and data validation.
3

Metadata Validation Using a Convolutional Neural Network : Detection and Prediction of Fashion Products

Nilsson Harnert, Henrik January 2019 (has links)
In the e-commerce industry, importing data from third party clothing brands require validation of this data. If the validation step of this data is done manually, it is a tedious and time-consuming task. Part of this task can be replaced or assisted by using computer vision to automatically find clothing types, such as T-shirts and pants, within imported images. After a detection of clothing type is computed, it is possible to recommend the likelihood of clothing products correlating to data imported with a certain accuracy. This was done alongside a prototype interface that can be used to start training, finding clothing types in an image and to mask annotations of products. Annotations are areas describing different clothing types and are used to train an object detector model. A model for finding clothing types is trained on Mask R-CNN object detector and achieves 0.49 mAP accuracy. A detection take just above one second on an Nvidia GTX 1070 8 GB graphics card. Recommending one or several products based on a detection take 0.5 seconds and the algorithm used is k-nearest neighbors. If prediction is done on products of which is used to build the model of the prediction algorithm almost perfect accuracy is achieved while products in images for another products does not achieve nearly as good results.
4

History and Evolution of Metadata Standards for the FTI Community

Cooke, Alan 10 1900 (has links)
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA / The paper discusses the history and background of metadata standards for the FTI community over the last 20 years and speculates on how they may develop in the future. It starts by highlighting the deficiencies of proprietary formats and the resulting problems. It then discusses the characteristics and features of specific industry standard metadata descriptions such as TMATS, iHAL, MDL and XidML in addition to their levels of maturity. The attributes of what constitutes a fully mature FTI metadata standard is then discussed. It is suggested that any standard must serve at least two functions, Configuration and Validation, and outlines what exactly each means. Finally, it is argued that there is now a significant level of convergence and consensus in both the scope and application of metadata, and in the associated concept of operations (ConOps). The details of this Concept of Operations are then discussed along with suggestions as to how this may evolve in the coming years.
5

Beyond Curation: A Validation and Classification Infrastructure for an Educational Content Catalog

Aina, Adeyemi Babatunde 21 January 2025 (has links)
To address the challenge of discovering computer science learning resources, the Smart Learning Content (SLC) catalog is designed to simplify access to the growing body of educational content. As part of the Standards, Protocols, and Learning Infrastructure for Computing Education (SPLICE) research community's efforts, the catalog functions as a centralized platform supporting SPLICE's objectives of improving interoperability, enabling comprehensive data collection, and facilitating data analysis in computer science education. The SLC catalog stands out from previous catalogs with an approach that applies an ontology-based content organization and validation services. Additionally, it serves as a platform where educators can contribute, access, and share a wide range of resources—including slideshows, interactive exercises, programming tasks, and Learning Tools Interoperability (LTI)-integrated content from various learning tools. While the primary goal of the catalog is to disseminate high-quality learning materials, its extensive and varied content requires robust organization and validation mechanisms to ensure educators can efficiently locate and utilize resources. The catalog is designed to further support diverse content types, including both standalone resources and content bundles. For one of our key contributors, OpenDSA—an e-textbook system—we have adopted latest LTI 1.3 standard. This implementation enables the catalog to disseminate content in both LTI 1.1 and LTI 1.3 standards, ensuring compatibility. One key improvement in LTI 1.3 is its security features, incorporating robust authentication methods to ensure stronger protection of sensitive student information. This updated standard enables learning tools to meet the evolving demands of digital education, providing educators and learners with more secure, flexible, and effective resources. / Master of Science / Finding quality computer science learning materials online can be difficult. The Smart Learning Content catalog helps by collecting resources such as slideshows, exercises, and other learning content in a catalog. This catalog makes it easier for Instructors and students to find what they need quickly. This catalog is organized by set of keywords, so instructors can understand how different topics relate to each other. The catalog also verifies that materials meet high-quality standards. Since instructors rely on a variety of online tools, the catalog supports multiple content types and learning technology standards. In addition, the catalog aims to support multiple Learning Tools Interoperability (LTI) standards, including both older and newer versions. For one of our key contributors, OpenDSA—an e-textbook system—we have adopted the latest LTI 1.3 standard. This implementation enables the catalog to disseminate content in both LTI 1.1 and LTI 1.3 standards, ensuring compatibility. The LTI 1.3 standard enhances security in OpenDSA, protecting student data and maintaining privacy. By aligning with modern standards, the catalog provides pathway for discovering and sharing computer science learning materials.

Page generated in 0.0934 seconds