• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 5
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 32
  • 32
  • 7
  • 7
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

UNLEASHING THE POWER OF XML

Corry, Diarmuid 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Over the last few years XML has been growing in importance as a language for describing the meta-data associated with a complete flight test. Three years ago ACRA CONTROL introduced XidML as an open, published XML standard describing flight test data acquisition from the air to the ground. Recently, XML has been adopted by the TMATS RCC committee and is currently being studied by iNET. While many papers have focused on what XML is and why it is a powerful language, few have related this to practical benefits for the end user. This paper attempts to address this gap. The paper describes simple cost effective tools for generating XML through an intuitive GUI, validating XML information against a schema and transforming XML into useful reports. In particular a suite of value added tools for XidML is described.
2

Introduction to XidML 3.0 An Open XML Standard for Flight Test Instrumentation Description

Cooke, Alan, Herbepin, Christian 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / A few years ago XidML was introduced as an open XML standard for capturing the meta-data associated with flight test instrumentation (FTI). This meta-data schema was broken down into elements for Parameter (name, range, units, offset-binary), Instrument (name, serial number, misses-to loss), Package (bits per word, words per minor-frame, rate) and Link (name, type) and so on. XidML remains one of the only published schema for FTI meta-data and with XidML 3.0 many simplifications have been introduced along with support for nested tree structures and a single instrument schema allowing anyone to define the validation for instruments from any vendor. This paper introduces the XidML schema and describers the benefits of XidML 3.0 in particular. It begins by giving a brief description of what XidML is and describes its history and motivation. The paper then outlines the main differences between XidML-3.0 and earlier versions, and how the XidML schema has been further refined to meet the challenges faced by the FTI community. As an example of usage the FTIManager software developed at Eurocopter will be briefly presented in order to illustrate the XidML ability to describe a multi-vendor FTI configuration.
3

Integrating Heterogeneous Systems in an FTI Environment

Cooke, Alan 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / Typically, FTI projects utilise acquisition hardware from multiple vendors. There are at least three ways of facilitating their integration. The first option is to implement a series of ad hoc mechanisms customised to the software interfaces provided by each specific FTI vendor. The second option is to define a meta-data format that can be used to define hardware setup and configuration in a common way. The final option is to define a common software architecture that prescribes a set of interfaces and services through which vendor hardware can be configured, and measurement data retrieved. This paper discusses the pros and cons of each approach and outlines the level of difficulty associated with each.
4

META-DATA VERSIONING

Adamski, Greg 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Telemetry missions spanning multiple years of tests often require access to archived configuration data for replay and analysis purposes. The needs for versioning vary from simple file-naming conventions to advanced global database versioning based on the scale and complexity of the mission. This paper focuses on a flexible approach to allow access to current and past versions of multiple test article configurations. Specifically, this paper discusses the characteristics of a versioning system for user-friendly and feature-rich solutions. It analyzes the tradeoffs of various versioning options to meet the needs of a given mission and provides a simple framework for users to identify their versioning requirements and implementation.
5

Visualization for Verification Driven Learning in Database Studies

Kallem, Aditya 17 December 2010 (has links)
This thesis aims at developing a data visualization tool to enhance database learning based on the Verification Driven Learning (VDL) model. The goal of the VDL model is to present abstract concepts in the contexts of real-world systems to students in the early stages of computer science program. In this project, a personnel/training management system has been turned into a learning platform by adding a number of features for visualization and quizzing. We have implemented various tactics to visualize the data manipulation and data retrieval operations in database, as well as the message contents in data messaging channels. The results of our development have been utilized in eight learning cases illustrating the applications of our visualization tool. Each of these learning cases were made by systematically implanting bugs in a functioning component; the students are assigned to identify the bugs and at the same time to learn the structure of the software system active
6

THE USE OF IEEE P1451.3 SMART SENSORS IN A DATA ACQUISITION NETWORK

Eccles, Lee H. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the use of an IEEE p1451.3 Smart Sensor bus as part of a network centric data acquisition system. IEEE p1451.3 provides for synchronized data acquisition from a number of transducers on a bus. The standard provides for Transducer Electronic Data Sheets (TEDS) that the manufacturer can use to describe the function and capabilities of the sensor module. The standard also provides for TEDS where the user can store information relevant to a particular application. The information in these TEDS can be used to generate much of the information that is required to be able to process the data during or after a test. The use of this information to configure and operate a Network Based Data Acquisition is described.
7

Tool Support and Data Management for Business Analytics

Azarm, Mana 20 June 2011 (has links)
The data delivery architectures in most enterprises are complex and under documented. Conceptual business models and business analytics applications are created to provide a simplified, and easy to navigate view of enterprise data for analysts. But the construction of such interfaces is tedious, manually intensive to build, requiring specialized technical expertise, and it is especially difficult to map exactly where data came from in the organization. In this paper we investigate how two aspects (lineage and requests for data i.e. semantics and new reports) can be addressed by tying metadata documentation to a systematic data delivery architecture in order to support business analytics applications. We propose a tool framework that includes a metadata repository for each step in the data delivery architecture, a web based interface to access and manage that repository and mapping tools that capture data lineage to support step by step automation of data delivery.
8

Tool Support and Data Management for Business Analytics

Azarm, Mana 20 June 2011 (has links)
The data delivery architectures in most enterprises are complex and under documented. Conceptual business models and business analytics applications are created to provide a simplified, and easy to navigate view of enterprise data for analysts. But the construction of such interfaces is tedious, manually intensive to build, requiring specialized technical expertise, and it is especially difficult to map exactly where data came from in the organization. In this paper we investigate how two aspects (lineage and requests for data i.e. semantics and new reports) can be addressed by tying metadata documentation to a systematic data delivery architecture in order to support business analytics applications. We propose a tool framework that includes a metadata repository for each step in the data delivery architecture, a web based interface to access and manage that repository and mapping tools that capture data lineage to support step by step automation of data delivery.
9

Experimental Frame Structuring For Automated Model Construction: Application to Simulated Weather Generation

Cheon, Saehoon January 2007 (has links)
The source system is the real or virtual environment that we are interested in modeling. It is viewed as a source of observable data, in the form of time-indexed trajectories of variables. The data that has been gathered from observing or experimenting with a system is called the system behavior data base. The time indexed trajectories of variables provide an important clue to compose the DEVS (discrete event specification) model. Once event set is derived from the time indexed trajectories of variable, the DEVS model formalism can be extracted from the given event set. The process must not be a simple model generation but a meaningful model structuring of a request. The source data and query designed with SES are converted to XML Meta data by XML converting process. The SES serves as a compact representation for organizing all possible hierarchical composition of system so that it performs an important role to design the structural representation of query and source data to be saved. For the real data application, the model structuring with the US Climate Normals is introduced. Moreover, complex systems are able to be developed at different levels of resolution. When the huge size of source data in US Climate Normals are implemented for the DEVS model, the model complexity is unavoidable. This issue is dealt with the creation of the equivalent lumped model based on the concept of morphism. Two methods to define the resolution level are discussed, fixed and dynamic definition. Aggregation is also discussed as the one of approaches for the model abstraction. Finally, this paper will introduce the process to integrate the DEVSML(DEVS Modeling Language) engine with the DEVS model creation engine for the Web Service Oriented Architecture.
10

Tool Support and Data Management for Business Analytics

Azarm, Mana 20 June 2011 (has links)
The data delivery architectures in most enterprises are complex and under documented. Conceptual business models and business analytics applications are created to provide a simplified, and easy to navigate view of enterprise data for analysts. But the construction of such interfaces is tedious, manually intensive to build, requiring specialized technical expertise, and it is especially difficult to map exactly where data came from in the organization. In this paper we investigate how two aspects (lineage and requests for data i.e. semantics and new reports) can be addressed by tying metadata documentation to a systematic data delivery architecture in order to support business analytics applications. We propose a tool framework that includes a metadata repository for each step in the data delivery architecture, a web based interface to access and manage that repository and mapping tools that capture data lineage to support step by step automation of data delivery.

Page generated in 0.841 seconds