• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19758
  • 3375
  • 2417
  • 2018
  • 1551
  • 1432
  • 884
  • 407
  • 390
  • 359
  • 297
  • 238
  • 208
  • 208
  • 208
  • Tagged with
  • 38349
  • 12482
  • 9268
  • 7139
  • 6701
  • 5896
  • 5347
  • 5220
  • 4757
  • 3477
  • 3309
  • 2878
  • 2733
  • 2554
  • 2119
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1081

Study on Telemetry Data Authentication Protocol in Arms Control Verification

Qiang, Huang, Fan, Yang 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / The arms control verification activity is executed between countries, so various telemetry data will be remote-transmitted in the public signal channel and can be easily tampered. In order to secure this data’s authenticity and integrality, the paper has established a Multi-layer Data Authentication Protocol (MDAP) in which the key cryptographic technologies are digital signature and authentication. Meanwhile, overall evaluations of MDAP have been presented. We proved the MDAP is secure.
1082

DATA REDUCTION AND PROCESSING SYSTEM FOR FLIGHT TEST OF NEXT GENERATION BOEING AIRPLANES

Cardinal, Robert W. 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the recently developed Loral Instrumentation ground-based equipment used to select and process post-flight test data from the Boeing 777 airplane as it is played back from a digital tape recorder (e.g., the Ampex DCRSi II) at very high speeds. Gigabytes (GB) of data, stored on recorder cassettes in the Boeing 777 during flight testing, are played back on the ground at a 15-30 MB/sec rate into ten multiplexed Loral Instrumentation System 500 Model 550s for high-speed decoding, processing, time correlation, and subsequent storage or distribution. The ten Loral 550s are multiplexed for independent data path processing from ten separate tape sources simultaneously. This system features a parallel multiplexed configuration that allows Boeing to perform critical 777 flight test processing at unprecedented speeds. Boeing calls this system the Parallel Multiplexed Processing Data (PMPD) System. The key advantage of the ground station's design is that Boeing engineers can add their own application-specific control and setup software. The Loral 550 VMEbus allows Boeing to add VME modules when needed, ensuring system growth with the addition of other LI-developed products, Boeing-developed products or purchased VME modules. With hundreds of third-party VME modules available, system expansion is unlimited. The final system has the capability to input data at 15 MB/sec. The present aggregate throughput capability of all ten 24-bit Decoders is 150 MB/sec from ten separate tape sources. A 24-bit Decoder was designed to support the 30 MB/sec DCRSi III so that the system can eventually support a total aggregate throughput of 300 MB/sec. Clearly, such high data selection, rejection, and processing will significantly accelerate flight certification and production testing of today's state-of-the-art aircraft. This system was supplied with low level software interfaces such that the customer would develop their own applications specific code and displays. The Loral 550 lends itself to this kind of applications due to its VME chassis, VxWorks operating system and the modularity of the software.
1083

APPLICATION OF DATA COMPRESSION TO FRAME AND PACKET TELEMETRY

Horan, Stephen, Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Reduction of signal transmission is of paramount concern to many in the telemetry and wireless industry. One technique that is available is the compression of the data before transmission. With telemetry type data, there are many approaches that can be used to achieve compression. Data compression of the Advanced Range Telemetry (ARTM) PCM data sets in the frame and packet modes, and for the entire data file will be considered and compared. The technique of differencing data will also be applied to the data files by subtracting the previous major frame and then applying compression techniques. It will be demonstrated that telemetry compression is a viable option to reduce the amount of data to be transmitted, and hence the bandwidth. However, this compression produces variable-length data segments with implications for real-time data synchronization.
1084

NATO ADVANCED DATA STORAGE STANDARD STANAG 4575

Feuer, Gary 10 1900 (has links)
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California / NATO NAFAG Air Group IV (AG IV) established the NATO Advanced Data Storage Technical Support Team (NADS TST) to investigate the technology and to develop an interface Standardization Agreement (STANAG) for recording, storage, and exchange of imagery data. Government agencies and industry involved in these technologies are participating in this effort.
1085

A new approach to systems integration in the mechatronic engineering design process of manufacturing systems

Proesser, Malte January 2014 (has links)
Creating flexible and automated production facilities is a complex process that requires high levels of cooperation involving all mechatronics disciplines, where software tools being utilised have to work as closely as their users. Some of these tools are well-integrated but others can hardly exchange any data. This research aims to integrate the software systems applied by the mechatronic engineering disciplines to enable an enhanced design process characterised by a more parallel and iterative work flow. This thesis approaches systems integration from a data modelling point of view because it sees information transfer between heterogeneous data models as a key element of systems integration. A new approach has been developed which is called middle-in data modelling strategy since it is a combination of currently applied top-down and bottom-up approaches. It includes the separation of data into core design data which is modelled top-down and detailed design data modules which are modelled bottom-up. The effectiveness of the integration approach has been demonstrated in a case study undertaken for the mechatronic engineering design process of body shop production lines in the automotive industry. However, the application of the middle-in data modelling strategy is not limited to this use case: it can be used to enhance a variety of system integration tasks. The middle-in data modelling strategy is tested and evaluated in comparison with present top-down and bottom-up data modelling strategies on the basis of three test cases. These test cases simulated how the systems integration solutions based on the different data modelling strategies react to certain disturbances in the data exchange process as they would likely occur during industrial engineering design work. The result is that the top-down data modelling strategy is best in maintaining data integrity and consistency while the bottom-up strategy is most flexibly adaptable to further developments of systems integration solutions. The middle-in strategy combines the advantages of top-down and bottom-up approaches while their weaknesses and disadvantages are kept at a minimum. Hence, it enables the maintenance of data modelling consistency while being responsive to multidisciplinary requirements and adaptive during its step-by-step introduction into an industrial engineering process.
1086

CHART RECORDERS EVOLVE INTO DATA MANAGEMENT SYSTEMS

Smith, Grant M. 11 1900 (has links)
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada / A synopsis of the very recent evolution of telemetry chart recorders from “closed” chart paper output devices to powerful “open” Data Management Systems. A Data Management System (DMS) is defined as one which incorporates a video screen for waveform preview and monitoring, direct connection of hard or optical disk via SCSI for real-time data archiving, and DR11 digital interfacing. The DMS concept of providing real-time waveform monitoring independent of hard copy recording is discussed, as well as the capabilities of the hard copy recorder. The realities of budget shortfalls makes wholesale system upgrades to eliminate DAC’s entirely difficult at best. These concerns—and a potential remedy: a DMS which accepts any mix of analog and digital waveforms—are reviewed. Objectives: How DMS’s can be integrated with existing telemetry systems, encompass the functionality of conventional recorders and add new capabilities, with an emphasis on how data can be digitally pre-formatted in real-time, simplifying—or even eliminating—post-mission reduction and analysis. A demonstration of how a video display allows real-time trace viewing—a major weakness of conventional thermal array recorders.
1087

UTTR BEST TELEMETRY SOURCE SELECTOR

Rigley, Kenneth H., Wheelwright, David H., Fowers, Brandt H. 10 1900 (has links)
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California / The UTTR (Utah Test & Training Range) offers the largest over land test and training airspace in the continental United States. It provides excellent telemetry data processing capability through a number of TM (telemetry) sites. Selecting the best source of telemetry data for optimum coverage from these many sites can be very involved and challenging for ground station personnel. Computer-based best source selection automates this process, thereby increasing accuracy and efficiency. This paper discusses the capabilities of the BTSS (Best Telemetry Source Selector), its background, design and development, applications, and future at the UTTR.
1088

INCREASING DATA DENSITY ON 1/2 INCH CASSETTE TAPE

Buschbach, Charles W. 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / Half inch Super VHS (S-VHS) tape is an exceptional media for recording telemetry instrumentation data. Due to high volume mass production for the commercial and industrial video markets, high quality S-VHS media is available at a low cost. Advances in head technology as well as data path electronics contribute to increased data densities recorded on this media. Present system capabilities of 50,000 bits per inch will soon be replaced with newer technology systems that will record at linear densities up to 100,000 bits per inch.
1089

Emerging Role of Social Media in Data Sharing and Management

Ram, Sudha 23 October 2012 (has links)
This presentation was given at the 2012 Open Access Week program, “The Open Data Revolution: Challenges and Innovations” on October 23, 2012. Open Access Week is a world-wide event where academic institutions explore Open Access – the ideal of free, full-text, immediate, online access to peer-reviewed scholarship and research results so new ideas and information can be obtained rapidly and freely by everyone. Many funding agencies, such as the National Science Foundation, and journal publishers, such as Nature, require researchers to share data produced during the course of their research. When researchers share their data, other researchers can reuse it to answer new questions, opening up new interpretations and discoveries. Sharing data may also lead to sharing research processes, workflows and tools and may make research articles and papers more useful and citable by others.
1090

A More Open Future for the Past

Kansa, Eric 25 October 2011 (has links)
This presentation was given at the 2011 Open Access Week program, “The Future of Data: Open Access and Reproducibility” on October 25, 2011. Open Access Week is a world-wide event where academic institutions explore Open Access – the ideal of free, full-text, immediate, online access to peer-reviewed scholarship and research results so new ideas and information can be obtained rapidly and freely by everyone. Open Data is the idea that data should be freely available to anyone to use and reuse without access restrictions, licenses, copyright, patents and charges for use.

Page generated in 0.129 seconds