• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19702
  • 3374
  • 2417
  • 2015
  • 1551
  • 1432
  • 884
  • 406
  • 390
  • 359
  • 297
  • 237
  • 208
  • 208
  • 208
  • Tagged with
  • 38246
  • 12472
  • 9259
  • 7125
  • 6700
  • 5896
  • 5313
  • 5210
  • 4743
  • 3465
  • 3307
  • 2847
  • 2731
  • 2548
  • 2118
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

DATA VALIDATION: A PREREQUISITE TO PERFORMING DATA UNCERTAINTY ANALYSIS

Walter, Patrick L. 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / There are increasing demands, particularly from government agencies, to perform uncertainty analysis in order to assign accuracy bounds to telemetered data from environmental measuring transducers (pressure, acceleration, force, strain, temperature, etc.). Several requirements must be fulfilled before measurement uncertainty analysis is justified. These requirements include good measurement system design practices such as adequate low- and high-frequency response and data-sampling rates, appropriate anti-aliasing filter selection^(1), proper grounding and shielding, and many more. In addition, there are applications (e.g., flight test) in which the environment of the transducer varies with time and/or location. In these applications, it is a requisite that data-validation be performed to establish that an individual transducer responds only to the environmental stimulus that it is intended to measure. Without this validation component designed into the telemetry system, assigned accuracy bounds can be totally meaningless. This paper presents examples and describes techniques for data validation of signals from environmental measuring transducers.
172

CURRENT STATUS OF DATA COMPRESSION IN TELEMETRY

Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California / Reduction of bandwidth for signal transmission is of paramount concern to many in the telemetry and wireless industry. One way to reduce bandwidth is to reduce the amount data being sent. There are several techniques available to reduce the amount of data. This paper will review the various types of data compression currently in use for telemetry data and how much compression is achieved.
173

FROM THE STRIP-CHART RECORDER TO THE TELEMETRY RECORDER-WORKSTATION

Kortick, David N. 10 1900 (has links)
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California / The strip-chart recorder has evolved from a simple pen-writing instrument to an instrumentation platform. Today's Telemetry Recording Workstations not only provide a permanent hard copy of telemetry data, but also offer high resolution videographic displays with real-time point-of-writing representation, the ability to efficiently store data digitally, and customizable user interfaces. Host control and digital data transfer can be achieved using Ethernet networks, making the instrument an integral part of a telemetry system. The Telemetry Recording Workstation is a fundamental instrument for any telemetry application or installation, due to its abilities to display, print and store real-time data.
174

Challenges and Solutions for Complex Gigabit FTI Networks

Cranley, Nikki 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / This paper presents a case study of an FTI system with complex requirements in terms of the data acquisition, recording, and post-analysis. Gigabit Ethernet was the technology of choice to facilitate such a system. Recording in a Gigabit Ethernet environment raises a fresh challenge to perform fast data reduction and data mining for post-flight analysis. This paper describes the Quick Access Recorder used in this system and how it addresses this challenge.
175

Computerisation in Hong Kong professional engineering firms

徐順通, Chee, Sung-thong, Andrew. January 1985 (has links)
published_or_final_version / Management Studies / Master / Master of Business Administration
176

Data mining algorithms for genomic analysis

Ao, Sio-iong., 區小勇. January 2007 (has links)
published_or_final_version / abstract / Mathematics / Doctoral / Doctor of Philosophy
177

A DATA ACQUISITION SYSTEM FOR CT SCANNING DURING RADIOTHERAPY.

Strinka, Stephen Andrew. January 1984 (has links)
No description available.
178

PROCESSING AND INTERPRETATION OF AEROMAGNETIC DIGITAL IMAGES.

Coulter, David William. January 1984 (has links)
No description available.
179

A NEW COMBINED ROUTING AND FLOW CONTROL ALGORITHM FOR DATA COMMUNICATION NETWORKS

Yamout, Salam, 1963- January 1987 (has links)
The object of interest in this research is the evaluation of the combined routing and flow control scheme proposed by Muralidhar and Sundareshan. A slightly modified design using results from positive system theory is proposed as an attempt to improve the performance. Then a detailed performance evaluation of Muralidhar and Sundareshan's scheme and its modified version proposed here is conducted using discrete-event simulation. The application of positive system theory to the combined routing and flow control algorithm proposed by Muralidhar and Sundareshan proved to be quite effective in improving the overall performance only in the initial transient period. The fact that the improvement is restricted to a particular time period appears to be attributable to poor buffer management and certain other fundamental problems with Muralidhar and Sundareshan's scheme. These problems are identified and are described in detail.
180

LHCb data management on the computing grid

Smith, Andrew Cameron January 2009 (has links)
The LHCb detector is one of the four experiments being built to harness the proton-proton collisions provided by the Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN). The data rate expected, when the LHC experiments are fully operational, eclipses that of any previous scientific experiments and has motivated the adoption of a grid computing paradigm to store and process the data. Managing PetaBytes of data in a distributed environment provides a rich set of challenges related to scalability, reliability and performance. This thesis will present the data management requirements for executing the workload of the LHCb collab- oration. We present the systems designed that support all aspects of the grid data management for LHCb, from data transfer, to data integrity, and efficient data access. The distributed computing environment is inherently unstable and much focus has been made on providing systems that are ro- bust and resilient to observed failures.

Page generated in 0.1542 seconds