Spelling suggestions: "subject:administracioneś́ńn+data" "subject:administrationś́ńn+data"
1071 |
AN IP-BASED RECORDING SYSTEMRoach, John, Hildin, John 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / Traditionally, acquired instrumentation data on a non-destructive test article is recorded to a nonvolatile memory recorder. The data acquisition system usually samples and formats its inputs before transmitting the data to the recorder (also known in this paper as a data sink) via a PCM serial data stream (i.e., clock and data). In a network-based data acquisition architecture, the inclusion of an IP-based recorder adds a new dimension to the data acquisition process. Any IP network inherently allows for the bi-directional exchange of data. In this environment, the IPbased recorder can be treated as both a data sink for parameter recording and a data source for parameter extraction, data rate statistics, and recorder status reporting. The network model recasts the data recorder’s function as a file server to which multiple clients could be simultaneously requesting services. Those clients that represent the data acquisition nodes are requesting storage of their acquired parameters. Clients, such as transmitters or test engineers, are requesting access to archived data or status information for further processing. This paper presents the advantages of using an IP-based recorder in a network-based data acquisition system. The availability of an IP interface along with the intelligence built into the recorder expands its capabilities beyond that of a conventional PCM recorder. These capabilities include real-time health monitoring, support for the Simple Network Management Protocol (SNMP), data mining, reporting of real-time performance and network statistics.
|
1072 |
REAL TIME DATA WAREHOUSING AND ON LINE ANALYTICAL PROCESSING AT ABERDEEN TEST CENTERReil, Michael J., Bartlett, T. George, Henry, Kevin 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / This paper is a follow on to a paper presented at the 2005 International Telemetry Conference by
Dr. Samuel Harley et. al., titled Data, Information, and Knowledge Management. This paper
will describe new techniques and provide further detail into the inner workings of the VISION
(Versatile Information System – Integrated, Online) Engineering Performance Data Mart.
|
1073 |
MINING IRIG-106 CHAPTER 10 AND HDF-5 DATALockard, Michael T., Rajagopalan, R., Garling, James A. 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / Rapid access to ever-increasing amounts of test data is becoming a problem. The authors
have developed a data-mining methodology solution approach to provide a solution to
catalog test files, search metadata attributes to derive test data files of interest, and query
test data measurements using a web-based engine to produce results in seconds.
Generated graphs allow the user to visualize an overview of the entire test for a selected
set of measurements, with areas highlighted where the query conditions were satisfied.
The user can then zoom into areas of interest and export selected information.
|
1074 |
A LOOK AT CELLULAR PACKET DATA PERFORMANCE FOR APPLICATION IN iNETWebley, Kayonne 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The integrated Network Enhanced Telemetry (iNET) environment is meant to build and improve on the existing unidirectional legacy telemetry links. The optimized network would have to be capable of providing bi-directional, spectrally efficient, reliable, dynamically allocated, real time or near real time access to video and other types of test data over a shared bandwidth, high capacity network. Developed specifically for providing a reliable means of communications for large numbers of users, cellular technology seems particularly suited to addressing iNET’s needs. This paper investigates the creation of a cellular model for enhanced throughput for data users wherein a user would dynamically be allocated high data rates dependent on parameters such as the received signal to noise ratio (SNR). Our future work will develop the average data performance, comparing both the time division multiple access (TDMA) and code division multiple access (CDMA) environments for potential application in iNET.
|
1075 |
MERGING TELEMETRY DATA FROM MULTIPLE RECEIVERSWilson, Michael J. 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Multiple receiver telemetry systems are common in the aeroballistics test and evaluation community. These systems typically record telemetry data independently, requiring post-flight data processing to produce the most accurate combination of the available data. This paper addresses the issues of time synchronization between multiple data sources and determination of the best choice for each data word. Additional filtering is also developed for the case when all available data are corrupted. The performance of the proposed algorithms is presented.
|
1076 |
EXTENSION OF A COMMON DATA FORMAT FOR REAL-TIME APPLICATIONSWegener, John A., Davis, Rodney L. 10 1900 (has links)
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California / The HDF5 (Hierarchical Data Format) data storage family is an industry standard format that allows
data to be stored in a common format and retrieved by a wide range of common tools. HDF5 is a
widely accepted industry standard container for data storage developed by the National Center for
Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. The
HDF5 data storage family includes HDF-Time History, intended for data processing, and HDF-Packet,
intended for real-time data collection; each of these is an extension to the basic HDF5
format, which defines data structures and associated interrelationships, optimized for that particular
purpose. HDF-Time History, developed jointly by Boeing and NCSA, is in the process of being
adopted throughout the Boeing test community and by its external partners. The Boeing/NCSA team
is currently developing HDF-Packet to support real-time streaming applications, such as airborne
data collection and recording of received telemetry. The advantages are significant cost reduction
resulting from storing the data in its final format, thus avoiding conversion between a myriad of
recording and intermediate formats. In addition, by eliminating intermediate file translations and
conversions, data integrity is maintained from recording through processing and archival storage.
As well, HDF5 is a general-purpose wrapper, into which can be stored processed data and other data
documentation information (such as calibrations), thus making the final data file self-documenting.
This paper describes the basics of the HDF-Time History, the extensions required to support real-time
acquisition with HDF-Packet, and implementation issues unique to real-time acquisition. It
also describes potential future implementations for data acquisition systems in different segments of
the test data industry.
|
1077 |
Mental Health Readmissions Among Veterans: An Exploratory Endeavor Using Data MiningPrice, Lauren Emilie January 2015 (has links)
The purpose of this research is to inform the understanding of mental health readmissions by identifying associations between individual and environmental attributes and readmissions, with consideration of the impact of time-to-readmission within the Veterans Health Administration (VHA). Mental illness affects one in five adults in the United States (US). Mental health disorders are among the highest all-cause readmission diagnoses. The VHA is one of the largest national service providers of specialty mental health care. VHA's clinical practices and patient outcomes can be traced to US policy, and may be used to forecast national outcomes should these same policies be implemented nationwide. In this research, we applied three different data mining techniques to clinical data from over 200,000 patients across the VHA. Patients in this cohort consisted of adults receiving VHA inpatient mental health care between 2008 and 2013. The data mining techniques employed included k-means cluster analysis, association-rule mining, and decision tree analysis. K-means was used during cluster analysis to identify four statistically distinct clusters based on the combination of admission count, comorbidities, prescription (RX) count, age, casualty status, travel distance, and outpatient encounters. The association-rule mining analysis yielded multiple frequently occurring attribute values and sets consisting of service connection type, diagnoses/problems, and pharmaceuticals. Using the CHAID algorithm, the best decision tree model achieved 80% predictive accuracy when no readmissions were compared to 30-day readmissions. The strongest predictors of readmissions based on this algorithm were outpatient encounters, prescription count, VA Integrated Service Network (VISN), number of comorbidities, region, service connection, and period of service. Based on evidence from all three techniques, individuals with higher rates of system-wide utilization, more comorbidities, and longer medication lists are the most likely to have a 30-day readmission. These individuals represented 25% of this cohort, are sicker in general and may benefit from enrollment in a comprehensive nursing case management program.
|
1078 |
Introduction to XidML 3.0 An Open XML Standard for Flight Test Instrumentation DescriptionCooke, Alan, Herbepin, Christian 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / A few years ago XidML was introduced as an open XML standard for capturing the meta-data associated with flight test instrumentation (FTI). This meta-data schema was broken down into elements for Parameter (name, range, units, offset-binary), Instrument (name, serial number, misses-to loss), Package (bits per word, words per minor-frame, rate) and Link (name, type) and so on. XidML remains one of the only published schema for FTI meta-data and with XidML 3.0 many simplifications have been introduced along with support for nested tree structures and a single instrument schema allowing anyone to define the validation for instruments from any vendor. This paper introduces the XidML schema and describers the benefits of XidML 3.0 in particular. It begins by giving a brief description of what XidML is and describes its history and motivation. The paper then outlines the main differences between XidML-3.0 and earlier versions, and how the XidML schema has been further refined to meet the challenges faced by the FTI community. As an example of usage the FTIManager software developed at Eurocopter will be briefly presented in order to illustrate the XidML ability to describe a multi-vendor FTI configuration.
|
1079 |
DESIGN OF A CONFIGURATIONAND MANAGEMENT TOOL FORINSTRUMENTATION NETWORKSRoach, John 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The development of network-based data acquisition systems has resulted in a new architecture for
supporting flight instrumentation that has the potential to revolutionize the way we test our aircraft.
However, the inherent capability and flexibility in a networked test architecture can only be
realized by the flight engineer if a sufficiently powerful toolset is available that can configure and
manage the system. This paper introduces the concept of an instrumentation configuration and
management system (ICMS) that acts as the central resource for configuring, controlling, and
monitoring the instrumentation network. Typically, the ICMS supports a graphical user interface
into the workings of the instrumentation network, providing the user with a friendly and efficient
way to verify the operation of the system. Statistics being gathered at different peripherals within
the network would be collected by this tool and formatted for interpretation by the user. Any error
conditions or out-of-bounds situations would be detected by the ICMS and signaled to the user.
Changes made to the operation of any of the peripherals in the network (if permitted) would be
managed by the ICMS to ensure consistency of the system. Furthermore, the ICMS could
guarantee that the appropriate procedures were being followed and that the operator had the
required privileges needed to make any changes. This paper describes the high-level design of a
modular and multi-platform ICMS and its use within the measurement-centric aircraft
instrumentation network architecture under development by the Network Products Division at
Teletronics.
|
1080 |
Personlig integritet som informationspolitik : debatt och diskussion i samband med tillkomsten av Datalag (1973:289) / Privacy as information policy : debate and discussion concerning the first Swedish data protection law, Datalag (1973:289)Söderlind, Åsa January 2009 (has links)
The dissertation explores the field of information policy in a historic setting in Sweden, namely the early 1970s. At the time the question of privacy in relation to databanks, data systems and personal records was intensively discussed in all public media, starting in the fall of 1970 when a large-scale population census was carried out in Sweden. The political discussions and public debate resulted in the first Swedish data protection law, Datalag (1973:289), and was counted as one of the first of this type of national legislation in the world. The focus of the empirical study lies in the analysis of the lines of arguments, political reasoning and debates concerning privacy, data protection, information and technology in documents such as official reports, committee reports, proposals and parliamentary records and publications that were produced in the policy process preceding the new legislation. The public debate itself is investigated through the editorials and reports in the daily press of the time. A combination of discourse analysis and agenda-setting theory, as it is presented and used by the political scientist John W Kingdon, constitutes the theoretical framework of the thesis. The study is introduced with a discussion concerning discourse and language use in politics, and here Norman Faircloughs CDA, Critical Discourse Analysis, has been the main inspiration. Kingdon’s agendasetting model contributes with an interesting theoretical perspective on the social and political context of the discourses under study. The research questions also draw upon library and information science and theoretical work within the area of information policy, with issues concerning notions of information and technology, for example information as a public good versus private good in the market, and information as a free or restricted/protected resource. The main findings of the study imply that the political discussion and debate on databanks and privacy were heavily influenced by a public-oriented discourse focusing mainly on governmental authorities’ own use of information systems holding personal data. The new legislation, datalag (1973:289) could also be seen as a tool that sanctions governmental authorities’ extensive use and dependence on new data technologies and automatic data-processing in building up the welfare state and the growing public sector. The discourse was also based on a mixed notion of the new technology, perceiving data technology mainly as the “big machine” which contains a vast amount of personal information. This, at a time when the technology itself was transforming rapidly from bulky machines to personal computers. The practical effects of this discourse could be seen, for example, in the serious underestimation of the overall use of automatic data-processing in society as a whole, the use of which the legislation was set to regulate. When it comes to agenda-setting the public debate together with the activities of different actors in parliament had a major influence on the outcome of the work of the commission of inquiry that was set up. The public debate affected how the problem area of databanks and privacy was considered, but the commission formulated the actual legislation independently, without interference or adjustments by the social democratic government. / <p>Akademisk avhandling som med tillstånd av samhällsvetenskapliga fakulteten vid Göteborgs universitet för vinnande av doktorsexamen framläggs till offentlig granskning kl. 13.15 fredagen den</p><p>11 september 2009 i hörsalen C204, Högskolan i Borås, Allégatan 1 Institutionen Biblioteks- och Informationsvetenskap/Bibliotekshögskolan, Högskolan i Borås och Göteborgs universitet</p>
|
Page generated in 0.1444 seconds