• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 21
  • 6
  • 4
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 74
  • 74
  • 21
  • 10
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

CAPS: AN EGLIN RANGE STANDARD FOR PC-BASED TELEMETRY DATA REDUCTION

Thomas, Tim 10 1900 (has links)
International Telemetering Conference Proceedings / October 21, 2002 / Town & Country Hotel and Conference Center, San Diego, California / A need exists to provide a flexible data reduction tool that minimizes software development costs and reduces analysis time for telemetry data. The Common Airborne Processing System (CAPS), developed by the Freeman Computer Sciences Center at Eglin AFB, Florida, provides a generalpurpose data reduction capability for digitally recorded data on a PC. Data from virtually any kind of MIL-STD-1553 message or Pulse Code Modulation (PCM) frame can be extracted and converted to engineering units using a parameter dictionary that describes the data format. The extracted data can then be written to a file, ASCII or binary, with a great deal of flexibility in the output format. CAPS has become the standard for digitally recorded data reduction on a PC at Eglin. New features, such as composing derived parameters using mathematical expressions, are being added to CAPS to make it an even more productive data reduction tool. This paper provides a conceptual overview of the CAPS version 2.3 software.
42

On-Board Data Processing and Filtering

Faber, Marc 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / One of the requirements resulting from mounting pressure on flight test schedules is the reduction of time needed for data analysis, in pursuit of shorter test cycles. This requirement has ramifications such as the demand for record and processing of not just raw measurement data but also of data converted to engineering units in real time, as well as for an optimized use of the bandwidth available for telemetry downlink and ultimately for shortening the duration of procedures intended to disseminate pre-selected recorded data among different analysis groups on ground. A promising way to successfully address these needs consists in implementing more CPU-intelligence and processing power directly on the on-board flight test equipment. This provides the ability to process complex data in real time. For instance, data acquired at different hardware interfaces (which may be compliant with different standards) can be directly converted to more easy-to-handle engineering units. This leads to a faster extraction and analysis of the actual data contents of the on-board signals and busses. Another central goal is the efficient use of the available bandwidth for telemetry. Real-time data reduction via intelligent filtering is one approach to achieve this challenging objective. The data filtering process should be performed simultaneously on an all-data-capture recording and the user should be able to easily select the interesting data without building PCM formats on board nor to carry out decommutation on ground. This data selection should be as easy as possible for the user, and the on-board FTI devices should generate a seamless and transparent data transmission, making a quick data analysis viable. On-board data processing and filtering has the potential to become the future main path to handle the challenge of FTI data acquisition and analysis in a more comfortable and effective way.
43

Analyzing and sharing data for surface combat weapons systems

Wilhelm, Gary L. 12 1900 (has links)
Approved for public release, distribution is unlimited / Test and evaluation of system performance has been a critical part of the acceptance of combat weapon systems for the Department of Defense. As combat weapon systems have become more complex, evaluation of system performance has relied more heavily on recorded test data. As part of the on-going transformation of the Defense department, Commercial-Off- The-Shelf (COTS) technology is being integrated into the acquisition of combat weapon systems. An Analysis Control Board (ACB) was created in response to these factors to support the AEGIS Weapon System Program Office. The focus of this ACB was to investigate and provide potential solutions to Data Dictionary, Data Recording and Data Reduction (R2D2) issues to the AEGIS Program Manager. This thesis discusses the history of the R2D2 ACB and its past, present and future directions. Additionally, this thesis examines how the R2D2 ACB concept could be applied to the DD(X) Next Generation Destroyer program. / Civilian, United States Navy
44

Scalable Performance Assessment of Industrial Assets: A Data Mining Approach

Dagnely, Pierre 21 June 2019 (has links) (PDF)
Nowadays, more and more industrial assets are continuously monitored and generate vast amount of event logs and sensor data. Data Mining is the field concerned with the exploration and exploitation of these data. Despite the fact that data mining has been researched for decades, the event log data are still underexploited in most data mining workflows although they could provide valuable insights on the asset behavior as they represent the internal processes of an asset. However, exploitation of event log data is challenging, mainly as: 1) event labels are not consistent across manufacturers, 2) assets report vast amount of data from which only a small part may be relevant, 3) textual event logs and numerical sensor data are usually processed by methods dedicated respectively to textual data or sensor data, methods combining both types of data are still missing, 4) industrial data are rarely labelled, i.e. there is no indication on the actual performance of the asset and it has to be derived from other sources, 5) the meaning of an event may vary depending on the events send after or before.Concretely, this thesis is concerned with the conception and validation of an integrated data processing framework for scalable performance assessment of industrial asset portfolios. This framework is composed of several advanced methodologies facilitating exploitation of both event logs and time series sensor data: 1) an ontology model describing photovoltaic (the validation domain) event system allowing the integration of heterogeneous event generated by various manufacturers; 2) a novel and computationally scalable methodology enabling automatic calculation of event relevancy score without any prior knowledge; 3) a semantically enriched multi-level pattern mining methodology enabling data exploration and hypothesis building across heterogeneous assets; 4) an advanced workflow extracting performance profiles by combining textual event logs and numerical sensor values; 5) a scalable methodology allowing rapid annotation of new asset runs with a known performance label only based on the event logs data.The framework has been exhaustively validated on real-world data from PV plants, provided by our industrial partner 3E. However, the framework has been designed to be domain agnostic and can be adapted to other industrial assets reporting event logs and sensor data. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished
45

Data security and reliability in cloud backup systems with deduplication.

January 2012 (has links)
雲存儲是一個新興的服務模式,讓個人和企業的數據備份外包予較低成本的遠程雲服務提供商。本論文提出的方法,以確保數據的安全性和雲備份系統的可靠性。 / 在本論文的第一部分,我們提出 FadeVersion,安全的雲備份作為今天的雲存儲服務上的安全層服務的系統。 FadeVersion實現標準的版本控制備份設計,從而消除跨不同版本備份的冗餘數據存儲。此外,FadeVersion在此設計上加入了加密技術以保護備份。具體來說,它實現細粒度安全删除,那就是,雲客戶可以穩妥地在雲上删除特定的備份版本或文件,使有關文件永久無法被解讀,而其它共用被删除數據的備份版本或文件將不受影響。我們實現了試驗性原型的 FadeVersion並在亞馬遜S3之上進行實證評價。我們證明了,相對於不支援度安全删除技術傳統的雲備份服務 FadeVersion只增加小量額外開鎖。 / 在本論文的第二部分,提出 CFTDedup一個分佈式代理系統,利用通過重複數據删除增加雲存儲的效率,而同時確保代理之間的崩潰容錯。代理之間會進行同步以保持重複數據删除元數據的一致性。另外,它也分批更新元數據減輕同步帶來的開銷。我們實現了初步的原型CFTDedup並通過試驗台試驗,以存儲虛擬機映像評估其重複數據删除的運行性能。我們還討論了幾個開放問題,例如如何提供可靠、高性能的重複數據删除的存儲。我們的CFTDedup原型提供了一個平台來探討這些問題。 / Cloud storage is an emerging service model that enables individuals and enterprises to outsource the storage of data backups to remote cloud providers at a low cost. This thesis presents methods to ensure the data security and reliability of cloud backup systems. / In the first part of this thesis, we present FadeVersion, a secure cloud backup system that serves as a security layer on top of todays cloud storage services. FadeVersion follows the standard version-controlled backup design, which eliminates the storage of redundant data across different versions of backups. On top of this, FadeVersion applies cryptographic protection to data backups. Specifically, it enables ne-grained assured deletion, that is, cloud clients can assuredly delete particular backup versions or files on the cloud and make them permanently in accessible to anyone, while other versions that share the common data of the deleted versions or les will remain unaffected. We implement a proof-of-concept prototype of FadeVersion and conduct empirical evaluation atop Amazon S3. We show that FadeVersion only adds minimal performance overhead over a traditional cloud backup service that does not support assured deletion. / In the second part of this thesis, we present CFTDedup, a distributed proxy system designed for providing storage efficiency via deduplication in cloud storage, while ensuring crash fault tolerance among proxies. It synchronizes deduplication metadata among proxies to provide strong consistency. It also batches metadata updates to mitigate synchronization overhead. We implement a preliminary prototype of CFTDedup and evaluate via test bed experiments its runtime performance in deduplication storage for virtual machine images. We also discuss several open issues on how to provide reliable, high-performance deduplication storage. Our CFTDedup prototype provides a platform to explore such issues. / Detailed summary in vernacular field only. / Detailed summary in vernacular field only. / Detailed summary in vernacular field only. / Rahumed, Arthur. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2012. / Includes bibliographical references (leaves 47-51). / Abstracts also in Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Cloud Based Backup and Assured Deletion --- p.1 / Chapter 1.2 --- Crash Fault Tolerance for Backup Systems with Deduplication --- p.4 / Chapter 1.3 --- Outline of Thesis --- p.6 / Chapter 2 --- Background and Related Work --- p.7 / Chapter 2.1 --- Deduplication --- p.7 / Chapter 2.2 --- Assured Deletion --- p.7 / Chapter 2.3 --- Policy Based Assured Deletion --- p.8 / Chapter 2.4 --- Convergent Encryption --- p.9 / Chapter 2.5 --- Cloud Based Backup Systems --- p.10 / Chapter 2.6 --- Fault Tolerant Deduplication Systems --- p.10 / Chapter 3 --- Design of FadeVersion --- p.12 / Chapter 3.1 --- Threat Model and Assumptions for Fade Version --- p.12 / Chapter 3.2 --- Motivation --- p.13 / Chapter 3.3 --- Main Idea --- p.14 / Chapter 3.4 --- Version Control --- p.14 / Chapter 3.5 --- Assured Deletion --- p.16 / Chapter 3.6 --- Assured Deletion for Multiple Policies --- p.18 / Chapter 3.7 --- Key Management --- p.19 / Chapter 4 --- Implementation of FadeVersion --- p.20 / Chapter 4.1 --- System Entities --- p.20 / Chapter 4.2 --- Metadata Format in FadeVersion --- p.22 / Chapter 5 --- Evaluation of FadeVersion --- p.24 / Chapter 5.1 --- Setup --- p.24 / Chapter 5.2 --- Backup/Restore Time --- p.26 / Chapter 5.3 --- Storage Space --- p.28 / Chapter 5.4 --- Monetary Cost --- p.29 / Chapter 5.5 --- Conclusions --- p.30 / Chapter 6 --- CFTDedup Design --- p.31 / Chapter 6.1 --- Failure Model --- p.31 / Chapter 6.2 --- System Overview --- p.32 / Chapter 6.3 --- Distributed Deduplication --- p.33 / Chapter 6.4 --- Crash Fault Tolerance --- p.35 / Chapter 6.5 --- Implementation --- p.36 / Chapter 7 --- Evaluation of CFTDedup --- p.37 / Chapter 7.1 --- Setup --- p.37 / Chapter 7.2 --- Experiment 1 (Archival) --- p.38 / Chapter 7.3 --- Experiment 2 (Restore) --- p.39 / Chapter 7.4 --- Experiment 3 (Recovery) --- p.40 / Chapter 7.5 --- Summary --- p.41 / Chapter 8 --- Future work and Conclusions of CFTDedup --- p.43 / Chapter 8.1 --- Future Work --- p.43 / Chapter 8.2 --- Conclusions --- p.44 / Chapter 9 --- Conclusion --- p.45 / Bibliography --- p.47
46

Wavelet-based Data Reduction and Mining for Multiple Functional Data

Jung, Uk 12 July 2004 (has links)
Advance technology such as various types of automatic data acquisitions, management, and networking systems has created a tremendous capability for managers to access valuable production information to improve their operation quality and efficiency. Signal processing and data mining techniques are more popular than ever in many fields including intelligent manufacturing. As data sets increase in size, their exploration, manipulation, and analysis become more complicated and resource consuming. Timely synthesized information such as functional data is needed for product design, process trouble-shooting, quality/efficiency improvement and resource allocation decisions. A major obstacle in those intelligent manufacturing system is that tools for processing a large volume of information coming from numerous stages on manufacturing operations are not available. Thus, the underlying theme of this thesis is to reduce the size of data in a mathematical rigorous framework, and apply existing or new procedures to the reduced-size data for various decision-making purposes. This thesis, first, proposes {it Wavelet-based Random-effect Model} which can generate multiple functional data signals which have wide fluctuations(between-signal variations) in the time domain. The random-effect wavelet atom position in the model has {it locally focused impact} which can be distinguished from other traditional random-effect models in biological field. For the data-size reduction, in order to deal with heterogeneously selected wavelet coefficients for different single curves, this thesis introduces the newly-defined {it Wavelet Vertical Energy} metric of multiple curves and utilizes it for the efficient data reduction method. The newly proposed method in this thesis will select important positions for the whole set of multiple curves by comparison between every vertical energy metrics and a threshold ({it Vertical Energy Threshold; VET}) which will be optimally decided based on an objective function. The objective function balances the reconstruction error against a data reduction ratio. Based on class membership information of each signal obtained, this thesis proposes the {it Vertical Group-Wise Threshold} method to increase the discriminative capability of the reduced-size data so that the reduced data set retains salient differences between classes as much as possible. A real-life example (Tonnage data) shows our proposed method is promising.
47

Wavelet-Based Methodology in Data Mining for Complicated Functional Data

Jeong, Myong-Kee 04 April 2004 (has links)
To handle potentially large size and complicated nonstationary functional data, we present the wavelet-based methodology in data mining for process monitoring and fault classification. Since traditional wavelet shrinkage methods for data de-noising are ineffective for the more demanding data reduction goals, this thesis presents data reduction methods based on discrete wavelet transform. Our new methods minimize objective functions to balance the tradeoff between data reduction and modeling accuracy. Several evaluation studies with four popular testing curves used in the literature and with two real-life data sets demonstrate the superiority of the proposed methods to engineering data compression and statistical data de-noising methods that are currently used to achieve data reduction goals. Further experimentation in applying a classification tree-based data mining procedure to the reduced-size data to identify process fault classes also demonstrates the excellence of the proposed methods. In this application the proposed methods, compared with analysis of original large-size data, result in lower misclassification rates with much better computational efficiency. This thesis extends the scalogram's ability for handling noisy and possibly massive data which show time-shifted patterns. The proposed thresholded scalogram is built on the fast wavelet transform, which can effectively and efficiently capture non-stationary changes in data patterns. Finally, we present a SPC procedure that adaptively determines which wavelet coefficients will be monitored, based on their shift information, which is estimated from process data. By adaptively monitoring the process, we can improve the performance of the control charts for functional data. Using a simulation study, we compare the performance of some of the recommended approaches.
48

The incorporation of World War II experiences in the life stories of alumni from the Vrije University in Amsterdam: an exploration at the crossroads between narrative, identity, and culture

Visser, Roemer Maarten Sander 15 May 2009 (has links)
For this study, twelve life stories of alumni from the Vrije Universiteit in Amsterdam, who were enrolled during the Nazi Occupation between 1940 and 1945, were collected and analyzed. Besides exploring the extent to which the interviews were co-constructed jointly by the interviewer and interviewees, this study addresses three questions. First, it acknowledges methodological concerns associated with an overabundance of narrative data, and suggests a new method for arriving at a core narrative based on the distribution of time. This core narrative can then be analyzed further. Second, it is suggested that early memories serve as identity claims; because of their congruency with the remainder of the story, they appear to foreshadow what is to come. As a result, it is argued that childhood memories merit special attention in the analysis of narratives. Third, and finally, the constraints on narratives imposed by cultural conventions, or master narratives, are explored. Narrators use a variety of strategies in order to satisfy sometimes competing demands on their narratives. It is argued that culture makes its influence felt in ways that are not always obvious, particularly if the interviewee and interviewer share the same culture.
49

Boolean factor analysis a review of a novel method of matrix decomposition and neural network Boolean factor analysis /

Upadrasta, Bharat. January 2009 (has links)
Thesis (M.S.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009. / Includes bibliographical references.
50

Dimensionality Reduction for Commercial Vehicle Fleet Monitoring

Baldiwala, Aliakbar 25 October 2018 (has links)
A variety of new features have been added in the present-day vehicles like a pre-crash warning, the vehicle to vehicle communication, semi-autonomous driving systems, telematics, drive by wire. They demand very high bandwidth from in-vehicle networks. Various electronic control units present inside the automotive transmit useful information via automotive multiplexing. Automotive multiplexing allows sharing information among various intelligent modules inside an automotive electronic system. Optimum functionality is achieved by transmitting this data in real time. The high bandwidth and high-speed requirement can be achieved either by using multiple buses or by implementing higher bandwidth. But, by doing so the cost of the network and the complexity of the wiring in the vehicle increases. Another option is to implement higher layer protocol which can reduce the amount of data transferred by using data reduction (DR) techniques, thus reducing the bandwidth usage. The implementation cost is minimal as only the changes are required in the software and not in hardware. In our work, we present a new data reduction algorithm termed as “Comprehensive Data Reduction (CDR)” algorithm. The proposed algorithm is used for minimization of the bus utilization of CAN bus for a future vehicle. The reduction in the busload was efficiently made by compressing the parameters; thus, more number of messages and lower priority messages can be efficiently sent on the CAN bus. The proposed work also presents a performance analysis of proposed algorithm with the boundary of fifteen compression algorithm, and Compression area selection algorithms (Existing Data Reduction Algorithm). The results of the analysis show that proposed CDR algorithm provides better data reduction compared to earlier proposed algorithms. The promising results were obtained in terms of reduction in bus utilization, compression efficiency, and percent peak load of CAN bus. This Reduction in the bus utilization permits to utilize a larger number of network nodes (ECU’s) in the existing system without increasing the overall cost of the system. The proposed algorithm has been developed for automotive environment, but it can also be utilized in any applications where extensive information transmission among various control units is carried out via a multiplexing bus.

Page generated in 0.3523 seconds