• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 45
  • 5
  • 4
  • 3
  • 3
  • 3
  • 1
  • 1
  • Tagged with
  • 77
  • 77
  • 26
  • 20
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 12
  • 11
  • 11
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On the integration of qualitative and quantitative methods in data fusion

Gao, Yang January 1994 (has links)
No description available.
2

Design, Implementation and Use of In-Process Sensor Data for Monitoring Broaching and Turning Processes: A Multi - Sensor Approach

Rathinam, Arvinth Chandar 02 June 2013 (has links)
Real-time quality monitoring continues to gain interest within the manufacturing domain as new and faster sensors are being developed. Unfortunately, most quality monitoring solutions are still based on collecting data from the end product. From a process improvement point of view, it is definitely more advantageous to proactively monitor quality directly in the process instead of the product, so that the consequences of a defective part can be minimized or even eliminated. In this dissertation, new methods for in-line process monitoring are explored using multiple sensors. In the first case, a new cutting force-based monitoring methodology was developed to detect out of control conditions in a broaching operation. The second part of this thesis focusses on the development of a test bed for monitoring the tool condition in a turning operation. The constructed test bed includes the combination of multiple sensors signals including, temperature, vibrations, and energy measurements. Here, the proposed SPC strategy integrates sensor data with engineering knowledge to produce quick, reliable results using proven profile monitoring techniques. While, the already existing methods are based on raw process data which requires more features to monitor without any loss of information. This technique is straight forward and able to monitor the process comprehensively with less number of features. Consequently, this also adds to the group of tools that are available for the practitioner. / Master of Science
3

SensAnalysis: A Big Data Platform for Vibration-Sensor Data Analysis

Kumar, Abhinav 26 June 2019 (has links)
The Goodwin Hall building on the Virginia Tech campus is the most instrumented building for vibration monitoring. It houses 225 hard-wired accelerometers which record vibrations arising due to internal as well as external activities. The recorded vibration data can be used to develop real-time applications for monitoring the health of the building or detecting human activity in the building. However, the lack of infrastructure to handle the massive scale of the data, and the steep learning curve of the tools required to store and process the data, are major deterrents for the researchers to perform their experiments. Additionally, researchers want to explore the data to determine the type of experiments they can perform. This work tries to solve these problems by providing a system to store and process the data using existing big data technologies. The system simplifies the process of big data analysis by supporting code re-usability and multiple programming languages. The effectiveness of the system was demonstrated by four case studies. Additionally, three visualizations were developed to help researchers in the initial data exploration. / Master of Science / The Goodwin Hall building on the Virginia Tech campus is an example of a ‘smart building.’ It uses sensors to record the response of the building to various internal and external activities. The recorded data can be used by algorithms to facilitate understanding of the properties of the building or to detect human activity. Accordingly, researchers in the Virginia Tech Smart Infrastructure Lab (VTSIL) run experiments using a part of the complete data. Ideally, they want to run their experiments continuously as new data is collected. However, the massive scale of the data makes it difficult to process new data as soon as it arrives, and to make it available immediately to the researchers. The technologies that can handle data at this scale have a steep learning curve. Starting to use them requires much time and effort. This project involved building a system to handle these challenges so that researchers can focus on their core area of research. The system provides visualizations depicting various properties of the data to help researchers explore that data before running an experiment. The effectiveness of this work was demonstrated using four case studies. These case studies used the actual experiments conducted by VTSIL researchers in the past. The first three case studies help in understanding the properties of the building whereas the final case study deals with detecting and locating human footsteps, on one of the floors, in real-time.
4

The iLog methodology for fostering valid and reliable Big Thick Data

Busso, Matteo 29 April 2024 (has links)
Nowadays, the apparent promise of Big Data is that of being able to understand in real-time people's behavior in their daily lives. However, as big as these data are, many useful variables describing the person's context (e.g., where she is, with whom she is, what she is doing, and her feelings and emotions) are still unavailable. Therefore, people are, at best, thinly described. A former solution is to collect Big Thick Data via blending techniques, combining sensor data sources with high-quality ethnographic data, to generate a dense representation of the person's context. As attractive as the proposal is, the approach is difficult to integrate into research paradigms dealing with Big Data, given the high cost of data collection, integration, and the expertise needed to manage them. Starting from a quantified approach to Big Thick Data, based on the notion of situational context, this thesis proposes a methodology, to design, collect, and prepare reliable and valid quantified Big Thick Data for the purposes of their reuse. Furthermore, the methodology is supported by a set of services to foster its replicability. The methodology has been applied in 4 case studies involving many domain experts and 10,000+ participants from 10 countries. The diverse applications of the methodology and the reuse of the data for multiple applications demonstrate its inner validity and reliability.
5

Evaluation of Spectrum Data Compression Algorithms for Edge-Applications in Industrial Tools

Ring, Johanna January 2024 (has links)
Data volume is growing for each day as more and more is digitalized which puts the data management on test. The smart tools developed by Atlas Copco saves and transmits data to the cloud as a service to find errors in tightening's for their customers to review. A problem is the amount of data that is lost in this process. A tightening cycle usually contains thousands of data points and the storage space for it is too great for the tool's hardware. Today many of the data points are deleted and a small portion of scattered data of the cycle is saved and transmitted. To avoid overfilling the storage space the data need to be minimized. This study is focus on comparing data compression algorithms that could solve this problem.   In a literature study in the beginning, numerous data compression algorithms were found with their advantages and disadvantages. Two different types of compression algorithms are also defined as lossy compression, where data is compressed by losing data points or precision, and lossless compression, where no data is lost throughout the compression. Two lossy and two lossless algorithms are selected to be avaluated with respect to their compression ratio, speed and error tolerance. Poor Man's Compression - Midrange (PMC-MR) and SWING-filter are the lossy algorithms while Gorilla and Fixed-Point Compression (FPC) are the lossless ones.   The reached compression ratios, in percentage, could range from 39\% to 99\%. As combinations of a lossy and a lossless algorithm yields best compression ratios with lower error tolerance, PMC-MR with Gorilla is suggested to be the best suited for Atlas Copco's needs.
6

A visual query language served by a multi-sensor environment

Camara (Silvervarg), Karin January 2007 (has links)
<p>A problem in modern command and control situations is that much data is available from different sensors. Several sensor data sources also require that the user has knowledge about the specific sensor types to be able to interpret the data.</p><p>To alleviate the working situation for a commander, we have designed and constructed a system that will take input from several different sensors and subsequently present the relevant combined information to the user. The users specify what kind of information is of interest at the moment by means of a query language. The main issues when designing this query language have been that (a) the users should not have to have any knowledge about sensors or sensor data analysis, and (b) that the query language should be powerful and flexible, yet easy to use. The solution has been to (a) use sensor data independence and (b) have a visual query language.</p><p>A visual query language was developed with a two-step interface. First, the users pose a “rough”, simple query that is evaluated by the underlying knowledge system. The system returns the relevant information that can be found in the sensor data. Then, the users have the possibility to refine the result by setting conditions for this. These conditions are formulated by specifying attributes of objects or relations between objects.</p><p>The problem of uncertainty in spatial data; (i.e. location, area) has been considered. The question of how to represent potential uncertainties is dealt with. An investigation has been carried out to find which relations are practically useful when dealing with uncertain spatial data.</p><p>The query language has been evaluated by means of a scenario. The scenario was inspired by real events and was developed in cooperation with a military officer to assure that it was fairly realistic. The scenario was simulated using several tools where the query language was one of the more central ones. It proved that the query language can be of use in realistic situations.</p> / Report code: LiU-Tek-Lic-2007:42.
7

A visual query language served by a multi-sensor environment

Camara (Silvervarg), Karin January 2007 (has links)
A problem in modern command and control situations is that much data is available from different sensors. Several sensor data sources also require that the user has knowledge about the specific sensor types to be able to interpret the data. To alleviate the working situation for a commander, we have designed and constructed a system that will take input from several different sensors and subsequently present the relevant combined information to the user. The users specify what kind of information is of interest at the moment by means of a query language. The main issues when designing this query language have been that (a) the users should not have to have any knowledge about sensors or sensor data analysis, and (b) that the query language should be powerful and flexible, yet easy to use. The solution has been to (a) use sensor data independence and (b) have a visual query language. A visual query language was developed with a two-step interface. First, the users pose a “rough”, simple query that is evaluated by the underlying knowledge system. The system returns the relevant information that can be found in the sensor data. Then, the users have the possibility to refine the result by setting conditions for this. These conditions are formulated by specifying attributes of objects or relations between objects. The problem of uncertainty in spatial data; (i.e. location, area) has been considered. The question of how to represent potential uncertainties is dealt with. An investigation has been carried out to find which relations are practically useful when dealing with uncertain spatial data. The query language has been evaluated by means of a scenario. The scenario was inspired by real events and was developed in cooperation with a military officer to assure that it was fairly realistic. The scenario was simulated using several tools where the query language was one of the more central ones. It proved that the query language can be of use in realistic situations. / <p>Report code: LiU-Tek-Lic-2007:42.</p>
8

Interactive Web-based Visualization Tool to Support Inquiry-based Science Learning

Johansson, Emil January 2010 (has links)
<p>This thesis introduces the idea of an interactive web-based visualization tool to support inquiry-based science learning. The problem that occurs when the teachers and students are discussing the collected data is that they are lacking a tool to display such large quantities of data. It is often hard to fully understand such data. This education tool makes use of different visualization approaches in order to support students while getting insights from their collected data. In this thesis I proposed and implemented an interactive web-based visualization tool that was used at a prototype level during the educational activities. The requirements and user needs led the development of this prototype. Requirement elicitations have been done as a part of the research project conducted by CeLeKT.</p><p> </p><p>For the development of this tool, it was necessary for the input of the teachers and students in order to get an understanding of the requirements. The initial inquiry of the teachers and students show the necessity and usefulness of an interactive web-based visualization tool to support learning practices.</p>
9

Wireless sensor data processing for on-site emergency response

Yang, Yanning January 2011 (has links)
This thesis is concerned with the problem of processing data from Wireless Sensor Networks (WSNs) to meet the requirements of emergency responders (e.g. Fire and Rescue Services). A WSN typically consists of spatially distributed sensor nodes to cooperatively monitor the physical or environmental conditions. Sensor data about the physical or environmental conditions can then be used as part of the input to predict, detect, and monitor emergencies. Although WSNs have demonstrated their great potential in facilitating Emergency Response, sensor data cannot be interpreted directly due to its large volume, noise, and redundancy. In addition, emergency responders are not interested in raw data, they are interested in the meaning it conveys. This thesis presents research on processing and combining data from multiple types of sensors, and combining sensor data with other relevant data, for the purpose of obtaining data of greater quality and information of greater relevance to emergency responders. The current theory and practice in Emergency Response and the existing technology aids were reviewed to identify the requirements from both application and technology perspectives (Chapter 2). The detailed process of information extraction from sensor data and sensor data fusion techniques were reviewed to identify what constitutes suitable sensor data fusion techniques and challenges presented in sensor data processing (Chapter 3). A study of Incident Commanders' requirements utilised a goal-driven task analysis method to identify gaps in current means of obtaining relevant information during response to fire emergencies and a list of opportunities for WSN technology to fill those gaps (Chapter 4). A high-level Emergency Information Management System Architecture was proposed, including the main components that are needed, the interaction between components, and system function specification at different incident stages (Chapter 5). A set of state-awareness rules was proposed, and integrated with Kalman Filter to improve the performance of filtering. The proposed data pre-processing approach achieved both improved outlier removal and quick detection of real events (Chapter 6). A data storage mechanism was proposed to support timely response to queries regardless of the increase in volume of data (Chapter 7). What can be considered as “meaning” (e.g. events) for emergency responders were identified and a generic emergency event detection model was proposed to identify patterns presenting in sensor data and associate patterns with events (Chapter 8). In conclusion, the added benefits that the technical work can provide to the current Emergency Response is discussed and specific contributions and future work are highlighted (Chapter 9).
10

Ground Target Tracking with Multi-Lane Constraint

Chen, Yangsheng 15 May 2009 (has links)
Knowledge of the lane that a target is located in is of particular interest in on-road surveillance and target tracking systems. We formulate the problem and propose two approaches for on-road target estimation with lane tracking. The first approach for lane tracking is lane identification based ona Hidden Markov Model (HMM) framework. Two identifiers are developed according to different optimality goals of identification, i.e., the optimality for the whole lane sequence and the optimality of the current lane where the target is given the whole observation sequence. The second approach is on-road target tracking with lane estimation. We propose a 2D road representation which additionally allows to model the lateral motion of the target. For fusion of the radar and image sensor based measurement data we develop three, IMM-based, estimators that use different fusion schemes: centralized, distributed, and sequential. Simulation results show that the proposed two methods have new capabilities and achieve improved estimation accuracy for on-road target tracking.

Page generated in 0.0655 seconds