• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 11
  • 7
  • 6
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 81
  • 81
  • 81
  • 20
  • 18
  • 15
  • 15
  • 14
  • 13
  • 12
  • 12
  • 12
  • 12
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Robust Facility Location With Mobile Customers

Gul, Evren 01 June 2011 (has links) (PDF)
In this thesis, we study the dynamic facility location problem with mobile customers considering the permanent facilities. Our general aim is to locate facilities considering the movements of customers in time. The problem is studied for three objectives: P-median, P-center and MINMAX P-median. We show that dynamic facility location problem is a large instance of a static facility location problem for P-median and P-center objectives. In the problem, we represent the movements of each customer in time with a time series. Using clustering approaches, we develop a heuristic approach for the problem with P-median objective. K-means algorithm is used as a clustering algorithm and dynamic time warping is used in order to define similarities between the customer time series. Solution method is tested on several experimental settings. We obtain results, which differ at most 2% from the optimal, in small computation times. Generally, in the literature, MINMAX P-median is solved with a heuristic depending on scenarios planning (see Serra and Marianov, 1998). The heuristic finds an initial solution according to scenarios, later the initial solution is tried to be improved. We provide a bounding procedure on the solution of the problem. The bounds can be used by decision maker to judge the solution quality before proceed. The bounding procedure is also analyzed in different experimental settings.
22

A Design and Applications of Mandarin Keyword Spotting System

Hou, Cheng-Kuan 11 August 2003 (has links)
A Mandarin keyword spotting system based on MFCC, discrete-time HMM and Viterbi algorithm with DTW is proposed in this thesis. Joining with a dialogue system, this keyword spotting platform is further refined to a prototype of natural speech patient registration system of Kaohsiung Veterans General Hospital. After the ID number is asked by the computer-dialogue attendant in the registration process, the user can finish all relevant works in one sentence. Functions of searching clinical doctors, making and canceling registration are all built in this system. In a laboratory environment, the correct rate of this speaker-independent patient registration system can reach 97% and all registration process can be completed within 75 seconds.
23

Spatial, Temporal and Spatio-Temporal Correspondence for Computer Vision Problems

Zhou, Feng 01 September 2014 (has links)
Many computer vision problems, such as object classification, motion estimation or shape registration rely on solving the correspondence problem. Existing algorithms to solve spatial or temporal correspondence problems are usually NP-hard, difficult to approximate, lack flexible models and mechanism for feature weighting. This proposal addresses the correspondence problem in computer vision, and proposes two new spatio-temporal correspondence problems and three algorithms to solve spatial, temporal and spatio-temporal matching between video and other sources. The main contributions of the thesis are: (1) Factorial graph matching (FGM). FGM extends existing work on graph matching (GM) by finding an exact factorization of the affinity matrix. Four are the benefits that follow from this factorization: (a) There is no need to compute the costly (in space and time) pairwise affinity matrix; (b) It provides a unified framework that reveals commonalities and differences between GM methods. Moreover, the factorization provides a clean connection with other matching algorithms such as iterative closest point; (c) The factorization allows the use of a path-following optimization algorithm, that leads to improved optimization strategies and matching performance; (d) Given the factorization, it becomes straight-forward to incorporate geometric transformations (rigid and non-rigid) to the GM problem. (2) Canonical time warping (CTW). CTW is a technique to temporally align multiple multi-dimensional and multi-modal time series. CTW extends DTW by incorporating a feature weighting layer to adapt different modalities, allowing a more flexible warping as combination of monotonic functions, and has linear complexity (unlike DTW that has quadratic). We applied CTW to align human motion captured with different sensors (e.g., audio, video, accelerometers). (3) Spatio-temporal matching (STM). Given a video and a 3D motion capture model, STM finds the correspondence between subsets of video trajectories and the motion capture model. STM is efficiently and robustly solved using linear programming. We illustrate the performance of STM on the problem of human detection in video, and show how STM achieves state-of-the-art performance.
24

Speech Recognition under Stress

Wang, Yonglian 01 December 2009 (has links)
ABSTRACT OF THE DISSERTATION OF Yonglian Wang, for Doctor of Philosophy degree in Electrical and Computer Engineering, presented on May 19, 2009, at Southern Illinois University- Carbondale. TITLE: SPEECH RECOGNITION UNDER STRESS MAJOR PROFESSOR: Dr. Nazeih M. Botros In this dissertation, three techniques, Dynamic Time Warping (DTW), Hidden Markov Models (HMM), and Hidden Control Neural Network (HCNN) are utilized to realize talker-independent isolated word recognition. DTW is a technique utilized to measure the distance between two input patterns or vectors; HMM is a tool utilized to model speech signals using stochastic process in five states to compare the similarity between signals; and HCNN calculates the errors between actual output and target output and it is mainly built for the stress compensated speech recognition. When stress (Angry, Question and Soft) is induced into the normal talking speech, speech recognition performance degrades greatly. Therefore hypothesis driven approach, a stress compensation technique is introduced to cancel the distortion caused by stress. The database for this research is SUSAS (Speech under Simulated and Actual Stress) which includes five domains encompassing a wide variety of stress, 16,000 isolated-word speech signal samples available from 44 speakers. Another database, called TIMIT (10 speakers and 6300 sentences in total) is used as a minor in DTW algorithm. The words used for speech recognition are speaker-independent. The characteristic feature analysis has been carried out in three domains: pitch, intensity, and glottal spectrum. The results showed that speech spoken under angry and question stress indicates extremely wide fluctuations with average higher pitch, higher RMS intensity, and more energy compared to neutral. In contrast, the soft talking style has lower pitch, lower RMS intensity, and less energy compared to neutral. The Linear Predictive Coding (LPC) cepstral feature analysis is used to obtain the observation vector and the input vector for DTW, HMM, and stress compensation. Both HMM and HCNN consist of training and recognition stages. Training stage is to form references, while recognition stage is to compare an unknown word against all the reference models. The unknown word is recognized by the model with highest similarity. Our results showed that HMM technique can achieve 91% recognition rate for Normal speech; however, the recognition rate dropped to 60% for Angry stress condition, 65% for Question stress condition, and 76% for Soft stress condition. After compensation was applied for the cepstral tilts, the recognition rate increased by 10% for Angry stress condition, 8% for Question stress condition, and 4% for Soft stress condition. Finally, HCNN technique increased the recognition rate to 90% for Angry stress condition and it also differentiated the Angry stress from other stress group.
25

Interpolation strategy based on Dynamic Time Warping

Operti, Felipe Gioachino January 2015 (has links)
OPERTI, Felipe Gioachino. Interpolation strategy based on Dynamic Time Warping. 2015. 53 f. Dissertação (Mestrado em Física) - Programa de Pós-Graduação em Física, Departamento de Física, Centro de Ciências, Universidade Federal do Ceará, Fortaleza, 2015. / Submitted by Edvander Pires (edvanderpires@gmail.com) on 2015-04-14T22:11:14Z No. of bitstreams: 1 2015_dis_fgoperti.pdf: 5361657 bytes, checksum: b47dae9c4d72accf5fe2c50b89abaae4 (MD5) / Approved for entry into archive by Edvander Pires(edvanderpires@gmail.com) on 2015-04-16T18:35:49Z (GMT) No. of bitstreams: 1 2015_dis_fgoperti.pdf: 5361657 bytes, checksum: b47dae9c4d72accf5fe2c50b89abaae4 (MD5) / Made available in DSpace on 2015-04-16T18:35:49Z (GMT). No. of bitstreams: 1 2015_dis_fgoperti.pdf: 5361657 bytes, checksum: b47dae9c4d72accf5fe2c50b89abaae4 (MD5) Previous issue date: 2015 / In oil industry, it is essential to have the knowledge of the stratified rocks’ lithology and, as consequence, where are placed the oil and the natural gases reserves, in order to efficiently drill the soil, without a major expense. In this context, the analysis of seismological data is highly relevant for the extraction of such hydrocarbons, producing predictions of profiles through reflection of mechanical waves in the soil. The image of the seismic mapping produced by wave refraction and reflection into the soil can be analysed to find geological formations of interest. In 1978, H. Sakoe et al. defined a model called Dynamic Time Warping (DTW)[23] for the local detection of similarity between two time series. We apply the Dynamic Time Warping Interpolation (DTWI) strategy to interpolate and simulate a seismic landscape formed by 129 depth-dependent sequences of length 201 using different values of known sequences m, where m = 2, 3, 5, 9, 17, 33, 65. For comparison, we done the same operation of interpolation using a Standard Linear Interpolation (SLI). Results show that the DTWI strategy works better than the SLI when m = 3, 5, 9, 17, or rather when distance between the known series has the same order size of the soil layers.
26

Interpolation Strategy Based on Dynamic Time Warping

Felipe Gioachino Operti 29 January 2015 (has links)
In oil industry, it is essential to have the knowledge of the stratified rocksâ lithology and, as consequence, where are placed the oil and the natural gases reserves, in order to efficiently drill the soil, without a major expense. In this context, the analysis of seismological data is highly relevant for the extraction of such hydrocarbons, producing predictions of profiles through reflection of mechanical waves in the soil. The image of the seismic mapping produced by wave refraction and reflection into the soil can be analysed to find geological formations of interest. In 1978, H. Sakoe et al. defined a model called Dynamic Time Warping (DTW)[23] for the local detection of similarity between two time series. We apply the Dynamic Time Warping Interpolation (DTWI) strategy to interpolate and simulate a seismic landscape formed by 129 depth-dependent sequences of length 201 using different values of known sequences m, where m = 2, 3, 5, 9, 17, 33, 65. For comparison, we done the same operation of interpolation using a Standard Linear Interpolation (SLI). Results show that the DTWI strategy works better than the SLI when m = 3, 5, 9, 17, or rather when distance between the known series has the same order size of the soil layers.
27

SHECARE: Shared Haptic Environment on the Cloud for Arm Rehabilitation Exercises

Hoda, Mohamad January 2016 (has links)
It is well known that home exercise is as good as rehab center. Unfortunately, passive devices such as dumbbells, elastic bands, stress balls and tubing that have been widely used for home-based arm rehabilitation do not provide therapists with the information needed to monitor the patient’s progress, identify any impairment, and suggest treatments. Moreover, the lack of interactivity of these devices turns the rehabilitation exercises into a boring, unpleasant task. In this thesis, we introduce a family of home-based post-stroke rehabilitation systems aimed at solving the aforementioned problems. We call such applications: “Shared Haptic Environment on the Cloud for Arm Rehabilitation Exercises (SHECARE)”. The systems combine recent rehabilitation approaches with efficient, yet affordable skeleton tracking input technologies, and multimodal interactive computer environment. In addition, the systems provide a real-time feedback to the stroke patients, summarize the feedback after each session, and predict the overall recovery progress. Moreover, these systems show a new style of home-based rehabilitation approach that motivate the patients by engaging the whole family and friends in the rehabilitation process and allow the therapists to remotely assess the progress of the patients and adjust the training strategy accordingly. Two mathematical models have been presented in this thesis. The first model is developed to find the relationship between upper extremity kinematics and the associated forces/strength. The second model is used to evaluate the medical condition of the stroke patients and predict their recovery progress depending on their performance history. The objective assessments, clinical tests, and the subjective assessments, usability studies have shown the feasibility of the proposed systems for rehabilitation in stroke patients with upper limb motor dysfunction.
28

Understanding Traffic Cruising Causation : Via Parking Data Enhancement

Jasarevic, Mirza January 2021 (has links)
Background. Some computer scientists have recently pointed out that it may be more effective for the computer science community to focus more on data preparation for performance improvements, rather than exclusively comparing modeling techniques.Testing how useful this shift in focus is, this paper chooses a particular data extraction technique to examine the differences in data model performance. Objectives. Five recent (2016-2020) studies concerning modeling parking congestion have used a rationalized approach to feature extraction rather than a measured approach. Their main focus was to select modeling techniques to find the best performance. Instead, this study picks a feature common to them all and attempts to improve it. It is then compared to the performance of the feature when it retains the state it had in the related studies. Weights are applied to the selected features, and altered, rather than using several modeling techniques. Specifically in the case of time series parking data, as the opportunity appeared in that sector. Apart from this, the reusability of the data is also gauged. Methods. An experimental case study is designed in three parts. The first tests the importance of weighted sum configurations relative to drivers' expectations. The second analyzes how much data can be recycled from the real data, and whether spatial or temporal comparisons are better for data synthesis of parking data. The third part compares the performance of the best configuration against the default configuration using k-means clustering algorithm and dynamic time warping distance. Results. The experimental results show performance improvements on all levels, and increasing improvement as the sample sizes grow, up to 9% average improvement per category, 6.2% for the entire city. The popularity of a parking lot turned out to be as important as occupancy rates(50% importance each), while volatility was obstructive. A few months were recyclable, and a few small parking lots could replace each other's datasets. Temporal aspects turned out to be better for parking data simulations than spatial aspects. Conclusions. The results support the data scientists' belief that quality- and quantity improvements of data are more important than creating more, new types of models. The score can be used as a better metric for parking congestion rates, for both drivers and managers. It can be employed in the public sphere under the condition that higher quality, richer data are provided.
29

Klasifikace srdečních cyklů / Classification of cardiac cycles

Lorenc, Patrik January 2013 (has links)
This work deals with the classification of cardiac cycles, which uses a method of dynamic time warping and cluster analysis. Method of dynamic time warping is among the elderly, but for its simplicity compared to others is still very much used, and also achieved good results in practice. Cluster analysis is used in many fields such as marketing or just for biological signals. The aim of this work is a general introduction to the ECG signal and the method and implementation of dynamic time warping algorithm. Subsequently, cluster analysis and finally the creation of the user interface for the algorithms.
30

Detekce klíčových slov v mluvené řeči / Keyword spotting

Zemánek, Tomáš January 2011 (has links)
This thesis is aimed on design keyword detector. The work contains a description of the methods that are used for these purposes and design of algorithm for keyword detection. The proposed detector is based on the method of DTW (Dynamic Time Warping). Analysis of the problem was performed on the module programmed in ANSI C, which was created within the thesis. The results of the detector were evaluated using the metrics WER (word error rate) and AUC (area under curve).

Page generated in 0.1033 seconds