• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 501
  • 201
  • 111
  • 59
  • 55
  • 39
  • 38
  • 28
  • 19
  • 16
  • 14
  • 13
  • 8
  • 6
  • 6
  • Tagged with
  • 1289
  • 142
  • 120
  • 120
  • 116
  • 112
  • 108
  • 106
  • 93
  • 85
  • 80
  • 80
  • 73
  • 70
  • 68
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Detection and handling of overlapping speech for speaker diarization

Zelenák, Martin 31 January 2012 (has links)
For the last several years, speaker diarization has been attracting substantial research attention as one of the spoken language technologies applied for the improvement, or enrichment, of recording transcriptions. Recordings of meetings, compared to other domains, exhibit an increased complexity due to the spontaneity of speech, reverberation effects, and also due to the presence of overlapping speech. Overlapping speech refers to situations when two or more speakers are speaking simultaneously. In meeting data, a substantial portion of errors of the conventional speaker diarization systems can be ascribed to speaker overlaps, since usually only one speaker label is assigned per segment. Furthermore, simultaneous speech included in training data can eventually lead to corrupt single-speaker models and thus to a worse segmentation. This thesis concerns the detection of overlapping speech segments and its further application for the improvement of speaker diarization performance. We propose the use of three spatial cross-correlationbased parameters for overlap detection on distant microphone channel data. Spatial features from different microphone pairs are fused by means of principal component analysis, linear discriminant analysis, or by a multi-layer perceptron. In addition, we also investigate the possibility of employing longterm prosodic information. The most suitable subset from a set of candidate prosodic features is determined in two steps. Firstly, a ranking according to mRMR criterion is obtained, and then, a standard hill-climbing wrapper approach is applied in order to determine the optimal number of features. The novel spatial as well as prosodic parameters are used in combination with spectral-based features suggested previously in the literature. In experiments conducted on AMI meeting data, we show that the newly proposed features do contribute to the detection of overlapping speech, especially on data originating from a single recording site. In speaker diarization, for segments including detected speaker overlap, a second speaker label is picked, and such segments are also discarded from the model training. The proposed overlap labeling technique is integrated in Viterbi decoding, a part of the diarization algorithm. During the system development it was discovered that it is favorable to do an independent optimization of overlap exclusion and labeling with respect to the overlap detection system. We report improvements over the baseline diarization system on both single- and multi-site AMI data. Preliminary experiments with NIST RT data show DER improvement on the RT ¿09 meeting recordings as well. The addition of beamforming and TDOA feature stream into the baseline diarization system, which was aimed at improving the clustering process, results in a bit higher effectiveness of the overlap labeling algorithm. A more detailed analysis on the overlap exclusion behavior reveals big improvement contrasts between individual meeting recordings as well as between various settings of the overlap detection operation point. However, a high performance variability across different recordings is also typical of the baseline diarization system, without any overlap handling.
312

Preserving Texture Boundaries for SAR Sea Ice Segmentation

Jobanputra, Rishi January 2004 (has links)
Texture analysis has been used extensively in the computer-assisted interpretation of SAR sea ice imagery. Provision of maps which distinguish relevant ice types is significant for monitoring global warming and ship navigation. Due to the abundance of SAR imagery available, there exists a need to develop an automated approach for SAR sea ice interpretation. Grey level co-occurrence probability (<i>GLCP</i>) texture features are very popular for SAR sea ice classification. Although these features are used extensively in the literature, they have a tendency to erode and misclassify texture boundaries. Proposed is an advancement to the <i>GLCP</i> method which will preserve texture boundaries during image segmentation. This method exploits the relationship a pixel has with its closest neighbors and weights the texture measurement accordingly. These texture features are referred to as <i>WGLCP</i> (weighted <i>GLCP</i>) texture features. In this research, the <i>WGLCP</i> and <i>GLCP</i> feature sets are compared in terms of boundary preservation, unsupervised segmentation ability, robustness to increasing boundary density and computation time. The <i>WGLCP</i> method outperforms the <i>GLCP</i> method in all aspects except for computation time, where it suffers. From the comparative analysis, an inconsistency with the <i>GLCP</i> correlation statistic was observed, which motivated an investigative study into using this statistic for image segmentation. As the overall goal of the thesis is to improve SAR sea ice segmentation accuracy, the concepts developed from the study are applied to the image segmentation problem. The results indicate that for images with high contrast boundaries, the <i>GLCP</i> correlation statistical feature decreases segmentation accuracy. When comparing <i>WGLCP</i> and <i>GLCP</i> features for segmentation, the <i>WGLCP</i> features provide higher segmentation accuracy.
313

Road Sign Recognition based onInvariant Features using SupportVector Machine

Gilani, Syed Hassan January 2007 (has links)
Since last two decades researches have been working on developing systems that can assistsdrivers in the best way possible and make driving safe. Computer vision has played a crucialpart in design of these systems. With the introduction of vision techniques variousautonomous and robust real-time traffic automation systems have been designed such asTraffic monitoring, Traffic related parameter estimation and intelligent vehicles. Among theseautomatic detection and recognition of road signs has became an interesting research topic.The system can assist drivers about signs they don’t recognize before passing them.Aim of this research project is to present an Intelligent Road Sign Recognition System basedon state-of-the-art technique, the Support Vector Machine. The project is an extension to thework done at ITS research Platform at Dalarna University [25]. Focus of this research work ison the recognition of road signs under analysis. When classifying an image its location, sizeand orientation in the image plane are its irrelevant features and one way to get rid of thisambiguity is to extract those features which are invariant under the above mentionedtransformation. These invariant features are then used in Support Vector Machine forclassification. Support Vector Machine is a supervised learning machine that solves problemin higher dimension with the help of Kernel functions and is best know for classificationproblems.
314

DEVELOPMENT OF A MANUFACTURING CELL IN COMPLIANCE WITH IEC 61499 : Function Blocks networks implementation in a Gantry Robot system

Diaz Rios, Raul January 2012 (has links)
Standards are, nowadays, a useful and sometimes essential tool, in the world of automated industry. Since the development of the standard IEC 61131-3, automation has made progresses in a consistent manner. This allows an engineer to develop control programs for any type of systems using the language set in the standard. However, this standard was not enough because it did not cover the control over distributed systems. Therefore, it was necessary to establish rules in this field. In order to fill this gap, the new standard IEC 61499 was developed. Its main objective is to describe how to develop, design and implement distributed control systems, using the new concept of function blocks. By the use of the new function blocks, it is possible to control distributed systems in a quick and easy way, adding some advantages such as reusability, portability, and much enhanced maintenance.This project involves the preparation of a manufacturing cell composed of a gantry robot interacting with a CNC machine. The project is developed in two sections; one is focused on the computer programming with function blocks for the gantry robot and HMI, and the other is prepared for the programming of function blocks for the CNC machine and HMI. The communications between PLC, PC, gantry robot and CNC machine are developed using the standard IEC 61499. This document covers the survey and research made for the first section of the project.The new standard IEC 61499 provides the methodologies and the appropriated tools to achieve a good control in distributed systems. The basic tools offered by this standard are the new function blocks which will be the main tool used in this project. Equally important, it is necessary to take into account the different assembly features and the machining features in order to design better function blocks to control the system. An HMI has to be developed in order to obtain a good interface for a worker of this cell.The aim of this project is to investigate how the new standard IEC 61499 works in a real manufacturing cell and how the new function blocks interact in a real distributed system. Moreover, it is important to see how the event flow works, controlling all the sequences required in the manufacturing cell.
315

Preserving Texture Boundaries for SAR Sea Ice Segmentation

Jobanputra, Rishi January 2004 (has links)
Texture analysis has been used extensively in the computer-assisted interpretation of SAR sea ice imagery. Provision of maps which distinguish relevant ice types is significant for monitoring global warming and ship navigation. Due to the abundance of SAR imagery available, there exists a need to develop an automated approach for SAR sea ice interpretation. Grey level co-occurrence probability (<i>GLCP</i>) texture features are very popular for SAR sea ice classification. Although these features are used extensively in the literature, they have a tendency to erode and misclassify texture boundaries. Proposed is an advancement to the <i>GLCP</i> method which will preserve texture boundaries during image segmentation. This method exploits the relationship a pixel has with its closest neighbors and weights the texture measurement accordingly. These texture features are referred to as <i>WGLCP</i> (weighted <i>GLCP</i>) texture features. In this research, the <i>WGLCP</i> and <i>GLCP</i> feature sets are compared in terms of boundary preservation, unsupervised segmentation ability, robustness to increasing boundary density and computation time. The <i>WGLCP</i> method outperforms the <i>GLCP</i> method in all aspects except for computation time, where it suffers. From the comparative analysis, an inconsistency with the <i>GLCP</i> correlation statistic was observed, which motivated an investigative study into using this statistic for image segmentation. As the overall goal of the thesis is to improve SAR sea ice segmentation accuracy, the concepts developed from the study are applied to the image segmentation problem. The results indicate that for images with high contrast boundaries, the <i>GLCP</i> correlation statistical feature decreases segmentation accuracy. When comparing <i>WGLCP</i> and <i>GLCP</i> features for segmentation, the <i>WGLCP</i> features provide higher segmentation accuracy.
316

Case Study of Feature-Oriented Requirements Modelling, Applied to an Online Trading System

Krulec, Ana 12 1900 (has links)
The Feature-Oriented Requirements Modelling (FORM) combines the requirement engineering style structuring of requirements documents with the feature-orientation of the Feature Oriented Software Development, resulting in a feature-oriented model of the functional requirements of a system-under-development (SUD). A feature is a distinguishable unit of added value to the SUD. The objectives of FORM are to model features as independent modules, to allow the addition of new features with minimal changes to the existing features, and to enable automatic generation and checking of properties like correctness, consistency, and non-determinism. FORM structures requirements into three models: a domain model, a collection of behavioural models, and a collection of functional models. A feature is modelled by a distinct behavioural model. This dissertation evaluates FORM by applying it to a new application that can be thought of in terms of features, namely an online trading system (OTS) that receives requests from customers about buying or selling securities on a stock market. The OTS offers variability in terms of the types of orders that customers can request, (e.g. market order, limit order and stop order). The case study revealed six deficiencies of the FORM notation, three of which were easily overcome. The dissertation presents the results of the case study, resolutions to three of the six deficiencies, and an outline of an approach to resolve the other three deficiencies.
317

Mold Feature Recognition using Accessibility Analysis for Automated Design of Core, Cavity, and Side-Cores and Tool-Path Generation of Mold Segments

Bassi, Rajnish January 2012 (has links)
Injection molding is widely used to manufacture plastic parts with good surface finish, dimensional stability and low cost. The common examples of parts manufactured by injection molding include toys, utensils, and casings of various electronic products. The process of mold design to generate these complex shapes is iterative and time consuming, and requires great expertise in the field. As a result, a significant amount of the final product cost can be attributed to the expenses incurred during the product’s design. After designing the mold segments, it is necessary to machine these segments with minimum cost using an efficient tool-path. The tool-path planning process also adds to the overall mold cost. The process of injection molding can be simplified and made to be more cost effective if the processes of mold design and tool-path generation can be automated. This work focuses on the automation of mold design from a given part design and the automation of tool-path generation for manufacturing mold segments. The hypothesis examined in this thesis is that the automatic identification of mold features can reduce the human efforts required to design molds. It is further hypothesised that the human effort required in many downstream processes such as mold component machining can also be reduced with algorithmic automation of otherwise time consuming decisions. Automatic design of dies and molds begins with the part design being provided as a solid model. The solid model of a part is a database of its geometry and topology. The automatic mold design process uses this database to identify an undercut-free parting direction, for recognition of mold features and identification of parting lines for a given parting direction, and for generation of entities such as parting surfaces, core, cavity and side-cores. The methods presented in this work are analytical in nature and work with the extended set of part topologies and geometries unlike those found in the literature. Moreover, the methods do not require discretizing the part geometry to design its mold segments, unlike those found in the literature that result in losing the part definition. Once the mold features are recognized and parting lines are defined, core, cavity and side-cores are generated. This work presents algorithms that recognize the entities in the part solid model that contribute to the design of the core, cavity and side-cores, extract the entities, and use them in the design of these elements. The developed algorithms are demonstrated on a variety of parts that cover a wide range of features. The work also presents a method for automatic tool-path generation that takes the designed core/cavity and produces a multi-stage tool-path to machine it from raw stock. The tool-path generation process begins by determining tool-path profiles and tool positions for the rough machining of the part in layers. Typically roughing is done with large aggressive tools to reduce the machining time; and roughing leaves uncut material. After generating a roughing tool-path for each layer, the machining is simulated and the areas left uncut are identified to generate a clean-up tool-path for smaller sized tools. The tool-path planning is demonstrated using a part having obstacles within the machining region. The simulated machining is presented in this work. This work extends the accessibility analysis by retaining the topology information and using it to recognize a larger domain of features including intersecting features, filling a void in the literature regarding a method that could recognize complex intersecting features during an automated mold design process. Using this information, a larger variety of new mold intersecting features are classified and recognized in this approach. The second major contribution of the work was to demonstrate that the downstream operations can also benefit from algorithmic decision making. This is shown by automatically generating roughing and clean-up tool-paths, while reducing the machining time by machining only those areas that have uncut material. The algorithm can handle cavities with obstacles in them. The methodology has been tested on a number of parts.
318

Prognostic factors associated with disease progression in parkinson's disease

Ferguson, Leslie Wayne 27 February 2006 (has links)
This thesis examined the factors correlated with rapid and benign progression of disease in a group of 1452 Parkinsons disease (PD) patients. The data were collected in a movement disorders clinic at the Royal University Hospital, University of Saskatchewan run by Dr. Alex Rajput and Dr. Ali Rajput. This data is a clinical dataset of PD patients collected from 1970 through to February, 2005. This was a retrospective cases-only study, with anticipated analytical follow-up if any correlations were detected between progression type of PD and the many independent variables available in the dataset. <p>Rapid progression was defined as those subjects who reached Hoehn and Yahr stage 3 within three years or H&Y stage 4 or 5 within five years. Subjects who remained in Hoehn and Yahr stage 1 or 2, ten years after onset of disease, were defined as having benign progression. The study analyzed demographic and clinical findings at first visit to this clinic associated with rapid and benign progression of PD. <p> Analysis revealed that, at first clinic visit, benign progression was positively associated with disease duration (OR=1.41; 95% CI 1.27, 1.57), male sex (OR=3.23; 95% CI 1.70, 6.16), and current smoking habit (OR=2.33; 95% CI 0.67, 8.11). Benign progression was negatively associated with older age of onset (OR=0.36; 95% CI 0.25, 0.50), past history of smoking (OR=0.46; 95% CI 0.24, 0.89), current or past use of levodopa (OR=0.45; 95% CI 0.21, 0.98), and mild to severe rigidity (OR=0.43; 95% CI 0.23, 0.80). <p>Analysis also revealed that, at first clinic visit, rapid progression was positively associated with older age of onset (OR=2.45; 95% CI 1.80, 3.33) and mild to severe rigidity (OR=1.73; 95% CI 1.02, 2.94). Rapid progression was negatively associated with disease duration (OR=0.52; 95% CI 0.44, 0.62), male sex (OR=0.58; CI 0.35, 0.95), and mild to severe resting tremor (OR=0.47; CI 0.28, 0.77). <p>The results of this study indicate that age of onset, disease duration, male sex, and rigidity are good potential predictors of disease progression in PD because they have opposite associations with rapid and benign progression. History of levodopa use was negatively associated with benign progression and as such may be good indicator of non-benign progression. Although previous studies found no predictive value for smoking history, the current study reported a unique association between smoking history and benign progression. Past smoking history was negatively associated with benign progression. While there was a positive association with current smoking history, the result was not statistically significant. Resting tremor was negatively associated with rapid progression and as such may be a good indicator of non-rapid progression. <p> Disease characteristics collected at first clinic visit are useful in predicting the course of progression of PD. With more rapid progression of PD closer and more frequent follow-up of patients may be necessary.
319

Modelling reflected polarized light from exoplanetary atmospheres

Aronson, Erik January 2011 (has links)
I present numerical simulations of intensity and degree of polarization of light reflected by Earth-like exoplanets. The results are presented as a function of wavelength, and for a few different phase angles and a few different points on the planet. At this stage the aim is to show the working code and test a few different set ups of the star-planet system in order to find preferable configurations for observations. Not surprisingly, phase angle 90◦ shows the largest degree of polarization. For beneficial wavelength regions, visual light shows a larger overall degree of polarization, while NIR shows very clear absorption patterns in the degree of polarization, making detection of the atmospheric composition possible.
320

Why Not China? : a study of organizational features behind Swedish SMEs' internationalization towards China

Bashir, Salman, Sarakinis, Mikael January 2011 (has links)
In today’s global market, China attracts great attention due to its rapid growing economy. Organizations from different countries take advantage of this, and move production to China. The noteworthy aspect of this situation is that most of these companies are Multi National Enterprises. These MNEs are aggressive in their expansion due to possession of major capabilities and possibilities to confront barriers and take economic risks. However, there are smaller companies with fewer resources that are more limited and choose not to move production to China. What drives these companies to bypass the well documented advantages with a production process in there? This research aims to fill that gap. This deductive research is based on Swedish SMEs that have been inquired to rank the most influential drivers behind their decision to move or not to move production to China. The investigation is conducted quantitatively by a survey. Another aspect of the survey which strengthens the result is the core strategies of the SMEs, which are asked to be ranked in order to reveal the most dominant one. The results analysis signifies that the key-drivers and the core strategy together influence the decision, to either move or not to move. However, the generalizability is negatively affected by the low level of participants. Therefore, in-depth analysis has been conducted, which highlights that the results do reveal a connection between the drivers and the core strategy and how they influence the decision. This research reveals the most influential processes for Swedish SMEs, which can further be considered by other SMEs that are in the process of making a decision to move or not to move production to China.

Page generated in 0.0776 seconds