• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 12
  • 7
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 71
  • 13
  • 10
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Biomechanical online signature modeling applied to verification / Modélisation biomécanique des signatures en ligne appliqué à la vérification

Coutinho Canuto, Jânio 08 December 2014 (has links)
Cette thèse porte sur la modélisation et vérification des signatures en ligne. La première partie a pour thème principal la modélisation biomécanique des mouvements de la main. Un modèle basé sur le critère de Minimum de Secousse (MS) a été choisi parmi plusieurs théories du contrôle moteur. Ensuite, le problème de la segmentation des trajectoires en traits qui correspondent au modèle cinématique choisi a été étudié, ce qui a conduit à la mise au point d'une méthode de segmentation itérative. Le choix du modèle et de la méthode de segmentation sont basé sur le compromis entre la qualité de reconstruction et la compression. Dans la deuxième partie, le modèle polynomial issu du critère de MS est volontairement dégradé. Les zéros non-Réels des polynômes sont jetés et les effets de cette dégradation sont étudiés dans une perspective de vérification biométrique. Cette dégradation est équivalente à la technique connue sous le nom d’Infinity Clipping, initialement appliqué à des signaux de parole. Pour les signatures en ligne, comme pour la parole, la préservation de l'information essentielle a été observée sur des tâches de vérification de signature. En fait, en utilisant seulement la distance de Levenshtein sur la représentation dégradée, un taux d'erreur comparable à ceux des méthodes plus élaborées a été obtenu. En outre, la représentation symbolique issue de l’Infinity Clipping permet d’établir une relation conceptuelle entre le nombre de segments obtenus par la segmentation itératif basée sur le MS et la complexité de Lempel-Ziv. Cette relation est potentiellement utile pour l'analyse des signatures en ligne et pour l’amélioration des systèmes de reconnaissance / This thesis deals with the modelling and verification of online signatures. The first part has as main theme the biomechanical modelling of hand movements associated to the signing gesture. A model based on the Minimum Jerk (MJ) criterion was chosen amongst the several available motor control theories. Next, the problem of signature trajectory segmentation into strokes that better fit the chosen kinematic model is studied, leading to the development of an iterative segmentation method. Both the choice of the model and the segmentation method are strongly based on the tradeoff between reconstruction quality and compression. On the second part, the polynomial model provided by the MJ criterion is intentionally degraded. The non-Real zeroes of the polynomials are discarded and the effects of this degradation are studied from a biometric verification perspective. This degradation is equivalent to the signal processing technique known as Infinity Clipping, originally applied to speech signals. On signatures, as for speech, the preservation of essential information was observed on signature verification tasks. As a matter of fact, using only the Levenshtein distance over the infinitely clipped representation, verification error rates comparable to those of more elaborate methods were obtained. Furthermore, the symbolic representation yielded by the infinity clipping technique allows for a conceptual relationship between the number of polynomial segments obtained through the Minimum Jerk-Based iterative segmentation and the Lempel-Ziv complexity. This relationship is potentially useful for the analysis of online signature signals and the improvement of recognition systems
42

New signal processing approaches to peak-to-average power ratio reduction in multicarrier systems

Bae, Ki-taek 06 December 2010 (has links)
Multi-carrier systems based on orthogonal frequency division multiplexing (OFDM) are efficient technologies for the implementation of broadband wireless communication systems. OFDM is widely used and has been adopted for current mobile broadband wireless communication systems such as IEEE 802.a/g wireless LANs, WiMAX, 3GPP LTE, and DVB-T/H digital video broadcasting systems. Despite their many advantages, however, OFDM-based systems suffer from potentially high peak-to-average power ratio (PAR). Since communication systems typically include nonlinear devices such as RF power amplifiers (PA) and digital-to-analog converters (DAC), high PAR results in increased symbol error rates and spectral radiation. To mitigate these nonlinear effects and to avoid nonlinear saturation effects of the PA, the operating point of a signal with high peak power must be backed off into the linear region of the PA. This so-called output backoff (OBO) results in a reduced power conversion efficiency which limits the battery life for mobile applications, reduces the coverage range, and increases both the cost of the PA and power consumption in the cellular base station. With the increasing demand for high energy efficiency, low power consumption, and greenhouse gas emission reduction, PAR reduction is a key technique in the design of practical OFDM systems. Motivated by the PAR reduction problem associated with multi-carrier systems, such as OFDM, this research explores the state of the art of PAR reduction techniques and develops new signal processing techniques that can achieve a minimum PAR for given system parameters and that are compatible with the appropriate standards. The following are the three principal contributions of this dissertation research. First, we present and derive the semi-analytical results for the output of asymptotic iterative clipping and filtering. This work provides expressions and analytical techniques for estimating the attenuation factor, error vector magnitude, and bit-error-rate (BER), using a noise enhancement factor that is obtained by simulation. With these semi-analytical results, we obtain a relationship between the BER and the target clipping level for asymptotic iterative clipping and filtering. These results serve as a performance benchmark for designing PAR reduction techniques using iterative clipping and filtering in OFDM systems. Second, we analyze the impact of the selected mapping (SLM) technique on BER performance of OFDM systems in an additive white Gaussian noise channel in the presence of nonlinearity. We first derive a closed-form expression for the envelope power distribution in an OFDM system with SLM. Then, using this derived envelope power distribution, we investigate the BER performance and the total degradation (TD) of OFDM systems with SLM under the existence of nonlinearity. As a result, we obtain the TD-minimizing peak backoff (PBO) and clipping ratio as functions of the number of candidate signals in SLM. Third, we propose an adaptive clipping control algorithm and pilotaided algorithm to address a fundamental issue associated with two lowcomplexity PAR reduction techniques, namely, tone reservation (TR) and active constellation extension (ACE). Specifically, we discovered that the existing low-complexity algorithms have a low clipping ratio problem in that they can not achieve the minimum PAR when the target clipping level is set below the initially unknown optimum value. Using our proposed algorithms, we overcome this problem and demonstrate that additional PAR reduction is obtained for any low value of the initial target clipping ratio. / text
43

REDUKCE DYNAMIKY SIGNÁLU V SYSTÉMECH S ORTOGONÁLNÍM FREKVENČNÍM MULTIPLEXEM / THE REDUCTION OF SIGNAL DYNAMIC IN ORTHOGONAL FREQUENCY DIVISION MULTIPLEX SYSTEMS

Urban, Josef January 2010 (has links)
This doctoral thesis is focused into the area of multicarrier radio communications systems. These systems are perspective for current and incoming mobile communications and wireless networks. Advantages of multicarrier systems like better multipath propagation resistivity are redeemed by some disadvantages. The high peak to average power ratio of transmitted signal belongs to these disadvantages, for its inconvenience for high efficient power amplification. The thesis concerns with peak to average power ratio reduction methods for OFDM systems, that belongs to the most used multicarrier systems. One of the main objectives is the modification of the existing methods with the intention of complexity reduction. Following subject of interest is the analysis of suitable methods combinations possibilities for more significant peak to average power ratio reduction. One part of this thesis is research of influence of these methods on the OFDM signals with different parameters.
44

Classification angiographique d’anévrismes cérébraux traités par voie chirurgicale ou endovasculaire : revue systématique et étude d’accord inter & intra-observateur

Benomar, Anass 08 1900 (has links)
INTRODUCTION Les anévrismes cérébraux représentent une condition potentiellement mortelle. Ils sont traités par voie chirurgicale (« clipping ») ou endovasculaire (dont le « coiling »), et plusieurs essais cliniques randomisés comparant ces traitements utilisent les résultats angiographiques comme mesure du succès du traitement. Le développement d'une classification standardisée et applicable aux deux traitements et aux trois modalités angiographiques (conventionnelle, angioscan et angio-IRM) est nécessaire. Nos objectifs furent de revoir systématiquement les classifications existantes, d’évaluer la fiabilité inter/intra-observateur d’une nouvelle classification angiographique commune, et d’en vérifier la pertinence clinique. MÉTHODES Deux revues systématiques furent effectuées dans 5 bases de données pour recenser les classifications (et leur fiabilité, lorsque rapportée) pour chaque traitement. Les résultats d’angiographie conventionnelle post-clipping (n=30) ou coiling (n=30) de 60 patients furent évalués de manière indépendante, à l’aide d’une nouvelle classification à 3 catégories, par 19 cliniciens ayant diverses pratiques et expériences. Les observateurs devaient également indiquer la prise en charge clinique appropriée pour chaque cas: un suivi espacé, étroit, ou un retraitement. Quinze lecteurs ont procédé à une 2e évaluation des cas (présentée selon un ordre permuté) à 1 mois d’écart. Le même travail fut reconduit, dans un deuxième temps, par 24 observateurs avec 60 cas évalués par angioscan (n=30) ou angio-IRM (n=30). Les accords inter/intra-observateurs furent estimés avec la statistique Alpha de Krippendorff (aK); la corrélation entre le choix de classe et la prise en charge fut calculée avec un test V de Cramer. RÉSULTATS Soixante-dix classifications (44 post-clipping, 26 post-coiling) furent recensées dans la littérature. La fiabilité variait de 0.12 à 1.00, avec une hétérogénéité significative en termes d’observateurs et de patients. L’accord inter-observateur pour la classification retenue était substantiel pour les résultats d’angiographie conventionnelle (aK=0.76; 0.67–0.83) et des autres modalités (aK=0.63; 0.55–0.70); peu importe le traitement, la spécialité ou l’expérience des évaluateurs. La classe est fortement corrélée à la prise en charge (V de Cramer de 0,77 à 0,80 selon la modalité angiographique; p<0,001). CONCLUSION La classification proposée est fiable et pertinente cliniquement. Elle servira d’outil valide pour évaluer les résultats angiographiques d’anévrismes dans les futurs essais cliniques comparant les deux traitements, peu importe la modalité d’imagerie utilisée. / INTRODUCTION Intracranial aneurysms are a medical condition associated with serious morbidity and mortality when ruptured. They are mainly treated with surgical clipping or endovascular coiling, and randomized clinical trials comparing both treatments often use angiographic results as a surrogate outcome of treatment success. The development of a standardized, reliable method to report angiographic results, applicable to both treatments using any angiographic modality (Conventional [CA], Computed tomography [CTA], or magnetic resonance [MRA] angiography) is needed. We sought to systematically review existing methods of classifying treated aneurysms and to assess the inter-/intra-rater reliability of a new angiographic grading scale, as well as its clinical relevance. METHODS We conducted two separate systematic reviews in 5 databases to collect post-clipping and post-coiling grading scales, along with their reliability when reported. The CA of 60 cases of clipped (n=30) and coiled (n=30) aneurysms were independently evaluated by 19 raters of various backgrounds and experiences using a new 3-grade scale. Raters were also asked how they would manage each case (delayed, close follow-up, or retreatment). Fifteen raters performed a 2nd evaluation of the same cases (in a permuted order) at least one month apart. The same methodology was used to study the reliability of the scale when 60 noninvasive angiography cases of clipped (30 CTA) or coiled (30 MRA) aneurysms were evaluated by 24 raters. Inter and intra-rater agreements were assessed using Krippendorff’s Alpha statistics (aK), and the relationship between occlusion grade and clinical management using Cramer’s V test. RESULTS The systematic reviews found 70 grading scales (44 post-clipping, 26 post-coiling) with significant heterogeneity in terms of reliability (kappa values from 0.12 to 1.00), raters, and patients. Overall inter-rater agreement for the grading scale was substantial for CA results (aK=0.76; 0.67–0.83) as well as for noninvasive results (aK=0.63; 0.55–0.70); regardless of background, experience, or treatment used. A strong relationship was found between angiographic grades and management decisions (V=0.80±0.12 for CA; V=0.77±0.14 for CTA/MRA [p<0.001]). CONCLUSION A standardized 3-grade scale was found to be a reliable and clinically meaningful tool to compare the results of clipping and coiling of aneurysms using any angiographic modality. It could be used as a valid tool to adjudicate results of comparative randomized trials, and to standardize the reporting of aneurysm treatments.
45

Forage Yield and Chemical Composition of an Orchardgrass-Bromegrass Pasture Mixture as Influenced by Clipping Frequency, Nitrogen Fertilization and Irrigation Regime

Gawai, Vinayak G. 01 May 1967 (has links)
The influence of agronomic practices on forage production and chemical composition of an orchardgrass-bromegrass pasture mixture was studied at the Greenville Farm, Logan, Utah, during 1960-1964. The soil is a well drained Millville Silt loam that has about a one percent slope and occurs on an alluvial fan. It is high in potash, phosphorus and calcium and is alkaline having a pH of 7.9 to 8.2. Analysis of the clipping frequency showed that the yield of forage from four harvests was greater than from five harvests. This difference was greater on plots receiving high rates of nitrogen fertilization, and frequent irrigation. Nitrogen, phosphorus, calcium, copper, iron and zinc contents were lower with five harvests. Forage production increased significantly as the available moisture in the soil increased, producing the highest yields for the 5-day irrigation interval. Nitrogen, copper, iron and zinc in the foliage decreased and phosphorus and calcium increased with increasing soil moisture. On the contrary, potassium content did not show a specific trend with increasing soil moisture. Nitrogen fertilization increased the forage dry matter production significantly giving the highest yield for 200 pounds of applied nitrogen per acre per season. The percentage of nitrogen in harvested forage decreased up to 100 pounds per acre of applied nitrogen and slightly increased with 200 pounds. Calcium, iron and manganese contents decreased and zinc content slightly increased with an increase in the amount of nitrogen fertilization. On the other hand, phosphorus, potassium and copper contents did not show a consistent trend with increasing nitrogen fertilization. A difference was noted between the amount of forage produced for different years. However, there was a tendency of decreasing forage yield with increasing age of stand. Nitrogen and phosphorus contents increased and copper, iron and manganese decreased significantly with the increasing age of stand. Potassium, calcium, and zinc contents showed a gradual decrease with the age of forage but there was a fluctuating tendency in their content for different years.
46

Simulation and evaluation of a DVB System using Simulink (Vol. II)

Prieto, Alberto Pastor January 2005 (has links)
<p>DVB (Digital Video Broadcasting) is the television digital system. It's however much more than a simple replacement for existing analogue television transmission, this system has many advantages such as picture quality and allows you a range of new features and services including subtitling, multiple audio tracks, interactive content, multimedia content. </p><p>The system is based in the OFDM technology, which allows DVB system to exploit the spectrum frequencies in a better way, saving spectrum, but OFDM has an important drawback that is the peak to average power ratio problem. OFDM is based on the FFT algorithms generating orthogonal subcarriers. </p><p>This thesis talks about the improvement of the PAPR problem using the soft compression method. </p><p>The simulation has been developed in Simulink® and Matlab®. With all the information presented in this thesis, any user can simulate the system. Thus, this thesis can be improved using other techniques to solve the PAPR problem.</p>
47

Simulation and evaluation of a DVB System using Simulink (Vol. II)

Prieto, Alberto Pastor January 2005 (has links)
DVB (Digital Video Broadcasting) is the television digital system. It's however much more than a simple replacement for existing analogue television transmission, this system has many advantages such as picture quality and allows you a range of new features and services including subtitling, multiple audio tracks, interactive content, multimedia content. The system is based in the OFDM technology, which allows DVB system to exploit the spectrum frequencies in a better way, saving spectrum, but OFDM has an important drawback that is the peak to average power ratio problem. OFDM is based on the FFT algorithms generating orthogonal subcarriers. This thesis talks about the improvement of the PAPR problem using the soft compression method. The simulation has been developed in Simulink® and Matlab®. With all the information presented in this thesis, any user can simulate the system. Thus, this thesis can be improved using other techniques to solve the PAPR problem.
48

Corrosion Behaviors Of Stainless Steels In Molten Zinc Aluminum Alloy

Ozcan, Emre 01 July 2012 (has links) (PDF)
High grade galvanized steel in large amounts is needed to match the increasing demand of automotive industry both in our country and in the world. Stainless steels, used in fabrication of zinc bath hardware of continuous galvanizing lines, lose their corrosion resistance due to various mechanisms in such mediums containing molten metals like zinc and aluminum. Consequently they corrode to the levels where they should be taken to maintenance or replaced. In this study, corrosion performance and the effect of typical galvanizing and age treating heat treatments to mechanical properties of 4 newly developed austenitic stainless steels and AISI 316L grade stainless steel were investigated and compared with each other. Experimental studies involved immersion corrosion tests for 168 and 504 hours followed by weight loss determinations and comparisons of corrosion performances of age treated and solution annealed stainless steels. Parallel with corrosion testing, delta ferrite content v determinations with 3 different methods, tensile tests and v-notch impact tests at 4 different heat exposure conditions were carried out and discussed. 2 new stainless steel compositions were selected to be used in fabrication of galvanizing hardware based on the comparisons of corrosion &amp / mechanical performances of candidate steels.
49

Low-complexity list detection algorithms for the multiple-input multiple-output channel

Milliner, David Louis 20 October 2009 (has links)
Modern communication systems demand ever-increasing data rates. Meeting this increased demand is not easy due to regulation and fundamental physical constraints. The utilization of more than one antenna at both the transmitter and receiver produces a multiple-input multiple-output (MIMO) channel, thereby enabling (under certain channel conditions) increased data rates without the need for increased bandwidth or transmission power. Concurrent with this increase in bandwidth is an increase in the receiver's computational complexity which, for a brute-force detector, increases exponentially. For receivers that possess error correcting capabilities, the problem of constructing a detector with low computational complexity that allows for near-exact a posteriori detection is challenging for transmission schemes employing even a modest number of transmit antennas and modulation alphabet sizes. The focus of this dissertation is on the construction of MIMO detection algorithms with low and fixed computational complexity. Specifically, the detection algorithms in this dissertation generate a list of potential transmission vectors resulting in realizable communication receivers with low and fixed computational complexity combined with low error rate performance in both coded and uncoded systems. A key contribution in this dissertation is a breadth-first fixed-complexity algorithm known as the smart-ordered and candidate-adding algorithm that achieves a desirable performance-complexity tradeoff. This algorithm requires only a single pass of a search tree to find its list of transmission vectors. We then construct a framework within which we classify a large class of breadth-first detection algorithms. The design of receiver algorithms for MIMO systems employing space-time codes and error correction is an important area of study. In this dissertation we propose a low and fixed computational complexity algorithm for an increasingly significant algebraic space-time code known as the golden code. The notion of computational complexity is critical in the design of practical MIMO receivers. We provide an analysis of computational complexity in relation to list-based soft-output detection where, in some instances, bounds are placed on the computational complexity of MIMO detection. For this analysis we utilize a metric known as the number of branch metric computations. The value at which the log-likelihood ratio (LLR) of conditional probabilities for a transmitted bit being either a 1 or a 0 is 'clipped' has an impact on a system's error rate performance. We propose a new approach for determining LLR clipping levels that, in contrast to prior approaches which clip to a predetermined fixed LLR clipping level, exploits channel state information to improve the error rate performance of suboptimal detection algorithms. Orthogonal frequency-division (OFDM) multiplexing is an effective technique for combating frequency-selective wideband communication channels. It is common practice for MIMO-OFDM detectors to implement the same detector at each subcarrier, in which case the overall performance is dominated by the weakest subcarrier. We propose a hard-output list detection receiver strategy for MIMO-OFDM channels called nonuniform computational complexity allocation, whereby the receiver adapts the computational resources of the MIMO detector at each subcarrier to match a metric of the corresponding channel quality. The proposed nonuniform algorithm improves performance over uniform allocation.
50

Recorte volumétrico usando técnicas de interação 2D e 3D / Volume Sculpting with 2D and 3D Interaction Techniques

Huff, Rafael January 2006 (has links)
A visualização de conjuntos de dados volumétricos é comum em diversas áreas de aplicação e há já alguns anos os diversos aspectos envolvidos nessas técnicas vêm sendo pesquisados. No entanto, apesar dos avanços das técnicas de visualização de volumes, a interação com grandes volumes de dados ainda apresenta desafios devido a questões de percepção (ou isolamento) de estruturas internas e desempenho computacional. O suporte do hardware gráfico para visualização baseada em texturas permite o desenvolvimento de técnicas eficientes de rendering que podem ser combinadas com ferramentas de recorte interativas para possibilitar a inspeção de conjuntos de dados tridimensionais. Muitos estudos abordam a otimização do desempenho de ferramentas de recorte, mas muito poucos tratam das metáforas de interação utilizadas por essas ferramentas. O objetivo deste trabalho é desenvolver ferramentas interativas, intuitivas e fáceis de usar para o recorte de imagens volumétricas. Inicialmente, é apresentado um estudo sobre as principais técnicas de visualização direta de volumes e como é feita a exploração desses volumes utilizando-se recorte volumétrico. Nesse estudo é identificada a solução que melhor se enquadra no presente trabalho para garantir a interatividade necessária. Após, são apresentadas diversas técnicas de interação existentes, suas metáforas e taxonomias, para determinar as possíveis técnicas de interação mais fáceis de serem utilizadas por ferramentas de recorte. A partir desse embasamento, este trabalho apresenta o desenvolvimento de três ferramentas de recorte genéricas implementadas usando-se duas metáforas de interação distintas que são freqüentemente utilizadas por usuários de aplicativos 3D: apontador virtual e mão virtual. A taxa de interação dessas ferramentas é obtida através de programas de fragmentos especiais executados diretamente no hardware gráfico. Estes programas especificam regiões dentro do volume a serem descartadas durante o rendering, com base em predicados geométricos. Primeiramente, o desempenho, precisão e preferência (por parte dos usuários) das ferramentas de recorte volumétrico são avaliados para comparar as metáforas de interação empregadas. Após, é avaliada a interação utilizando-se diferentes dispositivos de entrada para a manipulação do volume e ferramentas. A utilização das duas mãos ao mesmo tempo para essa manipulação também é testada. Os resultados destes experimentos de avaliação são apresentados e discutidos. / Visualization of volumetric datasets is common in many fields and has been an active area of research in the past two decades. In spite of developments in volume visualization techniques, interacting with large datasets still demands research efforts due to perceptual and performance issues. The support of graphics hardware for texture-based visualization allows efficient implementation of rendering techniques that can be combined with interactive sculpting tools to enable interactive inspection of 3D datasets. Many studies regarding performance optimization of sculpting tools have been reported, but very few are concerned with the interaction techniques employed. The purpose of this work is the development of interactive, intuitive, and easy-to-use sculpting tools. Initially, a review of the main techniques for direct volume visualization and sculpting is presented. The best solution that guarantees the required interaction is highlighted. Afterwards, in order to identify the most user-friendly interaction technique for volume sculpting, several interaction techniques, metaphors and taxonomies are presented. Based on that, this work presents the development of three generic sculpting tools implemented using two different interaction metaphors, which are often used by users of 3D applications: virtual pointer and virtual hand. Interactive rates for these sculpting tools are obtained by running special fragment programs on the graphics hardware which specify regions within the volume to be discarded from rendering based on geometric predicates. After development, the performance, precision and user preference of the sculpting tools were evaluated to compare the interaction metaphors. Afterward, the tools were evaluated by comparing the use of a 3D mouse against a conventional wheel mouse for guiding volume and tools manipulation. Two-handed input was also tested with both types of mouse. The results from the evaluation experiments are presented and discussed.

Page generated in 0.0585 seconds