• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 42
  • 8
  • 6
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 84
  • 11
  • 10
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Active Minimization of Acoustic Energy Density in a Mock Tractor Cab

Faber, Benjamin Mahonri 17 March 2004 (has links) (PDF)
An active noise control (ANC) system has been applied to the problem of attenuating low-frequency tonal noise inside small enclosures. The intended target application of the system was the reduction of the engine firing frequency inside heavy equipment cabins. The ANC system was based on a version of the filtered-x LMS adaptive algorithm, modified for the minimization of acoustic energy density (ED), rather than the more traditional minimization of squared acoustic pressure (SP). Three loudspeakers produced control signals within a mock cabin composed of a steel frame with plywood sides and a Plexiglas® front. An energy density sensor, capable of measuring acoustic pressure as well as acoustic particle velocity, provided the error signal to the control system. The ANC system operated on a single reference signal, which, for experiments involving recorded tractor engine noise, was derived from the engine's tachometer signal. For the low frequencies at which engine firing occurs, experiments showed that ANC systems minimizing ED and SP both provided significant attenuation of the tonal noise near the operator's head and globally throughout the small cabin. The tendency was for ED control to provide a more spatially uniform amount of reduction than SP control, especially at the higher frequencies investigated (up to 200 Hz). In dynamic measurement conditions, with a reference signal swept in frequency, the ED control often provided superior results, struggling less at frequencies for which the error sensor was near nodal regions for acoustic pressure. A single control channel often yielded performance comparable to that of two control channels, and sometimes produced superior results in dynamic tests. Tonal attenuation achieved by the ANC system was generally in excess of 20 dB and reduction in equivalent sound level for dynamic tonal noise often exceeded 4 dB at the error sensor. It was shown that temperature changes likely to be encountered in practice have little effect on the initial delay through the secondary control path, and are therefore unlikely to significantly impact ANC system stability in the event that a fixed set of system identification filter coefficients are employed.
62

Efficient finite-state algorithms for the application of local grammars

Sastre, Javier M. 11 July 2011 (has links) (PDF)
Notre travail porte sur le développement d'algorithmes performants d'application de grammaires locales, en prenant comme référence ceux des logiciels libres existants: l'analyseur syntaxique descendant d'Unitex et l'analyseur syntaxique à la Earley d'Outilex. Les grammaires locales sont un formalisme de représentation de la syntaxe des langues naturelles basé sur les automates finis. Les grammaires locales sont un modèle de construction de descriptions précises et à grande échelle de la syntaxe des langues naturelles par le biais de l'observation systématique et l'accumulation méthodique de données. L'adéquation des grammaires locales pour cette tâche a été testée à l'occasion de nombreux travaux. À cause de la nature ambiguë des langues naturelles et des propriétés des grammaires locales, les algorithmes classiques d'analyse syntaxique tels que LR, CYK et Tomita ne peuvent pas être utilisés dans le contexte de ce travail. Les analyseurs descendant et Earley sont des alternatives possibles, cependant, ils ont des coûts asymptotiques exponentiels pour le cas des grammaires locales. Nous avons d'abord conçu un algorithme d'application de grammaires locales avec un coût polynomial dans le pire des cas. Ensuite, nous avons conçu des structures de données performantes pour la représentation d'ensembles d'éléments et de séquences. Elles ont permis d'améliorer la vitesse de notre algorithme dans le cas général. Nous avons mis en oeuvre notre algorithme et ceux des systèmes Unitex et Outilex avec les mêmes outils afin de les tester dans les mêmes conditions. En outre, nous avons mis en oeuvre différentes versions de chaque algorithme en utilisant nos structures de données et algorithmes pour la représentation d'ensembles et ceux fournis par la Standard Template Library (STL) de GNU. Nous avons comparé les performances des différents algorithmes et de leurs variantes dans le cadre d'un projet industriel proposé par l'entreprise Telefónica I+D: augmenter la capacité de compréhension d'un agent conversationnel qui fournit des services en ligne, voire l'envoi de SMS à des téléphones portables ainsi que des jeux et d'autres contenus numériques. Les conversations avec l'agent sont en espagnol et passent par Windows Live Messenger. En dépit du domaine limité et de la simplicité des grammaires appliquées, les temps d'exécution de notre algorithme, couplé avec nos structures de données et algorithmes pour la représentation d'ensembles, ont été plus courts. Grâce au coût asymptotique amélioré, on peut s'attendre à des temps d'exécution significativement inférieurs par rapport aux algorithmes utilisés dans les systèmes Unitex et Outilex, pour le cas des grammaires complexes et à large couverture.
63

Image Reconstruction Based On Hilbert And Hybrid Filtered Algorithms With Inverse Distance Weight And No Backprojection Weight

Narasimhadhan, A V 08 1900 (has links) (PDF)
Filtered backprojection (FBP) reconstruction algorithms are very popular in the field of X-ray computed tomography (CT) because they give advantages in terms of the numerical accuracy and computational complexity. Ramp filter based fan-beam FBP reconstruction algorithms have the position dependent weight in the backprojection which is responsible for spatially non-uniform distribution of noise and resolution, and artifacts. Many algorithms based on shift variant filtering or spatially-invariant interpolation in the backprojection step have been developed to deal with this issue. However, these algorithms are computationally demanding. Recently, fan-beam algorithms based on Hilbert filtering with inverse distance weight and no weight in the backprojection have been derived using the Hamaker’s relation. These fan-beam reconstruction algorithms have been shown to improve noise uniformity and uniformity in resolution. In this thesis, fan-beam FBP reconstruction algorithms with inverse distance back-projection weight and no backprojection weight for 2D image reconstruction are presented and discussed for the two fan-beam scan geometries -equi-angular and equispace detector array. Based on the proposed and discussed fan-beam reconstruction algorithms with inverse distance backprojection and no backprojection weight, new 3D cone-beam FDK reconstruction algorithms with circular and helical scan trajectories for curved and planar detector geometries are proposed. To start with three rebinning formulae from literature are presented and it is shown that one can derive all fan-beam FBP reconstruction algorithms from these rebinning formulae. Specifically, two fan-beam algorithms with no backprojection weight based on Hilbert filtering for equi-space linear array detector and one new fan-beam algorithm with inverse distance backprojection weight based on hybrid filtering for both equi-angular and equi-space linear array detector are derived. Simulation results for these algorithms in terms of uniformity of noise and resolution in comparison to standard fan-beam FBP reconstruction algorithm (ramp filter based fan-beam reconstruction algorithm) are presented. It is shown through simulation that the fan-beam reconstruction algorithm with inverse distance in the backprojection gives better noise performance while retaining the resolution properities. A comparison between above mentioned reconstruction algorithms is given in terms of computational complexity. The state of the art 3D X-ray imaging systems in medicine with cone-beam (CB) circular and helical computed tomography scanners use non-exact (approximate) FBP based reconstruction algorithm. They are attractive because of their simplicity and low computational cost. However, they produce sub-optimal reconstructed images with respect to cone-beam artifacts, noise and axial intensity drop in case of circular trajectory scan imaging. Axial intensity drop in the reconstructed image is due to the insufficient data acquired by the circular-scan trajectory CB CT. This thesis deals with investigations to improve the image quality by means of the Hilbert and hybrid filtering based algorithms using redundancy data for Feldkamp, Davis and Kress (FDK) type reconstruction algorithms. In this thesis, new FDK type reconstruction algorithms for cylindrical detector and planar detector for CB circular CT are developed, which are obtained by extending to three dimensions (3D) an exact Hilbert filtering based FBP algorithm for 2D fan-beam beam algorithms with no position dependent backprojection weight and fan-beam algorithm with inverse distance backprojection weight. The proposed FDK reconstruction algorithm with inverse distance weight in the backprojection requires full-scan projection data while the FDK reconstruction algorithm with no backprojection weight can handle partial-scan data including very short-scan. The FDK reconstruction algorithms with no backprojection weight for circular CB CT are compared with Hu’s, FDK and T-FDK reconstruction algorithms in-terms of axial intensity drop and computational complexity. The simulation results of noise, CB artifacts performance and execution timing as well as the partial-scan reconstruction abilities are presented. We show that FDK reconstruction algorithms with no backprojection weight have better noise performance characteristics than the conventional FDK reconstruction algorithm where the backprojection weight is known to result in spatial non-uniformity in the noise characteristics. In this thesis, we present an efficient method to reduce the axial intensity drop in circular CB CT. The efficient method consists of two steps: the first one is reconstruction of the object using FDK reconstruction algorithm with no backprojection weight and the second is estimating the missing term. The efficient method is comparable to Zhu et al.’s method in terms of reduction in axial intensity drop, noise and computational complexity. The helical scanning trajectory satisfies the Tuy-smith condition, hence an exact and stable reconstruction is possible. However, the helical FDK reconstruction algorithm is responsible for the cone-beam artifacts since the helical FDK reconstruction algorithm is approximate in its derivation. In this thesis, helical FDK reconstruction algorithms based on Hilbert filtering with no backprojection weight and FDK reconstruction algorithm based on hybrid filtering with inverse distance backprojection weight are presented to reduce the CB artifacts. These algorithms are compared with standard helical FDK in-terms of noise, CB artifacts and computational complexity.
64

The Diamond Lemma for Power Series Algebras

Hellström, Lars January 2002 (has links)
<p>The main result in this thesis is the generalisation of Bergman's diamond lemma for ring theory to power series rings. This generalisation makes it possible to treat problems in which there arise infinite descending chains. Several results in the literature are shown to be special cases of this diamond lemma and examples are given of interesting problems which could not previously be treated. One of these examples provides a general construction of a normed skew field in which a custom commutation relation holds.</p><p>There is also a general result on the structure of totally ordered semigroups, demonstrating that all semigroups with an archimedean element has a (up to a scaling factor) unique order-preserving homomorphism to the real numbers. This helps analyse the concept of filtered structure. It is shown that whereas filtered structures can be used to induce pretty much any zero-dimensional linear topology, a real-valued norm suffices for the definition of those topologies that have a reasonable relation to the multiplication operation.</p><p>The thesis also contains elementary results on degree (as of polynomials) functions, norms on algebras (in particular ultranorms), (Birkhoff) orthogonality in modules, and construction of semigroup partial orders from ditto quasiorders.</p>
65

The Diamond Lemma for Power Series Algebras

Hellström, Lars January 2002 (has links)
The main result in this thesis is the generalisation of Bergman's diamond lemma for ring theory to power series rings. This generalisation makes it possible to treat problems in which there arise infinite descending chains. Several results in the literature are shown to be special cases of this diamond lemma and examples are given of interesting problems which could not previously be treated. One of these examples provides a general construction of a normed skew field in which a custom commutation relation holds. There is also a general result on the structure of totally ordered semigroups, demonstrating that all semigroups with an archimedean element has a (up to a scaling factor) unique order-preserving homomorphism to the real numbers. This helps analyse the concept of filtered structure. It is shown that whereas filtered structures can be used to induce pretty much any zero-dimensional linear topology, a real-valued norm suffices for the definition of those topologies that have a reasonable relation to the multiplication operation. The thesis also contains elementary results on degree (as of polynomials) functions, norms on algebras (in particular ultranorms), (Birkhoff) orthogonality in modules, and construction of semigroup partial orders from ditto quasiorders.
66

Design And Implementation Of A Fixed Point Digital Active Noise Controller Headphone

Erkan, Fatih 01 July 2009 (has links) (PDF)
In this thesis, the design and implementation of a Portable Feedback Active Noise Controller Headphone System, which is based on Texas Instruments TMS320VC5416PGE120 Fixed Point DSP, is described. Problems resulted from fixed-point implementation of LMS algorithm and delays existing in digital ANC implementation are determined. Effective solutions to overcome the aforementioned problems are proposed based on the literature survey. Design of the DSP based control card is explained and crucial points about analog-digital-mixed board design for noise sensitive applications are explained. Filtered input LMS algorithm, filtered input normalized LMS algorithm and filtered input sign-sign LMS algorithm are implemented as adaptation algorithms. The advantages and disadvantages of using modified LMS algorithms are indicated. The selection of the parameters of these algorithms is based on theoretical results and experiments. The real time performances of different adaptation algorithms are compared with each other as well as with a commercial analog ANC headphone under different types of artificial and natural noise signals. Moreover, practical conditions such as put on - put off case and dynamic range overflow case are handled with additional software implementations. It is shown that adaptive ANC systems improve the noise reduction significantly when the noise is within a narrow frequency range and this reduction can be applied to a wider frequency range. It is also shown that the problems of digitally implemented adaptive filters which are based on tracking capability, stability, dynamic range and portability can be fixed to challenge with the analog commercial ANC systems.
67

Kvantitativní hodnocení kvality CT RTG zobrazení / CT X-ray quantitative evaluation

Novotný, Lukáš January 2009 (has links)
X-Ray Computed Tomography is irreplaceable medical imaging system. Quantitative evaluation is day to day routine used for clean run of this imaging system. The master’s thesis is focused on quantitative evaluation of first and third generation X-Ray CT. First of all is about subjective and objective evaluation of space and energetic resolution. Space resolution is evaluated in space and frequency domain. Energetic resolution is represent by low contrast resolution method. Application “Kvantitativní hodnocení kvality CT RTG zobrazení” created for this thesis is used for creation of reconstruction image and quantitative evaluation. This application was created with consideration of its usage in subjects about image processing. The master’s thesis contains results of quantitative evaluation X-Ray CT created with this application and proposal of lab work.
68

Kumulace biologických signálů / Averaging of biological signals

Kubík, Adam January 2012 (has links)
The main aim of this thesis is to introduce issue of averaging of biological signals. The first part of the thesis deals with the principles of individual averaging methods (constant, floating and exponential window) and describes their basic features. Moreover, the principle of filtered residue, detection of QRS complex, and stretching/shrinking the length of RR-interval to the standardized length are explicated. In the second part of the thesis the outcomes of practically realized (Matlab and GUI) methods of averaging (by final signal-to-noise ratio) are evaluated. Signals from MIT-BIH database are used.
69

Využití kumulací pro biologické signály / Averaging of biological signals

Němeček, Tomáš January 2014 (has links)
The main objectives of this thesis are to study theory of signal averaging, filtered residue method and methods of stretching/shrinking signal. It will also test the functionality of those methods. Thesis contains theoretical analysis, explanation of principles and testing of behaving of used methods.
70

Algoritmo de reconstrucción analítico para el escáner basado en cristales monolíticos MINDView

Sánchez Góez, Sebastián 17 January 2021 (has links)
[ES] La tomografía por emisión de positrones (PET, del inglés Positron Emission Tomography) es una técnica de medicina nuclear en la que se genera una imagen a partir de la detección de rayos gamma en coincidencia. Estos rayos son producidos dentro de un paciente al que se le inyecta una radiotrazador emisor de positrones, los cuales se aniquilan con electrones del medio circundante. El proceso de adquisición de eventos de interacción, tiene como unidad central el detector del escáner PET, el cual se compone a su vez de un cristal de centelleo, encargado de transformar los rayos gamma incidentes en fotones ópticos dentro del cristal. La finalidad es entonces, determinar las coordenadas de impacto dentro del cristal de centelleo con la mayor precisión posible, para que, a partir de dichos puntos, se pueda reconstruir una imagen. A lo largo de la historia, los detectores basados en cristales pixelados han representado la elección por excelencia para la la fabricación de escáneres PET. En está tesis se evalúa el impacto en la resolución espacial del escáner PET MINDView, desarrollado dentro del séptimo programa Marco de la Unión Europea No 603002, el cual se basa en el uso de cristales monolíticos. El uso de cristales monolíticos, facilita la determinación de la profundidad de interacción (DOI - del inglés Depth Of Interaction) de los rayos gamma incidentes, aumenta la precisión en las coordenadas de impacto determinadas, y disminuye el error de paralaje que se induce en cristales pixelados, debido a la dificultad para determinar la DOI. En esta tesis, hemos logrado dos objetivos principales relacionados con la medición de la resolución espacial del escáner MINDView: la adaptación del un algoritmo de STIR de Retroproyección Filtrada en 3D (FBP3DRP - del inglés Filtered BackProjection 3D Reproyected) a un escáner basado en cristales monolíticos y la implementación de un algoritmo de Retroproyección y filtrado a posteriori (BPF - BackProjection then Filtered). Respecto a la adaptación del algoritmo FBP, las resoluciones espaciales obtenidas varían en los intervalos [2 mm, 3,4 mm], [2,3 mm, 3,3 mm] y [2,2 mm, 2,3 mm] para las direcciones radial, tangencial y axial, respectivamente, en el primer prototipo del escáner MINDView dedicado a cerebro. Por otra parte, en la implementación del algoritmo de tipo BPF, se realizó una adquisición de un maniquí de derenzo y se comparó la resolución obtenida con el algoritmo de FBP y una implementación del algoritmo de subconjuntos ordenados en modo lista (LMOS - del inglés List Mode Ordered Subset). Mediante el algoritmo de tipo BPF se obtuvieron valores pico-valle de 2.4 a lo largo de los cilindros del maniquí de 1.6 mm de diámetro, en contraste con las medidas obtenidas de 1.34 y 1.44 para los algoritmos de FBP3DRP y LMOS, respectivamente. Lo anterior se traduce en que, mediante el algoritmo de tipo BPF, se logra mejorar la resolución para obtenerse un valor promedio 1.6 mm. / [CAT] La tomografia per emissió de positrons és una tècnica de medicina nuclear en la qual es genera una imatge a partir de la detecció de raigs gamma en coincidència. Aquests raigs són produïts dins d'un pacient a què se li injecta una radiotraçador emissor de positrons, els quals s'aniquilen amb electrons de l'medi circumdant. El procés de adquición d'esdeveniments d'interacció, té com a unitat central el detector de l'escàner PET, el qual es compon al seu torn d'un vidre de centelleig, encarregat de transformar els raigs gamma incidents en fotons òptics dins el vidre. La finalitat és llavors, determinar les coordenades d'impacte dins el vidre de centelleig amb la major precisió possible, perquè, a partir d'aquests punts, es pugui reconstruir una imatge. Al llarg de la història, els detectors basats en cristalls pixelats han representat l'elecció per excellència per a la la fabricació d'escàners PET. En aquesta tesi s'avalua l'impacte en la resolució espacial de l'escàner PET MINDView, desenvolupat dins el setè programa Marc de la Unió Europea No 603.002, el qual es basa en l'ús de vidres monolítics. L'ús de vidres monolítics, facilita la determinació de la profunditat d'interacció dels raigs gamma incidents, augmenta la precisió en les coordenades d'impacte determinades, i disminueix l'error de parallaxi que s'indueix en cristalls pixelats, a causa de la dificultat per determinar la DOI. En aquesta tesi, hem aconseguit dos objectius principals relacionats amb el mesurament de la resolució espacial de l'escàner MINDView: l'adaptació de l'un algoritme de STIR de Retroprojecció Filtrada en 3D a un escàner basat en cristalls monolítics i la implementació d'un algoritme de Retroprojecció i filtrat a posteriori. Pel que fa a l'adaptació de l'algoritme FBP3DRP, les resolucions espacials obtingudes varien en els intervals [2 mm, 3,4 mm], [2,3 mm, 3,3 mm] i [2,2 mm, 2,3 mm] per les direccions radial, tangencial i axial, respectivament, en el primer prototip de l'escàner MINDView dedicat a cervell. D'altra banda, en la implementació de l'algoritme de tipus BPF, es va realitzar una adquisició d'un maniquí de derenzo i es va comparar la resolució obtinguda amb l'algorisme de FBP3DRP i una implementació de l'algoritme de subconjunts ordenats en mode llista (LMOS - de l'anglès List Mode Ordered Subset). Mitjançant l'algoritme de tipus BPF es van obtenir valors pic-vall de 2.4 al llarg dels cilindres de l'maniquí de 1.6 mm de diàmetre, en contrast amb les mesures obtingudes de 1.34 i 1.44 per als algoritmes de FBP3DRP i LMOS, respectivament. L'anterior es tradueix en que, mitjançant l'algoritme de tipus BPF, s'aconsegueix millorar la resolució per obtenir-se un valor mitjà 1.6 mm. / [EN] Positron Emission Tomography (PET) is a medical imaging technique, in which an image is generated from the detection of gamma rays in coincidence. These rays are produced within a patient, who is injected with a positron emmiter radiotracer, from which positrons are annihilated with electrons in the media. The event acquisition process is focused on the scanner detector. The detector is in turn composed of a scintillation crystal, which transform the incident ray gamma into optical photons within the crystal. The purpose is then to determine the impact coordinates within the scintillation crystal with the greatest possible precision, so that, from these points, an image can be reconstructed. Throughout history, detectors based on pixelated crystals have represented the quintessential choice for PET scanners manufacture. This thesis evaluates the impact on the spatial resolution of the MINDView PET scanner, developed in the seventh Framework program of the European Union No. 603002, which detectors are based on monolithic crystals. The use of monolithic crystals facilitates the determination of the depth of interaction (DOI - Depth Of Interaction) of the incident gamma rays, increases the precision in the determined impact coordinates, and reduces the parallax error induces in pixelated crystals, due to the difficulties in determining DOI. In this thesis, we have achieved two main goals related to the measurement of the spatial resolution of the MINDView PET scanner: the adaptation of an STIR algorithm for Filtered BackProjection 3D Reproyected (FBP3DRP) to a scanner based on monolithic crystals, and the implementation of a BackProjection then Filtered algorithm (BPF). Regarding the FBP algorithm adaptation, we achieved resolutions ranging in the intervals [2 mm, 3.4 mm], [2.3 mm, 3.3 mm] and [2.2 mm, 2.3 mm] for the radial, tangential and axial directions, respectively. On the an acquisition of a derenzo phantom was performed to measure the spacial resolution, which was obtained using three reconstruction algorithms: the BPF-type algorithm, the FBP3DRP algorithm and an implementation of the list-mode ordered subsets algorithm (LMOS). Regarding the BPF-type algorithm, a peak-to-valley value of 2.4 were obtain along rod of 1.6 mm, in contrast to the measurements of 1.34 and 1.44 obtained for the FBP3DRP and LMOS algorithms, respectively. This means that, by means of the BPF-type algorithm, it is possible to improve the resolution to obtain an average value of 1.6 mm. / Sánchez Góez, S. (2020). Algoritmo de reconstrucción analítico para el escáner basado en cristales monolíticos MINDView [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/159259 / TESIS

Page generated in 0.0606 seconds