• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 15
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 41
  • 31
  • 26
  • 22
  • 13
  • 11
  • 11
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Výukový video kodek / Educational video codec

Dvořák, Martin January 2012 (has links)
The first goal of diploma thesis is to study the basic principles of video signal compression. Introduction to techniques used to reduce irrelevancy and redundancy in the video signal. The second goal is, on the basis of information about compression tools, implement the individual compression tools in the programming environment of Matlab and assemble simple model of the video codec. Diploma thesis contains a description of the three basic blocks, namely - interframe coding, intraframe coding and coding with variable length word - according the standard MPEG-2.
72

Komprese obrazu pomocí vlnkové transformace / Image Compression Using the Wavelet Transform

Urbánek, Pavel January 2013 (has links)
This thesis is focused on subject of image compression using wavelet transform. The first part of this document provides reader with information about image compression, presents well known contemporary algorithms and looks into details of wavelet compression and following encoding schemes. Both JPEG and JPEG 2000 standards are introduced. Second part of this document analyzes and describes implementation of image compression tool including inovations and optimalizations. The third part is dedicated to comparison and evaluation of achievements.
73

Unsupervised Detection of Interictal Epileptiform Discharges in Routine Scalp EEG : Machine Learning Assisted Epilepsy Diagnosis

Shao, Shuai January 2023 (has links)
Epilepsy affects more than 50 million people and is one of the most prevalent neurological disorders and has a high impact on the quality of life of those suffering from it. However, 70% of epilepsy patients can live seizure free with proper diagnosis and treatment. Patients are evaluated using scalp EEG recordings which is cheap and non-invasive. Diagnostic yield is however low and qualified personnel need to process large amounts of data in order to accurately assess patients. MindReader is an unsupervised classifier which detects spectral anomalies and generates a hypothesis of the underlying patient state over time. The aim is to highlight abnormal, potentially epileptiform states, which could expedite analysis of patients and let qualified personnel attest the results. It was used to evaluate 95 scalp EEG recordings from healthy adults and adult patients with epilepsy. Interictal Epileptiform discharges (IED) occurring in the samples had been retroactively annotated, along with the patient state and maneuvers performed by personnel, to enable characterization of the classifier’s detection performance. The performance was slightly worse than previous benchmarks on pediatric scalp EEG recordings, with a 7% and 33% drop in specificity and sensitivity, respectively. Electrode positioning and partial spatial extent of events saw notable impact on performance. However, no correlation between annotated disturbances and reduction in performance could be found. Additional explorative analysis was performed on serialized intermediate data to evaluate the analysis design. Hyperparameters and electrode montage options were exposed to optimize for the average Mathew’s correlation coefficient (MCC) per electrode per patient, on a subset of the patients with epilepsy. An increased window length and lowered amount of training along with an common average montage proved most successful. The Euclidean distance of cumulative spectra (ECS), a metric suitable for spectral analysis, and homologous L2 and L1 loss function were implemented, of which the ECS further improved the average performance for all samples. Four additional analyses, featuring new time-frequency transforms and multichannel convolutional autoencoders were evaluated and an analysis using the continuous wavelet transform (CWT) and a convolutional autoencoder (CNN) performed the best, with an average MCC score of 0.19 and 56.9% sensitivity with approximately 13.9 false positives per minute.
74

On the design of fast and efficient wavelet image coders with reduced memory usage

Oliver Gil, José Salvador 06 May 2008 (has links)
Image compression is of great importance in multimedia systems and applications because it drastically reduces bandwidth requirements for transmission and memory requirements for storage. Although earlier standards for image compression were based on the Discrete Cosine Transform (DCT), a recently developed mathematical technique, called Discrete Wavelet Transform (DWT), has been found to be more efficient for image coding. Despite improvements in compression efficiency, wavelet image coders significantly increase memory usage and complexity when compared with DCT-based coders. A major reason for the high memory requirements is that the usual algorithm to compute the wavelet transform requires the entire image to be in memory. Although some proposals reduce the memory usage, they present problems that hinder their implementation. In addition, some wavelet image coders, like SPIHT (which has become a benchmark for wavelet coding), always need to hold the entire image in memory. Regarding the complexity of the coders, SPIHT can be considered quite complex because it performs bit-plane coding with multiple image scans. The wavelet-based JPEG 2000 standard is still more complex because it improves coding efficiency through time-consuming methods, such as an iterative optimization algorithm based on the Lagrange multiplier method, and high-order context modeling. In this thesis, we aim to reduce memory usage and complexity in wavelet-based image coding, while preserving compression efficiency. To this end, a run-length encoder and a tree-based wavelet encoder are proposed. In addition, a new algorithm to efficiently compute the wavelet transform is presented. This algorithm achieves low memory consumption using line-by-line processing, and it employs recursion to automatically place the order in which the wavelet transform is computed, solving some synchronization problems that have not been tackled by previous proposals. The proposed encode / Oliver Gil, JS. (2006). On the design of fast and efficient wavelet image coders with reduced memory usage [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1826
75

Development of a novel high resolution and high throughput biosensing technology based on a Monolithic High Fundamental Frequency Quartz Crystal Microbalance (MHFF-QCM). Validation in food control

Calero Alcarria, María del Señor 02 May 2022 (has links)
Tesis por compendio / [ES] La sociedad actual demanda un mayor control en la seguridad y calidad de los alimentos que se consumen. Esta preocupación se ve reflejada en los diferentes planes estatales y europeos de investigación científica, los cuales, plantean la necesidad de innovar y desarrollar nuevas técnicas analíticas que cubran los requerimientos actuales. En el presente documento se aborda el problema de la presencia de residuos químicos en la miel. El origen de los mismos se debe, fundamentalmente, a los tramientos veterinarios con los que se tratan enfermedades y parásitos en las abejas, y a los tratamientos agrícolas con los que las abejas se ponen en contacto cuando recolectan el néctar en cultivos próximos a las colmenas. La Agencia Europea de Seguridad Alimentaria (EFSA) confirma esta realidad al notificar numerosas alertas sanitarias en la miel. En los últimos años, los métodos de análisis basados en inmunosensores piezoeléctricos se han posicionado como la base de una técnica de cribado muy prometedora, la cual puede ser empleada como técnica complementaria a las clásicas de cromatografía, gracias a su sencillez, rapidez y bajo coste. La tecnología de resonadores High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) combina la detección directa en tiempo real, alta sensibilidad y selectividad con un fácil manejo y coste reducido en comparación con otras técnicas. Además, está tecnología permite aumentar el rendimiento del análisis mediante el diseño de arrays de resonadores en un mismo sustrato (Monolithic HFF-QCMD). En este documento se presenta el diseño de un array de 24 sensores HFF-QCMD, junto con un cartucho de micro-fluídica que traza diversos microcanales sobre los diferentes elementos sensores, a los que hace llegar la muestra de miel diluida a analizar. El cartucho actúa también como interfaz para realizar la conexión entre el array de resonadores y el instrumento de caracterización de los mismos. Para obtener el máximo partido del array diseñado, se desarrolla un método de medida robusto y fiable que permite elevar la tasa de adquisición de datos para facilitar la toma de registros eléctricos de un elevado número de resonadores de forma simultánea, e incluso en varios armónicos del modo fundamental de resonancia. La gran sensibilidad de la tecnología HFF-QCMD a los eventos bioquímicos a caracterizar se extiende también a otro tipo eventos externos, como son los cambios de temperatura o presión, lo que es necesario minimizar con el fin de reducir el impacto que estas perturbaciones no deseadas provocan en la estabilidad y fiabilidad de la medida. Con este fin, se desarrolla un algoritmo de procesado de señal basado en la Discrete Transform Wavelet (DTW). Finalmente, todos los desarrollos tecnológicos realizados se validan mediante la implementación de un inmunoensayo para la detección simultánea, en muestras de mieles reales, de residuos químicos de naturaleza química muy diferente, a saber, el fungicida tiabendazol y el antibiótico sulfatiazol. / [CA] La societat actual demanda un major control en la seguretat i qualitat dels aliments que es consumeixen. Aquesta preocupació es veu reflectida en els diferents plans estatals i europeus d'investigació científica, els quals, plantegen la necessitat d'innovar i desenvolupar noves tècniques analítiques que cobrisquen els requeriments actuals. En el present document s'aborda el problema de la presència de residus químics en la mel. L'origen dels mateixos es deu, fonamentalment, als tractaments veterinaris amb els quals es tracten malalties i paràsits en les abelles, i als tractaments agrícoles amb els quals les abelles es posen en contacte quan recol·lecten el nèctar en cultius pròxims als ruscos. L'Agència Europea de Seguretat Alimentària (EFSA) confirma aquesta realitat notificant nombroses alertes sanitàries en la mel. En els últims anys, els mètodes d'anàlisis basades en immunosensors piezoelèctrics s'han posicionat com la base d'una tècnica de garbellat molt prometedora, la qual pot ser emprada com a tècnica complementària a les clàssiques de cromatografia, gràcies a la seua senzillesa, rapidesa i baix cost. La tecnologia de ressonadors High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) combina la detecció directa en temps real, alta sensibilitat i selectivitat amb un fàcil maneig i cost reduït en comparació amb altres tècniques. A més, està tecnologia permet augmentar el rendiment del anàlisi mitjançant el disseny d'arrays de ressonadors en un mateix substrat (Monolithic HFF-QCMD). En aquest document es presenta el disseny d'un array de 24 sensors HFF-QCMD, juntament amb un cartutx de microfluídica que estableix diversos microcanals sobre els diferents elements sensors, als quals fa arribar la mostra de mel diluïda a analitzar. El cartutx actua també com a interfície per a realitzar la connexió entre l'array de ressonadors i l'instrument de caracterització d'aquests. Per a traure el màxim partit a l'array dissenyat, es desenvolupa un mètode de mesura robust i fiable que permet elevar la taxa d'adquisició de dades per a facilitar la presa de registres elèctrics d'un elevat nombre de ressonadors de manera simultània, i fins i tot en diversos harmònics del mode fonamental de ressonància. La gran sensibilitat de la tecnologia HFF-QCMD als esdeveniments bioquímics a caracteritzar s'estén també a un altre tipus esdeveniments externs, com són els canvis de temperatura o pressió, la qual cosa és necessari minimitzar amb la finalitat de reduir l'impacte que aquestes pertorbacions no desitjades provoquen en l'estabilitat i fiabilitat de la mesura. A aquest efecte, es desenvolupa un algorisme de processament de senyal basat en la Discrete Transform Wavelet (DTW). Finalment, tots els desenvolupaments tecnològics realitzats es validen mitjançant la implementació d'un immunoassaig per a la detecció simultània, en mostres de mel reals, de residus químics de naturalesa química molt diferent, a saber, el fungicida tiabendazol i l'antibiòtic sulfatiazol. / [EN] Currently, society demands greater control over the safety and quality of the food consumed. This concern is reflected in the different states and European plans for scientific research, which establish the necessity to innovate and develop new analytical techniques that meet current requirements. This document addresses the problem of the presence of chemical residues in honey. Its origin is fundamentally due to the veterinary treatments against diseases and parasites in bees, and also to the agricultural treatments with which the bees come into contact when they collect the nectar in crops close to the hives. The European Food Safety Agency (EFSA) confirms this reality by notifying numerous health alerts in honey. In recent years, analysis methods based on piezoelectric immunosensors have been positioned as the basis of a very promising screening technique, which can be used as a complementary technique to the classic chromatography, thanks to its simplicity, speed and low cost. High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) resonator technology combines direct real-time detection, high sensitivity and selectivity with easy handling and low cost compared to other techniques. In addition, this technology allows increasing the performance of the analysis through the design of resonator arrays on the same substrate (Monolithic HFF-QCMD). This document presents the design of an array of 24 HFF-QCMD sensors, together with a microfluidic cartridge that establish various microchannels on the different sensor elements, to provide them the diluted honey sample to be analyzed. The cartridge also acts as an interface to make the connection between the array of resonators and the characterization instrument. To get the most out of the designed array, a robust and reliable measurement method has been developed that allows increasing the data acquisition rate to facilitate electrical parameters readout from a high number of resonators simultaneously, and even in several harmonics of the fundamental resonance mode. The great sensitivity of the HFF-QCMD technology to the biochemical events to be characterized also is extended to other types of external events, such as changes in temperature or pressure, which must be minimized in order to reduce the impact that these unwanted disturbances cause in the stability and reliability of the measurement. To this end, a signal processing algorithm based on the Discrete Transform Wavelet (DTW) is developed. Finally, all the technological developments carried out are validated through the implementation of an immunoassay for the simultaneous detection, in real honey samples, of chemical residues of very different chemical nature, namely, the fungicide thiabendazole and the antibiotic sulfathiazole. / The authors would also like to thank Jorge Martínez from the Laboratory of High Frequency Circuits (LCAF) of the Universitat Politècnica de València (UPV) for assistance with profilometry, and Manuel Planes, José Luis Moya, Mercedes Tabernero, Alicia Nuez and Joaquin Fayos from the Electron Microscopy Services of the UPV for helping with the AFM, and SEM measurements. M.Calero is the recipient of the doctoral fellowship BES-2017-080246 from the Spanish Ministry of Economy, Industry and Competitiveness (Madrid, Spain). This research was funded by Spanish Ministry of Economy and Competitiveness with FEDER funds (AGL 2016-77702-R) and European Commission Horizon 2020 Programme (Grant Agreement number H2020-FETOPEN-2016-2017/737212-CATCH-U-DNA - Capturing non-Amplified Tumor Circulating DNA with Ultrasound Hydrodynamics) for which the authors are grateful. Román Fernández is with the Center for Research and Innovation in Bioengineering (Ci2B), Universitat Politècnica de València, València, Spain and with Advanced Wave Sensors S.L., Paterna, València, Spain. (e-mail: rfernandez@awsensors.com); Yolanda Jiménez, Antonio Arnau and María Calero are with the Center for Research and Innovation in Bioengineering (Ci2B), Universitat Politècnica de València, València, Spain; Ilya Reiviakine is with Advanced Wave Sensors S.L., Paterna, Valencia, Spain and with the Department of Bioengineering, University of Washington, Seattle, WA, 98150 USA; María Isabel Rocha-Gaso and José Vicente García are with Advanced Wave Sensors S.L., Paterna, València, Spain. / Calero Alcarria, MDS. (2022). Development of a novel high resolution and high throughput biosensing technology based on a Monolithic High Fundamental Frequency Quartz Crystal Microbalance (MHFF-QCM). Validation in food control [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/182652 / Compendio
76

Atrial Fibrillation Detection Algorithm Evaluation and Implementation in Java / Utvärdering av algoritmer för detektion av förmaksflimmer samt implementation i Java

Dizon, Lucas, Johansson, Martin January 2014 (has links)
Atrial fibrillation is a common heart arrhythmia which is characterized by a missing or irregular contraction of the atria. The disease is a risk factor for other more serious diseases and the total medical costs in society are extensive. Therefore it would be beneficial to improve and optimize the prevention and detection of the disease.   Pulse palpation and heart auscultation can facilitate the detection of atrial fibrillation clinically, but the diagnosis is generally confirmed by an ECG examination. Today there are several algorithms that detect atrial fibrillation by analysing an ECG. A common method is to study the heart rate variability (HRV) and by different types of statistical calculations find episodes of atrial fibrillation which deviates from normal sinus rhythm.   Two algorithms for detection of atrial fibrillation have been evaluated in Matlab. One is based on the coefficient of variation and the other uses a logistic regression model. Training and testing of the algorithms were done with data from the Physionet MIT database. Several steps of signal processing were used to remove different types of noise and artefacts before the data could be used.   When testing the algorithms, the CV algorithm performed with a sensitivity of 91,38%, a specificity of 93,93% and accuracy of 92,92%, and the results of the logistic regression algorithm was a sensitivity of 97,23%, specificity of 93,79% and accuracy of 95,39%. The logistic regression algorithm performed better and was chosen for implementation in Java, where it achieved a sensitivity of 97,31%, specificity of 93,47% and accuracy of 95,25%. / Förmaksflimmer är en vanlig hjärtrytmrubbning som kännetecknas av en avsaknad eller oregelbunden kontraktion av förmaken. Sjukdomen är en riskfaktor för andra allvarligare sjukdomar och de totala kostnaderna för samhället är betydande. Det skulle därför vara fördelaktigt att effektivisera och förbättra prevention samt diagnostisering av förmaksflimmer.   Kliniskt diagnostiseras förmaksflimmer med hjälp av till exempel pulspalpation och auskultation av hjärtat, men diagnosen brukar fastställas med en EKG-undersökning. Det finns idag flertalet algoritmer för att detektera arytmin genom att analysera ett EKG. En av de vanligaste metoderna är att undersöka variabiliteten av hjärtrytmen (HRV) och utföra olika sorters statistiska beräkningar som kan upptäcka episoder av förmaksflimmer som avviker från en normal sinusrytm.   I detta projekt har två metoder för att detektera förmaksflimmer utvärderats i Matlab, en baseras på beräkningar av variationskoefficienten och den andra använder sig av logistisk regression. EKG som kommer från databasen Physionet MIT används för att träna och testa modeller av algoritmerna. Innan EKG-signalen kan användas måste den behandlas för att ta bort olika typer av brus och artefakter.   Vid test av algoritmen med variationskoefficienten blev resultatet en sensitivitet på 91,38%, en specificitet på 93,93% och en noggrannhet på 92,92%. För logistisk regression blev sensitiviteten 97,23%, specificiteten 93,79% och noggrannheten 95,39%. Algoritmen med logistisk regression presterade bättre och valdes därför för att implementeras i Java, där uppnåddes en sensitivitet på 91,31%, en specificitet på 93,47% och en noggrannhet på 95,25%.

Page generated in 0.0496 seconds