• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 1
  • Tagged with
  • 9
  • 9
  • 9
  • 9
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Combined robust and fragile watermarking algorithms for still images : design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions

Jassim, Taha Dawood January 2014 (has links)
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
2

Objective Perceptual Quality Assessment of JPEG2000 Image Coding Format Over Wireless Channel

Chintala, Bala Venkata Sai Sundeep January 2019 (has links)
A dominant source of Internet traffic, today, is constituted of compressed images. In modern multimedia communications, image compression plays an important role. Some of the image compression standards set by the Joint Photographic Expert Group (JPEG) include JPEG and JPEG2000. The expert group came up with the JPEG image compression standard so that still pictures could be compressed to be sent over an e-mail, be displayed on a webpage, and make high-resolution digital photography possible. This standard was originally based on a mathematical method, used to convert a sequence of data to the frequency domain, called the Discrete Cosine Transform (DCT). In the year 2000, however, a new standard was proposed by the expert group which came to be known as JPEG2000. The difference between the two is that the latter is capable of providing better compression efficiency. There is also a downside to this new format introduced. The computation required for achieving the same sort of compression efficiency as one would get with the original JPEG format is higher. JPEG is a lossy compression standard which can throw away some less important information without causing any noticeable perception differences. Whereas, in lossless compression, the primary purpose is to reduce the number of bits required to represent the original image samples without any loss of information. The areas of application of the JPEG image compression standard include the Internet, digital cameras, printing, and scanning peripherals. In this thesis work, a simulator kind of functionality setup is needed for conducting the objective quality assessment. An image is given as an input to our wireless communication system and its data size is varied (e.g. 5%, 10%, 15%, etc) and a Signal-to-Noise Ratio (SNR) value is given as input, for JPEG2000 compression. Then, this compressed image is passed through a JPEG encoder and then transmitted over a Rayleigh fading channel. The corresponding image obtained after having applied these constraints on the original image is then decoded at the receiver and inverse discrete wavelet transform (IDWT) is applied to inverse the JPEG 2000 compression. Quantization is done for the coefficients which are scalar-quantized to reduce the number of bits to represent them, without the loss of quality of the image. Then the final image is displayed on the screen. The original input image is co-passed with the images of varying data size for an SNR value at the receiver after decoding. In particular, objective perceptual quality assessment through Structural Similarity (SSIM) index using MATLAB is provided.
3

Hand (Motor) Movement Imagery Classification of EEG Using Takagi-Sugeno-Kang Fuzzy-Inference Neural Network

Donovan, Rory Larson 01 June 2017 (has links) (PDF)
Approximately 20 million people in the United States suffer from irreversible nerve damage and would benefit from a neuroprosthetic device modulated by a Brain-Computer Interface (BCI). These devices restore independence by replacing peripheral nervous system functions such as peripheral control. Although there are currently devices under investigation, contemporary methods fail to offer adaptability and proper signal recognition for output devices. Human anatomical differences prevent the use of a fixed model system from providing consistent classification performance among various subjects. Furthermore, notoriously noisy signals such as Electroencephalography (EEG) require complex measures for signal detection. Therefore, there remains a tremendous need to explore and improve new algorithms. This report investigates a signal-processing model that is better suited for BCI applications because it incorporates machine learning and fuzzy logic. Whereas traditional machine learning techniques utilize precise functions to map the input into the feature space, fuzzy-neuro system apply imprecise membership functions to account for uncertainty and can be updated via supervised learning. Thus, this method is better equipped to tolerate uncertainty and improve performance over time. Moreover, a variation of this algorithm used in this study has a higher convergence speed. The proposed two-stage signal-processing model consists of feature extraction and feature translation, with an emphasis on the latter. The feature extraction phase includes Blind Source Separation (BSS) and the Discrete Wavelet Transform (DWT), and the feature translation stage includes the Takagi-Sugeno-Kang Fuzzy-Neural Network (TSKFNN). Performance of the proposed model corresponds to an average classification accuracy of 79.4 % for 40 subjects, which is higher than the standard literature values, 75%, making this a superior model.
4

Combined robust and fragile watermarking algorithms for still images. Design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions.

Jassim, Taha D. January 2014 (has links)
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
5

A Model Study For The Application Of Wavelet And Neural Network For Identification And Localization Of Partial Discharges In Transformers

Vaidya, Anil Pralhad 10 1900 (has links) (PDF)
No description available.
6

Investigation of New Techniques for Face detection

Abdallah, Abdallah Sabry 18 July 2007 (has links)
The task of detecting human faces within either a still image or a video frame is one of the most popular object detection problems. For the last twenty years researchers have shown great interest in this problem because it is an essential pre-processing stage for computing systems that process human faces as input data. Example applications include face recognition systems, vision systems for autonomous robots, human computer interaction systems (HCI), surveillance systems, biometric based authentication systems, video transmission and video compression systems, and content based image retrieval systems. In this thesis, non-traditional methods are investigated for detecting human faces within color images or video frames. The attempted methods are chosen such that the required computing power and memory consumption are adequate for real-time hardware implementation. First, a standard color image database is introduced in order to accomplish fair evaluation and benchmarking of face detection and skin segmentation approaches. Next, a new pre-processing scheme based on skin segmentation is presented to prepare the input image for feature extraction. The presented pre-processing scheme requires relatively low computing power and memory needs. Then, several feature extraction techniques are evaluated. This thesis introduces feature extraction based on Two Dimensional Discrete Cosine Transform (2D-DCT), Two Dimensional Discrete Wavelet Transform (2D-DWT), geometrical moment invariants, and edge detection. It also attempts to construct a hybrid feature vector by the fusion between 2D-DCT coefficients and edge information, as well as the fusion between 2D-DWT coefficients and geometrical moments. A self organizing map (SOM) based classifier is used within all the experiments to distinguish between facial and non-facial samples. Two strategies are tried to make the final decision from the output of a single SOM or multiple SOM. Finally, an FPGA based framework that implements the presented techniques, is presented as well as a partial implementation. Every presented technique has been evaluated consistently using the same dataset. The experiments show very promising results. The highest detection rate of 89.2% was obtained when using a fusion between DCT coefficients and edge information to construct the feature vector. A second highest rate of 88.7% was achieved by using a fusion between DWT coefficients and geometrical moments. Finally, a third highest rate of 85.2% was obtained by calculating the moments of edges. / Master of Science
7

Processamento Inteligente de Sinais de Press?o e Temperatura Adquiridos Atrav?s de Sensores Permanentes em Po?os de Petr?leo

Pires, Paulo Roberto da Motta 06 February 2012 (has links)
Made available in DSpace on 2014-12-17T14:08:50Z (GMT). No. of bitstreams: 1 PauloRMP_capa_ate_pag32.pdf: 5057325 bytes, checksum: bf8da0b02ad06ee116c93344fb67e976 (MD5) Previous issue date: 2012-02-06 / Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization / Originalmente voltadas ao monitoramento da opera??o, as medi??es cont?nuas de press?o e temperatura no fundo de po?o, realizadas atrav?s de PDGs (do ingl?s, Permanent Downhole Gauges), encontram vasta aplicabilidade no gerenciamento de reservat?rios. Para tanto, permitem o monitoramento do desempenho de po?os e a estimativa de par?metros de reservat?rios no longo prazo. Contudo, a despeito de sua inquestion?vel utilidade, os dados adquiridos de PDG apresentam grande conte?do de ru?do. Outro aspecto igualmente desfavor?vel reside na ocorr?ncia de valores esp?rios (outliers) imersos entre as medidas registradas pelo PDG. O presente trabalho aborda o tratamento inicial de sinais de press?o e temperatura, mediante t?cnicas de suaviza??o, mapas auto-organiz?veis e transformada wavelet discreta. Ademais, prop?e-se um sistema de detec??o de transientes relevantes para an?lise no longo hist?rico de registros, baseado no acoplamento entre clusteriza??o fuzzy e redes neurais feed-forward. Os resultados alcan?ados mostraram-se de todo satisfat?rios para po?os marinhos, atendendo a requisitos reais de utiliza??o dos sinais registrados por PDGs
8

Predictability of Nonstationary Time Series using Wavelet and Empirical Mode Decomposition Based ARMA Models

Lanka, Karthikeyan January 2013 (has links) (PDF)
The idea of time series forecasting techniques is that the past has certain information about future. So, the question of how the information is encoded in the past can be interpreted and later used to extrapolate events of future constitute the crux of time series analysis and forecasting. Several methods such as qualitative techniques (e.g., Delphi method), causal techniques (e.g., least squares regression), quantitative techniques (e.g., smoothing method, time series models) have been developed in the past in which the concept lies in establishing a model either theoretically or mathematically from past observations and estimate future from it. Of all the models, time series methods such as autoregressive moving average (ARMA) process have gained popularity because of their simplicity in implementation and accuracy in obtaining forecasts. But, these models were formulated based on certain properties that a time series is assumed to possess. Classical decomposition techniques were developed to supplement the requirements of time series models. These methods try to define a time series in terms of simple patterns called trend, cyclical and seasonal patterns along with noise. So, the idea of decomposing a time series into component patterns, later modeling each component using forecasting processes and finally combining the component forecasts to obtain actual time series predictions yielded superior performance over standard forecasting techniques. All these methods involve basic principle of moving average computation. But, the developed classical decomposition methods are disadvantageous in terms of containing fixed number of components for any time series, data independent decompositions. During moving average computation, edges of time series might not get modeled properly which affects long range forecasting. So, these issues are to be addressed by more efficient and advanced decomposition techniques such as Wavelets and Empirical Mode Decomposition (EMD). Wavelets and EMD are some of the most innovative concepts considered in time series analysis and are focused on processing nonlinear and nonstationary time series. Hence, this research has been undertaken to ascertain the predictability of nonstationary time series using wavelet and Empirical Mode Decomposition (EMD) based ARMA models. The development of wavelets has been made based on concepts of Fourier analysis and Window Fourier Transform. In accordance with this, initially, the necessity of involving the advent of wavelets has been presented. This is followed by the discussion regarding the advantages that are provided by wavelets. Primarily, the wavelets were defined in the sense of continuous time series. Later, in order to match the real world requirements, wavelets analysis has been defined in discrete scenario which is called as Discrete Wavelet Transform (DWT). The current thesis utilized DWT for performing time series decomposition. The detailed discussion regarding the theory behind time series decomposition is presented in the thesis. This is followed by description regarding mathematical viewpoint of time series decomposition using DWT, which involves decomposition algorithm. EMD also comes under same class as wavelets in the consequence of time series decomposition. EMD is developed out of the fact that most of the time series in nature contain multiple frequencies leading to existence of different scales simultaneously. This method, when compared to standard Fourier analysis and wavelet algorithms, has greater scope of adaptation in processing various nonstationary time series. The method involves decomposing any complicated time series into a very small number of finite empirical modes (IMFs-Intrinsic Mode Functions), where each mode contains information of the original time series. The algorithm of time series decomposition using EMD is presented post conceptual elucidation in the current thesis. Later, the proposed time series forecasting algorithm that couples EMD and ARMA model is presented that even considers the number of time steps ahead of which forecasting needs to be performed. In order to test the methodologies of wavelet and EMD based algorithms for prediction of time series with non stationarity, series of streamflow data from USA and rainfall data from India are used in the study. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability by the proposed algorithm is checked in two scenarios, first being six months ahead forecast and the second being twelve months ahead forecast. Normalized Root Mean Square Error (NRMSE) and Nash Sutcliffe Efficiency Index (Ef) are considered to evaluate the performance of the proposed techniques. Based on the performance measures, the results indicate that wavelet based analyses generate good variations in the case of six months ahead forecast maintaining harmony with the observed values at most of the sites. Although the methods are observed to capture the minima of the time series effectively both in the case of six and twelve months ahead predictions, better forecasts are obtained with wavelet based method over EMD based method in the case of twelve months ahead predictions. It is therefore inferred that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm could be used to model events such as droughts with reasonable accuracy. Also, some modifications that could be made in the model have been suggested which can extend the scope of applicability to other areas in the field of hydrology.
9

Development of a novel high resolution and high throughput biosensing technology based on a Monolithic High Fundamental Frequency Quartz Crystal Microbalance (MHFF-QCM). Validation in food control

Calero Alcarria, María del Señor 02 May 2022 (has links)
Tesis por compendio / [ES] La sociedad actual demanda un mayor control en la seguridad y calidad de los alimentos que se consumen. Esta preocupación se ve reflejada en los diferentes planes estatales y europeos de investigación científica, los cuales, plantean la necesidad de innovar y desarrollar nuevas técnicas analíticas que cubran los requerimientos actuales. En el presente documento se aborda el problema de la presencia de residuos químicos en la miel. El origen de los mismos se debe, fundamentalmente, a los tramientos veterinarios con los que se tratan enfermedades y parásitos en las abejas, y a los tratamientos agrícolas con los que las abejas se ponen en contacto cuando recolectan el néctar en cultivos próximos a las colmenas. La Agencia Europea de Seguridad Alimentaria (EFSA) confirma esta realidad al notificar numerosas alertas sanitarias en la miel. En los últimos años, los métodos de análisis basados en inmunosensores piezoeléctricos se han posicionado como la base de una técnica de cribado muy prometedora, la cual puede ser empleada como técnica complementaria a las clásicas de cromatografía, gracias a su sencillez, rapidez y bajo coste. La tecnología de resonadores High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) combina la detección directa en tiempo real, alta sensibilidad y selectividad con un fácil manejo y coste reducido en comparación con otras técnicas. Además, está tecnología permite aumentar el rendimiento del análisis mediante el diseño de arrays de resonadores en un mismo sustrato (Monolithic HFF-QCMD). En este documento se presenta el diseño de un array de 24 sensores HFF-QCMD, junto con un cartucho de micro-fluídica que traza diversos microcanales sobre los diferentes elementos sensores, a los que hace llegar la muestra de miel diluida a analizar. El cartucho actúa también como interfaz para realizar la conexión entre el array de resonadores y el instrumento de caracterización de los mismos. Para obtener el máximo partido del array diseñado, se desarrolla un método de medida robusto y fiable que permite elevar la tasa de adquisición de datos para facilitar la toma de registros eléctricos de un elevado número de resonadores de forma simultánea, e incluso en varios armónicos del modo fundamental de resonancia. La gran sensibilidad de la tecnología HFF-QCMD a los eventos bioquímicos a caracterizar se extiende también a otro tipo eventos externos, como son los cambios de temperatura o presión, lo que es necesario minimizar con el fin de reducir el impacto que estas perturbaciones no deseadas provocan en la estabilidad y fiabilidad de la medida. Con este fin, se desarrolla un algoritmo de procesado de señal basado en la Discrete Transform Wavelet (DTW). Finalmente, todos los desarrollos tecnológicos realizados se validan mediante la implementación de un inmunoensayo para la detección simultánea, en muestras de mieles reales, de residuos químicos de naturaleza química muy diferente, a saber, el fungicida tiabendazol y el antibiótico sulfatiazol. / [CA] La societat actual demanda un major control en la seguretat i qualitat dels aliments que es consumeixen. Aquesta preocupació es veu reflectida en els diferents plans estatals i europeus d'investigació científica, els quals, plantegen la necessitat d'innovar i desenvolupar noves tècniques analítiques que cobrisquen els requeriments actuals. En el present document s'aborda el problema de la presència de residus químics en la mel. L'origen dels mateixos es deu, fonamentalment, als tractaments veterinaris amb els quals es tracten malalties i paràsits en les abelles, i als tractaments agrícoles amb els quals les abelles es posen en contacte quan recol·lecten el nèctar en cultius pròxims als ruscos. L'Agència Europea de Seguretat Alimentària (EFSA) confirma aquesta realitat notificant nombroses alertes sanitàries en la mel. En els últims anys, els mètodes d'anàlisis basades en immunosensors piezoelèctrics s'han posicionat com la base d'una tècnica de garbellat molt prometedora, la qual pot ser emprada com a tècnica complementària a les clàssiques de cromatografia, gràcies a la seua senzillesa, rapidesa i baix cost. La tecnologia de ressonadors High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) combina la detecció directa en temps real, alta sensibilitat i selectivitat amb un fàcil maneig i cost reduït en comparació amb altres tècniques. A més, està tecnologia permet augmentar el rendiment del anàlisi mitjançant el disseny d'arrays de ressonadors en un mateix substrat (Monolithic HFF-QCMD). En aquest document es presenta el disseny d'un array de 24 sensors HFF-QCMD, juntament amb un cartutx de microfluídica que estableix diversos microcanals sobre els diferents elements sensors, als quals fa arribar la mostra de mel diluïda a analitzar. El cartutx actua també com a interfície per a realitzar la connexió entre l'array de ressonadors i l'instrument de caracterització d'aquests. Per a traure el màxim partit a l'array dissenyat, es desenvolupa un mètode de mesura robust i fiable que permet elevar la taxa d'adquisició de dades per a facilitar la presa de registres elèctrics d'un elevat nombre de ressonadors de manera simultània, i fins i tot en diversos harmònics del mode fonamental de ressonància. La gran sensibilitat de la tecnologia HFF-QCMD als esdeveniments bioquímics a caracteritzar s'estén també a un altre tipus esdeveniments externs, com són els canvis de temperatura o pressió, la qual cosa és necessari minimitzar amb la finalitat de reduir l'impacte que aquestes pertorbacions no desitjades provoquen en l'estabilitat i fiabilitat de la mesura. A aquest efecte, es desenvolupa un algorisme de processament de senyal basat en la Discrete Transform Wavelet (DTW). Finalment, tots els desenvolupaments tecnològics realitzats es validen mitjançant la implementació d'un immunoassaig per a la detecció simultània, en mostres de mel reals, de residus químics de naturalesa química molt diferent, a saber, el fungicida tiabendazol i l'antibiòtic sulfatiazol. / [EN] Currently, society demands greater control over the safety and quality of the food consumed. This concern is reflected in the different states and European plans for scientific research, which establish the necessity to innovate and develop new analytical techniques that meet current requirements. This document addresses the problem of the presence of chemical residues in honey. Its origin is fundamentally due to the veterinary treatments against diseases and parasites in bees, and also to the agricultural treatments with which the bees come into contact when they collect the nectar in crops close to the hives. The European Food Safety Agency (EFSA) confirms this reality by notifying numerous health alerts in honey. In recent years, analysis methods based on piezoelectric immunosensors have been positioned as the basis of a very promising screening technique, which can be used as a complementary technique to the classic chromatography, thanks to its simplicity, speed and low cost. High-Fundamental Frequency Quartz Crystal Microbalance with Dissipation (HFF-QCMD) resonator technology combines direct real-time detection, high sensitivity and selectivity with easy handling and low cost compared to other techniques. In addition, this technology allows increasing the performance of the analysis through the design of resonator arrays on the same substrate (Monolithic HFF-QCMD). This document presents the design of an array of 24 HFF-QCMD sensors, together with a microfluidic cartridge that establish various microchannels on the different sensor elements, to provide them the diluted honey sample to be analyzed. The cartridge also acts as an interface to make the connection between the array of resonators and the characterization instrument. To get the most out of the designed array, a robust and reliable measurement method has been developed that allows increasing the data acquisition rate to facilitate electrical parameters readout from a high number of resonators simultaneously, and even in several harmonics of the fundamental resonance mode. The great sensitivity of the HFF-QCMD technology to the biochemical events to be characterized also is extended to other types of external events, such as changes in temperature or pressure, which must be minimized in order to reduce the impact that these unwanted disturbances cause in the stability and reliability of the measurement. To this end, a signal processing algorithm based on the Discrete Transform Wavelet (DTW) is developed. Finally, all the technological developments carried out are validated through the implementation of an immunoassay for the simultaneous detection, in real honey samples, of chemical residues of very different chemical nature, namely, the fungicide thiabendazole and the antibiotic sulfathiazole. / The authors would also like to thank Jorge Martínez from the Laboratory of High Frequency Circuits (LCAF) of the Universitat Politècnica de València (UPV) for assistance with profilometry, and Manuel Planes, José Luis Moya, Mercedes Tabernero, Alicia Nuez and Joaquin Fayos from the Electron Microscopy Services of the UPV for helping with the AFM, and SEM measurements. M.Calero is the recipient of the doctoral fellowship BES-2017-080246 from the Spanish Ministry of Economy, Industry and Competitiveness (Madrid, Spain). This research was funded by Spanish Ministry of Economy and Competitiveness with FEDER funds (AGL 2016-77702-R) and European Commission Horizon 2020 Programme (Grant Agreement number H2020-FETOPEN-2016-2017/737212-CATCH-U-DNA - Capturing non-Amplified Tumor Circulating DNA with Ultrasound Hydrodynamics) for which the authors are grateful. Román Fernández is with the Center for Research and Innovation in Bioengineering (Ci2B), Universitat Politècnica de València, València, Spain and with Advanced Wave Sensors S.L., Paterna, València, Spain. (e-mail: rfernandez@awsensors.com); Yolanda Jiménez, Antonio Arnau and María Calero are with the Center for Research and Innovation in Bioengineering (Ci2B), Universitat Politècnica de València, València, Spain; Ilya Reiviakine is with Advanced Wave Sensors S.L., Paterna, Valencia, Spain and with the Department of Bioengineering, University of Washington, Seattle, WA, 98150 USA; María Isabel Rocha-Gaso and José Vicente García are with Advanced Wave Sensors S.L., Paterna, València, Spain. / Calero Alcarria, MDS. (2022). Development of a novel high resolution and high throughput biosensing technology based on a Monolithic High Fundamental Frequency Quartz Crystal Microbalance (MHFF-QCM). Validation in food control [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/182652 / Compendio

Page generated in 0.0879 seconds