Spelling suggestions: "subject:"biometrics"" "subject:"cliometrics""
11 |
A Wavelet-based Approach to Electrocardiogram (ECG) and Phonocardiogram (PCG) Subject RecognitionFatemian, Seyedeh Zahra 18 January 2010 (has links)
This thesis studies the applicability of two cardiac traits, the electrocardiogram (ECG) and the phonocardiogram (PCG), as biometrics. There is strong evidence that cardiac electrical activity (ECG) embeds highly distinctive characteristics, suitable for applications such as the recognition of human subjects. On the other hand, having the same origin with the ECG signal, it is believed that the PCG signal conveys distinctive information of an individual which can be deployed in biometric applications. Such recognition systems traditionally provide two modes of functionality, identification and authentication; frameworks for subject recognition are herein proposed and analyzed in both scenarios.
Moreover, the expression of the cardiac signals is subject to alternation with heart rate and noise components. Thus, the central consideration of this thesis is the design and evaluation of robust recognition approaches that can compensate for these effects. A recognition system based on each, the ECG and the PCG, is developed and evaluated. Furthermore, a fusion of the two signals in a multimodal biometric system is investigated.
|
12 |
ECG in Biometric Recognition: Time Dependency and Application ChallengesAgrafioti, Foteini 05 January 2012 (has links)
As biometric recognition becomes increasingly popular, the fear of circumvention, obfuscation and replay attacks is a rising concern. Traditional biometric modalities such as the face, the fingerprint or the iris are vulnerable to such attacks, which defeats the purpose of biometric recognition, namely to employ physiological characteristics for secure identity recognition.
This thesis advocates the use the electrocardiogram (ECG) signal for human identity recognition. The ECG is a vital signal of the human body, and as such, it naturally provides liveness detection, robustness to attacks, universality and permanence. In addition, ECG inherently satisfies uniqueness requirements, because the morphology of the signal is highly dependent on the particular anatomical and geometrical characteristics of the myocardium in the heart.
However, the ECG is a continuous signal, and this presents a great challenge to biometric recognition. With this modality, instantaneous variability is expected even within recordings of the same individual due to a variety of factors, including recording noise, or physical and psychological activity. While the noise and heart rate variations due to physical exercise can be addressed with appropriate feature extraction, the effects of emotional activity on the ECG signal are more obscure.
This thesis deals with this problem from an affective computing point of view. First, the psychological conditions that affect the ECG and endanger biometric accuracy are identified. Experimental setups that are targeted to provoke active and passive arousal as well as positive and negative valence are presented. The empirical mode decomposition (EMD) is used as the basis for the detection of emotional patterns, after adapting the algorithm to the particular needs of the ECG signal. Instantaneous frequency and oscillation features are used for state classification in various clustering setups. The result of this analysis is the designation of psychological states which affect the ECG signal to an extent that biometric matching may not be feasible. An updating methodology is proposed to address this problem, wherein the signal is monitored for instantaneous changes that require the design of a new template.
Furthermore, this thesis presents the enhanced Autocorrelation- Linear Discriminant Analysis (AC/LDA) algorithm for feature extraction, which incorporates a signal quality assessment module based on the periodicity transform. Three deployment scenarios are considered namely a) small-scale recognition systems, b) large-scale recognition systems and c) recognition in distributed systems. The enhanced AC/LDA algorithm is adapted to each setting, and the advantages and disadvantages of each scenario are discussed.
Overall, this thesis attempts to provide the necessary algorithmic and practical framework for the real-life deployment of the ECG signal in biometric recognition.
|
13 |
A Wavelet-based Approach to Electrocardiogram (ECG) and Phonocardiogram (PCG) Subject RecognitionFatemian, Seyedeh Zahra 18 January 2010 (has links)
This thesis studies the applicability of two cardiac traits, the electrocardiogram (ECG) and the phonocardiogram (PCG), as biometrics. There is strong evidence that cardiac electrical activity (ECG) embeds highly distinctive characteristics, suitable for applications such as the recognition of human subjects. On the other hand, having the same origin with the ECG signal, it is believed that the PCG signal conveys distinctive information of an individual which can be deployed in biometric applications. Such recognition systems traditionally provide two modes of functionality, identification and authentication; frameworks for subject recognition are herein proposed and analyzed in both scenarios.
Moreover, the expression of the cardiac signals is subject to alternation with heart rate and noise components. Thus, the central consideration of this thesis is the design and evaluation of robust recognition approaches that can compensate for these effects. A recognition system based on each, the ECG and the PCG, is developed and evaluated. Furthermore, a fusion of the two signals in a multimodal biometric system is investigated.
|
14 |
ECG in Biometric Recognition: Time Dependency and Application ChallengesAgrafioti, Foteini 05 January 2012 (has links)
As biometric recognition becomes increasingly popular, the fear of circumvention, obfuscation and replay attacks is a rising concern. Traditional biometric modalities such as the face, the fingerprint or the iris are vulnerable to such attacks, which defeats the purpose of biometric recognition, namely to employ physiological characteristics for secure identity recognition.
This thesis advocates the use the electrocardiogram (ECG) signal for human identity recognition. The ECG is a vital signal of the human body, and as such, it naturally provides liveness detection, robustness to attacks, universality and permanence. In addition, ECG inherently satisfies uniqueness requirements, because the morphology of the signal is highly dependent on the particular anatomical and geometrical characteristics of the myocardium in the heart.
However, the ECG is a continuous signal, and this presents a great challenge to biometric recognition. With this modality, instantaneous variability is expected even within recordings of the same individual due to a variety of factors, including recording noise, or physical and psychological activity. While the noise and heart rate variations due to physical exercise can be addressed with appropriate feature extraction, the effects of emotional activity on the ECG signal are more obscure.
This thesis deals with this problem from an affective computing point of view. First, the psychological conditions that affect the ECG and endanger biometric accuracy are identified. Experimental setups that are targeted to provoke active and passive arousal as well as positive and negative valence are presented. The empirical mode decomposition (EMD) is used as the basis for the detection of emotional patterns, after adapting the algorithm to the particular needs of the ECG signal. Instantaneous frequency and oscillation features are used for state classification in various clustering setups. The result of this analysis is the designation of psychological states which affect the ECG signal to an extent that biometric matching may not be feasible. An updating methodology is proposed to address this problem, wherein the signal is monitored for instantaneous changes that require the design of a new template.
Furthermore, this thesis presents the enhanced Autocorrelation- Linear Discriminant Analysis (AC/LDA) algorithm for feature extraction, which incorporates a signal quality assessment module based on the periodicity transform. Three deployment scenarios are considered namely a) small-scale recognition systems, b) large-scale recognition systems and c) recognition in distributed systems. The enhanced AC/LDA algorithm is adapted to each setting, and the advantages and disadvantages of each scenario are discussed.
Overall, this thesis attempts to provide the necessary algorithmic and practical framework for the real-life deployment of the ECG signal in biometric recognition.
|
15 |
Biometrics - Evaluation of Current SituationZahidi, Salman January 2011 (has links)
Information security has always been a topic of concern in the world as an emphasis on new techniques to secure the identity of a legitimate user is regarded as top priority. To counter such an issue, we have a traditional way of authentication factors “what you have” and “what you know” in the form of smart cards or passwords respectively. But biometrics is based on the factor “who are you” by analyzing human physical or behavioral characteristics. Biometrics has always been an efficient way of authorization and is now considered as a $1500 million industry where fingerprints dominate the biometrics while iris is quickly emerging as the most desirable form of biometric technique.The main goal of this thesis is to compare and evaluate different biometrics techniques in terms of their purpose, recognition mechanism, market value and their application areas. Since there are no defined evaluating criteria, my method of evaluation was based on a literature survey from internet, books, IEEE papers and technical surveys. Chapter 3 is focused on different biometrics techniques where I discuss them briefly but in chapter 4, I go deeper into Iris, fingerprints, facial techniques which are prominent in biometrics world. Lastly, I had a general assessment of the biometrics, their future growth and suggested specific techniques for different environment like access controls, e-commerce, national ids, and surveillance.
|
16 |
Automatic gait recognition via statistical approachesHuang, Ping Sheng January 1999 (has links)
No description available.
|
17 |
O método de pontos interiores no planejamento da radioterapiaMartins, Andréa Camila dos Santos [UNESP] 25 February 2011 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:27:25Z (GMT). No. of bitstreams: 0
Previous issue date: 2011-02-25Bitstream added on 2014-06-13T19:14:53Z : No. of bitstreams: 1
martins_acs_me_botib.pdf: 559361 bytes, checksum: de5a9b02a1bd741bc6a0e45d2d7a0f21 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Um tratamento do câncer por radioterapia tem como objetivo a eliminação das células do tumor e preservação das células saudáveis, obtendo assim uma melhor homo-geneização da dose administrada e menor possibilidade de complicações clínicas durante o tratamento. O sucesso do tratamento depende de um bom planejamento. Para um planejamento ótimo, técnicas matemáticas estão sendo utilizadas com o objetivo de maximizar a radiação no tumor e minimizar a radiação nas regiões vizinhas, com isto modelos de programação linear têm sido ótimas ferramentas para auxiliar a construção dos planos de tratamento por radioterapia. Assim, este trabalho visa: estudar os principais conceitos envolvidos no planejamento do tratamento do câncer por radioterapia; estudar modelos de programação linear (PL) aplicados ao planejamento ótimo; fazer um amplo estudo sobre a técnica de pontos interiores para PL e apresentar uma aplicação desta técnica para resolução de um problema de planejamento ótimo para o tratamento do câncer por radioterapia / A cancer treatment by radiotherapy aims to eliminate tumor cells and preservation of healthy cells, thus getting a better homogenization of the administered dose and fewer chances of complications during treatment. Treatment success depends on good planning. For an optimal planning, mathematical techniques are being used in order to maximize radiation at tumor and minimize radiation in the surrounding regions, thus linear programming models has been great tools to assist the construction of treatment plans for radiation therapy. Thus, this work aims: studying the key concepts involved in planning the treatment of cancer by radiotherapy; study the models the linear program- ming (PL) applied to optimal planning; make a broad study on the technique of interior point for PL and present an enforcement of this technique for solving a problem of optimal planning for cancer treatment by radiotherapy
|
18 |
Delineamentos D-ótimos para os modelos de Michaelis-Menten e de HillFerreira, Iuri Emmanuel de Paula [UNESP] 16 February 2010 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:23:03Z (GMT). No. of bitstreams: 0
Previous issue date: 2010-02-16Bitstream added on 2014-06-13T19:08:49Z : No. of bitstreams: 1
ferreira_iep_me_botib.pdf: 634855 bytes, checksum: c96cf169bba179f1ace49fe3a550f384 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Os resultados de muitos experimentos em áreas da biologia, como a farmacologia, a bioquímica e a agronomia, geralmente são analisados por ajustes de modelos não-lineares através dos quais pretende-se explicar a resposta através dos fatores pré-especificados no experimento. As estimações dos parâmetros ou das funções de interesse podem ser imprecisas se os níveis dos fatores não forem adequadamente escolhidos, impossibilitando ao pesquisador a obtenção da informação desejada sobre o objeto de estudo. A construção de um delineamento ótimo, que maximize a informação sobre algum aspecto de interesse, é crucial para o sucesso da prática experimental. O objetivo deste trabalho foi a obtenção de delineamentos D-ótimos exatos para modelos não-lineares utilizados para estudar cinética enzimática e transporte de minerais no organismo, como o de Michaelis-Menten e o de RiU. Para este fim, duas abordagens foram consideradas, a saber, a de delineamentos localmente ótimos e a pseudo-Bayesiana. Com o auxílio dos algoritmos genético e exchange foi possível obter delineamentos D-ótimos exatos para o modelo de Michaelis-Menten, para o modelo de RiU e para ambos, considerando-se valores diferentes e distribuições com diversos coeficientes de variação como informação a priori / The results of many experiments in biological fields, as pharmacology, biochemistry and agriculture, usually are analyzed by fitting nonlinear models, which are supposed to describe well the resp'onse to the pre-specified factors in the experiment. The estimates of the parameters or of their functions of interest could be imprecise if the factor levels are not adequately chosen. The construction of an optimum design, which maximizes the information about some aspect of interest, is crucial for the success of the experimental practice. The aim of this work was constructing exact D-optimal designs for nonlinear models usually used in studies of enzyme kinetics and mineral transport in organisms, such as the Michaelis-Menten and RiU models. Two approaches were considered, the locally optimal and pseudo- Bayesian designs. Genetic and Exchange algorithms were used for getting exact designs aiming at the Michaelis-Menten model, aiming at the RiU model, each one separately, and aiming at both models when considering a composite criterion. Different values and probability distributions with several variation coefficients were considered as prior information
|
19 |
Privacy-Preserving Facial Recognition Using Biometric-CapsulesPhillips, Tyler S. 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In recent years, developers have used the proliferation of biometric sensors in smart devices, along with recent advances in deep learning, to implement an array of biometrics-based recognition systems. Though these systems demonstrate remarkable performance and have seen wide acceptance, they present unique and pressing security and privacy concerns. One proposed method which addresses these concerns is the elegant, fusion-based Biometric-Capsule (BC) scheme. The BC scheme is provably secure, privacy-preserving, cancellable and interoperable in its secure feature fusion design.
In this work, we demonstrate that the BC scheme is uniquely fit to secure state-of-the-art facial verification, authentication and identification systems. We compare the performance of unsecured, underlying biometrics systems to the performance of the BC-embedded systems in order to directly demonstrate the minimal effects of the privacy-preserving BC scheme on underlying system performance. Notably, we demonstrate that, when seamlessly embedded into a state-of-the-art FaceNet and ArcFace verification systems which achieve accuracies of 97.18% and 99.75% on the benchmark LFW dataset, the BC-embedded systems are able to achieve accuracies of 95.13% and 99.13% respectively. Furthermore, we also demonstrate that the BC scheme outperforms or performs as well as several other proposed secure biometric methods.
|
20 |
The Design and Implementation of the Facial Recognition Vendor Test 2000 Evaluation MethodologyBlackburn, Duane Michael 13 September 2001 (has links)
The biggest change in the facial recognition community since the completion of the FacE REcognition Technology (FERET) program has been the introduction of facial recognition products to the commercial market. Open market competitiveness has driven numerous technological advances in automated face recognition since the FERET program and significantly lowered system costs. Today there are dozens of facial recognition systems available that have the potential to meet performance requirements for numerous applications. But which of these systems best meet the performance requirements for given applications?
Repeated inquiries from numerous government agencies on the current state of facial recognition technology prompted the DoD Counterdrug Technology Development Program Office to establish a new set of evaluations. The Facial Recognition Vendor Test 2000 (FRVT 2000), was co-sponsored by the DoD Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency, and was administered in May-June 2000.
The sponsors of the FRVT 2000 had two major goals for the evaluation. The first was a technical assessment of the capabilities of commercially available facial recognition systems. The sponsors wanted to know the strengths and weaknesses of each individual system, as well as obtain an understanding of the current state of the art for facial recognition.
The second goal of the evaluation was to educate the biometrics community and the general public on how to present and analyze results. The sponsors have seen vendors and would-be customers quoting outstanding performance specifications without understanding that these specifications are virtually useless without first knowing the details of the test that was used to produce the quoted results.
The Facial Recognition Vendor Test 2000 was a worthwhile endeavor. It will help numerous readers evaluate facial recognition systems for their own uses and will serve as a benchmark for all future evaluations of biometric technologies.
The FRVT 2000 evaluations were not designed, and the FRVT 2000 Evaluation Report was not written, to be a buyer's guide for facial recognition. No one will be able to open the report to a specific page to determine which facial recognition system is best because there is not one system for all applications. The only way to determine the best facial recognition system for any application is to follow the three-step evaluation methodology described in the FRVT 2000 Evaluation Report and analyze the data as it pertains to each individual application.
This thesis explains the design and implementation of the FRVT 2000 evaluations, and discusses how the FRVT 2000 Evaluation Report met the author's objectives for the evaluation. / Master of Science
|
Page generated in 0.0521 seconds