• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 128
  • 33
  • 18
  • 18
  • 11
  • 7
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 451
  • 130
  • 128
  • 71
  • 61
  • 56
  • 55
  • 47
  • 46
  • 45
  • 44
  • 43
  • 43
  • 42
  • 39
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Orientation Invariance Methods for Inertial Gait

Subramanian, Ravichandran 29 June 2018 (has links)
Intelligent devices such as smart phones, smart watches, virtual reality (VR) headsets and personal exercise devices have become integral elements of accessories used by many people. The ability of these devices to verify or identify the user could be applied for enhanced security and user experience customization among other things. Almost all these devices have built-in inertial sensors such as accelerometer and gyroscope. These inertial sensors respond to the movements made by the user while performing day to day activities like walking, getting up and sitting down. The response depends on the activity being performed and thus can be used for activity recognition. The response also captures the user's unique way of doing the activity and can be used as a behavioral biometric for verification or identification. The acceleration (accelerometer) and rate of rotation (gyroscope) are recorded in the device coordinate frame. But to determine the user's motion, these need to be converted to a coordinate frame relative to the user. In most situations the orientation of the device relative to the user can neither be controlled nor determined reliably. The solution to this problem requires methods to remove the dependence on device orientation while comparing the signals collected at different times. In a vast of majority of research to date, the performance of authentication algorithms using inertial sensors have been evaluated on small datasets with few tens of subjects, collected under controlled placement of the sensors. Very often stand alone inertial sensors have been used to collect the data. Stand alone sensors afford better calibration, while the sensors built into smart devices offer little or no means of calibration. Due to these limitations of the datasets used, it is difficult to extend the results from these research to realistic performance with a large number subjects and natural placement of off-the-shelf smart devices. This dissertation describes the Kabsch algorithm which is used to achieve orientation invariance of the recorded inertial data, enabling better authentication independent of device orientation. It also presents the Vector Cross Product (VCP) method developed to achieve orientation invariance. Details of a realistic inertial dataset (USF-PDA dataset) collected with commercial smart phones placed in natural positions and orientations using 101 subjects are given. The data includes sessions from different days on a subset of 56 subjects. This would enable realistic evaluation of authentication algorithms. This dataset has been made publicly available. The performance of five methods that address the orientation dependence of signals are compared to a baseline that performs no compensation for orientation of the device. The methods as a part of a overall gait authentication algorithm are evaluated on the USF-PDA dataset mentioned above and another large dataset with more than 400 subjects. The results show that the orientation compensation methods are able to improve the authentication performance on data with uncontrolled orientation to be close to performance on data collected with controlled orientation. The Kabsch method shows the highest improvement.
42

Hierarchical fingerprint verification

Yager, Neil Gordon, Computer Science & Engineering, Faculty of Engineering, UNSW January 2006 (has links)
Fingerprints have been an invaluable tool for law enforcement and forensics for over a century, motivating research into automated fingerprint based identification in the early 1960's. More recently, fingerprints have found an application in the emerging industry of biometric systems. Biometrics is the automatic identification of an individual based on physiological or behavioral characteristics. Due to its security related applications and the current world political climate, biometrics is presently the subject of intense research by private and academic institutions. Fingerprints are emerging as the most common and trusted biometric for personal identification. However, despite decades of intense research there are still significant challenges for the developers of automated fingerprint verification systems. This thesis includes an examination of all major stages of the fingerprint verification process, with contributions made at each step. The primary focus is upon fingerprint registration, which is the challenging problem of aligning two prints in order to compare their corresponding features for verification. A hierarchical approach is proposed consisting of three stages, each of which employs novel features and techniques for alignment. Experimental results show that the hierarchical approach is robust and outperforms competing state-of-the-art registration methods from the literature. However, despite its power, like most algorithms it has limitations. Therefore, a novel method of information fusion at the registration level has been developed. The technique dynamically selects registration parameters from a set of competing algorithms using a statistical framework. This allows for the relative advantages of different approaches to be exploited. The results show a significant improvement in alignment accuracy for a wide variety of fingerprint databases. Given a robust alignment of two fingerprints, it still remains to be verified whether or not they have originated from the same finger. This is a non-trivial problem, and a close examination of fingerprint features available for this task is conducted with extensive experimental results.
43

Cryptographic Credentials with Privacy-preserving Biometric Bindings

Bissessar, David 22 January 2013 (has links)
Cryptographic credentials allow user authorizations to be granted and verified. and have such applications as e-Passports, e-Commerce, and electronic cash. This thesis proposes a privacy protecting approach of binding biometrically derived keys to cryptographic credentials to prevent unauthorized lending. Our approach builds on the 2011 work of Adams, offering additional benefits of privacy protection of biometric information, generality on biometric modalities, and performance. Our protocol integrates into Brands’ Digital Credential scheme, and the Anonymous Credentials scheme of Camenisch and Lysyanskaya. We describe a detailed integration with the Digital Credential Scheme and sketch the integration into the Anonymous Credentials scheme. Security proofs for non-transferability, correctness of ownership, and unlinkability are provided for the protocol’s instantiation into Digital Credentials. Our approach uses specialized biometric devices in both the issue and show protocols. These devices are configured with our proposed primitive, the fuzzy ex-tractor indistinguishability adaptor which uses a traditional fuzzy extractor to create and regenerate cryptographic keys from biometric data and IND-CCA2 secure en-cryption protect the generated public data against multiplicity attacks. Pedersen commitments are used to hold the key at issue and show time, and A zero-knowledge proof of knowledge is used to ensure correspondence of key created at issue-time and regenerated at show-time. The above is done in a manner which preserves biometric privacy, as and delivers non-transferability of digital credentials. The biometric itself is not stored or divulged to any of the parties involved in the protocol. Privacy protection in multiple enrollments scenarios is achieved by the fuzzy extractor indistinguishability adapter. The zero knowledge proof of knowledge is used in the showing protocol to prove knowledge of values without divulging them.
44

The Identity Myth: Constructing the Face in Technologies of Citizenship

Ferenbok, Joseph 13 April 2010 (has links)
Over the last century, images of faces have become integral components of many institutional identification systems. A driver’s licence, a passport and often even a health care card, all usually feature prominently images representing the face of their bearer as part of the mechanism for linking real-world bodies to institutional records. Increasingly the production, distribution and inspection of these documents is becoming computer-mediated. As photo ID documents become ‘enhanced’ by computerization, the design challenges and compromises become increasingly coded in the hierarchy of gazes aimed at individual faces and their technologically mediated surrogates. In Western visual culture, representations of faces have been incorporated into identity documents since the 15th century when Renaissance portraits were first used to visually and legally establish the social and institutional positions of particular individuals. However, it was not until the 20th century that official identity documents and infrastructures began to include photographic representations of individual faces. This work explores photo ID documents within the context of “the face,”—a theoretical model for understanding relationships of power coded using representations of particular human faces as tokens of identity. “The face” is a product of mythology for linking ideas of stable identity with images of particular human beings. This thesis extends the panoptic model of the body and contributes to the understanding of changes posed by computerization to the norms of constructing institutional identity and interaction based on surrogates of faces. The exploration is guided by four key research questions: What is “the face”? How does it work? What are its origins (or mythologies)? And how is “the face” being transformed through digitization? To address these questions this thesis weaves ideas from theorists including Foucault, Deleuze and Lyon to explore the rise of “the face” as a strategy for governing, sorting, and classifying members of constituent populations. The work re-examines the techno-political value of captured faces as identity data and by tracing the cultural and techno-political genealogies tying faces to ideas of stable institutional identities this thesis demonstrates face-based identity practices are being improvised and reconfigured by computerization and why these practices are significant for understanding the changing norms of interaction between individuals and institutions.
45

Facial Feature Point Detection

Chen, Fang 06 December 2011 (has links)
Facial feature point detection is a key issue in facial image processing. One main challenge of facial feature point detection is the variation of facial structures due to expressions. This thesis aims to explore more accurate and robust facial feature point detection algorithms, which can facilitate the research on facial image processing, in particular the facial expression analysis. This thesis introduces a facial feature point detection system, where the Multilinear Principal Component Analysis is applied to extract the highly descriptive features of facial feature points. In addition, to improve the accuracy and efficiency of the system, a skin color based face detection algorithm is studied. The experiment results have indicated that this system is effective in detecting 20 facial feature points in frontal faces with different expressions. This system has also achieved a higher accuracy during the comparison with the state-of-the-art, BoRMaN.
46

Facial Feature Point Detection

Chen, Fang 06 December 2011 (has links)
Facial feature point detection is a key issue in facial image processing. One main challenge of facial feature point detection is the variation of facial structures due to expressions. This thesis aims to explore more accurate and robust facial feature point detection algorithms, which can facilitate the research on facial image processing, in particular the facial expression analysis. This thesis introduces a facial feature point detection system, where the Multilinear Principal Component Analysis is applied to extract the highly descriptive features of facial feature points. In addition, to improve the accuracy and efficiency of the system, a skin color based face detection algorithm is studied. The experiment results have indicated that this system is effective in detecting 20 facial feature points in frontal faces with different expressions. This system has also achieved a higher accuracy during the comparison with the state-of-the-art, BoRMaN.
47

The Identity Myth: Constructing the Face in Technologies of Citizenship

Ferenbok, Joseph 13 April 2010 (has links)
Over the last century, images of faces have become integral components of many institutional identification systems. A driver’s licence, a passport and often even a health care card, all usually feature prominently images representing the face of their bearer as part of the mechanism for linking real-world bodies to institutional records. Increasingly the production, distribution and inspection of these documents is becoming computer-mediated. As photo ID documents become ‘enhanced’ by computerization, the design challenges and compromises become increasingly coded in the hierarchy of gazes aimed at individual faces and their technologically mediated surrogates. In Western visual culture, representations of faces have been incorporated into identity documents since the 15th century when Renaissance portraits were first used to visually and legally establish the social and institutional positions of particular individuals. However, it was not until the 20th century that official identity documents and infrastructures began to include photographic representations of individual faces. This work explores photo ID documents within the context of “the face,”—a theoretical model for understanding relationships of power coded using representations of particular human faces as tokens of identity. “The face” is a product of mythology for linking ideas of stable identity with images of particular human beings. This thesis extends the panoptic model of the body and contributes to the understanding of changes posed by computerization to the norms of constructing institutional identity and interaction based on surrogates of faces. The exploration is guided by four key research questions: What is “the face”? How does it work? What are its origins (or mythologies)? And how is “the face” being transformed through digitization? To address these questions this thesis weaves ideas from theorists including Foucault, Deleuze and Lyon to explore the rise of “the face” as a strategy for governing, sorting, and classifying members of constituent populations. The work re-examines the techno-political value of captured faces as identity data and by tracing the cultural and techno-political genealogies tying faces to ideas of stable institutional identities this thesis demonstrates face-based identity practices are being improvised and reconfigured by computerization and why these practices are significant for understanding the changing norms of interaction between individuals and institutions.
48

Establishing Confidence Level Measurements for Remote User Authentication in Privacy-Critical Systems

Robertson, Matthew January 2009 (has links)
User Authentication is the process of establishing confidence in the User identities presented to an information system. This thesis establishes a method of assigning a confidence level to the output of a user authentication process based on what attacks and threats it is vulnerable to. Additionally, this thesis describes the results of an analysis where the method was performed on several different authentication systems and the confidence level in the authentication process of these systems determined. Final conclusions found that most systems lack confidence in their ability to authenticate users as the systems were unable to operate in the face of compromised authenticating information. Final recommendations were to improve on this inadequacy, and thus improve the confidence in the output of the authentication process, through the verification of both static and dynamic attributes of authenticating information. A system that operates confidently in the face of compromised authenticating information that utilizes voice verification is described demonstrating the ability of an authentication system to have complete confidence in its ability to authenticate a user through submitted data.
49

Establishing Confidence Level Measurements for Remote User Authentication in Privacy-Critical Systems

Robertson, Matthew January 2009 (has links)
User Authentication is the process of establishing confidence in the User identities presented to an information system. This thesis establishes a method of assigning a confidence level to the output of a user authentication process based on what attacks and threats it is vulnerable to. Additionally, this thesis describes the results of an analysis where the method was performed on several different authentication systems and the confidence level in the authentication process of these systems determined. Final conclusions found that most systems lack confidence in their ability to authenticate users as the systems were unable to operate in the face of compromised authenticating information. Final recommendations were to improve on this inadequacy, and thus improve the confidence in the output of the authentication process, through the verification of both static and dynamic attributes of authenticating information. A system that operates confidently in the face of compromised authenticating information that utilizes voice verification is described demonstrating the ability of an authentication system to have complete confidence in its ability to authenticate a user through submitted data.
50

Biometrics in practice : The security technology of tomorrow's airports

Salavati, Sadaf January 2006 (has links)
The biometric technology is a method for authentication which has been used since several centuries back. This is a technology which offers several different techniques where the human’s unique characteristics are used for identification and verification of the individual. Biometrics are today at a stage of development that is pointing upwards and many individuals that are well aware of the biometric world believes that this is the technology that will take over the security systems used today. Ever since the terror attacks against USA at 2001, USA demanded that all 45 countries that today are not required to have visa when entering the United States must until the end of 2006 implement passports that contains biometrics information. The UN’s air traffic group on the other hand thinks that all counties in the world should use passports with biometric data. The biometric data in the passports are going to be stored in a chip and is in the first hand an image of the individuals face in a cryptic jpg format, but can also be complemented with fingerprints or even signature recognition. Sweden is currently using passports which contain biometric data but so far haven’t any machines that can read this passports been bought. Ulf Hägglund at Precise Biometrics AB believes that as soon as the real use of the biometric passports gets going the biometric technique will be used in a greater extension in the airports. Even though several Swedish airports consider the security technique used in airports today being enough, biometrics can increase the security and at the same time simplify many security processes. Falsification can be reduced when at same time one can be sure that the same passenger who has checked-in is the passenger who boards the airplane and the employee security control can be totally automatized. Generally it can be said that “biometrics is a decent way to increase security in different areas”. / Den biometriska teknologin är en äkthetsbevisningsmetod som har används sedan flera århundraden tillbaka. Detta är en teknologi som erbjuder flera olika tekniker där människans unika karateristiska kännetecken används för identifiering och verifiering av individen. Biometrin befinner sig idag i ett utvecklingsstadie som pekar uppåt och flera personer som är insatta i biometrins värld anser att detta är den teknologi som kommer att ta över det nuvarande säkerhetssystemet. Sedan terrorattentatet mot USA år 2001 har USA begärt att alla 45 länderna som idag inte behöver visum för att komma in till USA måste innan slutet av år 2006 införa pass som innehåller biometrisk information. FN; s luftfartsgrupp anser däremot att alla världens länder bör införa pass med biometrisk data. Den biometriska data som ska finnas i passen ska lagras i ett chip och är främst en avbildning av individens ansikte i krypterad jpg format men kan även tänkas bli kompletterat med fingeravtryck och eventuellt signatur igenkänning . I dagsläget använder sig Sverige av pass med biometrisk data, men än så länge har inte några maskiner som kan avläsa dessa pass köpts in. Ulf Hägglund på Precise Biometrics AB tror att så snart användandet av de biometriska passen kommer igång på riktigt kommer även den biometiska tekniken att användas i större utsträckning på flygplatser. Trotts att flera svenska flygplatser idag anser att den säkerhetsteknik som används på flygplatserna idag räcker, kan man genom att använda sig av biometri öka säkerheten samtidigt som man förenklar många säkerhetsprocesser. Falsifieringen minskar samtidigt som man kan försäkra sig om att det alltid är samma passagerare som checkat- in som stiger på planet och säkerhetskontrollerna för de anställda kan bli total automatiserad. I stort kan man säga att ”biometrin är ett hyggligt steg mot att förbättra säkerheten inom olika områden”.

Page generated in 0.0591 seconds