101 |
Integrating biometric authentication into multiple applicationsBreedt, Morne 28 August 2007 (has links)
The Internet has grown from its modest academic beginnings into an important, global communication medium. It has become a significant, intrinsic part of our lives, how we distribute information and how we transact. It is used for a variety of purposes, including: banking; home shopping; commercial trade - using EDI (Electronic Data Interchange); and to gather information for market research and other activities. Owing to its academic origins, the early developers of the Internet did not focus on security. However, now that it has rapidly evolved into an extensively used, global commercial transaction and distribution channel, security has become a big concern. Fortunately, the field of information security has started to evolve in response and is fast becoming an important discipline with a sound theoretical basis. The discipline views the twin processes of identification and authentication as crucial aspects of information security. An individual access attempt must be identifiable prior to access being authorised otherwise system confidentiality cannot be enforced nor integrity safeguarded. Similarly, non-denial becomes impossible to instigate since the system is unable to log an identity against specific transactions. Consequently, identification and authentication should always be viewed as the first step to successfully enforcing information security. The process of identification and authorisation is, in essence, the ability to prove or verify an identity. This is usually accomplished using either one or a combination of the following three traditional identification techniques: something you possess; something you know; or something you are. A critical consideration when designing an application is which identification method, or combination of methods, from the three described above to use. Each method offers its own pros and cons and there are many ways to compare and contrast them. The comparison made in this study identifies biometrics as the best solution in a distributed application environment. There are, however, two over-arching hindrances to its widespread adoption. The first is the environment’s complexity - with multiple applications being accessed by both the public and the private sectors - and the second is that not all biometrics are popular and no single method has universe appeal. The more significant hindrance of the two is the latter, that of acceptance and trust, because it matters little how good or efficient a system is if nobody is willing to use it. This observation suggests that the identification system needs to be made as flexible as possible. In a democratic society, it could be argued that the best way of ensuring the successful adoption of a biometric system would be to allow maximum freedom of choice and let users decide which biometric method they would like to use. Although this approach is likely to go a long way towards solving the acceptance issue, it increases the complexity of the environment significantly. This study attempts to solve this problem by reducing the environment’s complexity while simultaneously ensuring the user retains maximum biometric freedom of choice. This can be achieved by creating a number of central biometric repositories. Each repository would be responsible for maintaining a biometric template data store for a type of biometric. These repositories or “Biometric Authorities” would act as authentication facilitators for a wide variety of applications and free them from that responsibility. / Dissertation (MSc (Computer Engineering))--University of Pretoria, 2005. / Electrical, Electronic and Computer Engineering / MSc / unrestricted
|
102 |
Två olika biometriska system – en jämförelse av säkerhetenImamovic, Edi January 2019 (has links)
Ett ständigt växande antal människor världen över använder sig numera av biometriska system istället för lösenord när de ska hålla sina tillgångar säkra. Det finns flera anledningar till att biometriska system har blivit ett så stort fenomen idag, bland annat på grund av dess höga grad av säkerhet. Eftersom varje människa har unika biometriska attribut leder detta till ett system som är oerhört säkert. Denna studie syftar till att utföra en systematisk litteraturstudie, där vi avgränsar oss till att undersöka hur man kan mäta säkerhetsaspekter hos två av dagens mest tillämpade biometriska system, nämligen fingerprint scanner och face recognition. Resultaten av denna litteraturstudie visar bland annat att båda systemen är mycket säkra och att det inte enbart finns ett sätt att mäta säkerheten hos de två ovannämnda biometriska systemen utan flera. / Biometric systems are becoming increasingly popular around the world and it has become more common among people to use biometric systems instead of passwords when they are to keep their assets safe. There are many reasons why biometric systems have become such a phenomenon today, among other things because of its high degree of security. Because no human has the same biometric attributes as the other this leads to a very safe system. In this study, two different biometric systems, namely fingerprint scanners and face recognition, and their security aspects are addressed. These two systems are two of the most widely used biometric systems today. A systematic literature study has been carried out aiming at examining how to measure the safety of a biometric system according to previous research studies and which of the two selected biometric solutions showing the greatest safety with the found test methods. Based on the answers we received from the literature study, we concluded that both systems were very safe and moreover, we also found that there is not only one way to measure the safety of biometric systems but several different.
|
103 |
Représentations redondantes et hiérarchiques pour l'archivage et la compression de scènes sonores / Sparse and herarchical representations for archival and compression of audio scenesMoussallam, Manuel 18 December 2012 (has links)
L'objet de cette thèse est l'analyse et le traitement automatique de grands volumes de données audio. Plus particulièrement, on s'intéresse à l'archivage, tâche qui regroupe, au moins, deux problématiques: la compression des données, et l'indexation du contenu de celles-ci. Ces deux problématiques définissent chacune des objectifs, parfois concurrents, dont la prise en compte simultanée s'avère donc difficile. Au centre de cette thèse, il y a donc la volonté de construire un cadre cohérent à la fois pour la compression et pour l'indexation d'archives sonores. Les représentations parcimonieuses de signaux dans des dictionnaires redondants ont récemment montré leur capacité à remplir une telle fonction. Leurs propriétés ainsi que les méthodes et algorithmes permettant de les obtenir sont donc étudiés dans une première partie de cette thèse. Le cadre applicatif relativement contraignant (volume des données) va nous amener à choisir parmi ces derniers des algorithmes itératifs, appelés également gloutons. Une première contribution de cette thèse consiste en la proposition de variantes du célèbre Matching Pursuit basées sur un sous-échantillonnage aléatoire et dynamique de dictionnaires. L'adaptation au cas de dictionnaires temps-fréquence structurés (union de bases de cosinus locaux) nous permet d'espérer une amélioration significative des performances en compression de scènes sonores. Ces nouveaux algorithmes s'accompagnent d'une modélisation statistique originale des propriétés de convergence usant d'outils empruntés à la théorie des valeurs extrêmes. Les autres contributions de cette thèse s'attaquent au second membre du problème d'archivage: l'indexation. Le même cadre est cette fois-ci envisagé pour mettre à jour les différents niveaux de structuration des données. Au premier plan, la détection de redondances et répétitions. A grande échelle, un système robuste de détection de motifs récurrents dans un flux radiophonique par comparaison d'empreintes est proposé. Ses performances comparatives sur une campagne d'évaluation du projet QUAERO confirment la pertinence de cette approche. L'exploitation des structures pour un contexte autre que la compression est également envisagé. Nous proposons en particulier une application à la séparation de sources informée par la redondance pour illustrer la variété de traitements que le cadre choisi autorise. La synthèse des différents éléments permet alors d'envisager un système d'archivage répondant aux contraintes par la hiérarchisation des objectifs et des traitements. / The main goal of this work is automated processing of large volumes of audio data. Most specifically, one is interested in archiving, a process that encompass at least two distinct problems: data compression and data indexing. Jointly addressing these problems is a difficult task since many of their objectives may be concurrent. Therefore, building a consistent framework for audio archival is the matter of this thesis. Sparse representations of signals in redundant dictionaries have recently been found of interest for many sub-problems of the archival task. Sparsity is a desirable property both for compression and for indexing. Methods and algorithms to build such representations are the first topic of this thesis. Given the dimensionality of the considered data, greedy algorithms will be particularly studied. A first contribution of this thesis is the proposal of a variant of the famous Matching Pursuit algorithm, that exploits randomness and sub-sampling of very large time frequency dictionaries. We show that audio compression (especially at low bit-rate) can be improved using this method. This new algorithms comes with an original modeling of asymptotic pursuit behaviors, using order statistics and tools from extreme values theory. Other contributions deal with the second member of the archival problem: indexing. The same framework is used and applied to different layers of signal structures. First, redundancies and musical repetition detection is addressed. At larger scale, we investigate audio fingerprinting schemes and apply it to radio broadcast on-line segmentation. Performances have been evaluated during an international campaign within the QUAERO project. Finally, the same framework is used to perform source separation informed by the redundancy. All these elements validate the proposed framework for the audio archiving task. The layered structures of audio data are accessed hierarchically by greedy decomposition algorithms and allow processing the different objectives of archival at different steps, thus addressing them within the same framework.
|
104 |
Modeling, analysis, and simulation of Muzima fingerprint module based on ordinary and time Petri netsEadara, Archana 15 April 2016 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In the healthcare industry, several modern patient identification and patient matching systems have been introduced. Most of these implement patient identification by their first, middle and last names. They also use Social Security Number and other similar national identifiers. These methods may not work for many developing and underdeveloped countries where identifying a patient is a challenge with highly redundant and interchangeable first and last names of the patient, this is aggravated by the absence of a national identification system. In order to make the patient identification more efficient, Muzima, an interface of OpenMRS (Open source medical records system) introduced an additional identifier, fingerprint, through a module to the system. Ordinary and Time Petri nets are used to analyze this module. Chapter 1 introduces Muzima fingerprint module and describes the workflow of this interface followed by the related work, importance and applications of Petri nets. Chapter 2 introduces Ordinary and Time Petri nets using examples. Chapter 3 discusses about the mathematical modeling of the Muzima Fingerprint module using Petri nets. Chapter 4 explains the qualitative and quantitative analysis done on the Muzima fingerprint module. Chapter 5 discusses about the programming and simulation done to prove the theoretical results obtained. Chapter 6 provides the conclusion and future work for the thesis.
|
105 |
Att mäta och kommunicera hållbart : en analys av ett svenskt jordbrukLevin, Anna January 2011 (has links)
Tools that highlight the human impact on ecosystems and the accelerating depletion of natural resources are essential in the strife towards a more sustainable way of living. Emergy analysis is a scientific and robust method to assess the degree of sustainability of human as well as natural systems. Despite the advantages of the method, its public breakthrough has been slow. One reason could be that the results of an emergy analysis are difficult to grasp. In contrast, ecological footprint is a concept that has a widespread impact, much due to its pedagogical disposition. Ecological footprint made use of the vision to develop a method to well communicate the magnitude of human effect on nature. Another more recently created method suitable in this context is ecosystem services. Ecosystem services as a concept is not yet so well developed methodologically, but its use of mainstream concepts point toward a promising application. The main purpose of this study is to make the emergy analysis and ecosystem services methods more accessible and also to facilitate effective communication of the results from these methods. The second part of the study aims to assess the degree of sustainability ofan agricultural system in central Sweden by means of emergy analysis and ecosystems services. By demonstrating the results of the emergy analysis as a foot- and fingerprint, a better understanding of the outcome may be achieved. The footprint, here called emergy-based footprint, visualizes all resources used in the production system. An emergy-based fingerprint identifies the resources of the most important items in the system. Furthermore, ecosystem services are evaluated from a data matrix and presented by means of a radar diagram. Alternative scenarios for the agricultural system were created in the study, each presented as an emergy-based foot- and fingerprint, as well as by means of a radar diagram to visualize the values of the ecosystem services. Together, these methods demonstrate the sustainability characteristics of the different production systems. Results from this study suggest the agricultural system analysed, as well as the developed scenarios, not to be sustainable. The use of emergy analysis combined with ecosystem services and the visualization methods developed in this study, serve to provide accessible and effective communication methods when aiming to transform agricultural systems towards sustainability. The communication methods developed in this study are alsoapplicable in systems other than agriculture.
|
106 |
Assessing and Evaluating Biomarkers and Chemical Markers by Targeted and Untargeted Mass Spectrometry-based MetabolomicsYang, Kundi 11 November 2020 (has links)
No description available.
|
107 |
A Personal Place Awareness SystemSnow, Bradford Jason 20 April 2005 (has links)
No description available.
|
108 |
Low-cost and Robust Countermeasures against Counterfeit Integrated CircuitsZheng, Yu 09 February 2015 (has links)
No description available.
|
109 |
Fingerprint Identification by Improved Method of Minutiae MatchingLi, Tuo 18 January 2017 (has links)
No description available.
|
110 |
Population Genetics of Hudson Bay Beluga Whales (Delphinapterus leucas): An Analysis of Population Structure and Gene Flow using Mitochondrial DNA Sequences and Multilocus DNA Fingerprinting / Population Genetics of Hudson Bay Beluga WhalesMancuso, Samuel 09 1900 (has links)
Beluga whales in Canadian waters are subdivided into at least six genetically distinct stocks maintained by geographic separation and philopatry to estuaries in summer. Belugas in eastern and western Hudson Bay have previously been shown to be compose genetically distinct populations using mitochondrial restriction analysis. It is not known whether these stocks are further subdivided on the basis of specific estuarine use. Mitochondrial DNA control region sequences were used to investigate variation among belugas sampled at several sites along eastern Hudson Bay, Hudson Strait and Ungava Bay. 320 bp were sequenced, including the highly variable 5' region of control region, in 126 belugas. 17 variable sites and 17 haplotypes, which clustered into 2 related groups, were detected among the whales sequenced. Haplotypes of group A were found mostly in eastern Hudson Bay sites, while B group haplotypes were predominant in northern populations. Significant differences in frequencies of haplotype groups were found between eastern Hudson Bay and Southern Hudson Strait/Ungava Bay populations, indicating they are genetically distinct populations. Haplotype distribution patterns also suggested possible differences between belugas using different estuaries along eastern Hudson Bay. The presence of both groups in each population indicated some exchange of individuals between populations, and/or between eastern and western Hudson Bay. Multilocus DNA fingerprinting was used to investigate the extent of gene flow between eastern and western Hudson Bay belugas via interbreeding on common wintering grounds in Hudson Strait. Belugas from St. Lawrence estuary and the Mackenzie Delta were also analyzed to measure their genetic relatedness to Hudson Bay whales as well as for purposes of comparison to earlier fingerprinting analyses. While results supported lower genetic diversity within the St. Lawrence population, the range of bandsharing within and between populations was otherwise low (0.09 -0.17 for Jeffreys 33.15 and 0.12-0.22 for Jeffreys 33.6). Mantel tests showed differences among St. Lawrence, Hudson Bay, and Mackenzie Delta populations, but not within Hudson Bay. The conflicting nature of the data did not allow conclusions regarding gene flow. Therefore, DNA fingerprinting was not considered to have provided sufficient resolution in addressing this issue. / Thesis / Master of Science (MS)
|
Page generated in 0.0926 seconds