• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 63
  • 12
  • 10
  • 9
  • 7
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 132
  • 132
  • 44
  • 21
  • 17
  • 17
  • 16
  • 15
  • 14
  • 13
  • 11
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Real-time Object Recognition on a GPU

Pettersson, Johan January 2007 (has links)
Shape-Based matching (SBM) is a known method for 2D object recognition that is rather robust against illumination variations, noise, clutter and partial occlusion. The objects to be recognized can be translated, rotated and scaled. The translation of an object is determined by evaluating a similarity measure for all possible positions (similar to cross correlation). The similarity measure is based on dot products between normalized gradient directions in edges. Rotation and scale is determined by evaluating all possible combinations, spanning a huge search space. A resolution pyramid is used to form a heuristic for the search that then gains real-time performance. For SBM, a model consisting of normalized edge gradient directions, are constructed for all possible combinations of rotation and scale. We have avoided this by using (bilinear) interpolation in the search gradient map, which greatly reduces the amount of storage required. SBM is highly parallelizable by nature and with our suggested improvements it becomes much suited for running on a GPU. This have been implemented and tested, and the results clearly outperform those of our reference CPU implementation (with magnitudes of hundreds). It is also very scalable and easily benefits from future devices without effort. An extensive evaluation material and tools for evaluating object recognition algorithms have been developed and the implementation is evaluated and compared to two commercial 2D object recognition solutions. The results show that the method is very powerful when dealing with the distortions listed above and competes well with its opponents.
102

Evaluation of biometric security systems against artificial fingers

Blommé, Johan January 2003 (has links)
Verification of users’ identities are normally carried out via PIN-codes or ID- cards. Biometric identification, identification of unique body features, offers an alternative solution to these methods. Fingerprint scanning is the most common biometric identification method used today. It uses a simple and quick method of identification and has therefore been favored instead of other biometric identification methods such as retina scan or signature verification. In this report biometric security systems have been evaluated based on fingerprint scanners. The evaluation method focuses on copies of real fingers, artificial fingers, as intrusion method but it also mentions currently used algorithms for identification and strengths and weaknesses in hardware solutions used. The artificial fingers used in the evaluation were made of gelatin, as it resembles the surface of human skin in ways of moisture, electric resistance and texture. Artificial fingers were based on ten subjects whose real fingers and artificial counterpart were tested on three different fingerprint scanners. All scanners tested accepted artificial fingers as substitutes for real fingers. Results varied between users and scanners but the artificial fingers were accepted between about one forth and half of the times. Techniques used in image enhancement, minutiae analysis and pattern matching are analyzed. Normalization, binarization, quality markup and low pass filtering are described within image enhancement. In minutiae analysis connectivity numbers, point identification and skeletonization (thinning algorithms) are analyzed. Within pattern matching, direction field analysis and principal component analysis are described. Finally combinations of both minutiae analysis and pattern matching, hybrid models, are mentioned. Based on experiments made and analysis of used techniques a recommendation for future use and development of fingerprint scanners is made.
103

A framework for the explicit use of specific systems thinking methodologies in data-driven decision support system development

Goede, Roelien 13 May 2005 (has links)
Data-driven decision support systems, such as data warehouses, are extremely costly to develop. Forty one per cent of data warehouse development practitioners have experienced project failures. These projects were either completed after exceeding budget and time limits, or not at all. Some influential data warehousing authors advocate user involvement as a solution, while others focus on technical factors to improve data warehouse success. This study proposes a framework for data warehousing success based on systems thinking methodology. Systems thinking implies a holistic approach to problem solving. A system is a set of interrelated elements. A systems approach represents a broad view, taking all aspects into account and concentrating on interactions between different parts of the problem. This study investigates the practices of data warehousing professionals from a systems thinking point of view, before proposing a framework for the explicit use of specific systems thinking methodologies in data warehouse development. Interpretive case study research is used to investigate practices of data warehousing professionals in three different organisations. Pattern matching is used to analyse collected data. This is done by mapping practices to different systems thinking perspectives. However, the theory component of the thesis is not a description of current data warehousing practices from a systems thinking point of view, as in typical interpretive research. The theory component relates more to critical research in that it is meant to change data warehousing practices towards specific systems thinking methodologies. The proposed framework incorporates three sources of information. These are a literature study on systems thinking philosophy, methodology and practice; a literature study on data warehousing and data warehousing success factors; and the results of case studies on current practices of data warehousing professionals analysed from a systems thinking perspective. The framework gives a methodological foundation for a holistic approach to data warehousing with maximum user involvement. It views a data warehouse as a system with typical systems characteristics, such as specified objectives relating to the organisation’s objectives, an environment, available resources, specified components and effective management. / Thesis (PhD (Information Technology))--University of Pretoria, 2006. / Informatics / unrestricted
104

Suivi d'objets à partir d'images issues de caméras mobiles non calibrées

Llucia, Ludovic 14 February 2011 (has links)
La finalité industrielle de ce travail est la réalisation d’un simulateur qui permette aux entraîneurs de football de mieux interpréter certaines situations de jeu à partir de séquences vidéo en en proposant une vision 3D, mais aussi de présenter de la manière la plus ergonomique qui soit des schémas de jeux sur la base de ces situations réelles. De plus, contrairement à ce qui se pratique assez couramment dans ce cadre-là, nous avons souhaité – pour des raisons de souplesse d’utilisation – rester dans un cadre assez large et ne pas profiter d’une phase préalable (fastidieuse) de calibrage, ce qui a été à l’origine de verrous scientifiques et technologiques que nous avons cherché à résoudre, et pour lesquels nous proposons des solutions dans la première partie. Les problèmes que nous avons eu à traiter sont de deux natures, informatiques d’une part et ergonomiques d’autre part. Il s’est agi, par exemple, de caractériser la transformation homographique permettant de passer de l’image au modèle, de retrouver la position de la caméra ou de déterminer automatiquement à quelle zone du terrain correspond l’image acquise. Et, au niveau ergonomique, il s’est agi de reproduire le plus fidèlement possible les modes d’appréhension de la réalité terrain, ainsi que de modélisation et de communication des entraîneurs de football. / This work refers to a 3D simulator that has for purpose to help football trainers interpreting tactical sequences based on real situations. This simulator has to be able to interpret video movies, to reconstruct a situation. The camera’s calibration state has to be as simple as possible. The first part of this document refers to the solution elaborated to implement this constraint whereas the second one is more oriented on the industrialisation process. These processes imply to focus on vision computing and ergonomics problems and to answer questions such as : how to characterize a homographic transformation matching the image and the model ? How to retrieve the position of the camera? Which area is part of the image? In an ergonomically point of view, the simulator has to reproduce the game play reality and to improve the abstraction and the communication of the coaches.
105

Árvores de Ukkonen: caracterização combinatória e aplicações / Ukkonen\'s tree: combinatorial characterization and applications

Gustavo Akio Tominaga Sacomoto 08 February 2011 (has links)
A árvore de sufixos é uma estrutura dados, que representa em espaço linear todos os fatores de uma palavra, com diversos exemplos de aplicações práticas. Neste trabalho, definimos uma estrutura mais geral: a árvore de Ukkonen. Provamos para ela diversas propriedades combinatórias, dentre quais, a minimalidade em um sentido preciso. Acreditamos que a apresentação aqui oferecida, além de mais geral que as árvores de sufixo, tem a vantagem de oferecer uma descrição explícita da topologia da árvore, de seus vértices, arestas e rótulos, o que não vimos em nenhum outro trabalho. Como aplicações, apresentamos também a árvore esparsa de sufixos (que armazena apenas um subconjunto dos sufixos) e a árvore de k-fatores (que armazena apenas os segmentos de comprimento k, ao invés dos sufixos) definidas como casos particulares das árvores de Ukkonen. Propomos para as árvores esparsas um novo algoritmo de construção com tempo O(n) e espaço O(m), onde n é tamanho da palavra e m é número de sufixos. Para as árvores de k-fatores, propomos um novo algoritmo online com tempo e espaço O(n), onde n é o tamanho da palavra. / The suffix tree is a data structure that represents, in linear space, all factors of a given word, with several examples of practical applications. In this work, we define a more general structure: the Ukkonen\'s tree. We prove many properties for it, among them, its minimality in a precise sense. We believe that this presentation, besides being more general than the suffix trees, has the advantage of offering an explicit description of the tree topology, its vertices, edges and labels, which was not seen in any other work. As applications, we also presents the sparse suffix tree (which stores only a subset of the suffixes) and the k-factor tree (which stores only the substrings of length k, instead of the suffixes), both defined as Ukkonen\'s tree special cases. We propose a new construction algorithm for the sparse suffix trees with time O(n) and space O(m), where n is the size of the word and m is the number of suffixes. For the k-factor trees, we propose a new online algorithm with time and space O(n), where n is the size of the word.
106

Efficient and Secure Equality-based Two-party Computation

Javad Darivandpour (11190051) 27 July 2021 (has links)
<div>Multiparty computation refers to a scenario in which multiple distinct yet connected parties aim to jointly compute a functionality. Over recent decades, with the rapid spread of the internet and digital technologies, multiparty computation has become an increasingly important topic. In addition to the integrity of computation in such scenarios, it is essential to ensure that the privacy of sensitive information is not violated. Thus, secure multiparty computation aims to provide sound approaches for the joint computation of desired functionalities in a secure manner: Not only must the integrity of computation be guaranteed, but also each party must not learn anything about the other parties' private data. In other words, each party learns no more than what can be inferred from its own input and its prescribed output.</div><div><br></div><div> This thesis considers secure two-party computation over arithmetic circuits based on additive secret sharing. In particular, we focus on efficient and secure solutions for fundamental functionalities that depend on the equality of private comparands. The first direction we take is providing efficient protocols for two major problems of interest. Specifically, we give novel and efficient solutions for <i>private equality testing</i> and multiple variants of <i>secure wildcard pattern matching</i> over any arbitrary finite alphabet. These problems are of vital importance: Private equality testing is a basic building block in many secure multiparty protocols; and, secure pattern matching is frequently used in various data-sensitive domains, including (but not limited to) private information retrieval and healthcare-related data analysis. The second direction we take towards a performance improvement in equality-based secure two-party computation is via introducing a generic functionality-independent secure preprocessing that results in an overall computation and communication cost reduction for any subsequent protocol. We achieve this by providing the first precise functionality formulation and secure protocols for replacing original inputs with much smaller inputs such that this replacement neither changes the outcome of subsequent computations nor violates the privacy of sensitive inputs. Moreover, our input-size reduction opens the door to a new approach for efficiently solving Private Set Intersection. The protocols we give in this thesis are typically secure in the semi-honest adversarial threat model.</div>
107

Detekce P2P sítí / Detection of P2P Networks

Březina, Matej January 2008 (has links)
This document deals with design, implementation and testing of software system for detecting p2p (peer-to-peer) networks based on combination of BPF prefiltering and POSIX regular expressions packet payload matching with known p2p protocol communications. The proposed detection system includes a database with some rules of most effuse p2p protocols in format resembling to definitions for L7-filter classifier. The application is implemented in C, runs in userspace and is targeted to all POSIX compatible platforms. The combination of detector with user attached QoS controlling is complete solution for traffic reduction of common p2p protocols.
108

DebEAQ - debugging empty-answer queries on large data graphs

Lehner, Wolfgang, Vasilyeva, Elena, Heinze, Thomas, Thiele, Maik 12 January 2023 (has links)
The large volume of freely available graph data sets impedes the users in analyzing them. For this purpose, they usually pose plenty of pattern matching queries and study their answers. Without deep knowledge about the data graph, users can create ‘failing’ queries, which deliver empty answers. Analyzing the causes of these empty answers is a time-consuming and complicated task especially for graph queries. To help users in debugging these ‘failing’ queries, there are two common approaches: one is focusing on discovering missing subgraphs of a data graph, the other one tries to rewrite the queries such that they deliver some results. In this demonstration, we will combine both approaches and give the users an opportunity to discover why empty results were delivered by the requested queries. Therefore, we propose DebEAQ, a debugging tool for pattern matching queries, which allows to compare both approaches and also provides functionality to debug queries manually.
109

Memory Efficient Regular Expression Pattern Matching Architecture For Network Intrusion Detection Systems

Kumar, Pawan 08 1900 (has links) (PDF)
The rampant growth of the Internet has been coupled with an equivalent growth in cyber crime over the Internet. With our increased reliance on the Internet for commerce, social networking, information acquisition, and information exchange, intruders have found financial, political, and military motives for their actions. Network Intrusion Detection Systems (NIDSs) intercept the traffic at an organization’s periphery and try to detect intrusion attempts. Signature-based NIDSs compare the packet to a signature database consisting of known attacks and malicious packet fingerprints. The signatures use regular expressions to model these intrusion activities. This thesis presents a memory efficient pattern matching system for the class of regular expressions appearing frequently in the NIDS signatures. Proposed Cascaded Automata Architecture is based on two stage automata. The first stage recognizes the sub-strings and character classes present in the regular expression. The second stage consumes symbol generated by the first stage upon receiving input traffic symbols. The basic idea is to utilize the research done on string matching problem for regular expression pattern matching. We formally model the class of regular expressions mostly found in NIDS signatures. The challenges involved in using string matching algorithms for regular expression matching has been presented. We introduce length-bound transitions, counter-based states, and associated counter arrays in the second stage automata to address these challenges. The system uses length information along with counter arrays to keep track of overlapped sub-strings and character class based transition. We present efficient implementation techniques for counter arrays. The evaluation of the architecture on practical expressions from Snort rule set showed compression in number of states between 50% to 85%. Because of its smaller memory footprint, our solution is suitable for both software based implementations on network chips as well as FPGA based designs.
110

A Bayesian learning approach to inconsistency identification in model-based systems engineering

Herzig, Sebastian J. I. 08 June 2015 (has links)
Designing and developing complex engineering systems is a collaborative effort. In Model-Based Systems Engineering (MBSE), this collaboration is supported through the use of formal, computer-interpretable models, allowing stakeholders to address concerns using well-defined modeling languages. However, because concerns cannot be separated completely, implicit relationships and dependencies among the various models describing a system are unavoidable. Given that models are typically co-evolved and only weakly integrated, inconsistencies in the agglomeration of the information and knowledge encoded in the various models are frequently observed. The challenge is to identify such inconsistencies in an automated fashion. In this research, a probabilistic (Bayesian) approach to abductive reasoning about the existence of specific types of inconsistencies and, in the process, semantic overlaps (relationships and dependencies) in sets of heterogeneous models is presented. A prior belief about the manifestation of a particular type of inconsistency is updated with evidence, which is collected by extracting specific features from the models by means of pattern matching. Inference results are then utilized to improve future predictions by means of automated learning. The effectiveness and efficiency of the approach is evaluated through a theoretical complexity analysis of the underlying algorithms, and through application to a case study. Insights gained from the experiments conducted, as well as the results from a comparison to the state-of-the-art have demonstrated that the proposed method is a significant improvement over the status quo of inconsistency identification in MBSE.

Page generated in 0.0588 seconds