• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 15
  • Tagged with
  • 72
  • 72
  • 72
  • 67
  • 32
  • 30
  • 16
  • 15
  • 14
  • 14
  • 12
  • 12
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A software restructuring tool for oberon

Eloff, Johannes J. 12 1900 (has links)
Thesis (MComm) -- University of Stellenbosch, 2001. / ENGLISH ABSTRACT: Software restructuring is a form of perfective maintenance that modifies the structure of a program's source code. lts goal is increased maintainability to better facilitate other maintenance activities, such as adding new functionality or correcting previously undetected errors. The modification of structure is achieved by applying transformations to the source code of a software system. Software engineers often attempt to restructure software by manually transforming the source code. This approach may lead to undesirable and undetectable changes in its behaviour. Ensuring that manual transformations preserve functionality during restructuring is difficult; guaranteeing it is almost impossible. One solution to the problem of manual restructuring is automation through use of a restructuring tool. The tool becomes responsible to examine each transformation and determine its impact on the software's behaviour. If a transformation preserves functionality, it may be applied to produce new source code. The tool only automates the application of transformations. The decision regarding which transformation to apply in a specific situation still resides with the maintainer. This thesis describes the design and implementation of a restructuring tool for the Oberon language, a successor of Pascal and Modula-2, under the PC Native Oberon operating system. The process of creating an adequate abstraction of a program's structure and its use to apply transformations and generate new source code are investigated. Transformations can be divided into different classes: Scoping, Syntactic, Control flow and Abstraction transformations. The restructuring tool described in this thesis contains implementations from all four classes. Informal arguments regarding the correctness of each transformation are also presented. / AFRIKAANSE OPSOMMING: Die herstrukturering van programmatuur is daarop gemik om die struktuur van 'n program se bronkode te wysig. Hierdie strukturele veranderings dien in die algemeen as voorbereiding vir meer omvangryke onderhoudsaktiwiteite, soos byvoorbeeld die toevoeging van nuwe funksionaliteit of die korrigering van foute wat voorheen verskuil was. Die verandering in struktuur word teweeggebring deur die toepassing van transformasies op die bronkode. Programmatuur-ontwikkelaars voer dikwels sulke transformasies met die hand uit. Sulke optrede kan problematies wees indien 'n transformasie die funksionaliteit, in terme van programgedrag, van die programmatuur beïnvloed. Dit is moeilik om te verseker dat bogenoemde metode funksionaliteit sal behou; om dit te waarborg is so te sê onmoontlik. 'n Oplossing vir bogenoemde probleem is die outomatisering van die herstruktureringsproses deur die gebruik van gespesialiseerde programmatuur. Hierdie programmatuur is in staat om die nodige transformasies toe te pas en terselfdertyd funksionaliteit te waarborg. Die keuse vir die toepassing van 'n spesifieke transformasie lê egter steeds by die programmeerder. Hierdie tesis bespreek die ontwerp en implementering van programmatuur om bronkode, geskryf in Oberon (die opvolger van Pascal en Modula-2), te herstruktureer. Die skep van 'n voldoende abstrakte voorstelling van bronkode, die gebruik van sodanige voorstelling in die toepassing van transformasies en die reprodusering van nuwe bronkode, word bespreek. Transformasies kan in vier breë klasse verdeel word: Bestek, Sintaks, Kontrolevloei en Abstraksie. Die programmatuur wat ontwikkel is vir hierdie tesis bevat voorbeelde uit elkeen van die voorafgenoemde klasse. Informele argumente word aangebied om die korrektheid van die onderskeie transformasies te staaf.
2

Minimization of symmetric difference finite automata

Muller, Graham 03 1900 (has links)
Thesis (MSc (Computer Science))--University of Stellenbosch, 2006. / The minimization of a Finite Automaton (FA) deals with the construction of an equivalent FA with the least number of states. Traditional FAs and the minimization thereof is a well defined and researched topic within academic literature. Recently a generalized form of the FA, namely the generalized FA(*-FA), has been derived from these traditional FAs. This thesis investigates the minimization and reduction of one case of ...
3

Automated program generation : bridging the gap between model and implementation

Bezuidenhout, Johannes Abraham 02 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: The general goal of this thesis is the investigation of a technique that allows model checking to be directly integrated into the software development process, preserving the benefits of model checking while addressing some of its limitations. A technique was developed that allows a complete executable implementation to be generated from an enhanced model specification. This included the development of a program, the Generator, that completely automates the generation process. In addition, it is illustrated how structuring the specification as a transitions system formally separates the control flow from the details of manipulating data. This simplifies the verification process which is focused on checking control flow in detail. By combining this structuring approach with automated implementation generation we ensure that the verified system behaviour is preserved in the actual implementation. An additional benefit is that data manipulation, which is generally not suited to model checking, is restricted to separate, independent code fragments that can be verified using verification techniques for sequential programs. These data manipulation code segments can also be optimised for the implementation without affecting the verification of the control structure. This technique was used to develop a reactive system, an FTP server, and this experiment illustrated that efficient code can be automatically generated while preserving the benefits of model checking. / AFRIKAANSE OPSOMMING: Hierdie tesis ondersoek ’n tegniek wat modeltoetsing laat deel uitmaak van die sagtewareontwikkelingsproses, en sodoende betroubaarheid verbeter terwyl sekere tekorkominge van die tradisionele modeltoetsing proses aangespreek word. Die tegniek wat ontwikkel is maak dit moontlik om ’n volledige uitvoerbare implementasie vanaf ’n gespesialiseerde model spesifikasie te genereer. Om die implementasie-generasie stap ten volle te outomatiseer is ’n program, die Generator, ontwikkel. Daarby word dit ook gewys hoe die kontrolevloei op ’n formele manier geskei kan word van data-manipulasie deur gebruik te maak van ’n staatoorgangsstelsel struktureringsbenadering. Dit vereenvoudig die verifikasie proses, wat fokus op kontrolevloei. Deur di´e struktureringsbenadering te kombineer met outomatiese implementasie-generasie, word verseker dat die geverifieerde stelsel se gedrag behou word in die finale implementasie. ’n Bykomende voordeel is dat data-manipulasie, wat gewoonlik nie geskik is vir modeltoetsing nie, beperk word tot aparte, onafhanklike kode segmente wat geverifieer kan word deur gebruik te maak van verifikasie tegnieke vir sekwensi¨eele programme. Hierdie data-manipulasie kode segmente kan ook geoptimeer word vir die implementasie sonder om die verifikasie van die kontrole struktuur te be¨ınvloed. Hierdie tegniek word gebruik om ’n reaktiewe stelsel, ’n FTP bediener, te ontwikkel, en di´e eksperiment wys dat doeltreffende kode outomaties gegenereer kan word terwyl die voordele van modeltoetsing behou word.
4

Random generation of finite automata over the domain of the regular languages

Raitt, Lesley Anne 12 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: The random generation of finite automata over the domain of their graph structures is a wellknown problem. However, random generation of finite automata over the domain of the regular languages has not been studied in such detail. Random generation algorithms designed for this domain would be useful for the investigation of the properties of the regular languages associated with the finite automata. We studied the existing enumerations and algorithms to randomly generate UDFAs and binary DFAs as they pertained to the domain of the regular languages. We evaluated the algorithms experimentally across the domain of the regular languages for small values of n and found the distributions non-uniform. Therefore, for UDFAs, we derived an algorithm for the random generation of UDFAs over the domain of the regular languages from Domaratzki et. al.’s [9] enumeration of the domain of the regular languages. Furthermore, for binary DFAs, we concluded that for large values of n, the bijection method is a viable means of randomly generating binary DFAs over the domain of the regular langagues. We looked at all the random generation of union-UNFAs and -UNFAs across the domain of the regular languages. Our study of these UNFAs took all possible variables for the generation of UNFAs into account. The random generation of UNFAs over the domain of the regular languages is an open problem / AFRIKAANSE OPSOMMING: Die ewekansige generasie van eindige toestand outomate (eto’s) oor die domein van hul grafiekstrukture is ’n bekende probleem. Nieteenstaande het die ewekansige generasie van eindige toestand outomate oor die domein van die regulˆere tale nie soveel aandag gekry nie. Algoritmes wat eindige toestand outomate ewekansig genereer oor die domein van die regulˆere tale sal nuttig wees om die ondersoek van die eienskappe van regulˆere tale, wat met eto’s verbind is, te bewerkstellig. Ons het die bestaande aftellings en algoritmes bestudeer vir die ewekansige generasie van deterministiese eindige toestand outomate (deto’s) met een en twee alfabetiese simbole soos dit betrekking het op die domein van die regulˆere tale bestudeer. Ons het die algoritmes eksperimenteel beoordeel oor die domein van die regulˆere tale vir outomate met min toestande en bevind dat die verspreiding nie eenvomig is nie. Daarom het ons ’n algoritme afgelei vir die ewekansige generasie van deto’s met een alfabetsimbool oor die domein van die regulˆere tale van Domaratzki et. al. [9] se aftelling. Bowendien, in die geval van deto’s met twee alfabetsimbole met ’n groot hoeveelheid toestande is die ‘bijeksie metode ’n goeie algoritme om te gebruik vir die ewekansige generasie van hierdie deto’s oor die domein van die regulˆere tale. Ons het ook die ewekansige generasie van [-nie-deterministiese eindige toestand outomate en -nie-deterministiese eindige toestand outomate oor die domein van die regulˆere tale bestudeer. Ons studie van hierdie neto’s het alle moontlike veranderlikes in ageneem. Die ewekansige generering van deto’s oor die domein van die regulˆere tale is ’n ope probleem.
5

Inductive machine learning bias in knowledge-based neurocomputing

Snyders, Sean 04 1900 (has links)
Thesis (MSc) -- Stellenbosch University , 2003. / ENGLISH ABSTRACT: The integration of symbolic knowledge with artificial neural networks is becoming an increasingly popular paradigm for solving real-world problems. This paradigm named knowledge-based neurocomputing, provides means for using prior knowledge to determine the network architecture, to program a subset of weights to induce a learning bias which guides network training, and to extract refined knowledge from trained neural networks. The role of neural networks then becomes that of knowledge refinement. It thus provides a methodology for dealing with uncertainty in the initial domain theory. In this thesis, we address several advantages of this paradigm and propose a solution for the open question of determining the strength of this learning, or inductive, bias. We develop a heuristic for determining the strength of the inductive bias that takes the network architecture, the prior knowledge, the learning method, and the training data into consideration. We apply this heuristic to well-known synthetic problems as well as published difficult real-world problems in the domain of molecular biology and medical diagnoses. We found that, not only do the networks trained with this adaptive inductive bias show superior performance over networks trained with the standard method of determining the strength of the inductive bias, but that the extracted refined knowledge from these trained networks deliver more concise and accurate domain theories. / AFRIKAANSE OPSOMMING: Die integrasie van simboliese kennis met kunsmatige neurale netwerke word 'n toenemende gewilde paradigma om reelewereldse probleme op te los. Hierdie paradigma genoem, kennis-gebaseerde neurokomputasie, verskaf die vermoe om vooraf kennis te gebruik om die netwerkargitektuur te bepaal, om a subversameling van gewigte te programeer om 'n leersydigheid te induseer wat netwerkopleiding lei, en om verfynde kennis van geleerde netwerke te kan ontsluit. Die rol van neurale netwerke word dan die van kennisverfyning. Dit verskaf dus 'n metodologie vir die behandeling van onsekerheid in die aanvangsdomeinteorie. In hierdie tesis adresseer ons verskeie voordele wat bevat is in hierdie paradigma en stel ons 'n oplossing voor vir die oop vraag om die gewig van hierdie leer-, of induktiewe sydigheid te bepaal. Ons ontwikkel 'n heuristiek vir die bepaling van die induktiewe sydigheid wat die netwerkargitektuur, die aanvangskennis, die leermetode, en die data vir die leer proses in ag neem. Ons pas hierdie heuristiek toe op bekende sintetiese probleme so weI as op gepubliseerde moeilike reelewereldse probleme in die gebied van molekulere biologie en mediese diagnostiek. Ons bevind dat, nie alleenlik vertoon die netwerke wat geleer is met die adaptiewe induktiewe sydigheid superieure verrigting bo die netwerke wat geleer is met die standaardmetode om die gewig van die induktiewe sydigheid te bepaal nie, maar ook dat die verfynde kennis wat ontsluit is uit hierdie geleerde netwerke meer bondige en akkurate domeinteorie lewer.
6

Computer-controlled human body coordination

Hakl, Henry 12 1900 (has links)
Thesis (MSc) -- University of Stellenbosch, 2003. / ENGLISH ABSTRACT: A need for intelligent robotic machines is identified. Research and experiments have focussed on stable, or relatively stable, dynamic simulated systems to demonstrate the feasibility of embedding advanced AI into dynamic physical systems. This thesis presents an attempt to scale the techniques to a dynamically highly unstable system - the coordination of movements in a humanoid model. Environmental simulation, articulated systems and artificial intelligence methods are identified as three essential layers for a complete and unified approach to embedding AI into robotic machinery. The history of the physics subsystem for this project is discussed, leading to the adoption of the Open Dynamics Engine as the physics simulator of choice. An approach to articulated systems is presented along with the EBNF of a hierarchical articulated system that was used to describe the model. A revised form of evolution is presented and justified. An AI model that makes use of this new evolutionary paradigm is introduced. A variety of AI variants are defined and simulated. The results of these simulations are presented and analysed. Based on these results recommendations for future work are made. / AFRIKAANSE OPSOMMING: Die beheer van dinamiese masjiene, soos intelligente robotte, is tans beperk tot fisies stabilie - of relatief stabiele - sisteme. In hierdie tesis word die tegnieke van kunsmatige intelligensie (KI) toegepas op die kontrole en beheer van 'n dinamies hoogs onstabiele sisteem: 'n Humanoïede model. Fisiese simulasie, geartikuleerde sisteme en kunmatige intelligensie metodes word geïdentifiseer as drie noodsaaklike vereistes vir 'n volledige en eenvormige benadering tot KI beheer in robotte. Die implementasie van 'n fisiese simulator word beskryf, en 'n motivering vir die gebruik van die sogenaamde "Open Dynamics Engine" as fisiese simulator word gegee. 'n Benadering tot geartikuleerde sisteme word beskryf, tesame met die EBNF van 'n hierargiese geartikuleerde sisteem wat gebruik is om die model te beskryf. 'n Nuwe interpretasie vir evolusie word voorgestel, wat die basis vorm van 'n KI model wat in die tesis gebruik word. 'n Verskeidenheid van KI variasies word gedefineer en gesimuleer, en die resultate word beskryf en ontleed. Voorstelle vir verdere navorsing word gemaak.
7

A bandwidth market for traffic engineering in telecommunication networks

Combrink, J. J. (Jacobus Johannes) 04 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2004. / ENGLISH ABSTRACT: Traffic engineering determines the bandwidth allocation required to meet the traffic loads in a network. Similarly an economic market determines the resource allocation required to meet the demand for resources. The term bandwidth market denotes traffic engineering methods that use economic market methodology to determine the bandwidth allocation required to meet the traffic loads. A bandwidth market is an attractive traffic engineering method because of its distributed nature and ability to respond quickly to changes in network architecture or traffic loads. Network terminology is frequently used to define bandwidth markets. Our approach is to use the concepts of microeconomics to define a bandwidth market. The result is that our bandwidth markets are similar to economic markets, which is advantageous for applying economic principles correctly. This thesis presents the theoretical basis for two bandwidth markets. The first bandwidth market is a framework for building bandwidth markets. The second bandwidth market represents a society of cooperating individuals. The society distributes resources via a mechanism based on economic principles. An implementation of the bandwidth market is presented in the form of an optimisation algorithm, followed by its application to several test networks. We show that, in the test networks examined, the optimisation algorithm reduces the network loss. For all test networks, the network loss achieved by the optimisation algorithm compares well with the network loss achieved by a centralised optimisation algorithm. / AFRIKAANSE OPSOMMING: Verkeersingenieurswese bepaal die nodige bandwydtetoekenning om die verkeersvolume in 'n netwerk te dra. Op 'n soortgelyke wyse bepaal 'n ekonomiese mark die nodige hulpbrontoekenning om die aanvraag vir hulpbronne te bevredig. Die terme bandwydtemark stel verkeersingenieurswesetegnieke voor wat ekonomiese-mark metodes gebruik om die bandwydtetoekenning vir die verkeersvolume in 'n netwerk te bepaal. 'n Bandwydtemark is 'n aantreklike verkeersingenieurswesetegniek omdat dit verspreid van aard is en vinnig kan reageer op veranderinge in netwerk argitektuur en verkeersvolume. Netwerkterminologie word gereeld gebruik om bandwydtemarkte te definieer. Ons benadering is om mikro-ekonomiese begrippe te gebruik om 'n bandwydtemark te definieer. Die resultaat is dat ons bandwydtemarkte soortgelyk aan ekonomiese markte is, wat voordelig is vir die korrekte toepassing van ekonomiese beginsels. Hierdie tesis lê die teoretiese grondwerk vir twee bandwydtemarkte. Die eerste bandwydtemark is 'n raamwerk vir die ontwikkeling van bandwydtemarkte. Die tweede bandwydtemark stel 'n vereniging van samewerkende individue voor. Die vereniging versprei bandwydte deur middel van 'n meganisme wat gebasseer is op ekonomiese beginsels. 'n Implementasie van hierdie bandwydtemark word voorgestel in die vorm van 'n optimeringsalgoritme, gevolg deur die toepassing van die optimeringsalgoritme op 'n aantal toetsnetwerke. Ons wys dat die bandwydtemark die netwerkverlies verminder in die toetsnetwerke. In terme van netwerkverlies vaar die bandwydtemark goed vergeleke met 'n gesentraliseerde optimeringsalgoritme.
8

On providing an efficient and reliable virtual block storage service

Esterhuyse, Eben 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2001. / ENGLISH ABSTRACT: This thesis describes the design and implementation of a data storage service. Many clients can be served simultaneously in an environment where processes execute on different physical machines and communicate via message passing primitives. The service is provided by two separate servers: one that functions at the disk block level and another that maintains files. A prototype system was developed first in the form of a simple file store. The prototype served two purposes: (1) it extended the single-user Oberon system to create a multiuser system suitable to support group work in laboratories, and (2) it provided a system that could be measured to obtain useful data to design the final system. Clients access the service from Oberon workstations. The Oberon file system (known as the Ceres file system) normally stores files on a local disk. This system was modified to store files on a remote Unix machine. Heavily used files are cached to improve the efficiency of the system. In the final version of the system disk blocks are cached, not entire files. In this way the disks used to store the data are unified and presented as a separate virtual block service to be used by file systems running on client workstations. The virtual block server runs on a separate machine and is accessed via a network. The simplicity of the block server is appealing and should in itself improve reliability. The main concern is efficiency and the goal of the project was to determine whether such a design can be made efficient enough to serve its purpose. / AFRIKAANSE OPSOMMING:Hierdie tesis omskryf die ontwerp en implementasie van 'n data stoor diens. Verskeie gebruikers word bedien deur die diens wat funksioneer in 'n verspreide omgewing: 'n omgewing waar prosesse uitvoer op verskillende masjiene en met mekaar kommunikeer met behulp van boodskappe wat rondgestuur word. Die diens word verskaf deur twee bedieners: die eerste wat funksioneer op 'n blok vlak en die ander wat lers onderhou. 'n Prototipe leer diens is ontwikkel deur middel van 'n basiese leer stoor. Die prototipe het twee funksies verrig: (1) die enkel gebruiker Oberon stelsel is uitgebrei na 'n veelvoudige gebruiker stelsel bruikbaar vir groepwerk in 'n laboratorium omgewing, en (2) 'n stelsel is verskaf wat betroubare en akkurate data kon verskaf vir die ontwerp van die finale stelsel. Oberon werkstasies word gebruik met die leer diens. Die Oberon leer stelsel (ook bekend as die Ceres leer stelsel) stoor normaalweg leers op 'n lokale skyf. Hierdie bestaande stelsel is verander om leers te stoor op 'n eksterne Unix masjien. Leers wat die meeste in gebruik is word in geheue aangehou vir effektiwiteits redes. Die finale weergawe van die stelsel berg skyf blokke in geheue, nie leers nie. Hierdie metode laat dit toe om data te stoor op 'n standaard metode, bruikbaar deur verskillende tipes leer stelsels wat uitvoer op verskeie gebruikers se werkstasies. Die virtuele blok stoor voer uit op 'n aparte masjien en is bereikbaar via 'n netwerk. Die eenvoudige ontwerp van die diens is opsigself aanloklik en behoort betroubaarheid te verbeter. Die hoof bekommernis is effektiwiteit en die hoofdoel van die projek was om te bepaal of hierdie ontwerp effektief genoeg gemaak kon word.
9

Resource management in IP networks

Wahabi, Abdoul Rassaki 12 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2001. / ENGLISH ABSTRACT: lP networks offer scalability and flexibility for rapid deployment of value added lP services. However, with the increased demand and explosive growth of the Internet, carriers require a network infrastructure that is dependable, predictable, and offers consistent network performance. This thesis examines the functionality, performance and implementation aspects of the MPLS mechanisms to minimize the expected packet delay in MPLS networks. Optimal path selection and the assignment of bandwidth to those paths for minimizing the average packet delay are investigated. We present an efficient flow deviation algorithm (EFDA) which assigns a small amount of flow from a set of routes connecting each OD pair to the shortest path connecting the OD pair in the network. The flow is assigned in such a way that the network average packet delay is minimized. Bellman's algorithm is used to find the shortest routes between all OD pairs. The thesis studies the problem of determining the routes between an OD pair and assigning capacities to those routes. The EFDA algorithm iteratively determines the global minimum of the objective function. We also use the optimal flows to compute the optimal link capacities in both single and multirate networks. The algorithm has been applied to several examples and to different models of networks. The results are used to evaluate the performance of the EFDA algorithm and compare the optimal solutions obtained with different starting topologies and different techniques. They all fall within a close cost-performance range. They are all within the same range from the optimal solution as well. / AFRIKAANSE OPSOMMING: lP-netwerke voorsien die skaleerbaarheid en buigsaamheid vir die vinnige ontplooing van toegevoegde-waarde lP-dienste. Die vergrote aanvraag en eksplosiewe uitbreiding van die Internet benodig betroubare, voorspelbare en bestendige netwerkprestasie. Hierdie tesis ondersoek die funksionaliteit, prestasie en implementering van die MPLS(multiprotokoletiketskakel)- meganismes om die verwagte pakketvertraging te minimeer. Ons bespreek 'n doeltreffende algoritme vir vloei-afwyking (EFDA) wat 'n klein hoeveelheid vloei toewys uit die versameling van roetes wat elke OT(oorsprong-teiken)- paar verbind aan die kortste pad wat die OT-paar koppel. Die vloei word toegewys sodanig dat die netwerk se gemiddelde pakketvertraging geminimeer word. Bellman se algoritme word gebruik om die kortste roetes tussen alle OT-pare te bepaal. Die tesis bespreek die probleem van die bepaling van roetes tussen 'n OT-paar en die toewysing van kapasiteite aan sulke roetes. Die EFDA-algoritme bepaal die globale minimum iteratief. Ons gebruik ook optimale vloeie vir die berekening van die optimale skakelkapasiteite in beide enkel- en multikoers netwerke. Die algoritme is toegepas op verskeie voorbeelde en op verskillende netwerkmodelle. Die skakelkapasiteite word aangewend om die prestasie van die EFDAalgoritme te evalueer en dit te vergelyk met die optimale oplossings verkry met verskillende aanvangstopologieë en tegnieke. Die resultate val binne klein koste-prestasie perke wat ook na aan die optimale oplossing lê.
10

Hidden Markov models for on-line signature verification

Wessels, Tiaan 12 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2002. / ENGLISH ABSTRACT: The science of signature verification is concerned with identifying individuals by their handwritten signatures. It is assumed that the signature as such is a unique feature amongst individuals and the creation thereof requires a substantial amount of hidden information which makes it difficult for another individual to reproduce the signature. Modern technology has produced devices which are able to capture information about the signing process beyond what is visible to the naked eye. A dynamic signature verification system is concerned with utilizing not only visible, i.e. shape related information but also invisible, hidden dynamical characteristics of signatures. These signature characteristics need to be subjected to analysis and modelling in order to automate use of signatures as an identification metric. We investigate the applicability of hidden Markov models to the problem of modelling signature characteristics and test their ability to distinguish between authentic signatures and forgeries. / AFRIKAANSE OPSOMMING: Die wetenskap van handtekeningverifikasie is gemoeid met die identifisering van individue deur gebruik te maak van hulle persoonlike handtekening. Dit berus op die aanname dat 'n handtekening as sulks uniek is tot elke individu en die generering daarvan 'n genoeg mate van verskuilde inligting bevat om die duplisering daarvan moeilik te maak vir 'n ander individu. Moderne tegnologie het toestelle tevoorskyn gebring wat die opname van eienskappe van die handtekeningproses buite die bestek van visuele waarneming moontlik maak. Dinamiese handtekeningverifikasie is gemoeid met die gebruik nie alleen van die sigbare manefestering van 'n handtekening nie, maar ook van die verskuilde dinamiese inligting daarvan om dit sodoende 'n lewensvatbare tegniek vir die identifikasie van individue te maak. Hierdie sigbare en onsigbare eienskappe moet aan analise en modellering onderwerp word in die proses van outomatisering van persoonidentifikasie deur handtekeninge. Ons ondersoek die toepasbaarheid van verskuilde Markov-modelle tot die modelleringsprobleem van handtekeningkarakteristieke en toets die vermoë daarvan om te onderskei tussen egte en vervalste handtekeninge.

Page generated in 0.1855 seconds