• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • 1
  • Tagged with
  • 1325
  • 1313
  • 1312
  • 1312
  • 1312
  • 192
  • 164
  • 156
  • 129
  • 99
  • 93
  • 79
  • 52
  • 51
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
781

Novel approaches to biometric security with an emphasis on liveness and coercion detection

Matthew, Peter January 2016 (has links)
Chapter One contains the introduction, sets the context and aims of the research and highlights the original contribution to knowledge along with publications gained from this research. Chapter Two will cover the methodological choice, which is grounded theory, as well as highlighting other potential methodologies that while are viable have not been used. Chapter Three will identify the background to biometric security while identifying some of the key areas that are currently lacking such as an appropriate way of measuring liveness detection techniques as well as the entire coercion detection sub-discipline. Chapter Four discusses the development of a new taxonomy that will classify liveness detection while moving away from the current ordinal measurement system used within the research area. Analysis of these liveness classifiers will then follow leading into the adaptation of the taxonomy of coercion detection techniques. Further development of these new techniques will follow, identifying metrics for coercion detection and an analysis of the proposed classifiers. After the taxonomy development Chapter Five analyses coercion and liveness techniques by 1.6 Published Work 7 applying the taxonomy across a selection of liveness and coercion techniques. This will then be followed by the development of an algorithm to denote the level of security an individual technique has achieved. Explanation of the algorithm development, components and testing will then be included. Finally Chapter Six will contain the final concluding remarks and will cover some of the areas in the future that can be looked into, alongside some focuses for article and conference submission.
782

Using cultural familiarity for usable and secure recognition-based graphical passwords

Aljahdali, Hani Moaiteq January 2015 (has links)
Recognition-based graphical passwords (RBGPs) are a promising alternative to alphanumeric passwords for user authentication. The literature presented several schemes in order to find the best types of pictures in terms of usability and security. This thesis contributes the positive use of cultural familiarity with pictures for usable and secure recognition-based graphical passwords in two different countries: Scotland and Saudi Arabia. This thesis presents an evaluation of a culturally-familiar graphical password scheme (CFGPS). This scheme is based on pictures that represent the daily life in different cultures. Those pictures were selected from a database containing 797 pictures representing the cultures of 30 countries. This database was created as the first step in this thesis from the responses of 263 questionnaires. The evaluation of the scheme goes through five phases: registration phase, usability phase, security phase, interviews phase, and guidelines phase. In the registration phase, a web-based study was conducted to determine the cultural familiarity impact on choosing the pictures for the GPs. A large number of participants (Saudi and Scottish) registered their GPs. The results showed that users were highly affected by their culture when they chose pictures for their GPs; however, the Saudis were significantly more affected by their culture than the Scottish. This study showed the developers the importance of having a selection of pictures that are as familiar as possible to users in order to create suitable GPs. In the usability phase, the participants were asked to log in with their GPs three months after the registration phase. The main results showed that the memorability rate for GPs consisting only of pictures belonging to participants’ culture was higher than the memorability rate for GPs consisting of pictures that did not belong to participants’ culture. However, there was no evidence regarding a cultural familiarity effect on login time. In the security phase, a within-subject user study was conducted to examine the security of culturally-familiar GPs against educated guessing attacks. This study was also the first attempt to investigate the risk of using personal information shared by users on social networks to guess their GPs. The results showed high guessability for CFGPs. The interviews phase evaluated the qualitative aspects of the CFGP password in order to improve its performance. In-depth interviews with the users of the scheme suggested guidelines for both developers and users to increase the usability and security of the scheme. Those guidelines are not exclusive to the culturally-familiar scheme, as they can be used for all RBGP schemes. Finally, as one of the instructions stated in the developers’ guidelines, different challenge sets’ designs were evaluated based on their cultural familiarity to users. The results showed a high usability of the culturally-familiar challenge set while the security target was met in the culturally-unfamiliar challenge set. To balance between these two factors, following the user guidelines covered the weaknesses of both designs.
783

Flow effect on thrombus formation in stenosed coronary arteries : a computational study

Papadopoulos, Konstantinos January 2015 (has links)
Thrombotic complications often occur in stenosed coronary artery, causing permanent damage and even death. Although the exact factors and events leading to thrombus formation are not entirely known, the local rheological conditions as well as the different tendency for clotting of the blood of each individual are considered to have an important role both in progress of the disease and the relevant complications. Recently a statistical correlation between the exact location of the stenosis and the evolution of the disease has been reported [1]. In this work we investigated whether this connection can be predicted by computational simulations. For this purpose, a simplified model for blood coagulation, focusing mainly on thrombin activation, has been formulated in three steps. A phenomenological sub-model for thrombin generation was developed and calibrated, based on clinical tests (not laboratory experiments). The model was proved capable of reproducing with acceptable accuracy the rate of thrombin generation for blood samples from individuals with different thrombogenic potential, including haemophilia cases. A second sub-model for platelet aggregation on reacting surface was developed and calibrated in order to reproduce experimental data [2]. Finally, the whole coagulation model was adapted for application under flow conditions, based on the threshold behaviour of blood coagulation under flow in respect to wall shear rate and reacting surface stream-wise length [3]. In order to test the developed models, 3 groups of left anterior descending (LAD) geometry models have been constructed based on the location of the stenotic lesion. Each group consisted by geometries with different degrees of stenosis. Two of these groups (named MI1 and MI2) were statistically assessed as of higher risk for complications with the third one (STA) considered safer. Transient flow simulations were performed for these three groups, with representative coronary flow conditions. Flow was resolved by employing the incompressible Navier-Stokes equations for Newtonian fluid, which are considered to describe with acceptable accuracy the blood motion in arteries. Processing of the results have shown that appropriate surface quantities can distinct between high and low complication risk cases. Finally, based on the results of the healthy model and on previous works, a set of flow-based risk indices was proposed in order to distinguish among arbitrary geometries the ones that are more likely to lead to coronary artery disease complications. Finally the developed model for coagulation was applied in selected (based on the flow simulation results) geometries for the three groups, under the previously calculated pulsating blood flow conditions. The simulation results have indicated that in geometries with higher degree of stenosis and higher wall shear values (regardless of the geometric group considered) the propagation of thrombin is slower. All MI2 models had similar behaviour: low average thrombin concentration and production rate with high thrombin concentration restricted in specific sites. On the contrary, in STA and MI1 models we could not identify a uniform pattern. In most cases high risk sites (elevated concentration and production rate of thrombin) were found near the wall at the areas of recirculation vortices formed after the stenotic lesion. Although this study showed that computational simulations can be used for the assessment of stenosed LAD and probably coronary arteries in general, it also showed that in order to obtain results that can be safely trusted for diagnosis the method should be applied on a large number of real geometries from patients with known disease outcome.
784

Constraint specification by example in a meta-CASE tool

Qattous, Hazem Kathem January 2011 (has links)
Meta-CASE tools offer the ability to specialise and customise diagram-based software modelling editors. Constraints play a major role in these specialisation and customisation tasks. However, constraint definition is complicated. This thesis addresses the problem of constraint specification complexity in meta-CASE tools. Constraint Specification by Example (CSBE), a novel variant of Programming by Example, is proposed as a technique that can simplify and facilitate constraint specification in meta-CASE tools. CSBE involves a user presenting visual examples of diagrams to the tool which engages in a synergistic interaction with the user, based on system inference and additional user input, to arrive at the user’s intended constraint. A prototype meta-CASE tool has been developed that incorporates CSBE. This prototype was used to perform several empirical studies to investigate the feasibility and potential advantages of CSBE. An empirical study was conducted to evaluate the performance in terms of effectiveness, efficiency and user satisfaction of CSBE compared to a typical form-filling technique. Results showed that users using CSBE correctly specified significantly more constraints and required less time to accomplish the task. Users reported higher satisfaction when using CSBE. A second empirical online study has been conducted with the aim of discovering the preference of participants for positive or negative natural language polarity when expressing constraints. Results showed that subjects preferred positive constraint expression over negative expression. A third empirical study aimed to discover the effect of example polarity (negative vs. positive) on the performance of CSBE. A multi-polarity tool offering both positive and negative examples scored significantly higher correctness in a significantly shorter time to accomplish the task with a significantly higher user satisfaction compared to a tool offering only one example polarity. A fourth empirical study examined user-based addition of new example types and inference rules into the CSBE technique. Results demonstrated that users are able to add example types and that performance is improved when they do so. Overall, CSBE has been shown to be feasible and to offer potential advantages compared to other commonly-used constraint specification techniques.
785

Sound for the exploration of space physics data

Diaz Merced, Wanda Liz January 2013 (has links)
Current analysis techniques for space physics 2D numerical data are based on scruti-nising the data with the eyes. Space physics data sets acquired from the natural lab of the interstellar medium may contain events that may be masked by noise making it difficult to identify. This thesis presents research on the use of sound as an adjunct to current data visualisation techniques to explore, analyse and augment signatures in space physics data. This research presents a new sonification technique to decom-pose a space physics data set into different components (frequency, oscillatory modes, etc…) of interest, and its use as an adjunct to data visualisation to explore and analyse space science data sets which are characterised by non-linearity (a system which does not satisfy the superposition principle, or whose output is not propor-tional to its input). Integrating aspects of multisensory perceptualization, human at tention mechanisms, the question addressed by this dissertation is: Does sound used as an adjunct to current data visualisation, augment the perception of signatures in space physics data masked by noise? To answer this question, the following additional questions had to be answered: a) Is sound used as an adjunct to visualisation effective in increasing sensi-tivity to signals occurring at attended, unattended, unexpected locations, extended in space, when the occurrence of the signal is in presence of a dynamically changing competing cognitive load (noise), that makes the signal visually ambiguous? b) How can multimodal perceptualization (sound as an adjunct to visualisa-tion) and attention control mechanisms, be combined to help allocate at-tention to identify visually ambiguous signals? One aim of these questions is to investigate the effectiveness of the use of sound to-gether with visual display to increase sensitivity to signal detection in presence of visual noise in the data as compared to visual display only. Radio, particle, wave and high energy data is explored using a sonification technique developed as part of this research. The sonification technique developed as part of this research, its application and re-sults are numerically validated and presented. This thesis presents the results of three experiments and results of a training experiment. In all the 4 experiments, the volun-teers were using sound as an adjunct to data visualisation to identify changes in graphical visual and audio representations and these results are compared with those of using audio rendering only and visual rendering only. In the first experiment audio rendering did not result in significant benefits when used alone or with a visual display. With the second and third experiments, the audio as an adjunct to visual rendering became significant when a fourth cue was added to the spectra. The fourth cue con-sisted of a red line sweeping across the visual display at the rate the sound was played, to synchronise the audio and visual present. The results prove that a third congruent multimodal stimulus in synchrony with the sound helps space scientists identify events masked by noise in 2D data. Results of training experiments are reported.
786

The information rate and other parameters of probabilistic context free grammars and their parsers

Kestner, Simon January 1974 (has links)
Probabilistic context-free languages are defined by giving predetermined probabilities (preprobabilities) for the choices that their grammars make when generating. Chapter 1 shows how to carry out the above definition, and how to calculate some parameters or the language; for instance: average length or work, mean square length, digraph probabilities, entropy. Chapter 2 introduces generating ffunctions related to grammars. It uses them to derive a condition for which preprobabilities give rise to well-fformed probability spaces. Two ffunctions, the length and entropy generating ffunctions are studied in detail. They are algebraic ffunctions, can in general only be defined implicitly, but can be used to give unified explicit methods or calculating all the parameters or chapter I (and more). Chapter 3 defines and shows how to calculate the information rate or a language. As a by-blow, Macmillan's theorem is extended (for a small class or processes) to an analogue or the Central Limit Theorem. Chapter 4 tries to compare the efficiencies or different parsing algorithms. In a reasonable sense, all deterministic parsers take equal average time to parse, any backtracking parser is slower, but there is no general algorithm for calculating the speed or a backtracking parser.
787

Evaluating computer-based teaching and learning situations : theoretical approaches to TILT-E's work

Henderson, Fiona P. January 1999 (has links)
A methodology and a framework for evaluating computer-based learning was produced by the TLTP Teaching with Independent Learning Technologies (TILT) Project's dedicated Evaluation Group, known as TILT-E, of which the author was a member. TILT-E's evaluation work was a result of over twenty evaluation case studies conducted by the group from 1993 to 1998, the majority of which were performed during 1994 and 1995. The TILT-E studies can be viewed as examples of good practice in the evaluation of a range of computer-based teaching situations. While TILT-E's method and measures provide a useful framework for evaluating computer-based learning they did not fully exploit the strengths and history of the theoretical approaches underpinning the TILT-E work. In fact, research traditions are seldom mentioned in the TILT-E literature yet are central not only to understanding the value of the TILT-E work, but also to assisting future research in the search for a model of the computer-based teaching and learning situation. By examining the evolution of the TILT-E methodologies through three of the first TILT-E studies, referred to in this thesis as the Pilot Studies, and then assessing several later case studies much is learnt about the need for a pluralist approach to evaluation in the computer-based teaching and learning context. While TILT-E advocates the use of different methods, the group failed to justify this approach and to recommend when and why such a mix would be appropriate. This thesis aims to rectify this imbalance through the detailed examination of eight evaluation episodes covering three different computer-based teaching and learning situations, all of which had been carried out by the author. Firstly, the Fast Frac case study is considered, which involved evaluating the replacement of a lecture with the Fast Frac software. The Fast Frac study consisted of three evaluation episodes over a period of four years. The study found that the package could replace the lecture, and noted not only that a comparative design does not necessarily constrain the researcher to empirical methods alone, but also that such an approach need not disadvantage the students in either the control or the experimental groups.
788

Using program behaviour to exploit heterogeneous multi-core processors

McIlroy, Ross January 2010 (has links)
Multi-core CPU architectures have become prevalent in recent years. A number of multi-core CPUs consist of not only multiple processing cores, but multiple different types of processing cores, each with different capabilities and specialisations. These heterogeneous multi-core architectures (HMAs) can deliver exceptional performance; however, they are notoriously difficult to program effectively. This dissertation investigates the feasibility of ameliorating many of the difficulties encountered in application development on HMA processors, by employing a behaviour aware runtime system. This runtime system provides applications with the illusion of executing on a homogeneous architecture, by presenting a homogeneous virtual machine interface. The runtime system uses knowledge of a program's execution behaviour, gained through explicit code annotations, static analysis or runtime monitoring, to inform its resource allocation and scheduling decisions, such that the application makes best use of the HMA's heterogeneous processing cores. The goal of this runtime system is to enable non-specialist application developers to write applications that can exploit an HMA, without the developer requiring in-depth knowledge of the HMA's design. This dissertation describes the development of a Java runtime system, called Hera-JVM, aimed at investigating this premise. Hera-JVM supports the execution of unmodified Java applications on both processing core types of the heterogeneous IBM Cell processor. An application's threads of execution can be transparently migrated between the Cell's different core types by Hera-JVM, without requiring the application's involvement. A number of real-world Java benchmarks are executed across both of the Cell's core types, to evaluate the efficacy of abstracting a heterogeneous architecture behind a homogeneous virtual machine. By characterising the performance of each of the Cell processor's core types under different program behaviours, a set of influential program behaviour characteristics is uncovered. A set of code annotations are presented, which enable program code to be tagged with these behaviour characteristics, enabling a runtime system to track a program's behaviour throughout its execution. This information is fed into a cost function, which Hera-JVM uses to automatically estimate whether the executing program's threads of execution would benefit from being migrated to a different core type, given their current behaviour characteristics. The use of history, hysteresis and trend tracking, by this cost function, is explored as a means of increasing its stability and limiting detrimental thread migrations. The effectiveness of a number of different migration strategies is also investigated under real-world Java benchmarks, with the most effective found to be a strategy that can target code, such that a thread is migrated whenever it executes this code. This dissertation also investigates the use of runtime monitoring to enable a runtime system to automatically infer a program's behaviour characteristics, without the need for explicit code annotations. A lightweight runtime behaviour monitoring system is developed, and its effectiveness at choosing the most appropriate core type on which to execute a set of real-world Java benchmarks is examined. Combining explicit behaviour characteristic annotations with those characteristics which are monitored at runtime is also explored. Finally, an initial investigation is performed into the use of behaviour characteristics to improve application performance under a different type of heterogeneous architecture, specifically, a non-uniform memory access (NUMA) architecture. Thread teams are proposed as a method of automatically clustering communicating threads onto the same NUMA node, thereby reducing data access overheads. Evaluation of this approach shows that it is effective at improving application performance, if the application's threads can be partitioned across the available NUMA nodes of a system. The findings of this work demonstrate that a runtime system with a homogeneous virtual machine interface can reduce the challenge of application development for HMA processors, whilst still being able to exploit such a processor by taking program behaviour into account.
789

WISPA : a system for word identification in speech by phonetic analysis

Yardley, John P. January 1981 (has links)
No description available.
790

Mass participation user trials

McMillan, Donald Calum January 2012 (has links)
This thesis investigates how researchers can take advantage of the rapid adoption of mobile technology that has brought with it transformations in social and cultural practice; the expectations of what computers are, what they can do, and the role of digital objects in everyday life. In particular this thesis presents and discuses the use of new App Store style software distribution methods to reduce the cost, in terms of researcher time and hardware, of recruiting a large group of participants for a trial ‘in the wild’ while increasing the potential diversity of users is becoming an attractive option for researchers pursuing the ubicomp vision. It examines the procedures for running large scale trials with the deployment of three applications released to a combined user base of over 135,000 in such a way as to keep the qualitative detail necessary to inform design while gain- ing the diversity of users for claims of generalisability. More generally, it discusses the results that can be expected from this ‘mass participation’ approach, and the ethical responsibilities they place upon researchers. The contributions of this thesis for mobile HCI show that in large-scale trials, relatively rich qualitative data can be collected along with substantial quantitative data, and that a hybrid trial methodology combining a large- scale deployment with a local trial can be a powerful tool in addressing shortcomings of trials that are either solely local or solely global. This thesis also contributes guidelines for researchers running large-scale user trials that give consideration to the established research norms and practices, in an attempt to strike a new balance between invasiveness and utility.

Page generated in 0.0295 seconds