• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 80
  • 45
  • 29
  • Tagged with
  • 381
  • 105
  • 79
  • 43
  • 37
  • 33
  • 29
  • 29
  • 25
  • 23
  • 22
  • 22
  • 21
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Towards a framework for trust negotiations in composite web services

Thomas, Anitta 08 1900 (has links)
Web Services propose a framework for the standardisation of interfaces and interaction, and for publishing software componen1S as services on the Internet By using this framework, composite servi<:es that make use of more than one Web service can be created. Although a composite Web service may provide a unified service to the service requestor~ it cannot be viewed as a single unified entity when its trustworthiness is evaluated, since its constituent services may differ in their nonfunctional attributes. Based on the context of trust (which includes security, reliability, quality of servi~e. etc.), information has to be collected from the constituent services of a composite Web service. Trust negotiation is perfonned to gather infonnation from these services, with the ultimate goal of establishing trust relationships with them. In this dissertation~ the Web Services framework is analysed to determine its support for trust negotiation in any trust context. A trust negotiation procedure is subsequeotly presemed. which can be applied in a composite as well as an elementary Web service. / Computing / M.Sc. (Computer Science)
2

Practical applications of performance modelling of security protocols using PEPA

Zhao, Yishi January 2011 (has links)
Trade-off between security and performance has become an intriguing area in recent years in both the security and performance communities. As the security aspects of security protocol research is fully- edged, this thesis is therefore devoted to conducting a performance study of these protocols. The long term objective is to translate formal de nitions of security protocols to formal performance models automatically, then analysing by relevant techniques. In this thesis, we take a preliminary step by studying five typical security protocols, and exploring the methodology of construction and analysis of their models by using the Markovian process algebra PEPA. Through these case studies, an initial framework of performance analysis of security protocol is established. Firstly, a key distribution centre is investigated. The basic model su ers from the commonly encountered state space explosion problem, and so we apply some efficient solution techniques, which include model reduction techniques and ordinary di fferential equation based fluid flow analysis. Finally, we evaluate a utility function for this secure key exchange model. Then, we explore two non-repudiation protocols. Mean value analysis has been applied here for a class of PEPA models, and it is compared with an ODE approximation. After that, an optimistic nonrepudiation protocol with off-line third trust party is studied. The PEPA model has been formulated using a concept of multi-threaded servers with functional rates. The nal case study is a cross-realm Kerberos protocol. A simplified technique of aggregation with an ODE approximation is performed to do efficient cient analysis. All these modelling and analysis methods are illustrated through numerical examples.
3

Usable, secure and deployable graphical passwords

Dunphy, Paul Michael January 2013 (has links)
Evaluations of the usability and security of alphanumeric passwords and Personal Identification Numbers (PINs) have shown that users cannot remember credentials considered to be secure. However, the continued reliance upon these methods of user authentication has placed end-users and system designers in a coevolutionary struggle, with each defending competing concerns of usability and security. Graphical passwords have been proposed as an alternative, and their use is supported by cognitive theories such as the picture superiority effect which suggest that pictures, rather than words or numbers, could provide a stronger foundation upon which to design usable and secure knowledge-based authentication. Indeed, early usability studies of novel systems harnessing this effect appear to show promise, however, the uptake of graphical passwords in real-world systems is low. This inertia is likely related to uncertainty regarding the challenges that novel systems might bring to the already delicate interplay between usability and security; particularly the new challenges faced in scaffolding user behaviours that comply with context-specific security policies, uncertainty regarding the nature of new socio-technical attacks, and the impact of images themselves upon usability and security. In this thesis we present a number of case studies incorporating new designs, empirical methods and results, that begin to explore these aspects of representative graphical password systems. Specifically, we explore: (i) how we can implicitly support security-focused behaviours such as choosing high entropy graphical passwords and defending against observation attack; (ii) how to capture the likely extent of insecure behaviour in the social domain such as graphical password sharing and observation attack; and (iii) how through the selection of appropriate properties of the images themselves we can provide security and usability benefits. In doing so, we gen- erate new insights into the potential of graphical passwords to provide usable, secure and deployable user authentication.
4

Advanced cryptographic system : design, architecture and FPGA implementation

Al-Gailani, Mohammed Falih January 2012 (has links)
The field programmable gate array (FPGA) is a powerful technology, and since its introduction broad prospects have opened up for engineers to creatively design and implement complete systems in various fields. One such area that has a long history in information and network security is cryptography, which is considered in this thesis. The challenge for engineers is to design secure cryptographic systems, which should work efficiently on different platforms with the levels of security required. In addition, cryptographic functionalities have to be implemented with acceptable degrees of complexity while demanding lower power consumption. The present work is devoted to the design of an efficient block cipher that meets contemporary security requirements, and to implement the proposed design in a configurable hardware platform. The cipher has been designed according to Shannon’s principles and modern cryptographic theorems. It is an iterated symmetric-key block cipher based on the substitution permutation network and number theoretic transform with variable key length, block size and word length. These parameters can be undisclosed when determined by the system, making cryptanalysis almost impossible. The aim is to design a more secure, reliable and flexible system that can run as a ratified standard, with reasonable computational complexity for a sufficient service time. Analyses are carried out on the transforms concerned, which belong to the number theoretic transforms family, to evaluate their diffusion power, and the results confirm good performance in this respect mostly of a minimum of 50%. The new Mersenne number transform and Fermat number transform were included in the design because their characteristics meet the basic requirements of modern cryptographic systems. A new 7×7 substitution box (S-box) is designed and its non-linear properties are evaluated, resulting in values of 2-6 for maximum difference propagation probability and 2-2.678 for maximum input-output correlation. In addition, these parameters are calculated for all S-boxes belonging to the previous and current standard algorithms. Moreover, three extra S-boxes are derived from the new S-box and another three from the current standard, preserving the same non-linear properties by reordering the output elements. The robustness of the proposed cipher in terms of differential and linear cryptanalysis is then considered, and it is proven that the algorithm is secure against such well-known attacks from round three onwards regardless of block or key length. A number of test vectors are run to verify the correctness of the algorithm’s implementation in terms of any possible error, and all results were promising. Tests included the known answer test, the multi-block message test, and the Monte Carlo test. Finally, efficient hardware architectures for the proposed cipher have been designed and implemented using the FPGA system generator platform. The implementations are run on the target device, Xilinx Virtex 6 (XC6VLX130T-2FF484). Using parallel loop-unrolling architecture, a high throughput of 44.9 Gbits/sec is achieved with a power consumption of 1.83W and 8030 slices for implementing the encryption module with key and block lengths of 16×7 bits. There are a variety of outcomes when the cipher is implemented on different FPGA devices as well as for different block and key lengths.
5

Building a conceptual framework for analyzing, comparing and evaluating identity management systems

Chehab, Maya Ismail January 2011 (has links)
The thesis proposes a framework to objectively analyze, compare and evaluate identity management systems. The central focus is on the integration of all essential aspects of identity management in the evaluation process of an identity management system. The aspects reflect the generic requirements that organizations have sought in the design of an identity management system in order to attain their objectives. These include attributes, binding, functionalities, trust, scope, assurance, privacy and usability. For each aspect, the framework analyzes and discusses the prominent role that the aspect plays in an identity management system, highlighting the potential advantages it may bring and emphasizing on its influences on the security of information. A whole identity management system can thus be viewed as an assembly of the essential aspects. This would reflect the potential benefits that the system may offer along with the influences on information security that it entails. In an identity management system, some aspects are viewed through the interaction of the core components of the system. An abstraction is given to the underlying architectures of identity management systems. This allows a clearer understanding to the features provided by the systems. Additionally, the framework views trust inherent in identity management systems as a relationship that entails significant risk to information security. An approach to evaluating trust in terms of measuring the risk entailed to it is provided in this framework. The measure can be used as the input knowledge to help in analysis and decision-making about whether to trust or not. The framework also characterizes authentication assurance and measures it according to a scale that reflects the risk entailed to the online services that are intended to be used. This gives service providers confidence that their critical services are only granted to users who have gone through stringent authentication mechanisms in the identity management system. Moreover, privacy and usability principles are customized to identity management systems in this framework. The principles of an aspect can be used as the foundation for criteria to evaluating the aspect in an identity management system. Finally, the framework introduces a model that presents every system as an integration of all the aspects of identity management. It uses a chart to depict the quality of each aspect in the system. By illustrating charts to multiple identity management systems, this allows organizations to compare, contrast and choose with confidence the system that best fit their objectives.
6

Investigation into reduction and/or prevention of cybercrime victimisation among the user individuals

Askerniya, Imran January 2010 (has links)
Much of today's information is stored in digital form in computers and information in the digital form is much more easily stored, duplicated and manipulated. Businesses have to use information effectively in order to remain competitive, and storing information in the digital form allows them to use information much more effectively. As most computers are always connected to the Internet, this gives to attacker an easy (unauthorised) access to the computer through the network. The contribution to knowledge that this research brings is the development of the models on reduction/ prevention of cybercrime as there are little mainstream research exist. The model is concentrates on the point of view of the victims, that is, what makes each victim vulnerable, and how the cybercrime prevention strategies should be adapted according to the unique characteristics of the victim. The aim of this research is to help minimise the victimisation of cybercrime in Russian cyber space. It is worth mentioning that there is no such model, exist. Therefore, this work has attempted to fill this significant gap in the research by proposing four-dimensional grid model. The model is based on the existing traditional crime prevention model and situational crime prevention theory. The grid model provides three preventive strategies: Education, Training, and Awareness. The model and these strategies were evaluated qualitatively and quantitatively by distributing questionnaires and interviewing the Moscow cybercrime experts
7

Investigating the factors that influence the effectiveness of E-commerce security context

Halaweh, Mohanad Ali January 2009 (has links)
Of all the factors restricting their engagement and participation in e-commerce, security is the main concern for customers and businesses alike. Existing research has dealt with the issue of e-commerce security by focusing separately on the customer or on the business, but there is little research attempting to investigate both as a single phenomenon. This research considers the two perspectives jointly in order to establish a comprehensive viewpoint, and to reduce the gap between the solutions and security being implemented by organisations, on the one hand, and the perceptions of customers, on the other hand, thus allowing a more in-depth understanding of the customer's needs and priorities to be taken into account by organisations. This research is undertaken in Jordan, an environment where no other research into security perception in e-commerce has taken place. The reason for choosing the Jordanian context is justifiable as most existing research conducted in Jordan confirms the security concern in e-commerce and Internet banking, without exploring the issue in depth. This barrier (i. e. security) makes both Jordanian organisations and individual customers hesitant to participate in e-commerce transactions, and thus reduces the growth of e-commerce. An interpretive case study method has been adopted and guides the grounded theory analysis method of research into the perceptions of customers and of businesses and IT personnel. This research, which is qualitative and subjective in nature, involves examining and identifying the meanings of the participants in order to gain an understanding of the phenomenon under study. Qualitative research enables the researcher to understand the phenomenon in depth without being limited to certain predetermined hypothesis and factors that are defined from literature, so that issues are allowed to emerge from the natural setting of the context (i. e. Jordan). Grounded theory procedures and techniques have been used to explore the relevant influence of factors on the effectiveness of e-commerce security and to identify the relationships between them. The empirical findings show seven factors: cooperative responsibility, implementation concerns, commitment of management, users' characteristics, social communication, psychological aspect of security, and tangible and intangible features of security. Finally, a critical discussion of research implications has been drawn which extends the theoretical base of e-commerce security within the IS domain and provides broad insight for decision-makers and practitioners in Jordan.
8

Quantitative analysis of the release order of defensive mechanisms

Alsuhibany, Suliman Abdullah January 2014 (has links)
Dependency on information technology (IT) and computer and information security (CIS) has become a critical concern for many organizations. This concern has essentially centred on protecting secrecy, confidentiality, integrity and availability of information. To overcome this concern, defensive mechanisms, which encompass a variety of services and protections, have been proposed to protect system resources from misuse. Most of these defensive mechanisms, such as CAPTCHAs and spam filters, rely in the first instance on a single algorithm as a defensive mechanism. Attackers would eventually break each mechanism. So, each algorithm would ultimately become useless and the system no longer protected. Although this broken algorithm will be replaced by a new algorithm, no one shed light on a set of algorithms as a defensive mechanism. This thesis looks at a set of algorithms as a holistic defensive mechanism. Our hypothesis is that the order in which a set of defensive algorithms is released has a significant impact on the time taken by attackers to break the combined set of algorithms. The rationale behind this hypothesis is that attackers learn from their attempts, and that the release schedule of defensive mechanisms can be adjusted so as to impair the learning process. To demonstrate the correctness of our hypothesis, an experimental study involving forty participants was conducted to evaluate the effect of algorithms’ order on the time taken to break them. In addition, this experiment explores how the learning process of attackers could be observed. The results showed that the order in which algorithms are released has a statistically significant impact on the time attackers take to break all algorithms. Based on these results, a model has been constructed using Stochastic Petri Nets, which facilitate theoretical analysis of the release order of a set of algorithms approach. Moreover, a tailored optimization algorithm is proposed using a Markov Decision Process model in order to obtain efficiently the optimal release strategy for any given model by maximizing the time taken to break a set of algorithms. As our hypothesis is based on the learning acquisition ability of attackers while interacting with the system, the Attacker Learning Curve (ALC) concept is developed. Based on empirical results of the ALC, an attack strategy detection approach is introduced and evaluated, which has achieved a detection success rate higher than 70%. The empirical findings in this detection approach provide a new understanding of not only how to detect the attack strategy used, but also how to track the attack strategy through the probabilities of classifying results that may provide an advantage for optimising the release order of defensive mechanisms.
9

A fully distributed security service for ad hoc networks based on dynamic general access structures

Obi, Obowoware O. January 2009 (has links)
In this thesis, a design of a fully distributed security support for self-initializing and self-organizing ad hoc networks, based on dynamic general access structures is presented. Scalable and intrusion tolerant even against an adaptive design adversary, the design supports networks with dynamically changing topology, and seamlessly at that, in terms of enrolment and de-enrolment. Generalizations of the Paillier cryptosystem and an on-line/offline commitment scheme are developed in this thesis and together with the secret sharing concepts required, yielded new results in the areas that are critical to the design: cryptography and secret sharing. The thesis employs a top down approach to define the overall structure and the design components, and a bottom-up (evolutionary) approach to develop the design slowly, letting the experiences gained at each stage assist in determining the next stage of development. Finally a resolution was obtained, in the affirmative, of two open questions as to the obtention of a protocol for the generation of RSA keys without a trusted dealer for general access structures, and whether or not other cryptosystems could be employed in the key distribution scheme.
10

Vulnerability identification errors in security risk assessments

Taubenberger, Stefan January 2014 (has links)
At present, companies rely on information technology systems to achieve their business objectives, making them vulnerable to cybersecurity threats. Information security risk assessments help organisations to identify their risks and vulnerabilities. An accurate identification of risks and vulnerabilities is a challenge, because the input data is uncertain. So-called ’vulnerability identification errors‘ can occur if false positive vulnerabilities are identified, or if vulnerabilities remain unidentified (false negatives). ‘Accurate identification’ in this context means that all vulnerabilities identified do indeed pose a risk of a security breach for the organisation. An experiment performed with German IT security professionals in 2011 confirmed that vulnerability identification errors do occur in practice. In particular, false positive vulnerabilities were identified by participants. In information security (IS) risk assessments, security experts analyze the organisation’s assets in order to identify vulnerabilities. Methods such as brainstorming, checklists, scenario-analysis, impact-analysis, and cause-analysis (ISO, 2009b) are used to identify vulnerabilities. These methods use uncertain input data for vulnerability identification, because the probabilities, effects and losses of vulnerabilities cannot be determined exactly (Fenz and Ekelhart, 2011). Furthermore, business security needs are not considered properly; the security checklists and standards used to identify vulnerabilities do not consider company-specific security requirements (Siponen and Willison, 2009). In addition, the intentional behaviour of an attacker when exploiting vulnerabilities for malicious purposes further increases the uncertainty, because predicting human behaviour is not just about existing vulnerabilities and their consequences (Pieters and Consoli, 2009), rather than preparing for future attacks. As a result, current approaches determine risks and vulnerabilities under a high degree of uncertainty, which can lead to errors. This thesis proposes an approach to resolve vulnerability identification errors using security requirements and business process models. Security requirements represent the business security needs and determine whether any given vulnerability is a security risk for the business. Information assets’ security requirements are evaluated in the context of the business process model, in order to determine whether security functions are implemented and operating correctly. Systems, personnel and physical parts of business processes, as well as IT processes, are considered in the security requirement evaluation, and this approach is validated in three steps. Firstly, the systematic procedure is compared to two best-practice approaches. Secondly, the risk result accuracy is compared to a best-practice risk-assessment approach, as applied to several real-world examples within an insurance company. Thirdly, the capability to determine risk more accurately by using business processes and security requirements is tested in a quasi-experiment, using security professionals. This thesis demonstrates that risk assessment methods can benefit from explicit evaluation of security requirements in the business context during risk identification, in order to resolve vulnerability identification errors and to provide a criterion for security.

Page generated in 0.0201 seconds