91 |
Changing the way the world thinks about computer securityFord, Sarah Gordon January 2006 (has links)
Small changes in an established system can result in larger changes in the overall system (e.g. network effects, émergence, criticality, broken Windows theory). However, in an immature discipline, such as computer security, such changes can be difficult to envision and even more difficult to amplement, as the immature discipline is likely to lack the scientific framework that would allow for the introduction of even minute changes. (Cairns, P. and Thimbleby, H, 2003) describe three of the signs of an immature discipline as postulated by (Kuhn, 1970): a. squabbles over what are legitimate tools for research b. disagreement over which phenomenon are legitimate to study, and c. inability to scope the domain of study. The research presented in this document demonstrates how the computer security field, at the time this research began, was the embodiment of thèse characteristics. It presents a cohesive analysis of the intentional introduction of a séries of small changes chosen to aid in maturation of the discipline. Summarily, it builds upon existing theory, exploring the combined effect of coordinated and strategie changes in an immature system and establishing a scientific framework by which the impact of the changes can be quantified. By critically examining the nature of the computer security system overall, this work establishes the need for both increased scientific rigor, and a multidisciplinary approach to the global computer security problem. In order for these changes to take place, many common assumptions related to computer security had to be questioned. However, as the discipline was immature, and controlled by relatively few entities, questioning the status quo was not without difficulties. However, in order for the discipline to mature, more feedback into the overall computer security (and in particular, the computer malware/virus) system was needed, requiring a shift from a mostly closed system to one that was forced to undergo greater scrutiny from various other communities. The input from these communities resulted in long-term changes and increased maturation of the system. Figure 1 illustrates the specific areas in which the research presented herein addressed these needs, provides an overview of the research context, and outlines the specific impact of the research, specifically the development of new and significant scientific paradigms within the discipline.
|
92 |
Counter intrusion software : malware detection using structural and behavioural features and machine learningRabaiotti, Joseph January 2007 (has links)
Over the past twenty-five years malicious software has evolved from a minor annoyance to a major security threat. Authors of malicious software are now more likely to be organised criminals than bored teenagers, and modern malicious software is more likely to be aimed at stealing data (and hence money) than trashing data. The arms race between malware authors and manufacturers of anti-malware software continues apace, but despite this, the majority of anti-malware solutions still rely on relatively old technology such as signature scanning, which works well enough in the majority of cases but which has long been known to be ineffective if signatures are not updated regularly. The need for regular updating means there is often a critical window---between the publication of a flaw exploitable by malware and the distribution of the appropriate counter measures or signature. At this point a user system is open to attack by hitherto unseen malware. The object of this thesis is to determine if it is practical to use machine learning techniques to abstract generic structural or behavioural features of malware which can then be used to recognise hitherto unseen examples. Although a sizeable amount of research has been done on various ways in which malware detection might be automated, most of the proposed methods are burdened by excessive complexity. This thesis looks specifically at the possibility of using learning systems to classify software as malicious or nonmalicious based on easily-collectable structural or behavioural data. On the basis of the experimental results presented herein it may be concluded that classification based on such structural data is certainly possible, and on behavioural data is at least feasible.
|
93 |
Advanced access control in support and distributed collaborative working and de-perimeterizationBurnap, Peter Richard January 2009 (has links)
This thesis addresses the problem of achieving fine-grained and sustained control of access to electronic information, shared in distributed collaborative environments. It presents an enhanced approach to distributed information security architecture, driven by the risks, guidelines and legislation emerging due to the growth of collaborative working, and the often associated increase in storage of information outside of a secured information system perimeter. Traditional approaches to access control are based on applying controls at or within the network perimeter of an information system. One issue with this approach when applying it to shared information is that, outside of the perimeterized zone, the owner loses control of their information. This loss of control could dissuade collaborating parties from sharing their information resources. Information resources can be thought of as a collection of related content stored in a container. Another issue with current approaches to access control, particularly to unstructured resources such as text documents, is the coarse granularity of control they provide. That is, controls can only apply to a resource in its entirety. In reality, the content within a resource could have varying levels of security requirements with different levels of control. For example, some of the content may be completely free from any access restriction, while other parts may be too sensitive to share outside of an internal organisation. The consequence being that the entire resource is restricted with the controls relevant to the highest level content. Subsequently, a substantial amount of information that could feasibly be shared in collaborative environments is prevented from being shared, due to being part of a highly restricted resource. The primary focus of this thesis is to address these two issues by investigating the appropriateness and capability of perimeter security, and entire-resource protection, to provide access control for information shared in collaborative distributed environments. To overcome these problems, the thesis develops an access control framework, based on which, several formulae are defined to clarify the problems, and to allow them to be contextualised. The formulae have then been developed and improved, with the problem in mind, to create a potential solution, which has been implemented and tested to demonstrate that it is possible to enhance access control technology to implement the capability to drill down into the content of an information resource and apply more fine-grained controls, based on the security requirements of the content within. Furthermore, it is established that it is possible to shift part of the controls that protect information resources within a secure network perimeter, to the body of the resources themselves so that they become, to some extent, self protecting. This enables the same controls to be enforced outside of the secure perimeter. The implementation is based on the structuring of information and embedding of metadata within the body of an information resource. The metadata effectively wraps sections of content within a resource into containers that define fine-grained levels of access control requirement, to protect its confidentiality and integrity. Examples of the granularity afforded by this approach could be page, paragraph, line or even word level in a text document. Once metadata has been embedded, it is bound to a centrally controlled access control policy for the lifetime of the resource. Information can then be shared, copied, distributed and accessed in support of collaborative working, but a link between the metadata and the centrally controlled policy is sustained, meaning that previously assigned access privileges to different sections of content can be modified or revoked at any time in the future. The result of this research is to allow information sharing to reach a greater level of acceptance and usage due to: i. the enhanced level of access control made possible through finer-grained controls, allowing the content of a single resource to be classified and restricted at different levels, and ii. the ability to retain sustained control over information through modifiable controls, that can be enforced both while the information is stored on local information systems, and after the information has been shared outside the local environment.
|
94 |
Towards for purpose security in high assurance and Agile environmentsFinch, Linda January 2009 (has links)
This thesis examines the problems that arise when military organisations and communities seek to exploit the opportunities provided by Network Enabled Capability for resource sharing and collaboration. The research shows that current methods of assessing security do not take account of the temporal nature of real communications with ad hoc and policy driven connections. It is contended that this results in an overly conservative approach, and that explicit consideration (and implementation) of non-persistent and alternative connectivity would provide better assessments on which to base risk decisions for 'fit for purpose' security, enabling more flexible and responsive security in rapidly changing environmental conditions.
|
95 |
Agile security for Web applicationsGe, Xiaocheng January 2007 (has links)
Web-based applications (or more concisely, Web applications) are a kind of information system with a particular architecture. They have progressively evolved from Internet browser-based, read-only information repositories to Web-based distributed systems. Today, increasing numbers of businesses rely on their Web applications. At the same time, Web applications are facing many security challenges and, as a result, are exposing businesses to many risks. This thesis proposes a novel approach to building secure Web applications using agile software development methods.
|
96 |
Distress detectionVella, Mark Joseph January 2012 (has links)
Web attacks pose a prime concern for cybersecurity, and whilst attackers are leveraging modern technologies to launch unpredictable attacks with serious consequences, web attack detectors are still restricted to the classical misuse and anomaly detection methods. As a result, web attack detectors have limited resilience to novel attacks or produce impractical amounts of daily false alerts. Advances in intrusion detection techniques have so far only partly alleviated the problem as they are still tied to existing methods. This thesis proposes Distress Detection (DD), a detection method providing novel web attack resilience while suppressing false alerts. It is partly inspired by the workings of the human immune system, that is capable to respond against previously unseen infections. The premise is that within the scope of an attack objective (the attack's end result), attack HTTP requests are associated with features that are necessary to reach that objective, rendering them suspicious. Their eventual execution must generate system events that are associated with the successful attainment of their objective, called the attack symptoms. Suspicious requests and attack symptoms are modeled on the generic signs of ongoing infections that enable the immune system to respond to novel infections, however they are not exclusive to attacks. The suppression of false alerts is left to an alert correlation process based on the premise that attack requests can be distinguished from the rest through a link that connects their features with their consequent attack symptoms. The provision of novel attack resilience and false alert suppression is demonstrated through three prototype distress detectors, identifying DD as promising for effective web attack detection, despite some concerns about the level of diffculty of their implementation process.
|
97 |
Distributed protocols for digital signatures and public key encryptionKuchta, Veronika January 2016 (has links)
Distributed protocols allow a cryptographic scheme to distribute its operation among a group of participants (servers). This new concept of cryptosystems was introduced by Desmedt [56]. We consider two different flavours of distributed protocols. One of them considers a distributed model with n parties where all of these parties are honest. The other allows up to t − 1 parties to be faulty. Such cryptosystems are called threshold cryptosystems. The distribution of cryptographic process is based on secret sharing techniques and is usually applicable to public-key cryptosystems. In this thesis we consider distributed protocols for digital signatures and public key encryption schemes. First we consider two flavours of digital signatures - aggregate signatures and multisignatures - and explore the uniqueness property of these constructions. We show that it gives rise to generic constructions of distributed verifiable unpredictable functions (DVUF), whose outputs can be made pseudorandom in the shared random string model using the techniques from [120]. This gives us the first generic construction of distributed verifiable random functions (DVRF) that do not impose assumptions on trusted generation of secret keys and whose outputs remain pseudorandom even in a presence of up to n − 1 corrupted servers. We provide a DVRF construction which follows immediately from the proof of uniqueness for the multisignature scheme [26]. Then we consider blind signatures as another flavour of digital signatures, and propose the first standard-model construction of (re-randomizable) threshold blind signatures (TBS), where signatures can be obtained in a blind way through interaction with n signers of which t are required to provide their signature shares. The stronger security notions for TBS schemes formalized in our work extend the definitions from [144] to the threshold setting. We further show how our TBS construction can be used to realize a distributed e-voting protocol following the template from [158] that guarantees privacy, anonymity, democracy, conjectured soundness and individual verifiability in the presence of distributed voting authorities. The important applications of distributed digital signatures - threshold e-voting and distributed e-cash - motivated us to consider the nowadays meaningful and crucial cloud data storage techniques. We realize the idea of distributed cloud data storage, which becomes possible as an application of threshold public key encryption with keyword search. First, we model the concept of Threshold Public Key Encryption with Keyword Search (TPEKS) and define its security properties - indistinguishability and consistency under chosen-ciphertext attacks. Our definition of indistinguishability includes protection against keyword guessing attacks, to which all single-server-based PEKS constructions were shown to be vulnerable. We provide a transformation for obtaining secure TPEKS constructions from an anonymous Identity-Based Threshold Decryption (IBTD) scheme, following the conceptual idea behind the transformation from [2] for building PEKS from anonymous IBE. A concrete instantiation of a secure TPEKS scheme can be obtained from our direct anonymous IBTD construction, based on the classical Boneh-Franklin IBE [31], for which we prove the security under the BDH assumption in the random oracle model. Finally we highlight the use of TPEKS schemes for better privacy and availability in distributed cloud storage and provide a comparison with the dual-server PEKS (DS-PEKS)[50] regarding the functionalities of the both schemes, PEKS and DS-PEKS.
|
98 |
Advances in string algorithms for information security applicationsAljamea, Mudhi Mohammed January 2016 (has links)
This thesis focuses on introducing novel algorithms in information security through studying successful algorithms in bioinformatics and adapting them to solve some open problems in information security. Although, the problems in both bioinformatics and information security are different, yet, they might be considered very similar when it comes to identifying and solving them using Stringology techniques. Different successful bioinformatics algorithms have been studied and introduced to solve different security problems such as malware detection, biometrics and big data. Firstly, we present a dynamic computer malware detection model; a novel approach for detecting malware code embedded in different types of computer files, with consistency, accuracy and in high speed without excessive memory usages. This model was inspired by REAL; an efficient read aligner used by next generation sequencing for processing biological data. In addition, we introduce a novel algorithmic approach to detect malicious URLs in image secret communications. Secondly, we also focus on biometrics, specifically fingerprint which is considered to be one of the most reliable and used technique to identify individuals. In particular, we introduce a new fingerprint matching technique, which matches the fingerprint information using circular approximate string matching to solve the rotation problem overcoming the previous methods’ obstacles. Finally, we conclude with an algorithmic approach to analyse big data readings from smart meters to confirm some privacy issues concerns.
|
99 |
Applicability of neural networks to software securityAdebiyi, Adetunji B. January 2013 (has links)
Software design flaws account for 50% software security vulnerability today. As attacks on vulnerable software continue to increase, the demand for secure software is also increasing thereby putting software developers under more pressure. This is especially true for those developers whose primary aim is to produce their software quickly under tight deadlines in order to release it into the market early. While there are many tools focusing on implementation problems during software development lifecycle (SDLC), this does not provide a complete solution in resolving software security problems. Therefore designing software with security in mind will go a long way in developing secure software. However, most of the current approaches used for evaluating software designs require the involvement of security experts because many software developers often lack the required expertise in making their software secure. In this research the current approaches used in integrating security at the design level is discussed and a new method of evaluating software design using neural network as evaluation tool is presented. With the aid of the proposed neural network tool, this research found out that software design scenarios can be matched to attack patterns that identify the security flaws in the design scenarios. Also, with the proposed neural network tool this research found out that the identified attack patterns can be matched to security patterns that can provide mitigation to the threat in the attack pattern.
|
100 |
Implementation and analysis of the generalised new Mersenne number transforms for encryptionRutter, Nick January 2015 (has links)
Encryption is very much a vast subject covering myriad techniques to conceal and safeguard data and communications. Of the techniques that are available, methodologies that incorporate the number theoretic transforms (NTTs) have gained recognition, specifically the new Mersenne number transform (NMNT). Recently, two new transforms have been introduced that extend the NMNT to a new generalised suite of transforms referred to as the generalised NMNT (GNMNT). These two new transforms are termed the odd NMNT (ONMNT) and the odd-squared NMNT (O2NMNT). Being based on the Mersenne numbers, the GNMNTs are extremely versatile with respect to vector lengths. The GNMNTs are also capable of being implemented using fast algorithms, employing multiple and combinational radices over one or more dimensions. Algorithms for both the decimation-in-time (DIT) and -frequency (DIF) methodologies using radix-2, radix-4 and split-radix are presented, including their respective complexity and performance analyses. Whilst the original NMNT has seen a significant amount of research applied to it with respect to encryption, the ONMNT and O2NMNT can utilise similar techniques that are proven to show stronger characteristics when measured using established methodologies defining diffusion. Analyses in diffusion using a small but reasonably sized vector-space with the GNMNTs will be exhaustively assessed and a comparison with the Rijndael cipher, the current advanced encryption standard (AES) algorithm, will be presented that will confirm strong diffusion characteristics. Implementation techniques using general-purpose computing on graphics processing units (GPGPU) have been applied, which are further assessed and discussed. Focus is drawn upon the future of cryptography and in particular cryptology, as a consequence of the emergence and rapid progress of GPGPU and consumer based parallel processing.
|
Page generated in 0.04 seconds