• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 19
  • 18
  • 9
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 274
  • 274
  • 274
  • 274
  • 82
  • 66
  • 59
  • 52
  • 41
  • 40
  • 39
  • 38
  • 38
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

GPF : a framework for general packet classification on GPU co-processors / GPU Packet Filter : framework for general packet classification on Graphics Processing Unit co-processors

Nottingham, Alastair January 2012 (has links)
This thesis explores the design and experimental implementation of GPF, a novel protocol-independent, multi-match packet classification framework. This framework is targeted and optimised for flexible, efficient execution on NVIDIA GPU platforms through the CUDA API, but should not be difficult to port to other platforms, such as OpenCL, in the future. GPF was conceived and developed in order to accelerate classification of large packet capture files, such as those collected by Network Telescopes. It uses a multiphase SIMD classification process which exploits both the parallelism of packet sets and the redundancy in filter programs, in order to classify packet captures against multiple filters at extremely high rates. The resultant framework - comprised of classification, compilation and buffering components - efficiently leverages GPU resources to classify arbitrary protocols, and return multiple filter results for each packet. The classification functions described were verified and evaluated by testing an experimental prototype implementation against several filter programs, of varying complexity, on devices from three GPU platform generations. In addition to the significant speedup achieved in processing results, analysis indicates that the prototype classification functions perform predictably, and scale linearly with respect to both packet count and filter complexity. Furthermore, classification throughput (packets/s) remained essentially constant regardless of the underlying packet data, and thus the effective data rate when classifying a particular filter was heavily influenced by the average size of packets in the processed capture. For example: in the trivial case of classifying all IPv4 packets ranging in size from 70 bytes to 1KB, the observed data rate achieved by the GPU classification kernels ranged from 60Gbps to 900Gbps on a GTX 275, and from 220Gbps to 3.3Tbps on a GTX 480. In the less trivial case of identifying all ARP, TCP, UDP and ICMP packets for both IPv4 and IPv6 protocols, the effective data rates ranged from 15Gbps to 220Gbps (GTX 275), and from 50Gbps to 740Gbps (GTX 480), for 70B and 1KB packets respectively. / LaTeX with hyperref package
132

An analysis of the risk exposure of adopting IPV6 in enterprise networks

Berko, Istvan Sandor January 2015 (has links)
The IPv6 increased address pool presents changes in resource impact to the Enterprise that, if not adequately addressed, can change risks that are locally significant in IPv4 to risks that can impact the Enterprise in its entirety. The expected conclusion is that the IPv6 environment will impose significant changes in the Enterprise environment - which may negatively impact organisational security if the IPv6 nuances are not adequately addressed. This thesis reviews the risks related to the operation of enterprise networks with the introduction of IPv6. The global trends are discussed to provide insight and background to the IPv6 research space. Analysing the current state of readiness in enterprise networks, quantifies the value of developing this thesis. The base controls that should be deployed in enterprise networks to prevent the abuse of IPv6 through tunnelling and the protection of the enterprise access layer are discussed. A series of case studies are presented which identify and analyse the impact of certain changes in the IPv6 protocol on the enterprise networks. The case studies also identify mitigation techniques to reduce risk.
133

Towards a framework for the integration of information security into undergraduate computing curricula

Gomana, Lindokuhle Gcina, Thomson, Kerry-Lynn January 2017 (has links)
Information is an important and valuable asset, in both our everyday lives and in various organisations. Information is subject to numerous threats, these can originate internally or externally to the organisation and could be accidental, intentional or caused by natural disasters. As an important organisational asset, information should be appropriately protected from threats and threat agents regardless of their origin. Organisational employees are, however, often cited as the “weakest link” in the attempt to protect organisational information systems and related information assets. Additionally to this, employees are one of the biggest and closest threat-agents to an organisation’s information systems and its security. Upon graduating, computing (Computer Science, Information Systems and Information Technology) graduates typically become organisational employees. Within organisations, computing graduates often take on roles and responsibilities that involve designing, developing, implementing, upgrading and maintaining the information systems that store, process and transmit organisational information assets. It is, therefore, important that these computing graduates possess the necessary information security skills, knowledge and understanding that could enable them to perform their roles and responsibilities in a secure manner. These information security skills, knowledge and understanding can be acquired through information security education obtained through a qualification that is offered at a higher education institution. At many higher education institutions where information security is taught, it is taught as a single, isolated module at the fourth year level of study. The problem with this is that some computing students do not advance to this level and many of those that do, do not elect information security as a module. This means that these students may graduate and be employed by organisations lacking the necessary information security skills, knowledge and understanding to perform their roles and responsibilities securely. Consequently, this could increase the number of employees who are the “weakest link” in securing organisational information systems and related information assets. The ACM, as a key role player that provides educational guidelines for the development of computing curricula, recommends that information security should be pervasively integrated into computing curricula. However, these guidelines and recommendations do not provide sufficient guidance on “how” computing educators can pervasively integrate information security into their modules. Therefore, the problem identified by this research is that “currently, no generally used framework exists to aid the pervasive integration of information security into undergraduate computing curricula”. The primary research objective of this study, therefore, is to develop a framework to aid the pervasive integration of information security into undergraduate computing curricula. In order to meet this objective, secondary objectives were met, namely: To develop an understanding of the importance of information security; to determine the importance of information security education as it relates to undergraduate computing curricula; and to determine computing educators’ perspectives on information security education in a South African context. Various research methods were used to achieve this study’s research objectives. These research methods included a literature review which was used to define and provide an in-depth discussion relating to the domain in which this study is contained, namely: information security and information security education. Furthermore, a survey which took the form of semi-structured interviews supported by a questionnaire, was used to elicit computing educators’ perspectives on information security education in a South African context. Argumentation was used to argue towards the proposed framework to aid the pervasive integration of information security into undergraduate computing curricula. In addition, modelling techniques were used to model the proposed framework and scenarios were used to demonstrate how a computing department could implement the proposed framework. Finally, elite interviews supported by a questionnaire were conducted to validate the proposed framework. It is envisaged that the proposed framework could assist computing departments and undergraduate computing educators in the integration of information security into their curricula. Furthermore, the pervasive integration of information security into undergraduate computing curricula could ensure that computing graduates exit higher education institutions possessing the necessary information security skills, knowledge and understanding to enable them to perform their roles and responsibilities securely. It is hoped that this could enable computing graduates to become a stronger link in securing organisational information systems and related assets.
134

Guidelines to address the human factor in the South African National Research and Education Network beneficiary institutions

Mjikeliso, Yolanda January 2014 (has links)
Even if all the technical security solutions appropriate for an organisation’s network are implemented, for example, firewalls, antivirus programs and encryption, if the human factor is neglected then these technical security solutions will serve no purpose. The greatest challenge to network security is probably not the technological solutions that organisations invest in, but the human factor (non-technical solutions), which most organisations neglect. The human factor is often ignored even though humans are the most important resources of organisations and perform all the physical tasks, configure and manage equipment, enter data, manage people and operate the systems and networks. The same people that manage and operate networks and systems have vulnerabilities. They are not perfect and there will always be an element of mistake-making or error. In other words, humans make mistakes that could result in security vulnerabilities, and the exploitation of these vulnerabilities could in turn result in network security breaches. Human vulnerabilities are driven by many factors including insufficient security education, training and awareness, a lack of security policies and procedures in the organisation, a limited attention span and negligence. Network security may thus be compromised by this human vulnerability. In the context of this dissertation, both physical and technological controls should be implemented to ensure the security of the SANReN network. However, if the human factors are not adequately addressed, the network would become vulnerable to risks posed by the human factor which could threaten the security of the network. Accordingly, the primary research objective of this study is to formulate guidelines that address the information security related human factors in the rolling out and continued management of the SANReN network. An analysis of existing policies and procedures governing the SANReN network was conducted and it was determined that there are currently no guidelines addressing the human factor in the SANReN beneficiary institutions. Therefore, the aim of this study is to provide the guidelines for addressing the human factor threats in the SANReN beneficiary institutions.
135

Critical success factors of information security projects

Tshabalala, Obediant January 2016 (has links)
The research shows the critical success factors when implementing information security projects. Many Information security projects in the past have not been successful because these factors were not identified and emphasised effectively. By identifying these factors the research basically presents a model by which information security projects can be executed with guaranteed success. The factors identified during the study cover the following streams: top management commitment as a factor of success; accountability as a factor of success; responsibility as a factor of success; awareness as a factor of success and an information security policy as a factor of success. For the empirical study, a physical questionnaire was administrated to a pool of experts in project management and information security. The study consisted of 60 participants who were verified to have minimum requirements core for questionnaire completion. The questionnaire requested for biological information of the participants and their perceived relations (based on their experience) between project success versus accountability, information security project success versus responsibilities, information security project success versus training & awareness, information security project success versus top management commitment and information security project success versus information security policy. The participants’ responses were structured according to a Likert-type scale. Participants had to indicate the extent to which they agreed with each of the statements in the questionnaire. The responses obtained from the survey were presented and analysed. The researcher observed in this study that information security projects are so specific that critical success factors need to be emphasised from project inception. With the identified critical success factors, the researcher recommends that a project methodology be structured to include these factors so that there is a standard in running information security projects successfully. The researcher also identified that amongst the critical success factors identified, there are some that need to be emphasised more than the others due to their level of importance in such projects.
136

Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective

Li, Xiao-Yu January 2007 (has links)
As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
137

A framework for information security governance in SMMEs

Coertze, Jacques Jacobus January 2012 (has links)
It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
138

A framework for assuring conformance of cloud-based email at higher education institutions

Willett, Melanie January 2013 (has links)
Cloud computing is a relatively immature computing paradigm that could significantly benefit users. Cloud computing solutions are often associated with potential benefits such as cost reduction, less administrative hassle, flexibility and scalability. For organisations to realize such potential benefits, cloud computing solutions need to be chosen, implemented, managed and governed in a way that is secure, compliant with internal and external requirements and indicative of due diligence. This can be a challenge, given the many concerns and risks commonly associated with cloud computing solutions. One cloud computing solution that is being widely adopted around the world is cloud-based email. One of the foremost adopters of this cloud computing solution is higher education institutions. These higher education institutions stand to benefit greatly from using such services. Cloud-based email can be provisioned to staff and students at these institutions for free. Additionally, cloud service providers (CSPs) are able to provide a better email service than some higher education institutions would be able to provide if they were required to do so in-house. CSPs often provide larger inboxes and many extra services with cloud-based email. Cloud-based email is, therefore, clearly an example of a cloud computing solution that has the potential to benefit organisations. There are however, risks and challenges associated with the use of this cloud computing solution. Two of these challenges relate to ensuring conformance to internal and external (legal, regulatory and contractual obligations) requirements and to providing a mechanism of assuring that cloud-based email related activities are sound. The lack of structured guidelines for assuring the conformance of cloud-based email is putting this service at risk at higher education institutions in South Africa. This work addresses this problem by promoting a best practice based approach to assuring the conformance of cloud-based email at higher education institutions. To accomplish this, components of applicable standards and best practice guidelines for IT governance, IT assurance and IT conformance are used to construct a framework for assuring the conformance of cloud-based email. The framework is designed and verified using sound design science principles. The utility and value of the framework has been demonstrated at a higher education institution in South Africa. This framework can be used to assist higher education institutions to demonstrate due diligence in assuring that they conform to legal and best practice requirements for the management and governance of cloud-based email. This is a significant contribution in the relatively new field of cloud computing governance.
139

SecMVC : a model for secure software design based on the model-view-controller pattern

Colesky, Michael Robert January 2014 (has links)
Current advances in the software development industry are growing more ubiquitous by the day. This has caused for security, not only in the broader sense, but specifically within the design and overall development of software itself, to become all the more important. An evidently prevalent problem in the domain of software development is that software security is not consistently addressed during design, which undermines core security concerns, and leads to the development of insecure software. This research seeks to address this issue via a model for secure software design, which is based on a software design pattern, namely, the Model-View-Controller (MVC) pattern. The use of a pattern to convey knowledge is not a new notion. However, the ability of software design patterns to convey secure software design is an idea worth investigating. Following identification of secure software design principles and concepts, as well as software design patterns, specifically those relating to the MVC pattern, a model was designed and developed. With the MVC pattern argued as being a suitable foundation for the model, the security conscious MVC (SecMVC) combines secure software design principles and concepts into the MVC pattern. Together herewith, the MVC pattern’s components in the MVC Compound pattern, namely: the Observer pattern, the Strategy pattern, and the Composite pattern, have provided further sub-models for less abstraction and greater detail. These sub-models were developed, as a result of the SecMVC model’s evaluation in the validation for this study, an expert review. Argued in the light of similar research methods, the expert review was chosen – along with a process that included the use of two expert participants to validate the SecMVC model. It was determined through the expert review that the SecMVC model is of sufficient utility, quality, and efficacy to constitute research value. The research methodology process followed was design science, in which the SecMVC model, which includes its related sub-models, serves as the artefact and research output of this study. This research study contributes evidence of the feasibility for integrating knowledge into software design patterns. This includes the SecMVC model itself. In addition, it argues for the use of an expert review, as an evaluative research method for such an artifact.
140

The ISO/IEC 27002 and ISO/IEC 27799 information security management standards : a comparative analysis from a healthcare perspective

Ngqondi, Tembisa Grace January 2009 (has links)
Technological shift has become significant and an area of concern in the health sector with regard to securing health information assets. Health information systems hosting personal health information expose these information assets to ever-evolving threats. This information includes aspects of an extremely sensitive nature, for example, a particular patient may have a history of drug abuse, which would be reflected in the patient’s medical record. The private nature of patient information places a higher demand on the need to ensure privacy. Ensuring that the security and privacy of health information remain intact is therefore vital in the healthcare environment. In order to protect information appropriately and effectively, good information security management practices should be followed. To this end, the International Organization for Standardization (ISO) published a code of practice for information security management, namely the ISO 27002 (2005). This standard is widely used in industry but is a generic standard aimed at all industries. Therefore it does not consider the unique security needs of a particular environment. Because of the unique nature of personal health information and its security and privacy requirements, the need to introduce a healthcare sector-specific standard for information security management was identified. The ISO 27799 was therefore published as an industry-specific variant of the ISO 27002 which is geared towards addressing security requirements in health informatics. It serves as an implementation guide for the ISO 27002 when implemented in the health sector. The publication of the ISO 27799 is considered as a positive development in the quest to improve health information security. However, the question arises whether the ISO 27799 addresses the security needs of the healthcare domain sufficiently. The extensive use of the ISO 27002 implies that many proponents of this standard (in healthcare), now have to ensure that they meet the (assumed) increased requirements of the ISO 27799. The purpose of this research is therefore to conduct a comprehensive comparison of the ISO 27002 and ISO 27799 standards to determine whether the ISO 27799 serves the specific needs of the health sector from an information security management point of view.

Page generated in 0.1109 seconds