• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 433
  • 38
  • 35
  • 29
  • 19
  • 11
  • 8
  • 8
  • 8
  • 8
  • 8
  • 8
  • 7
  • 4
  • 4
  • Tagged with
  • 757
  • 757
  • 464
  • 347
  • 184
  • 182
  • 159
  • 122
  • 112
  • 112
  • 108
  • 103
  • 100
  • 86
  • 84
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
421

GPF : a framework for general packet classification on GPU co-processors / GPU Packet Filter : framework for general packet classification on Graphics Processing Unit co-processors

Nottingham, Alastair January 2012 (has links)
This thesis explores the design and experimental implementation of GPF, a novel protocol-independent, multi-match packet classification framework. This framework is targeted and optimised for flexible, efficient execution on NVIDIA GPU platforms through the CUDA API, but should not be difficult to port to other platforms, such as OpenCL, in the future. GPF was conceived and developed in order to accelerate classification of large packet capture files, such as those collected by Network Telescopes. It uses a multiphase SIMD classification process which exploits both the parallelism of packet sets and the redundancy in filter programs, in order to classify packet captures against multiple filters at extremely high rates. The resultant framework - comprised of classification, compilation and buffering components - efficiently leverages GPU resources to classify arbitrary protocols, and return multiple filter results for each packet. The classification functions described were verified and evaluated by testing an experimental prototype implementation against several filter programs, of varying complexity, on devices from three GPU platform generations. In addition to the significant speedup achieved in processing results, analysis indicates that the prototype classification functions perform predictably, and scale linearly with respect to both packet count and filter complexity. Furthermore, classification throughput (packets/s) remained essentially constant regardless of the underlying packet data, and thus the effective data rate when classifying a particular filter was heavily influenced by the average size of packets in the processed capture. For example: in the trivial case of classifying all IPv4 packets ranging in size from 70 bytes to 1KB, the observed data rate achieved by the GPU classification kernels ranged from 60Gbps to 900Gbps on a GTX 275, and from 220Gbps to 3.3Tbps on a GTX 480. In the less trivial case of identifying all ARP, TCP, UDP and ICMP packets for both IPv4 and IPv6 protocols, the effective data rates ranged from 15Gbps to 220Gbps (GTX 275), and from 50Gbps to 740Gbps (GTX 480), for 70B and 1KB packets respectively. / LaTeX with hyperref package
422

An investigation of ISO/IEC 27001 adoption in South Africa

Coetzer, Christo January 2015 (has links)
The research objective of this study is to investigate the low adoption of the ISO/IEC 27001 standard in South African organisations. This study does not differentiate between the ISO/IEC 27001:2005 and ISO/IEC 27001:2013 versions, as the focus is on adoption of the ISO/IEC 27001 standard. A survey-based research design was selected as the data collection method. The research instruments used in this study include a web-based questionnaire and in-person interviews with the participants. Based on the findings of this research, the organisations that participated in this study have an understanding of the ISO/IEC 27001 standard; however, fewer than a quarter of these have fully adopted the ISO/IEC 27001 standard. Furthermore, the main business objectives for organisations that have adopted the ISO/IEC 27001 standard were to ensure legal and regulatory compliance, and to fulfil client requirements. An Information Security Management System management guide based on the ISO/IEC 27001 Plan-Do-Check-Act model is developed to help organisations interested in the standard move towards ISO/IEC 27001 compliance.
423

Amber : a aero-interaction honeypot with distributed intelligence

Schoeman, Adam January 2015 (has links)
For the greater part, security controls are based on the principle of Decision through Detection (DtD). The exception to this is a honeypot, which analyses interactions between a third party and itself, while occupying a piece of unused information space. As honeypots are not located on productive information resources, any interaction with it can be assumed to be non-productive. This allows the honeypot to make decisions based simply on the presence of data, rather than on the behaviour of the data. But due to limited resources in human capital, honeypots’ uptake in the South African market has been underwhelming. Amber attempts to change this by offering a zero-interaction security system, which will use the honeypot approach of decision through Presence (DtP) to generate a blacklist of third parties, which can be passed on to a network enforcer. Empirical testing has proved the usefulness of this alternative and low cost approach in defending networks. The functionality of the system was also extended by installing nodes in different geographical locations, and streaming their detections into the central Amber hive.
424

An analysis of the risk exposure of adopting IPV6 in enterprise networks

Berko, Istvan Sandor January 2015 (has links)
The IPv6 increased address pool presents changes in resource impact to the Enterprise that, if not adequately addressed, can change risks that are locally significant in IPv4 to risks that can impact the Enterprise in its entirety. The expected conclusion is that the IPv6 environment will impose significant changes in the Enterprise environment - which may negatively impact organisational security if the IPv6 nuances are not adequately addressed. This thesis reviews the risks related to the operation of enterprise networks with the introduction of IPv6. The global trends are discussed to provide insight and background to the IPv6 research space. Analysing the current state of readiness in enterprise networks, quantifies the value of developing this thesis. The base controls that should be deployed in enterprise networks to prevent the abuse of IPv6 through tunnelling and the protection of the enterprise access layer are discussed. A series of case studies are presented which identify and analyse the impact of certain changes in the IPv6 protocol on the enterprise networks. The case studies also identify mitigation techniques to reduce risk.
425

Pro-active visualization of cyber security on a National Level : a South African case study

Swart, Ignatius Petrus January 2015 (has links)
The need for increased national cyber security situational awareness is evident from the growing number of published national cyber security strategies. Governments are progressively seen as responsible for cyber security, but at the same time increasingly constrained by legal, privacy and resource considerations. Infrastructure and services that form part of the national cyber domain are often not under the control of government, necessitating the need for information sharing between governments and commercial partners. While sharing of security information is necessary, it typically requires considerable time to be implemented effectively. In an effort to decrease the time and effort required for cyber security situational awareness, this study considered commercially available data sources relating to a national cyber domain. Open source information is typically used by attackers to gather information with great success. An understanding of the data provided by these sources can also afford decision makers the opportunity to set priorities more effectively. Through the use of an adapted Joint Directors of Laboratories (JDL) fusion model, an experimental system was implemented that visualized the potential that open source intelligence could have on cyber situational awareness. Datasets used in the validation of the model contained information obtained from eight different data sources over a two year period with a focus on the South African .co.za sub domain. Over a million infrastructure devices were examined in this study along with information pertaining to a potential 88 million vulnerabilities on these devices. During the examination of data sources, a severe lack of information regarding the human aspect in cyber security was identified that led to the creation of a novel Personally Identifiable Information detection sensor (PII). The resultant two million records pertaining to PII in the South African domain were incorporated into the data fusion experiment for processing. The results of this processing are discussed in the three case studies. The results offered in this study aim to highlight how data fusion and effective visualization can serve to move national cyber security from a primarily reactive undertaking to a more pro-active model.
426

Towards a framework for the integration of information security into undergraduate computing curricula

Gomana, Lindokuhle Gcina, Thomson, Kerry-Lynn January 2017 (has links)
Information is an important and valuable asset, in both our everyday lives and in various organisations. Information is subject to numerous threats, these can originate internally or externally to the organisation and could be accidental, intentional or caused by natural disasters. As an important organisational asset, information should be appropriately protected from threats and threat agents regardless of their origin. Organisational employees are, however, often cited as the “weakest link” in the attempt to protect organisational information systems and related information assets. Additionally to this, employees are one of the biggest and closest threat-agents to an organisation’s information systems and its security. Upon graduating, computing (Computer Science, Information Systems and Information Technology) graduates typically become organisational employees. Within organisations, computing graduates often take on roles and responsibilities that involve designing, developing, implementing, upgrading and maintaining the information systems that store, process and transmit organisational information assets. It is, therefore, important that these computing graduates possess the necessary information security skills, knowledge and understanding that could enable them to perform their roles and responsibilities in a secure manner. These information security skills, knowledge and understanding can be acquired through information security education obtained through a qualification that is offered at a higher education institution. At many higher education institutions where information security is taught, it is taught as a single, isolated module at the fourth year level of study. The problem with this is that some computing students do not advance to this level and many of those that do, do not elect information security as a module. This means that these students may graduate and be employed by organisations lacking the necessary information security skills, knowledge and understanding to perform their roles and responsibilities securely. Consequently, this could increase the number of employees who are the “weakest link” in securing organisational information systems and related information assets. The ACM, as a key role player that provides educational guidelines for the development of computing curricula, recommends that information security should be pervasively integrated into computing curricula. However, these guidelines and recommendations do not provide sufficient guidance on “how” computing educators can pervasively integrate information security into their modules. Therefore, the problem identified by this research is that “currently, no generally used framework exists to aid the pervasive integration of information security into undergraduate computing curricula”. The primary research objective of this study, therefore, is to develop a framework to aid the pervasive integration of information security into undergraduate computing curricula. In order to meet this objective, secondary objectives were met, namely: To develop an understanding of the importance of information security; to determine the importance of information security education as it relates to undergraduate computing curricula; and to determine computing educators’ perspectives on information security education in a South African context. Various research methods were used to achieve this study’s research objectives. These research methods included a literature review which was used to define and provide an in-depth discussion relating to the domain in which this study is contained, namely: information security and information security education. Furthermore, a survey which took the form of semi-structured interviews supported by a questionnaire, was used to elicit computing educators’ perspectives on information security education in a South African context. Argumentation was used to argue towards the proposed framework to aid the pervasive integration of information security into undergraduate computing curricula. In addition, modelling techniques were used to model the proposed framework and scenarios were used to demonstrate how a computing department could implement the proposed framework. Finally, elite interviews supported by a questionnaire were conducted to validate the proposed framework. It is envisaged that the proposed framework could assist computing departments and undergraduate computing educators in the integration of information security into their curricula. Furthermore, the pervasive integration of information security into undergraduate computing curricula could ensure that computing graduates exit higher education institutions possessing the necessary information security skills, knowledge and understanding to enable them to perform their roles and responsibilities securely. It is hoped that this could enable computing graduates to become a stronger link in securing organisational information systems and related assets.
427

The development of a risk prevention safety and security program and its application into selected Miami hotels

Cochran, John 01 August 1984 (has links)
The purpose of this thesis is to develop a risk prevention safety and security program for the major problem areas of hotel operations. This includes general hotel safety and security, personnel, lock and key control, lighting and fire prevention. It will then evaluate randomly selected hotels in the Miami area to determine how well they meet the criteria set forth in this program. This thesis will use related texts, periodicals and published articles to develop a the risk prevention safety and security program. The data to determine how well selected Miami hotels responded to this program was developed through the use of a detailed questionaire. The major finding was that the majority of hotels have adequate risk prevention safety and security programs set forth in writing as part of the overall hotel management function. However. the hotels surveyed failed to implement these programs into daily operations. Survey members agreed if the hotels could consolidate the risk prevention safety and security program into a singular management function then a valuable management tool would be created.
428

Guidelines to address the human factor in the South African National Research and Education Network beneficiary institutions

Mjikeliso, Yolanda January 2014 (has links)
Even if all the technical security solutions appropriate for an organisation’s network are implemented, for example, firewalls, antivirus programs and encryption, if the human factor is neglected then these technical security solutions will serve no purpose. The greatest challenge to network security is probably not the technological solutions that organisations invest in, but the human factor (non-technical solutions), which most organisations neglect. The human factor is often ignored even though humans are the most important resources of organisations and perform all the physical tasks, configure and manage equipment, enter data, manage people and operate the systems and networks. The same people that manage and operate networks and systems have vulnerabilities. They are not perfect and there will always be an element of mistake-making or error. In other words, humans make mistakes that could result in security vulnerabilities, and the exploitation of these vulnerabilities could in turn result in network security breaches. Human vulnerabilities are driven by many factors including insufficient security education, training and awareness, a lack of security policies and procedures in the organisation, a limited attention span and negligence. Network security may thus be compromised by this human vulnerability. In the context of this dissertation, both physical and technological controls should be implemented to ensure the security of the SANReN network. However, if the human factors are not adequately addressed, the network would become vulnerable to risks posed by the human factor which could threaten the security of the network. Accordingly, the primary research objective of this study is to formulate guidelines that address the information security related human factors in the rolling out and continued management of the SANReN network. An analysis of existing policies and procedures governing the SANReN network was conducted and it was determined that there are currently no guidelines addressing the human factor in the SANReN beneficiary institutions. Therefore, the aim of this study is to provide the guidelines for addressing the human factor threats in the SANReN beneficiary institutions.
429

Critical success factors of information security projects

Tshabalala, Obediant January 2016 (has links)
The research shows the critical success factors when implementing information security projects. Many Information security projects in the past have not been successful because these factors were not identified and emphasised effectively. By identifying these factors the research basically presents a model by which information security projects can be executed with guaranteed success. The factors identified during the study cover the following streams: top management commitment as a factor of success; accountability as a factor of success; responsibility as a factor of success; awareness as a factor of success and an information security policy as a factor of success. For the empirical study, a physical questionnaire was administrated to a pool of experts in project management and information security. The study consisted of 60 participants who were verified to have minimum requirements core for questionnaire completion. The questionnaire requested for biological information of the participants and their perceived relations (based on their experience) between project success versus accountability, information security project success versus responsibilities, information security project success versus training & awareness, information security project success versus top management commitment and information security project success versus information security policy. The participants’ responses were structured according to a Likert-type scale. Participants had to indicate the extent to which they agreed with each of the statements in the questionnaire. The responses obtained from the survey were presented and analysed. The researcher observed in this study that information security projects are so specific that critical success factors need to be emphasised from project inception. With the identified critical success factors, the researcher recommends that a project methodology be structured to include these factors so that there is a standard in running information security projects successfully. The researcher also identified that amongst the critical success factors identified, there are some that need to be emphasised more than the others due to their level of importance in such projects.
430

Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective

Li, Xiao-Yu January 2007 (has links)
As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.

Page generated in 0.2725 seconds