• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 2
  • Tagged with
  • 27
  • 27
  • 27
  • 22
  • 13
  • 12
  • 9
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Search engine poisoning and its prevalence in modern search engines

Blaauw, Pieter January 2013 (has links)
The prevalence of Search Engine Poisoning in trending topics and popular search terms on the web within search engines is investigated. Search Engine Poisoning is the act of manipulating search engines in order to display search results from websites infected with malware. Research done between February and August 2012, using both manual and automated techniques, shows us how easily the criminal element manages to insert malicious content into web pages related to popular search terms within search engines. In order to provide the reader with a clear overview and understanding of the motives and the methods of the operators of Search Engine Poisoning campaigns, an in-depth review of automated and semi-automated web exploit kits is done, as well as looking into the motives for running these campaigns. Three high profile case studies are examined, and the various Search Engine Poisoning campaigns associated with these case studies are discussed in detail to the reader. From February to August 2012, data was collected from the top trending topics on Google’s search engine along with the top listed sites related to these topics, and then passed through various automated tools to discover if these results have been infiltrated by the operators of Search Engine Poisoning campaings, and the results of these automated scans are then discussed in detail. During the research period, manual searching for Search Engine Poisoning campaigns was also done, using high profile news events and popular search terms. These results are analysed in detail to determine the methods of attack, the purpose of the attack and the parties behind it
22

An exploratory study of techniques in passive network telescope data analysis

Cowie, Bradley January 2013 (has links)
Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope datasets
23

Proximity-based attacks in wireless sensor networks

Subramanian, Venkatachalam 29 March 2013 (has links)
The nodes in wireless sensor networks (WSNs) utilize the radio frequency (RF) channel to communicate. Given that the RF channel is the primary communication channel, many researchers have developed techniques for securing that channel. However, the RF channel is not the only interface into a sensor. The sensing components, which are primarily designed to sense characteristics about the outside world, can also be used (or misused) as a communication (side) channel. In our work, we aim to characterize the side channels for various sensory components (i.e., light sensor, acoustic sensor, and accelerometer). While previous work has focused on the use of these side channels to improve the security and performance of a WSN, we seek to determine if the side channels have enough capacity to potentially be used for malicious activity. Specifically, we evaluate the feasibility and practicality of the side channels using today's sensor technology and illustrate that these channels have enough capacity to enable the transfer of common, well-known malware. Given that a significant number of modern robotic systems depend on the external side channels for navigation and environment-sensing, they become potential targets for side-channel attacks. Therefore, we demonstrate this relatively new form of attack which exploits the uninvestigated but predominantly used side channels to trigger malware residing in real-time robotic systems such as the iRobot Create. The ultimate goal of our work is to show the impact of this new class of attack and also to motivate the need for an intrusion detection system (IDS) that not only monitors the RF channel, but also monitors the values returned by the sensory components.
24

A framework for the application of network telescope sensors in a global IP network

Irwin, Barry Vivian William January 2011 (has links)
The use of Network Telescope systems has become increasingly popular amongst security researchers in recent years. This study provides a framework for the utilisation of this data. The research is based on a primary dataset of 40 million events spanning 50 months collected using a small (/24) passive network telescope located in African IP space. This research presents a number of differing ways in which the data can be analysed ranging from low level protocol based analysis to higher level analysis at the geopolitical and network topology level. Anomalous traffic and illustrative anecdotes are explored in detail and highlighted. A discussion relating to bogon traffic observed is also presented. Two novel visualisation tools are presented, which were developed to aid in the analysis of large network telescope datasets. The first is a three-dimensional visualisation tool which allows for live, near-realtime analysis, and the second is a two-dimensional fractal based plotting scheme which allows for plots of the entire IPv4 address space to be produced, and manipulated. Using the techniques and tools developed for the analysis of this dataset, a detailed analysis of traffic recorded as destined for port 445/tcp is presented. This includes the evaluation of traffic surrounding the outbreak of the Conficker worm in November 2008. A number of metrics relating to the description and quantification of network telescope configuration and the resultant traffic captures are described, the use of which it is hoped will facilitate greater and easier collaboration among researchers utilising this network security technology. The research concludes with suggestions relating to other applications of the data and intelligence that can be extracted from network telescopes, and their use as part of an organisation’s integrated network security systems
25

DNS traffic based classifiers for the automatic classification of botnet domains

Stalmans, Etienne Raymond January 2014 (has links)
Networks of maliciously compromised computers, known as botnets, consisting of thousands of hosts have emerged as a serious threat to Internet security in recent years. These compromised systems, under the control of an operator are used to steal data, distribute malware and spam, launch phishing attacks and in Distributed Denial-of-Service (DDoS) attacks. The operators of these botnets use Command and Control (C2) servers to communicate with the members of the botnet and send commands. The communications channels between the C2 nodes and endpoints have employed numerous detection avoidance mechanisms to prevent the shutdown of the C2 servers. Two prevalent detection avoidance techniques used by current botnets are algorithmically generated domain names and DNS Fast-Flux. The use of these mechanisms can however be observed and used to create distinct signatures that in turn can be used to detect DNS domains being used for C2 operation. This report details research conducted into the implementation of three classes of classification techniques that exploit these signatures in order to accurately detect botnet traffic. The techniques described make use of the traffic from DNS query responses created when members of a botnet try to contact the C2 servers. Traffic observation and categorisation is passive from the perspective of the communicating nodes. The first set of classifiers explored employ frequency analysis to detect the algorithmically generated domain names used by botnets. These were found to have a high degree of accuracy with a low false positive rate. The characteristics of Fast-Flux domains are used in the second set of classifiers. It is shown that using these characteristics Fast-Flux domains can be accurately identified and differentiated from legitimate domains (such as Content Distribution Networks exhibit similar behaviour). The final set of classifiers use spatial autocorrelation to detect Fast-Flux domains based on the geographic distribution of the botnet C2 servers to which the detected domains resolve. It is shown that botnet C2 servers can be detected solely based on their geographic location. This technique is shown to clearly distinguish between malicious and legitimate domains. The implemented classifiers are lightweight and use existing network traffic to detect botnets and thus do not require major architectural changes to the network. The performance impact of implementing classification of DNS traffic is examined and it is shown that the performance impact is at an acceptable level.
26

Legal and policy aspects to consider when providing information security in the corporate environment

Dagada, Rabelani 11 1900 (has links)
E-commerce is growing rapidly due to the massive usage of the Internet to conduct commercial transactions. This growth has presented both customers and merchants with many advantages. However, one of the challenges in E-commerce is information security. In order to mitigate e-crime, the South African government promulgated laws that contain information security legal aspects that should be integrated into the establishment of information security. Although several authors have written about legal and policy aspects regarding information security in the South African context, it has not yet been explained how these aspects are used in the provision of information security in the South African corporate environment. This is the premise upon which the study was undertaken. Forty-five South African organisations participated in this research. Data gathering methods included individual interviews, website analysis, and document analysis. The findings of this study indicate that most organisations in South Africa are not integrating legal aspects into their information security policies. One of the most important outcomes of this study is the proposed Concept Model of Legal Compliance in the Corporate Environment. This Concept Model embodies the contribution of this study and demonstrates how legal requirements can be incorporated into information security endeavours. The fact that the proposed Concept Model is technology-independent and that it can be implemented in a real corporate environment, regardless of the organisation’s governance and management structure, holds great promise for the future of information security in South Africa and abroad. Furthermore, this thesis has generated a topology for linking legislation to the provision of information security which can be used by any academic or practitioner who intends to implement information security measures in line with the provisions of the law. It is on the basis of this premise that practitioners can, to some extent, construe that the integration of legislation into information security policies can be done in other South African organisations that did not participate in this study. Although this study has yielded theoretical, methodological and practical contributions, there is, in reality, more research work to be done in this area. / School of Computing / D. Phil. (Information Systems)
27

Legal and policy aspects to consider when providing information security in the corporate environment

Dagada, Rabelani 11 1900 (has links)
E-commerce is growing rapidly due to the massive usage of the Internet to conduct commercial transactions. This growth has presented both customers and merchants with many advantages. However, one of the challenges in E-commerce is information security. In order to mitigate e-crime, the South African government promulgated laws that contain information security legal aspects that should be integrated into the establishment of information security. Although several authors have written about legal and policy aspects regarding information security in the South African context, it has not yet been explained how these aspects are used in the provision of information security in the South African corporate environment. This is the premise upon which the study was undertaken. Forty-five South African organisations participated in this research. Data gathering methods included individual interviews, website analysis, and document analysis. The findings of this study indicate that most organisations in South Africa are not integrating legal aspects into their information security policies. One of the most important outcomes of this study is the proposed Concept Model of Legal Compliance in the Corporate Environment. This Concept Model embodies the contribution of this study and demonstrates how legal requirements can be incorporated into information security endeavours. The fact that the proposed Concept Model is technology-independent and that it can be implemented in a real corporate environment, regardless of the organisation’s governance and management structure, holds great promise for the future of information security in South Africa and abroad. Furthermore, this thesis has generated a topology for linking legislation to the provision of information security which can be used by any academic or practitioner who intends to implement information security measures in line with the provisions of the law. It is on the basis of this premise that practitioners can, to some extent, construe that the integration of legislation into information security policies can be done in other South African organisations that did not participate in this study. Although this study has yielded theoretical, methodological and practical contributions, there is, in reality, more research work to be done in this area. / School of Computing / D. Phil. (Information Systems)

Page generated in 0.065 seconds