• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 169
  • 19
  • 18
  • 9
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 279
  • 279
  • 279
  • 279
  • 82
  • 69
  • 59
  • 52
  • 41
  • 40
  • 39
  • 39
  • 38
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

A model for security incident response in the South African National Research and Education network

Mooi, Roderick David January 2014 (has links)
This dissertation addresses the problem of a lack of a formal incident response capability in the South African National Research and Education Network (SA NREN). While investigating alternatives it was found that no clear method exists to solve this problem. Therefore, a second problem is identified: the lack of a definitive method for establishing a Computer Security Incident Response Team (CSIRT) or Computer Emergency Response Team (CERT) in general. Solving the second problem is important as we then have a means of knowing how to start when building a CSIRT. This will set the basis for addressing the initial problem, resulting in a prepared, improved and coordinated response to IT security incidents affecting the SANREN. To commence, the requirements for establishing a CSIRT are identified via a comprehensive literature review. These requirements are categorized into five areas, namely, the basic business requirements followed by the four Ps of the IT Infrastructure Library (ITIL). That is, People, Processes, Product and Partners, adapted to suit the CSIRT context. Through the use of argumentation, the relationships between the areas are uncovered and explored. Thereafter, a Design Science Research-based process is utilised to develop a generic model for establishing a CSIRT. The model is based on the interactions uncovered between the business requirements and the adapted four Ps. These are summarised through two views -- strategic and tactical -- together forming an holistic model for establishing a CSIRT. The model highlights the decisions required for the business requirements, services, team model and staff, policies and processes, tools and technologies, and partners of a CSIRT respectively. Finally, to address the primary objective, the generic model is applied to the SANREN environment. Thus, the second artefact is an instantiation, a specific model, which can be implemented to create a CSIRT for the SA NREN. To produce the specific model, insight into the nature of the SANREN environment was required. The status quo was revealed through the use of a survey and argumentative analysis of the results. The specific decisions in each area required to establish an SA NREN CSIRT are explored throughout the development of the model. The result is a comprehensive framework for implementing a CSIRT in the SA NREN, detailing the decisions required in each of the areas. This model additionally acts as a demonstration of the utility of the generic model. The implications of this research are twofold. Firstly, the generic model is useful as a basis for anyone wanting to establish a CSIRT. It helps to ensure that all factors are considered and that no important decisions are neglected, thereby enabling an holistic view. Secondly, the specific model for the SA NREN CSIRT serves as a foundation for implementing the CSIRT going forward. It accelerates the process by addressing the important considerations and highlighting the concerns that must be addressed while establishing the CSIRT.
182

Digital forensic model for computer networks

Sanyamahwe, Tendai January 2011 (has links)
The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
183

A model to measure the maturuty of smartphone security at software consultancies

Allam, Sean January 2009 (has links)
Smartphones are proliferating into the workplace at an ever-increasing rate, similarly the threats that they pose is increasing. In an era of constant connectivity and availability, information is freed up of constraints of time and place. This research project delves into the risks introduced by smartphones, and through multiple cases studies, a maturity measurement model is formulated. The model is based on recommendations from two leading information security frameworks, the COBIT 4.1 framework and ISO27002 code of practice. Ultimately, a combination of smartphone specific risks are integrated with key control recommendations, in providing a set of key measurable security maturity components. The subjective opinions of case study respondents are considered a key component in achieving a solution. The solution addresses the concerns of not only policy makers, but also the employees subjected to the security policies. Nurturing security awareness into organisational culture through reinforcement and employee acceptance is highlighted in this research project. Software consultancies can use this model to mitigate risks, while harnessing the potential strategic advantages of mobile computing through smartphone devices. In addition, this research project identifies the critical components of a smartphone security solution. As a result, a model is provided for software consultancies due to the intense reliance on information within these types of organisations. The model can be effectively applied to any information intensive organisation.
184

The Impact of Information Security Awareness on Compliance with Information Security Policies: a Phishing Perspective

Hanus, Bartlomiej T. 08 1900 (has links)
This research seeks to derive and examine a multidimensional definition of information security awareness, investigate its antecedents, and analyze its effects on compliance with organizational information security policies. The above research goals are tested through the theoretical lens of technology threat avoidance theory and protection motivation theory. Information security awareness is defined as a second-order construct composed of the elements of threat and coping appraisals supplemented by the responsibilities construct to account for organizational environment. The study is executed in two stages. First, the participants (employees of a municipality) are exposed to a series of phishing and spear-phishing messages to assess if there are any common characteristics shared by the phishing victims. The differences between the phished and the not phished group are assessed through multiple discriminant analysis. Second, the same individuals are asked to participate in a survey designed to examine their security awareness. The research model is tested using PLS-SEM approach. The results indicate that security awareness is in fact a second-order formative construct composed of six components. There are significant differences in security awareness levels between the victims of the phishing experiment and the employees who maintain compliance with security policies. The study extends the theory by proposing and validating a universal definition of security awareness. It provides practitioners with an instrument to examine awareness in a plethora of settings and design customized security training activities.
185

Using Spammers' Computing Resources for Volunteer Computing

Bui, Thai Le Quy 13 March 2014 (has links)
Spammers are continually looking to circumvent counter-measures seeking to slow them down. An immense amount of time and money is currently devoted to hiding spam, but not enough is devoted to effectively preventing it. One approach for preventing spam is to force the spammer's machine to solve a computational problem of varying difficulty before granting access. The idea is that suspicious or problematic requests are given difficult problems to solve while legitimate requests are allowed through with minimal computation. Unfortunately, most systems that employ this model waste the computing resources being used, as they are directed towards solving cryptographic problems that provide no societal benefit. While systems such as reCAPTCHA and FoldIt have allowed users to contribute solutions to useful problems interactively, an analogous solution for non-interactive proof-of-work does not exist. Towards this end, this paper describes MetaCAPTCHA and reBOINC, an infrastructure for supporting useful proof-of-work that is integrated into a web spam throttling service. The infrastructure dynamically issues CAPTCHAs and proof-of-work puzzles while ensuring that malicious users solve challenging puzzles. Additionally, it provides a framework that enables the computational resources of spammers to be redirected towards meaningful research. To validate the efficacy of our approach, prototype implementations based on OpenCV and BOINC are described that demonstrate the ability to harvest spammer's resources for beneficial purposes.
186

Secure data aggregation protocol for sensor networks

Shah, Kavit 20 February 2015 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / We propose a secure in-network data aggregation protocol with internal verification, to gain increase in the lifespan of the network by preserving bandwidth. For doing secure internal distributed operations, we show an algorithm for securely computing the sum of sensor readings in the network. Our algorithm can be generalized to any random tree topology and can be applied to any combination of mathematical functions. In addition, we represent an efficient way of doing statistical analysis for the protocol. Furthermore, we propose a novel, distributed and interactive algorithm to trace down the adversary and remove it from the network. Finally, we do bandwidth analysis of the protocol and give the proof for the efficiency of the protocol.
187

The Next Generation Botnet Attacks And Defenses

Wang, Ping 01 January 2010 (has links)
A "botnet" is a network of compromised computers (bots) that are controlled by an attacker (botmasters). Botnets are one of the most serious threats to today’s Internet; they are the root cause of many current Internet attacks, such as email spam, distributed denial of service (DDoS) attacks , click fraud, etc. There have been many researches on how to detect, monitor, and defend against botnets that have appeared and their attack techniques. However, it is equally important for us to investigate possible attack techniques that could be used by the next generation botnets, and develop effective defense techniques accordingly in order to be well prepared for future botnet attacks. In this dissertation, we focus on two areas of the next generation botnet attacks and defenses: the peer-to-peer (P2P) structured botnets and the possible honeypot detection techniques used by future botnets. Currently, most botnets have centralized command and control (C&C) architecture. However, P2P structured botnets have gradually emerged as a new advanced form of botnets. Without C&C servers, P2P botnets are more resilient to defense countermeasures than traditional centralized botnets. Therefore, we first systematically study P2P botnets along multiple dimensions: bot candidate selection, network construction and C&C mechanisms and communication protocols. As a further illustration of P2P botnets, we then present the design of an advanced hybrid P2P botnet, which could be developed by botmasters in the near future. Compared with current botnets, the proposed botnet is harder to be shut down, monitored, and hijacked. It provides robust network connectivity, individualized encryption and control traffic dispersion, limited botnet exposure by each bot, and easy monitoring and recovery by its botmaster. We suggest and analyze several possible defenses against this advanced botnet. Upon our understanding of P2P botnets, we turn our focus to P2P botnet countermeasures. We provide mathematical analysis of two P2P botnet mitigation approaches — index iii poisoning defense and Sybil defense, and one monitoring technique - passive monitoring. We are able to give analytical results to evaluate their performance. And simulation-based experiments show that our analysis is accurate. Besides P2P botnets, we investigate honeypot-aware botnets as well. This is because honeypot techniques have been widely used in botnet defense systems, botmasters will have to find ways to detect honeypots in order to protect and secure their botnets. We point out a general honeypot-aware principle, that is security professionals deploying honeypots have liability constraint such that they cannot allow their honeypots to participate in real attacks that could cause damage to others, while attackers do not need to follow this constraint. Based on this principle, a hardware- and software- independent honeypot detection methodology is proposed. We present possible honeypot detection techniques that can be used in both centralized botnets and P2P botnets. Our experiments show that current standard honeypot and honeynet programs are vulnerable to the proposed honeypot detection techniques. In the meantime, we discuss some guidelines for defending against general honeypot-aware botnet attacks.
188

Implementation business-to-business electronic commerce website using active server pages

Teesri, Sumuscha 01 January 2000 (has links)
E-commerce is the current approach for doing any type of business online, which uses the superior power of digital information to understand the requirements and preferences of each client and each partner, to adapt products and services for them, and then to distribute the products and services as swiftly as possible.
189

Implementation business-to-business electronic commerce website using active server pages

Teesri, Sumuscha 01 January 2000 (has links)
E-commerce is the current approach for doing any type of business online, which uses the superior power of digital information to understand the requirements and preferences of each client and each partner, to adapt products and services for them, and then to distribute the products and services as swiftly as possible.
190

Investigation and development of a system for secure synchronisation in a wireless mesh network

De Bruyn, Daniel Nicholas January 2010 (has links)
Thesis (M. Tech.(Electrical Engineering)) -- Central University of technology, Free State, 2010 / This dissertation gives an overview of the research done in developing a protocol to synchronise information in a secure wireless mesh network. Alternative methods to control wireless devices were investigated in the non-controlled frequency spectrum. The aim of the research was to develop a protocol that can be loaded on a micro-controller with limited intelligence, controlling endpoints. The protocol minimises human interference and automatically negotiates which device becomes the master controller. The device is able to discover and locate neighbour devices in range. The device has the capability to be stationary or mobile and host multiple control endpoints. Control endpoints can be digital or analogue, input or output, and belongs to a group like security, lighting or irrigation. These capabilities can change according to the solution’s requirements. Control endpoints with the same capabilities must be able to establish a connection between each other. An endpoint has a user-friendly name and can update the remote endpoint with the description. When a connection is established both endpoints update each other with their user-friendly name and their status. A local endpoint can trigger a certain action on a receiving control point. The system was tested with a building monitoring system because it is static and a less expensive choice, thus making the evaluation more suitable. A simulator for a personal computer was developed to evaluate the new protocol. Finally, the protocol was implemented and tested on a micro-controller platform.

Page generated in 0.0833 seconds