• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 19
  • 18
  • 9
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 274
  • 274
  • 274
  • 274
  • 82
  • 66
  • 59
  • 52
  • 41
  • 40
  • 39
  • 38
  • 38
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Physical-layer security

Bloch, Matthieu 05 May 2008 (has links)
As wireless networks continue to flourish worldwide and play an increasingly prominent role, it has become crucial to provide effective solutions to the inherent security issues associated with a wireless transmission medium. Unlike traditional solutions, which usually handle security at the application layer, the primary concern of this thesis is to analyze and develop solutions based on coding techniques at the physical layer. First, an information-theoretically secure communication protocol for quasi-static fading channels was developed and its performance with respect to theoretical limits was analyzed. A key element of the protocol is a reconciliation scheme for secret-key agreement based on low-density parity-check codes, which is specifically designed to operate on non-binary random variables and offers high reconciliation efficiency. Second, the fundamental trade-offs between cooperation and security were analyzed by investigating the transmission of confidential messages to cooperative relays. This information-theoretic study highlighted the importance of jamming as a means to increase secrecy and confirmed the importance of carefully chosen relaying strategies. Third, other applications of physical-layer security were investigated. Specifically, the use of secret-key agreement techniques for alternative cryptographic purposes was analyzed, and a framework for the design of practical information-theoretic commitment protocols over noisy channels was proposed. Finally, the benefit of using physical-layer coding techniques beyond the physical layer was illustrated by studying security issues in client-server networks. A coding scheme exploiting packet losses at the network layer was proposed to ensure reliable communication between clients and servers and security against colluding attackers.
172

Source authentication in group communication

Al-Ibrahim, Mohamed Hussain January 2005 (has links)
Title from screen page; viewed 10 Oct 2005. / Thesis (PhD)--Macquarie University, Division of Information and Communication Sciences, Dept. of Computing, 2004. / Bibliography: leaves 163-175. / Introduction -- Cryptographic essentials -- Multicast: structure and security -- Authentication of multicast streams -- Authentication of concast communication -- Authentication of transit flows -- One-time signatures for authenticating group communication -- Authentication of anycast communication -- Authentication of joining operation - Conclusion and future directions. / Electronic publication; full text available in PDF format. / Multicast is a relatively new and emerging communication mode in which a sender sends a message to a group of recipients in just one connection establishment... reducing broadband overhead and increasing resource utilization in the already congested and contented network... The focus of the research in this area has been in two directions: first, building an efficient routing infrastructure, and secondly, building a sophisticated security infrastructure. The focus of this work is on the second issue. / An ideal authenticated multicast environment ... provides authenticity for all the communication operations in the system... We ... propose a comprehensive solution to the problem ... for all its possible operations... 1. one-to-one (or joining mode) 2. one-to-many (or broadcast mode) 3. many-to-one (or concast mode) 4. intermediate (or transit mode) ... We study the ... mode known as anycast, in which a server is selected from a group of servers. Further we develop ... schemes for group-based communication exploiting the distinct features of one-time signatures... cover situations when a threshold number of participants are involved and ... where a proxy signer is required. / Electronic reproduction. / Mode of access: World Wide Web. / Also available in a print form
173

Social media risks in large and medium enterprises in the Cape Metropole : the role of internal auditors

Gwaka, Leon Tinashe January 2015 (has links)
Thesis (MTech (Internal Auditing))--Cape Peninsula University of Technology, 2015. / Social media has undoubtedly emerged as one of the greatest developments in this technology driven generation. Despite its existence from years back, social media popularity has recently surged drastically, with billions of users worldwide reported to be on at least one social media platform. This increase in users of social media has further been necessitated by governmental and private-sector initiatives to boost Internet connectivity to bridge the digital divide globally. Mobile Internet access has also fuelled the use of social media as it allows easy and economical connectivity anytime, anywhere. The availability of hundreds of social media platforms has presented businesses with several opportunities to conduct business activities using social media. The use of social media has been reported to avail businesses with plenty of benefits when this use is strategically aligned to business objectives. On the flipside of the coin, these social media platforms have also emerged as new hunting grounds for fraudsters and other information-technology related criminals. As with any invention, engaging social media for business has its own inherent risks; this further complicates existing information-technology risks and also presents businesses with new risks. Despite blossoming into a global phenomenon, social media has no universally accepted frameworks or approaches (thus no safeguards) when engaging with it, resulting in almost unlimited risk exposures. The uncertainly, i.e. risks surrounding social media platforms, proves problematic in determining the optimum social media platform to use in business. Furthermore, organisations are facing challenges in deciding whether to formally adopt it or totally ignore it, with the latter emerging not to be a viable option. The complex nature of social media has made it difficult for enterprises to place a monetary value and determine a return on investment on these platforms. From a governance perspective, it remains a challenge for most enterprises to identify and appoint individuals responsible for social media management within businesses, although recently social media strategist positions have been surfacing. Due to their nature, the social media trigger matters relating to governance, risk and compliance, which imply that internal auditors therefore are expected to champion the adoption of social media in enterprises. As a relatively new concept, the role that internal auditors should play towards social media appears not to be well defined. Through examination of existing guidelines, an attempt is made to define the role of internal auditors towards social media.
174

An exploratory study of techniques in passive network telescope data analysis

Cowie, Bradley January 2013 (has links)
Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope datasets
175

Towards an evaluation and protection strategy for critical infrastructure

Gottschalk, Jason Howard January 2015 (has links)
Critical Infrastructure is often overlooked from an Information Security perspective as being of high importance to protect which may result in Critical Infrastructure being at risk to Cyber related attacks with potential dire consequences. Furthermore, what is considered Critical Infrastructure is often a complex discussion, with varying opinions across audiences. Traditional Critical Infrastructure included power stations, water, sewage pump stations, gas pipe lines, power grids and a new entrant, the “internet of things”. This list is not complete and a constant challenge exists in identifying Critical Infrastructure and its interdependencies. The purpose of this research is to highlight the importance of protecting Critical Infrastructure as well as proposing a high level framework aiding in the identification and securing of Critical Infrastructure. To achieve this, key case studies involving Cyber crime and Cyber warfare, as well as the identification of attack vectors and impact on against Critical Infrastructure (as applicable to Critical Infrastructure where possible), were identified and discussed. Furthermore industry related material was researched as to identify key controls that would aid in protecting Critical Infrastructure. The identification of initiatives that countries were pursuing, that would aid in the protection of Critical Infrastructure, were identified and discussed. Research was conducted into the various standards, frameworks and methodologies available to aid in the identification, remediation and ultimately the protection of Critical Infrastructure. A key output of the research was the development of a hybrid approach to identifying Critical Infrastructure, associated vulnerabilities and an approach for remediation with specific metrics (based on the research performed). The conclusion based on the research is that there is often a need and a requirement to identify and protect Critical Infrastructure however this is usually initiated or driven by non-owners of Critical Infrastructure (Governments, governing bodies, standards bodies and security consultants). Furthermore where there are active initiative by owners very often the suggested approaches are very high level in nature with little direct guidance available for very immature environments.
176

A model for security incident response in the South African National Research and Education network

Mooi, Roderick David January 2014 (has links)
This dissertation addresses the problem of a lack of a formal incident response capability in the South African National Research and Education Network (SA NREN). While investigating alternatives it was found that no clear method exists to solve this problem. Therefore, a second problem is identified: the lack of a definitive method for establishing a Computer Security Incident Response Team (CSIRT) or Computer Emergency Response Team (CERT) in general. Solving the second problem is important as we then have a means of knowing how to start when building a CSIRT. This will set the basis for addressing the initial problem, resulting in a prepared, improved and coordinated response to IT security incidents affecting the SANREN. To commence, the requirements for establishing a CSIRT are identified via a comprehensive literature review. These requirements are categorized into five areas, namely, the basic business requirements followed by the four Ps of the IT Infrastructure Library (ITIL). That is, People, Processes, Product and Partners, adapted to suit the CSIRT context. Through the use of argumentation, the relationships between the areas are uncovered and explored. Thereafter, a Design Science Research-based process is utilised to develop a generic model for establishing a CSIRT. The model is based on the interactions uncovered between the business requirements and the adapted four Ps. These are summarised through two views -- strategic and tactical -- together forming an holistic model for establishing a CSIRT. The model highlights the decisions required for the business requirements, services, team model and staff, policies and processes, tools and technologies, and partners of a CSIRT respectively. Finally, to address the primary objective, the generic model is applied to the SANREN environment. Thus, the second artefact is an instantiation, a specific model, which can be implemented to create a CSIRT for the SA NREN. To produce the specific model, insight into the nature of the SANREN environment was required. The status quo was revealed through the use of a survey and argumentative analysis of the results. The specific decisions in each area required to establish an SA NREN CSIRT are explored throughout the development of the model. The result is a comprehensive framework for implementing a CSIRT in the SA NREN, detailing the decisions required in each of the areas. This model additionally acts as a demonstration of the utility of the generic model. The implications of this research are twofold. Firstly, the generic model is useful as a basis for anyone wanting to establish a CSIRT. It helps to ensure that all factors are considered and that no important decisions are neglected, thereby enabling an holistic view. Secondly, the specific model for the SA NREN CSIRT serves as a foundation for implementing the CSIRT going forward. It accelerates the process by addressing the important considerations and highlighting the concerns that must be addressed while establishing the CSIRT.
177

Digital forensic model for computer networks

Sanyamahwe, Tendai January 2011 (has links)
The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
178

A model to measure the maturuty of smartphone security at software consultancies

Allam, Sean January 2009 (has links)
Smartphones are proliferating into the workplace at an ever-increasing rate, similarly the threats that they pose is increasing. In an era of constant connectivity and availability, information is freed up of constraints of time and place. This research project delves into the risks introduced by smartphones, and through multiple cases studies, a maturity measurement model is formulated. The model is based on recommendations from two leading information security frameworks, the COBIT 4.1 framework and ISO27002 code of practice. Ultimately, a combination of smartphone specific risks are integrated with key control recommendations, in providing a set of key measurable security maturity components. The subjective opinions of case study respondents are considered a key component in achieving a solution. The solution addresses the concerns of not only policy makers, but also the employees subjected to the security policies. Nurturing security awareness into organisational culture through reinforcement and employee acceptance is highlighted in this research project. Software consultancies can use this model to mitigate risks, while harnessing the potential strategic advantages of mobile computing through smartphone devices. In addition, this research project identifies the critical components of a smartphone security solution. As a result, a model is provided for software consultancies due to the intense reliance on information within these types of organisations. The model can be effectively applied to any information intensive organisation.
179

The Impact of Information Security Awareness on Compliance with Information Security Policies: a Phishing Perspective

Hanus, Bartlomiej T. 08 1900 (has links)
This research seeks to derive and examine a multidimensional definition of information security awareness, investigate its antecedents, and analyze its effects on compliance with organizational information security policies. The above research goals are tested through the theoretical lens of technology threat avoidance theory and protection motivation theory. Information security awareness is defined as a second-order construct composed of the elements of threat and coping appraisals supplemented by the responsibilities construct to account for organizational environment. The study is executed in two stages. First, the participants (employees of a municipality) are exposed to a series of phishing and spear-phishing messages to assess if there are any common characteristics shared by the phishing victims. The differences between the phished and the not phished group are assessed through multiple discriminant analysis. Second, the same individuals are asked to participate in a survey designed to examine their security awareness. The research model is tested using PLS-SEM approach. The results indicate that security awareness is in fact a second-order formative construct composed of six components. There are significant differences in security awareness levels between the victims of the phishing experiment and the employees who maintain compliance with security policies. The study extends the theory by proposing and validating a universal definition of security awareness. It provides practitioners with an instrument to examine awareness in a plethora of settings and design customized security training activities.
180

Using Spammers' Computing Resources for Volunteer Computing

Bui, Thai Le Quy 13 March 2014 (has links)
Spammers are continually looking to circumvent counter-measures seeking to slow them down. An immense amount of time and money is currently devoted to hiding spam, but not enough is devoted to effectively preventing it. One approach for preventing spam is to force the spammer's machine to solve a computational problem of varying difficulty before granting access. The idea is that suspicious or problematic requests are given difficult problems to solve while legitimate requests are allowed through with minimal computation. Unfortunately, most systems that employ this model waste the computing resources being used, as they are directed towards solving cryptographic problems that provide no societal benefit. While systems such as reCAPTCHA and FoldIt have allowed users to contribute solutions to useful problems interactively, an analogous solution for non-interactive proof-of-work does not exist. Towards this end, this paper describes MetaCAPTCHA and reBOINC, an infrastructure for supporting useful proof-of-work that is integrated into a web spam throttling service. The infrastructure dynamically issues CAPTCHAs and proof-of-work puzzles while ensuring that malicious users solve challenging puzzles. Additionally, it provides a framework that enables the computational resources of spammers to be redirected towards meaningful research. To validate the efficacy of our approach, prototype implementations based on OpenCV and BOINC are described that demonstrate the ability to harvest spammer's resources for beneficial purposes.

Page generated in 0.0821 seconds