• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 7
  • 6
  • 3
  • 3
  • 3
  • 3
  • 1
  • Tagged with
  • 106
  • 106
  • 60
  • 60
  • 24
  • 23
  • 19
  • 14
  • 13
  • 11
  • 11
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A dynamic distributed trust model to control access to resources over the Internet

Lei, Hui. 10 April 2008 (has links)
The access control mechanisms used in traditional security infrastructures, such as ACL and password applications, have been proven inadequate, inflexible, and difficult to apply in the Internet due to the incredible magnitude of today's Internet. Recently, research for expressing trust information in the digital world has been explored to be complementary to security mechanisms. This thesis deals with the access control for the resources provided over the Internet. On line digital content service is exemplary of such an application. In this work, we have concentrated on the idea of a trust management system, which was first proposed by Blaze et a1 in 1996, and we have proposed a general-purpose, application-independent Dynamic Distributed Trust Model (DDTM). In our DDTM, access rights are directly associated with a trust value. The trust values in this thesis are further classified into direct trust values, indirect trust values and trust authorization levels. We have calculated and expressed each type of the trust values as explicit numerical values. The core of this model is the recommendation-based trust model, organized as a Trust Delegation Tree (TDT), and the authorization delegation realized by delegation certificate chains. Moreover, the DDTM provides a distributed key-oriented certificate-issuing mechanism with no centralized global authority. A Dynamic Distributed Trust Protocol (DDTP) was developed as a general protocol for establishing and managing the trust relationship in a TDT structure. The protocol was verified by means of the verification tool, SPIN, and was prototyped to simulate communication and behaviors among the certificate issuer nodes on a TDT.
12

Novel framework to support information security audit in virtual environment

Nagarle Shivashankarappa, A. January 2013 (has links)
Over the years, the focus of information security has evolved from technical issue to business issue. Heightened competition from globalization compounded by emerging technologies such as cloud computing has given rise to new threats and vulnerabilities which are not only complex but unpredictable. However, there are enormous opportunities which can bring value to business and enhance stakeholders’ wealth. Enterprises in Oman are compelled to embark e-Oman strategy which invariably increases the complexity due to integration of heterogeneous systems and outsourcing with external business partners. This implies that there is a need for a comprehensive model that integrates people, processes and technology and provides enterprise information security focusing on organizational transparency and enhancing business value. It was evident through interviews with security practitioners that existing security models and frameworks are inadequate to meet the dynamic nature of threats and challenges inherent in virtualization technology which is a catalyst to cloud computing. Hence the intent of this research is to evaluate enterprise information security in Oman and explore the potential of building a balanced model that aligns governance, risk management and compliance with emphasis to auditing in virtual environment. An integrated enterprise governance, risk and compliance model was developed where enterprise risk management acts as a platform, both mitigating risk on one hand and as a framework for defining cost controls and quantifying revenue opportunities on the other. Further, security standards and frameworks were evaluated and some limitations were identified. A framework for implementing IT governance focusing on critical success factors was developed after analysing and mapping the four domains of COBIT with various best practices. Server virtualization using bare metal architecture was practically tested which provides fault-tolerance and automated load balancing with enhanced security. Taxonomy of risks inherent in virtual environments was identified and an audit process flow was devised that provides insight to auditors to assess the adequacy of controls in a virtual environment. A novel framework for a successful audit in virtual environment is the contribution of this research that has changed some of the security assumptions and audit controls in virtual environment.
13

Assessing program code through static structural similarity

Naude, Kevin Alexander January 2007 (has links)
Learning to write software requires much practice and frequent assessment. Consequently, the use of computers to assist in the assessment of computer programs has been important in supporting large classes at universities. The main approaches to the problem are dynamic analysis (testing student programs for expected output) and static analysis (direct analysis of the program code). The former is very sensitive to all kinds of errors in student programs, while the latter has traditionally only been used to assess quality, and not correctness. This research focusses on the application of static analysis, particularly structural similarity, to marking student programs. Existing traditional measures of similarity are limiting in that they are usually only effective on tree structures. In this regard they do not easily support dependencies in program code. Contemporary measures of structural similarity, such as similarity flooding, usually rely on an internal normalisation of scores. The effect is that the scores only have relative meaning, and cannot be interpreted in isolation, ie. they are not meaningful for assessment. The SimRank measure is shown to have the same problem, but not because of normalisation. The problem with the SimRank measure arises from the fact that its scores depend on all possible mappings between the children of vertices being compared. The main contribution of this research is a novel graph similarity measure, the Weighted Assignment Similarity measure. It is related to SimRank, but derives propagation scores from only the locally optimal mapping between child vertices. The resulting similarity scores may be regarded as the percentage of mutual coverage between graphs. The measure is proven to converge for all directed acyclic graphs, and an efficient implementation is outlined for this case. Attributes on graph vertices and edges are often used to capture domain specific information which is not structural in nature. It has been suggested that these should influence the similarity propagation, but no clear method for doing this has been reported. The second important contribution of this research is a general method for incorporating these local attribute similarities into the larger similarity propagation method. An example of attributes in program graphs are identifier names. The choice of identifiers in programs is arbitrary as they are purely symbolic. A problem facing any comparison between programs is that they are unlikely to use the same set of identifiers. This problem indicates that a mapping between the identifier sets is required. The third contribution of this research is a method for applying the structural similarity measure in a two step process to find an optimal identifier mapping. This approach is both novel and valuable as it cleverly reuses the similarity measure as an existing resource. In general, programming assignments allow a large variety of solutions. Assessing student programs through structural similarity is only feasible if the diversity in the solution space can be addressed. This study narrows program diversity through a set of semantic preserving program transformations that convert programs into a normal form. The application of the Weighted Assignment Similarity measure to marking student programs is investigated, and strong correlations are found with the human marker. It is shown that the most accurate assessment requires that programs not only be compared with a set of good solutions, but rather a mixed set of programs of varying levels of correctness. This research represents the first documented successful application of structural similarity to the marking of student programs.
14

Effective monitoring of slow suspicious activites on computer networks

Kalutarage, H. K. January 2013 (has links)
Slow and suspicious activities on modern computer networks are increasingly hard to detect. An attacker may take days, weeks or months to complete an attack life cycle. A particular challenge is to monitor for stealthy attempts deliberately designed to stay beneath detection thresholds. This doctoral research presents a theoretical framework for effective monitoring of such activities. The main contribution of this work is a scalable monitoring scheme proposed in a Bayesian framework, which allows for detection of multiple attackers by setting a threshold using the Grubbs’ test. Second contribution is a tracing algorithm for such attacks. Network paths from a victim to its immediate visible hops are mapped and profiled in a Bayesian framework and the highest scored path is prioritised for monitoring. Third contribution explores an approach to minimise data collection by employing traffic sampling. The traffic is sampled using the stratification sampling technique with optimum allocation method. Using a 10% sampling rate was sufficient to detect simulated attackers, and some network parameters affected on sampling error. Final contribution is a target-centric monitoring scheme to detect nodes under attack. Target-centric approach is quicker to detect stealthy attacks and has potential to detect collusion as it completely independent from source information. Experiments are carried out in a simulated environment using the network simulator NS3. Anomalous traffic is generated along with normal traffic within and between networks using a Poisson arrival model. Our work addresses a key problem of network security monitoring: a scalable monitoring scheme for slow and suspicious activities. State size, in terms of a node score, is a small number of nodes in the network and hence storage is feasible for very large networks.
15

Image understanding for automatic human and machine separation

Romero Macias, Cristina January 2013 (has links)
The research presented in this thesis aims to extend the capabilities of human interaction proofs in order to improve security in web applications and services. The research focuses on developing a more robust and efficient Completely Automated Public Turing test to tell Computers and Human Apart (CAPTCHA) to increase the gap between human recognition and machine recognition. Two main novel approaches are presented, each one of them targeting a different area of human and machine recognition: a character recognition test, and an image recognition test. Along with the novel approaches, a categorisation for the available CAPTCHA methods is also introduced. The character recognition CAPTCHA is based on the creation of depth perception by using shadows to represent characters. The characters are created by the imaginary shadows produced by a light source, using as a basis the gestalt principle that human beings can perceive whole forms instead of just a collection of simple lines and curves. This approach was developed in two stages: firstly, two dimensional characters, and secondly three-dimensional character models. The image recognition CAPTCHA is based on the creation of cartoons out of faces. The faces used belong to people in the entertainment business, politicians, and sportsmen. The principal basis of this approach is that face perception is a cognitive process that humans perform easily and with a high rate of success. The process involves the use of face morphing techniques to distort the faces into cartoons, allowing the resulting image to be more robust against machine recognition. Exhaustive tests on both approaches using OCR software, SIFT image recognition, and face recognition software show an improvement in human recognition rate, whilst preventing robots break through the tests.
16

IP traceback marking scheme based DDoS defense.

January 2005 (has links)
Ping Yan. / Thesis submitted in: December 2004. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (leaves 93-100). / Abstracts in English and Chinese. / Abstract --- p.i / Acknowledgement --- p.iii / Chapter 1 --- INTRODUCTION --- p.1 / Chapter 1.1 --- The Problem --- p.1 / Chapter 1.2 --- Research Motivations and Objectives --- p.3 / Chapter 1.3 --- The Rationale --- p.8 / Chapter 1.4 --- Thesis Organization --- p.9 / Chapter 2 --- BACKGROUND STUDY --- p.10 / Chapter 2.1 --- Distributed Denial of Service Attacks --- p.10 / Chapter 2.1.1 --- Taxonomy of DoS and DDoS Attacks --- p.13 / Chapter 2.2 --- IP Traceback --- p.17 / Chapter 2.2.1 --- Assumptions --- p.18 / Chapter 2.2.2 --- Problem Model and Performance Metrics --- p.20 / Chapter 2.3 --- IP Traceback Proposals --- p.24 / Chapter 2.3.1 --- Probabilistic Packet Marking (PPM) --- p.24 / Chapter 2.3.2 --- ICMP Traceback Messaging --- p.26 / Chapter 2.3.3 --- Logging --- p.27 / Chapter 2.3.4 --- Tracing Hop-by-hop --- p.29 / Chapter 2.3.5 --- Controlled Flooding --- p.30 / Chapter 2.4 --- DDoS Attack Countermeasures --- p.30 / Chapter 2.4.1 --- Ingress/Egress Filtering --- p.33 / Chapter 2.4.2 --- Route-based Distributed Packet Filtering (DPF) --- p.34 / Chapter 2.4.3 --- IP Traceback Based Intelligent Packet Filtering --- p.35 / Chapter 2.4.4 --- Source-end DDoS Attack Recognition and Defense --- p.36 / Chapter 2.4.5 --- Classification of DDoS Defense Methods --- p.38 / Chapter 3 --- ADAPTIVE PACKET MARKING SCHEME --- p.41 / Chapter 3.1 --- Scheme Overview --- p.41 / Chapter 3.2 --- Adaptive Packet Marking Scheme --- p.44 / Chapter 3.2.1 --- Design Motivation --- p.44 / Chapter 3.2.2 --- Marking Algorithm Basics --- p.46 / Chapter 3.2.3 --- Domain id Marking --- p.49 / Chapter 3.2.4 --- Router id Marking --- p.51 / Chapter 3.2.5 --- Attack Graph Reconstruction --- p.53 / Chapter 3.2.6 --- IP Header Overloading --- p.56 / Chapter 3.3 --- Experiments on the Packet Marking Scheme --- p.59 / Chapter 3.3.1 --- Simulation Set-up --- p.59 / Chapter 3.3.2 --- Experimental Results and Analysis --- p.61 / Chapter 4 --- DDoS DEFENSE SCHEMES --- p.67 / Chapter 4.1 --- Scheme I: Packet Filtering at Victim-end --- p.68 / Chapter 4.1.1 --- Packet Marking Scheme Modification --- p.68 / Chapter 4.1.2 --- Packet Filtering Algorithm --- p.69 / Chapter 4.1.3 --- Determining the Filtering Probabilities --- p.70 / Chapter 4.1.4 --- Suppressing Packets Filtering with did Markings from Nearby Routers --- p.73 / Chapter 4.2 --- Scheme II: Rate Limiting at the Sources --- p.73 / Chapter 4.2.1 --- Algorithm of the Rate-limiting Scheme --- p.74 / Chapter 4.3 --- Performance Measurements for Scheme I & Scheme II . --- p.77 / Chapter 5 --- CONCLUSION --- p.87 / Chapter 5.1 --- Contributions --- p.87 / Chapter 5.2 --- Discussion and Future Work --- p.91 / Bibliography --- p.100
17

Cyber Power and the International System

Lonergan, Shawn William January 2017 (has links)
This dissertation is comprised of three separate papers that address how cyber power contributes to national power and the implications for international security posed by cyber operations. The first paper, “Cyber Power and International Stability: Assessing Deterrence and Escalation in Cyberspace,” posits that there are unique attributes that define the cyber domain and that have direct implications on deterrence and escalation dynamics between state actors. The second paper, “Arms Control and Confidence Building Measures for the Cyber Domain,” explores at various mechanisms that states have traditionally used to foster stability and prevent inadvertent conflict and assesses their applicability to controlling cyber operations. Finally, “The Logic of Coercion in Cyberspace” delves into the role of cyber operations as both inadvertent and deliberate signals and assesses their utility as a coercive instrument of statecraft.
18

Parent's use of strategies to monitor children's activities online

Maserumule, Ngwanadira Tebogo January 2017 (has links)
Thesis (M.Com. (Information Systems))--University of the Witwatersrand, Faculty of Commerce, Law and Management, School of Economic and Business Sciences, 2017 / Although studies have been conducted on the effectiveness of different types of filtering software, limited knowledge is available on parents’ use of strategies to monitor their children’s activities online. Thus, identifying understanding parents’ use of strategies to monitor children’s activities online and the extent in which parents use content filtering software will contribute to the body of knowledge. The purpose of this study is to understand parent’s use of strategies to monitor children’s activities online and the extent in which they use content filtering software in Gauteng Province, South Africa. The study adopted a Social Cognitive Theory to develop a conceptual framework and identify existing theoretical concepts. The conceptual framework adapted Bandura’s (2001) framework to inform data analysis. Data were collected through semi-structured interviews and qualitative, thematic content analysis was used for data analyses. The results of the study indicated that parents do use various strategies to monitor children’s activities online and further apply knowledge, experience, and social support as a rationale for using those strategies. The study further revealed that there is a gap between parents, technology industry and government regarding the use of content filtering software. Thus, the study recommends parents, industry and government work together to protecting children online through various strategies and address the concerns regarding the use of content filtering software. Parents’ need to understand the importance of content filtering software and discuss this with their children to be able to protect them online without restricting access to relevant information. Keywords: Harmful content, blocking, strategies, filtering, online content, software, use, non-use, strategies / GR2018
19

Data privacy : the non-interactive setting

Narayanan, Arvind, 1981- 16 October 2012 (has links)
The Internet has enabled the collection, aggregation and analysis of personal data on a massive scale. It has also enabled the sharing of collected data in various ways: wholesale outsourcing of data warehousing, partnering with advertisers for targeted advertising, data publishing for exploratory research, etc. This has led to complex privacy questions related to the leakage of sensitive user data and mass harvesting of information by unscrupulous parties. These questions have information-theoretic, sociological and legal aspects and are often poorly understood. There are two fundamental paradigms for how the data is released: in the interactive setting, the data collector holds the data while third parties interact with the data collector to compute some function on the database. In the non-interactive setting, the database is somehow \sanitized" and then published. In this thesis, we conduct a thorough theoretical and empirical investigation of privacy issues involved in non-interactive data release. Both settings have been well analyzed in the academic literature, but simplicity of the non-interactive paradigm has resulted in its being used almost exclusively in actual data releases. We analyze several common applications including electronic directories, collaborative ltering and recommender systems, and social networks. Our investigation has two main foci. First, we present frameworks for privacy and anonymity in these dierent settings within which one might dene exactly when a privacy breach has occurred. Second, we use these frameworks to experimentally analyze actual large datasets and quantify privacy issues. The picture that has emerged from this research is a bleak one for noninteractivity. While a surprising level of privacy control is possible in a limited number of applications, the general sense is that protecting privacy in the non-interactive setting is not as easy as intuitively assumed in the absence of rigorous privacy denitions. While some applications can be salvaged either by moving to an interactive setting or by other means, in others a rethinking of the tradeos between utility and privacy that are currently taken for granted appears to be necessary. / text
20

Delegation of rights using PKI-based components

Cheung, Lai-sze., 張麗詩. January 2004 (has links)
published_or_final_version / abstract / toc / Computer Science / Master / Master of Philosophy

Page generated in 0.1895 seconds