• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 6
  • 6
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A quantitative measure of the security risk level of enterprise networks

Munir, Rashid, Pagna Disso, Jules F., Awan, Irfan U., Mufti, Muhammad R. January 2013 (has links)
No / Along with the tremendous expansion of information technology and networking, the number of malicious attacks which cause disruption to business processes has concurrently increased. Despite such attacks, the aim for network administrators is to enable these systems to continue delivering the services they are intended for. Currently, many research efforts are directed towards securing network further whereas, little attention has been given to the quantification of network security which involves assessing the vulnerability of these systems to attacks. In this paper, a method is devised to quantify the security level of IT networks. This is achieved by electronically scanning the network using the vulnerability scanning tool (Nexpose) to identify the vulnerability level at each node classified according to the common vulnerability scoring system standards (critical, severe and moderate). Probabilistic approach is then applied to calculate an overall security risk level of sub networks and entire network. It is hoped that these metrics will be valuable for any network administrator to acquire an absolute risk assessment value of the network. The suggested methodology has been applied to a computer network of an existing UK organization with 16 nodes and a switch.
2

Adapting the Single-Request/Multiple-Response Message Exchange Pattern to Web Services

Ruth, Michael 20 May 2005 (has links)
Single-Request/Multiple-Response (SRMR) is an important messaging exchange pattern because it can be used to model many real world problems elegantly. However, SRMR messaging is not directly supported by Web services, and, since it requires Callback to function it is hampered by current in-practice security schemes, such as firewalls and proxy servers. In this thesis, a framework will be proposed to support SRMR and Callback in the context of Web services and the realities of network security. The central component of the proposed solution is a Clearinghouse Web service (CWS), which serves as a communication proxy and realizes the correlation of responses with requests. One and only one CWS will be needed per enterprise that wishes to handle any number of SRMR Web services and their respective clients. Using the framework and related code generation utilities, a non-trivial case study, a Purchase Order System, has been implemented.
3

A quantitative security assessment of modern cyber attacks : a framework for quantifying enterprise security risk level through system's vulnerability analysis by detecting known and unknown threats

Munir, Rashid January 2014 (has links)
Cisco 2014 Annual Security Report clearly outlines the evolution of the threat landscape and the increase of the number of attacks. The UK government in 2012 recognised the cyber threat as Tier-1 threat since about 50 government departments have been either subjected to an attack or a direct threat from an attack. The cyberspace has become the platform of choice for businesses, schools, universities, colleges, hospitals and other sectors for business activities. One of the major problems identified by the Department of Homeland Security is the lack of clear security metrics. The recent cyber security breach of the US retail giant TARGET is a typical example that demonstrates the weaknesses of qualitative security, also considered by some security experts as fuzzy security. High, medium or low as measures of security levels do not give a quantitative representation of the network security level of a company. In this thesis, a method is developed to quantify the security risk level of known and unknown attacks in an enterprise network in an effort to solve this problem. The identified vulnerabilities in a case study of a UK based company are classified according to their severity risk levels using common vulnerability scoring system (CVSS) and open web application security project (OWASP). Probability theory is applied against known attacks to create the security metrics and, detection and prevention method is suggested for company network against unknown attacks. Our security metrics are clear and repeatable that can be verified scientifically.
4

A Quantitative Security Assessment of Modern Cyber Attacks. A Framework for Quantifying Enterprise Security Risk Level Through System's Vulnerability Analysis by Detecting Known and Unknown Threats

Munir, Rashid January 2014 (has links)
Cisco 2014 Annual Security Report clearly outlines the evolution of the threat landscape and the increase of the number of attacks. The UK government in 2012 recognised the cyber threat as Tier-1 threat since about 50 government departments have been either subjected to an attack or a direct threat from an attack. The cyberspace has become the platform of choice for businesses, schools, universities, colleges, hospitals and other sectors for business activities. One of the major problems identified by the Department of Homeland Security is the lack of clear security metrics. The recent cyber security breach of the US retail giant TARGET is a typical example that demonstrates the weaknesses of qualitative security, also considered by some security experts as fuzzy security. High, medium or low as measures of security levels do not give a quantitative representation of the network security level of a company. In this thesis, a method is developed to quantify the security risk level of known and unknown attacks in an enterprise network in an effort to solve this problem. The identified vulnerabilities in a case study of a UK based company are classified according to their severity risk levels using common vulnerability scoring system (CVSS) and open web application security project (OWASP). Probability theory is applied against known attacks to create the security metrics and, detection and prevention method is suggested for company network against unknown attacks. Our security metrics are clear and repeatable that can be verified scientifically
5

A comprehensive approach to enterprise network security management

Homer, John January 1900 (has links)
Doctor of Philosophy / Department of Computing and Information Sciences / Xinming (Simon) Ou / Enterprise network security management is a vitally important task, more so now than ever before. Networks grow ever larger and more complex, and corporations, universities, government agencies, etc. rely heavily on the availability of these networks. Security in enterprise networks is constantly threatened by thousands of known software vulnerabilities, with thousands more discovered annually in a wide variety of applications. An overwhelming amount of data is relevant to the ongoing protection of an enterprise network. Previous works have addressed the identification of vulnerabilities in a given network and the aggregated collection of these vulnerabilities in an attack graph, clearly showing how an attacker might gain access to or control over network resources. These works, however, do little to address how to evaluate or properly utilize this information. I have developed a comprehensive approach to enterprise network security management. Compared with previous methods, my approach realizes these issues as a uniform desire for provable mitigation of risk within an enterprise network. Attack graph simplification is used to improve user comprehension of the graph data and to enable more efficient use of the data in risk assessment. A sound and effective quantification of risk within the network produces values that can form a basis for valuation policies necessary for the application of a SAT solving technique. SAT solving resolves policy conflicts and produces an optimal reconfiguration, based on the provided values, which can be verified by a knowledgeable human user for accuracy and applicability within the context of the enterprise network. Empirical study shows the effectiveness and efficiency of these approaches, and also indicates promising directions for improvements to be explored in future works. Overall, this research comprises an important step toward a more automated security management initiative.
6

Handling uncertainty in intrusion analysis

Zomlot, Loai M. M. January 1900 (has links)
Doctor of Philosophy / Department of Computing and Information Sciences / Xinming Ou / Intrusion analysis, i.e., the process of combing through Intrusion Detection System (IDS) alerts and audit logs to identify true successful and attempted attacks, remains a difficult problem in practical network security defense. The primary cause of this problem is the high false positive rate in IDS system sensors used to detect malicious activity. This high false positive rate is attributed to an inability to differentiate nearly certain attacks from those that are merely possible. This inefficacy has created high uncertainty in intrusion analysis and consequently causing an overwhelming amount of work for security analysts. As a solution, practitioners typically resort to a specific IDS-rules set that precisely captures specific attacks. However, this results in failure to discern other forms of the targeted attack because an attack’s polymorphism reflects human intelligence. Alternatively, the addition of generic rules so that an activity with remote indication of an attack will trigger an alert, requires the security analyst to discern true alerts from a multitude of false alerts, thus perpetuating the original problem. The perpetuity of this trade-off issue is a dilemma that has puzzled the cyber-security community for years. A solution to this dilemma includes reducing uncertainty in intrusion analysis by making IDS-nearly-certain alerts prominently discernible. Therefore, I propose alerts prioritization, which can be attained by integrating multiple methods. I use IDS alerts correlation by building attack scenarios in a ground-up manner. In addition, I use Dempster-Shafer Theory (DST), a non-traditional theory to quantify uncertainty, and I propose a new method for fusing non-independent alerts in an attack scenario. Finally, I propose usage of semi-supervised learning to capture an organization’s contextual knowledge, consequently improving prioritization. Evaluation of these approaches was conducted using multiple datasets. Evaluation results strongly indicate that the ranking provided by the approaches gives good prioritization of IDS alerts based on their likelihood of indicating true attacks.

Page generated in 0.0947 seconds