• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • Tagged with
  • 6
  • 6
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Security Risk Analysis based on Data Criticality

Zhou, Luyuan January 2020 (has links)
Nowadays, security risk assessment has become an integral part of network security as everyday life has become interconnected with and dependent on computer networks. There are various types of data in the network, often with different criticality in terms of availability or confidentiality or integrity of information. Critical data is riskier when it is exploited. Data criticality has an impact on network security risks. The challenge of diminishing security risks in a specific network is how to conduct network security risk analysis based on data criticality. An interesting aspect of the challenge is how to integrate the security metric and the threat modeling, and how to consider and combine the various elements that affect network security during security risk analysis. To the best of our knowledge, there exist no security risk analysis techniques based on threat modeling that consider the criticality of data. By extending the security risk analysis with data criticality, we consider its impact on the network in security risk assessment. To acquire the corresponding security risk value, a method for integrating data criticality into graphical attack models via using relevant metrics is needed. In this thesis, an approach for calculating the security risk value considering data criticality is proposed. Our solution integrates the impact of data criticality in the network by extending the attack graph with data criticality. There are vulnerabilities in the network that have potential threats to the network. First, the combination of these vulnerabilities and data criticality is identified and precisely described. Thereafter the interaction between the vulnerabilities through the attack graph is taken into account and the final security metric is calculated and analyzed. The new security metric can be used by network security analysts to rank security levels of objects in the network. By doing this, they can find objects that need to be given additional attention in their daily network protection work. The security metric could also be used to help them prioritize vulnerabilities that need to be fixed when the network is under attack. In general, network security analysts can find effective ways to resolve exploits in the network based on the value of the security metric.
2

A comprehensive approach to enterprise network security management

Homer, John January 1900 (has links)
Doctor of Philosophy / Department of Computing and Information Sciences / Xinming (Simon) Ou / Enterprise network security management is a vitally important task, more so now than ever before. Networks grow ever larger and more complex, and corporations, universities, government agencies, etc. rely heavily on the availability of these networks. Security in enterprise networks is constantly threatened by thousands of known software vulnerabilities, with thousands more discovered annually in a wide variety of applications. An overwhelming amount of data is relevant to the ongoing protection of an enterprise network. Previous works have addressed the identification of vulnerabilities in a given network and the aggregated collection of these vulnerabilities in an attack graph, clearly showing how an attacker might gain access to or control over network resources. These works, however, do little to address how to evaluate or properly utilize this information. I have developed a comprehensive approach to enterprise network security management. Compared with previous methods, my approach realizes these issues as a uniform desire for provable mitigation of risk within an enterprise network. Attack graph simplification is used to improve user comprehension of the graph data and to enable more efficient use of the data in risk assessment. A sound and effective quantification of risk within the network produces values that can form a basis for valuation policies necessary for the application of a SAT solving technique. SAT solving resolves policy conflicts and produces an optimal reconfiguration, based on the provided values, which can be verified by a knowledgeable human user for accuracy and applicability within the context of the enterprise network. Empirical study shows the effectiveness and efficiency of these approaches, and also indicates promising directions for improvements to be explored in future works. Overall, this research comprises an important step toward a more automated security management initiative.
3

Security metric based risk assessment.

Khan, Moazzam 30 April 2013 (has links)
Modern day computer networks have become very complex and attackers have benefited due to this complexity and have found vulnerabilities and loopholes in the network architecture. In order to identify the attacks from an attacker all aspects of network architecture needs to be carefully examined such as packet headers, network scans, versions of applications, network scans, network anomalies etc. and after the examination attributes playing a significant impact on the security posture of the organization needs to be highlighted so that resources and efforts are directed towards those attributes. In this work we extensively look at network traffic at dormitory network of a large campus and try to identify the attributes that play a significant role in the infection of a machine. Our scheme is to collect as much attributes from the network traffic applying the heuristic of network infection and then devise a scheme called decision centric rank ordering of security metric that gives the priority to the security metrics so that network administrators can channel their efforts in the right direction. Another aspect of this research is to identify the probability of an attack on a communication infrastructure. A communication infrastructure becomes prone to attack if certain elements exist in it, such as vulnerabilities in the comprising elements of the system, existence of an attacker and motivation for him to attack. Focus of this study is on vulnerability assessment and security metrics such as user behavior, operating systems, user applications, and software updates. To achieve a quantified value of risk, a set of machines is carefully observed for the security metrics. Statistical analysis is applied on the data collected from compromised machines and the quantified value of risk is achieved.
4

A Framework Based On Continuous Security Monitoring

Erturk, Volkan 01 December 2008 (has links) (PDF)
Continuous security monitoring is the process of following up the IT systems by collecting measurements, reporting and analysis of the results for comparing the security level of the organization on continuous time axis to see how organizational security is progressing in the course of time. In the related literature there is very limited work done to continuously monitor the security of the organizations. In this thesis, a continuous security monitoring framework based on security metrics is proposed. Moreover, to decrease the burden of implementation a software tool called SecMon is introduced. The implementation of the framework in a public organization shows that the proposed system is successful for building an organizational memory and giving insight to the security stakeholders about the IT security level in the organization.
5

Secure Localization Topology and Methodology for a Dedicated Automated Highway System

Deka, Bhaswati 01 May 2013 (has links)
Localization of nodes is an important aspect in a vehicular ad-hoc network (VANET). Research has been done on various localization methods. Some are more apt for a specific purpose than others. To begin with, we give an overview of a vehicular ad-hoc network, localization methods, and how they can be classified. The distance bounding and verifiable trilateration methods are explained further with their corresponding algorithms and steps used for localization. Distance bounding is a range-based distance estimation algorithm. Verifiable trilateration is a popular geometric method of localization. A dedicated automated highway infrastructure can use distance bounding and/or trilateration to localize an automated vehicle on the highway. We describe a highway infrastructure for our analysis and test how well each of the methods performs, according to a security measure defined as spoofing probability. The spoofing probability is, simply put, the probability that a given point on the highway will be successfully spoofed by an attacker that is located at any random position along the highway. Spoofing probability depends on different quantities depending on the method of localization used. We compare the distance bounding and trilateration methods to a novel method using friendly jamming for localization. Friendly jamming works by creating an interference around the region whenever communication takes place between a vehicle and a verifier (belonging to the highway infrastructure, which is involved in the localization process using a given algorithm and localization method). In case of friendly jamming, the spoofing probability depends both on the position and velocity of the attacker and those of the target vehicle (which the attacker aims to spoof). This makes the spoofing probability much less for friendly jamming. On the other hand, the distance bounding and trilateration methods have spoofing probabilities depending only on their position. The results are summarized at the end of the last chapter to give an idea about how the three localization methods, i.e. distance bounding, verifiable trilateration, and friendly jamming, compare against each other for a dedicated automated highway infrastructure. We observe that the spoofing probability of the friendly jamming infrastructure is less than 2% while the spoofing probabilities of distance bounding and trilateration are 25% and 11%, respectively. This means that the friendly jamming method is more secure for the corresponding automated transportation system (ATS) infrastructure than distance bounding and trilateration. However, one drawback of friendly jamming is that it has a high standard deviation because the range of positions that are most vulnerable is high. Even though the spoofing probability is much less, the friendly jamming method is vulnerable to an attack over a large range of distances along the highway. This can be overcome by defining a more robust infrastructure and using the infrastructure's resources judiciously. This can be the future scope of our research. Infrastructures that use the radio resources in a cost effective manner to reduce the vulnerability of the friendly jamming method are a promising choice for the localization of vehicles on an ATS highway.
6

A Security Framework for Logic Locking Through Local and Global Structural Analysis

Taylor, Christopher P. 28 September 2020 (has links)
No description available.

Page generated in 0.0516 seconds