• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • Tagged with
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Detecting Networks Employing Algorithmically Generated Domain Names

Ashwath Kumar Krishna Reddy 2010 August 1900 (has links)
Recent Botnets such as Conficker, Kraken and Torpig have used DNS based "domain fluxing" for command-and-control, where each Bot queries for existence of a series of domain names and the owner has to register only one such domain name. In this report, we develop a methodology to detect such "domain fluxes" in DNS traffic by looking for patterns inherent to domain names that are generated algorithmically, in contrast to those generated by humans. In particular, we look at distribution of alphanumeric characters as well as bigrams in all domains that are mapped to the same set of IP-addresses. We present and compare the performance of several distance metrics, including KL-distance and Edit distance. We train by using a good data set of domains obtained via a crawl of domains mapped to all IPv4 address space and modeling bad data sets based on behaviors seen so far and expected. We also apply our methodology to packet traces collected at two Tier-1 ISPs and show we can automatically detect domain fluxing as used by Conficker botnet with minimal false positives. We are also able to detect new botnets and other malicious networks using our method.
2

Enhancing Performance of Vulnerability-based Intrusion Detection Systems

Farroukh, Amer 31 December 2010 (has links)
The accuracy of current intrusion detection systems (IDSes) is hindered by the limited capability of regular expressions (REs) to express the exact vulnerability. Recent advances have proposed vulnerability-based IDSes that parse traffic and retrieve protocol semantics to describe the vulnerability. Such a description of attacks is analogous to subscriptions that specify events of interest in event processing systems. However, the matching engine of state-of-the-art IDSes lacks efficient matching algorithms that can process many signatures simultaneously. In this work, we place event processing in the core of the IDS and propose novel algorithms to efficiently parse and match vulnerability signatures. Also, we are among the first to detect complex attacks such as the Conficker worm which requires correlating multiple protocol data units (MPDUs) while maintaining a small memory footprint. Our approach incurs neglibile overhead when processing clean traffic, is resilient to attacks, and is faster than existing systems.
3

Enhancing Performance of Vulnerability-based Intrusion Detection Systems

Farroukh, Amer 31 December 2010 (has links)
The accuracy of current intrusion detection systems (IDSes) is hindered by the limited capability of regular expressions (REs) to express the exact vulnerability. Recent advances have proposed vulnerability-based IDSes that parse traffic and retrieve protocol semantics to describe the vulnerability. Such a description of attacks is analogous to subscriptions that specify events of interest in event processing systems. However, the matching engine of state-of-the-art IDSes lacks efficient matching algorithms that can process many signatures simultaneously. In this work, we place event processing in the core of the IDS and propose novel algorithms to efficiently parse and match vulnerability signatures. Also, we are among the first to detect complex attacks such as the Conficker worm which requires correlating multiple protocol data units (MPDUs) while maintaining a small memory footprint. Our approach incurs neglibile overhead when processing clean traffic, is resilient to attacks, and is faster than existing systems.
4

Analýza síťových útoků pomocí honeypotů / Network Attack Analysis Using Honeypots

Galetka, Josef January 2010 (has links)
This text deals with computer network security using honeypot technology, as a tool of intentional trap for attackers. It closely describes basic thoughts, together with advantages and disadvantages of this concept. The main aim is a low interaction honeypot Honeyd, its functionality and possible extensional features. As a practical part of the text there is a description of principles of implementation Honeyd service scripts, which are represented as a simulation of behavior of computer worm Conficker. Further it describes creation of automated script used for analysis and processing of gathered data, captured during actual deployment of Honeyd in Internet network.

Page generated in 0.0461 seconds