Spelling suggestions: "subject:"botnet"" "subject:"dotnet""
31 |
Systèmes coopératifs décentralisés de détection et de contre-mesures des incidents et attaques sur les réseaux IP / Collaborative and decentralized detection and mitigation of network attacksGuerid, Hachem 06 December 2014 (has links)
La problématique des botnets, réseaux de machines infectées par des logiciels malveillants permettant de les contrôler à distance, constitue une préoccupation majeure du fait du nombre de machines infectées et des menaces associées: attaque par déni de service distribué (DDoS), spam, vol de données bancaires. Les solutions de lutte contre les botnets proposées présentent des limitations majeures dans le contexte d'un opérateur réseau (contraintes de volumétrie et de passage à l'échelle, respect de la confidentialité et de la vie privée des utilisateurs). Cette thèse propose quatre contributions orientées réseau de lutte contre les botnets. Chaque contribution traite d'une étape complémentaire dans la problématique des botnets: la première contribution permet de remonter à la source d'attaques par déni de service, et ainsi d'identifier un groupe de machines infectées à l'origine de ces attaques. La deuxième contribution concerne la détection des communications entre les machines infectées et leurs serveurs de contrôle et commande dans un réseau à large échelle, et offre ainsi l'opportunité de bloquer ces serveurs pour limiter le risque de nouvelles attaques. La troisième contribution permet une détection collaborative de botnets dans un contexte inter-domaine et inter-opérateur, permettant ainsi de lutter contre l'aspect hautement distribué de ces botnets. Enfin, la dernière contribution proposée permet de remédier aux botnets en ralentissant les communications entre les machines infectées et leur serveur de contrôle, offrant par ce biais une contre-mesure aux stratégies d'évasions développées par les cybercriminels afin de rendre leurs botnets plus résilients. / The problem of botnets, networks of infected hosts controlled remotely by attackers, is a major concern because of the number of infected hosts and associated threats, like distributed denial of service (DDoS), spams, and data theft. State of the art solutions to fight against botnets have major limitations in a context of a network operator (scalability of the solution, confidentiality and privacy of users). In this thesis, we propose four network-based contributions to fight against botnets. Each solution address a different and complementary issue in this area: the first contribution tracebacks the source of denial of service attacks which threaten the network availability, allowing by that way to identify infected devices used to perpetrate these attacks. The second contribution detects the communications between infected computers and their command and control server (C&C) in a large scale network and offers the opportunity to block these servers to minimize the risk of future attacks. The third contribution enables collaborative detection of botnets in an inter-domain and inter-operator context in order to fight against the highly distributed aspect of these botnets. Finally, the last contribution mitigates botnets by slowing down the communication between infected hosts and their C&C server, providing a countermeasure against evasion techniques developed by cybercriminals to make their botnets more resilient
|
32 |
Moderní způsoby návrhů plně distribuovaných, decentralizovaných a těžko detekovatelných červů / Modern ways to design fully distributed, decentralized and stealthy wormsSzetei, Norbert January 2013 (has links)
The thesis deals with the study of the computer worm meeting several criteria (it should be fully distributed, decentralized and stealthy). These conditions lead to anonymity, longevity and better security of our worm. After presenting the recently used architectures and new technologies we analyse the known implementations. We propose the solutions with the new design together with the possible ways of improvements. In the next chapter we study biological concepts suitable for the new replication mode, where we implement the key concepts of functionality in a higher programming language. At design we have considered as important to be platform independent, so it is possible for the worm to spread in almost every computer environment, in dependence of implementation of the required modules. Powered by TCPDF (www.tcpdf.org)
|
33 |
Integrate Model and Instance Based Machine Learning for Network Intrusion DetectionLena Ara (5931005) 17 January 2019 (has links)
<div> In computer networks, the convenient internet access facilitates internet services, but at the same time also augments the spread of malicious software which could represent an attack or unauthorized access. Thereby, making the intrusion detection an important area to explore for detecting these unwanted activities. This thesis concentrates on combining the Model and Instance Based Machine Learning for detecting intrusions through a series of algorithms starting from clustering the similar hosts. </div><div> Similar hosts have been found based on the supervised machine learning techniques like Support Vector Machines, Decision Trees and K Nearest Neighbors using our proposed Data Fusion algorithm. Maximal cliques of Graph Theory has been explored to find the clusters. A recursive way is proposed to merge the decision areas of best features. The idea is to implement a combination of model and instance based machine learning and analyze how it performs as compared to a conventional machine learning algorithm like Random Forest for intrusion detection. The system has been evaluated on three datasets by CTU-13. The results show that our proposed method gives better detection rate as compared to traditional methods which might overfit the data.</div><div> The research work done in model merging, instance based learning, random forests, data mining and ensemble learning with regards to intrusion detection have been studied and taken as reference. </div>
|
34 |
An implementation of a DNS-based malware detection systemFors, Markus, Grahn, Christian January 2010 (has links)
<p>Today’s wide usage of the Internet makes malicious software (malware) and botnets a big problem. While anti-virus software is commonplace today, malware is constantly evolving to remain undetected. Passively monitoring DNS traffic on a network can present a platform for detecting malware on multiple computers at a low cost and low complexity. To explore this avenue for detecting malware we decided it was necessary to design an extensible system where the framework was separate from the actual detection methods. We wanted to divide the system into three parts, one for logging, one for handling modules for detection and one for taking action against suspect traffic. The system we implemented in C collects DNS traffic and processes it with modules that are compiled separately and can be plugged in or out during runtime. Two proof of concept modules have been implemented. One based on a blacklist and one based on geolocation of requested servers. The system is complete to the point of being ready for field testing and implementation of more advanced detection modules.</p>
|
35 |
A Convert Channel Using 802.11 LANSCalhoun, Telvis Eugene 10 April 2009 (has links)
We present a covert side channel that uses the 802.11 MAC rate switching protocol. The covert channel provides a general method to hide communications in an 802.11 LAN. The technique uses a one-time password algorithm to ensure high-entropy randomness of the covert messages. We investigate how the covert side channel affects node throughput in mobile and non-mobile scenarios. We also investigate the covertness of the covert side channel using standardized entropy. The results show that the performance impact is minimal and increases slightly as the covert channel bandwidth increases. We further show that the channel has 100% accuracy with minimal impact on rate switching entropy. Finally, we present two applications for the covert channel: covert authentication and covert WiFi botnets.
|
36 |
High Orbit Ion Cannon : Går det att skydda sig?Jonsson, Robin, Blixt, Simon January 2012 (has links)
No description available.
|
37 |
Effective and scalable botnet detection in network trafficZhang, Junjie 03 July 2012 (has links)
Botnets represent one of the most serious threats against Internet security since they serve as platforms that are responsible for the vast majority of large-scale and coordinated cyber attacks, such as distributed denial of service, spamming, and information stolen. Detecting botnets is therefore of great importance and a number of network-based botnet detection systems have been proposed. However, as botnets perform attacks in an increasingly stealthy way and the volume of network traffic is rapidly growing, existing botnet detection systems are faced with significant challenges in terms of effectiveness and scalability.
The objective of this dissertation is to build novel network-based solutions that can boost both the effectiveness of existing botnet detection systems by detecting botnets whose attacks are very hard to be observed in network traffic, and their scalability by adaptively sampling network packets that are likely to be generated by botnets. To be specific, this dissertation describes three unique contributions.
First, we built a new system to detect drive-by download attacks, which represent one of the most significant and popular methods for botnet infection. The goal of our system is to boost the effectiveness of existing drive-by download detection systems by detecting a large number of drive-by download attacks that are missed by these existing detection efforts.
Second, we built a new system to detect botnets with peer-to-peer (P2P) command&control (C&C) structures (i.e., P2P botnets), where P2P C&Cs represent currently the most robust C&C structures against disruption efforts. Our system aims to boost the effectiveness of existing P2P botnet detection by detecting P2P botnets in two challenging scenarios: i) botnets perform stealthy attacks that are extremely hard to be observed in the network traffic; ii) bot-infected hosts are also running legitimate P2P applications (e.g., Bittorrent and Skype).
Finally, we built a novel traffic analysis framework to boost the scalability of existing botnet detection systems. Our framework can effectively and efficiently identify a small percentage of hosts that are likely to be bots, and then forward network traffic associated with these hosts to existing detection systems for fine-grained analysis, thereby boosting the scalability of existing detection systems. Our traffic analysis framework includes a novel botnet-aware and adaptive packet sampling algorithm, and a scalable flow-correlation technique.
|
38 |
Performance analysis of peer-to-peer botnets using "The Storm Botnet" as an exemplarAgarwal, Sudhir 03 May 2010 (has links)
Among malicious codes like computer viruses and worms, botnets have attracted a significant attention and have been one of the biggest threats on the Internet. Botnets have evolved to incorporate peer-to-peer communications for the purpose of propagating instructions to large numbers of computers (also known as bot) under the botmaster's control. The impact of the botnet lies in its ability for a bot master to execute large scale attacks while remaining hidden as the true director of the attack. One such recently known botnet is the Storm botnet. Storm is based on the Overnet Distributed Hash Table (DHT) protocol which in turn is based on the Kademlia DHT protocol. Significant research has been done for determining its operational size, behaviour and mitigation approaches.
In this research, the peer-to-peer behaviour of Storm is studied by simulating its actual packet level network behaviour. The packet level simulator is developed via the simulation framework OMNET++ to determine the impact of design parameters on botnets performance and resilience. Parameters such as botnet size, peer list size, the number of bot masters and the key propagation time have been explored. Furthermore, two mitigation strategies are considered: a) random removal strategy (disinfection strategy), that removes selected bots randomly from the botnet; b) Sybil disruption strategy, that introduces fake bots into the botnet with the task of propagating Sybil values into the botnet to disrupt the communication channels between the controllers and the compromised machines. The simulation studies demonstrate that Sybil disruption strategies outperform random removal strategies. The simulation results also indicate that random removal strategies are not even effective for a small sized networks. The results of the simulation studies are particularly applicable to the Storm botnet but these results also provide insights that can be applied to peer-to-peer based botnets in general.
|
39 |
Bezpečnostní analýza síťového provozu / Security inspection of network trafficKult, Viktor January 2017 (has links)
Thesis topic concerns the issue of information security in corporate environments. Literature search includes information obtained by studying articles and literature in the field of information security. Resources were selected with a focus on the security risks, security technologies and legislative regulation. Attention is focused on technology that supports monitoring of communication flows in the data network. Overview of traffic operating a data network provides important information for the prevention or investigation of security incidents. Monitoring also serves as a source of information for the planning of the network infrastructure. It can detect faults or insufficient transmission capacity. The practical part is dedicated to implementation of the monitoring system in the real corporate networks. Part of the experience is the analysis of the network structure and choice of appropriate tools for actual implementation. When selecting tools, you can use the scoring method of multicriterial analysis options. The integration of the monitoring system is also the configuration of active network elements. Subsequent analysis of network traffic provides information about the most active users, most used applications or on the sources and targets of data transmitted. It provides a source of valuable information that can be used in case of failure on the network or security incident. The conclusion is a summary of the results and workflow.
|
40 |
On P2P Networks and P2P-Based Content Discovery on the InternetMemon, Ghulam 17 June 2014 (has links)
The Internet has evolved into a medium centered around content: people watch videos on YouTube, share their pictures via Flickr, and use Facebook to keep in touch with their friends. Yet, the only globally deployed service to discover content - i.e., Domain Name System (DNS) - does not discover content at all; it merely translates domain names into locations. The lack of persistent naming, in particular, makes content discovery, instead of domain discovery, challenging. Content Distribution Networks (CDNs), which augment DNSs with location-awareness, also suffer from the same problem of lack of persistent content names. Recently, several infrastructure- level solutions to this problem have emerged, but their fundamental limitation is that they fail to preserve the autonomy of network participants. Specifically, the storage requirements for resolution within each participant may not be proportional to their capacity. Furthermore, these solutions cannot be incrementally deployed. To the best of our knowledge, content discovery services based on peer-to-peer (P2P) networks are the only ones that support persistent content names. These services also come with the built-in advantage of scalability and deployability. However, P2P networks have been deployed in the real-world only recently, and their real-world characteristics are not well understood. It is important to understand these real-world characteristics in order to improve the performance and propose new designs by identifying the weaknesses of existing designs. In this dissertation, we first propose a novel, lightweight technique for capturing P2P traffic. Using our captured data, we characterize several aspects of P2P networks and draw conclusions about their weaknesses. Next, we create a botnet to demonstrate the lethality of the weaknesses of P2P networks. Finally, we address the weaknesses of P2P systems to design a P2P-based content discovery service, which resolves the drawbacks of existing content discovery systems and can operate at Internet-scale.
This dissertation includes both previously published/unpublished and co-authored material.
|
Page generated in 0.0235 seconds