• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 22
  • 8
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 97
  • 97
  • 34
  • 21
  • 20
  • 19
  • 18
  • 16
  • 16
  • 16
  • 16
  • 16
  • 15
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Analytical modelling of scheduling schemes under self-similar network traffic : traffic modelling and performance analysis of centralized and distributed scheduling schemes

Liu, Lei January 2010 (has links)
High-speed transmission over contemporary communication networks has drawn many research efforts. Traffic scheduling schemes which play a critical role in managing network transmission have been pervasively studied and widely implemented in various practical communication networks. In a sophisticated communication system, a variety of applications co-exist and require differentiated Quality-of-Service (QoS). Innovative scheduling schemes and hybrid scheduling disciplines which integrate multiple traditional scheduling mechanisms have emerged for QoS differentiation. This study aims to develop novel analytical models for commonly interested scheduling schemes in communication systems under more realistic network traffic and use the models to investigate the issues of design and development of traffic scheduling schemes. In the open literature, it is commonly recognized that network traffic exhibits self-similar nature, which has serious impact on the performance of communication networks and protocols. To have a deep study of self-similar traffic, the real-world traffic datasets are measured and evaluated in this study. The results reveal that selfsimilar traffic is a ubiquitous phenomenon in high-speed communication networks and highlight the importance of the developed analytical models under self-similar traffic. The original analytical models are then developed for the centralized scheduling schemes including the Deficit Round Robin, the hybrid PQGPS which integrates the traditional Priority Queueing (PQ) and Generalized Processor Sharing (GPS) schemes, and the Automatic Repeat reQuest (ARQ) forward error control discipline in the presence of self-similar traffic. Most recently, research on the innovative Cognitive Radio (CR) techniques in wireless networks is popular. However, most of the existing analytical models still employ the traditional Poisson traffic to examine the performance of CR involved systems. In addition, few studies have been reported for estimating the residual service left by primary users. Instead, extensive existing studies use an ON/OFF source to model the residual service regardless of the primary traffic. In this thesis, a PQ theory is adopted to investigate and model the possible service left by selfsimilar primary traffic and derive the queue length distribution of individual secondary users under the distributed spectrum random access protocol.
2

Analytical Modelling of Scheduling Schemes under Self-similar Network Traffic. Traffic Modelling and Performance Analysis of Centralized and Distributed Scheduling Schemes.

Liu, Lei January 2010 (has links)
High-speed transmission over contemporary communication networks has drawn many research efforts. Traffic scheduling schemes which play a critical role in managing network transmission have been pervasively studied and widely implemented in various practical communication networks. In a sophisticated communication system, a variety of applications co-exist and require differentiated Quality-of-Service (QoS). Innovative scheduling schemes and hybrid scheduling disciplines which integrate multiple traditional scheduling mechanisms have emerged for QoS differentiation. This study aims to develop novel analytical models for commonly interested scheduling schemes in communication systems under more realistic network traffic and use the models to investigate the issues of design and development of traffic scheduling schemes. In the open literature, it is commonly recognized that network traffic exhibits self-similar nature, which has serious impact on the performance of communication networks and protocols. To have a deep study of self-similar traffic, the real-world traffic datasets are measured and evaluated in this study. The results reveal that selfsimilar traffic is a ubiquitous phenomenon in high-speed communication networks and highlight the importance of the developed analytical models under self-similar traffic. The original analytical models are then developed for the centralized scheduling schemes including the Deficit Round Robin, the hybrid PQGPS which integrates the traditional Priority Queueing (PQ) and Generalized Processor Sharing (GPS) schemes, and the Automatic Repeat reQuest (ARQ) forward error control discipline in the presence of self-similar traffic. Most recently, research on the innovative Cognitive Radio (CR) techniques in wireless networks is popular. However, most of the existing analytical models still employ the traditional Poisson traffic to examine the performance of CR involved systems. In addition, few studies have been reported for estimating the residual service left by primary users. Instead, extensive existing studies use an ON/OFF source to model the residual service regardless of the primary traffic. In this thesis, a PQ theory is adopted to investigate and model the possible service left by selfsimilar primary traffic and derive the queue length distribution of individual secondary users under the distributed spectrum random access protocol.
3

Detecting ransomware in encrypted network traffic using machine learning

Modi, Jaimin 29 August 2019 (has links)
Ransomware is a type of malware that has gained immense popularity in recent time due to its money extortion techniques. It locks out the user from the system files until the ransom amount is paid. Existing approaches for ransomware detection predominantly focus on system level monitoring, for instance, by tracking the file system characteristics. To date, only a small amount of research has focused on detecting ransomware at the network level, and none of the published proposals have addressed the challenges raised by the fact that an increasing number of ransomware are using encrypted channels for communication with the command and control (C&C) server, mainly, over the HTTPS protocol. Despite the limited amount of ransomware-specific data available in network traffic, network-level detection represents a valuable extension of system-level detection as this would provide early indication of ransomware activities and allow disrupting such activities before serious damage can take place. To address the aforementioned gap, we propose, in the current thesis, a new approach for detecting ransomware in encrypted network traffic that leverages network connection and certificate information and machine learning. We observe that network traffic characteristics can be divided into 3 categories – connection based, encryption based, and certificate based. Based on these characteristics, we explore a feature model that separates effectively ransomware traffic from normal traffic. We study three different classifiers – Random Forest, SVM and Logistic Regression. Experimental evaluation on diversified dataset yields a detection rate of 99.9% and a false positive rate of 0% for random forest, the best performing of the three classifiers. / Graduate
4

An Investigation of a Multi-Objective Genetic Algorithm applied to Encrypted Traffic Identification

Bacquet, Carlos 10 August 2010 (has links)
This work explores the use of a Multi-Objective Genetic Algorithm (MOGA) for both, feature selection and cluster count optimization, for an unsupervised machine learning technique, K-Means, applied to encrypted traffic identification (SSH). The performance of the proposed model is benchmarked against other unsupervised learning techniques existing in the literature: Basic K-Means, semi-supervised K-Means, DBSCAN, and EM. Results show that the proposed MOGA, not only outperforms the other models, but also provides a good trade off in terms of detection rate, false positive rate, and time to build and run the model. A hierarchical version of the proposed model is also implemented, to observe the gains, if any, obtained by increasing cluster purity by means of a second layer of clusters. Results show that with the hierarchical MOGA, significant gains are observed in terms of the classification performances of the system.
5

On enabling dynamically adaptable Internet applications

Bhatti, Saleem Noel January 1998 (has links)
No description available.
6

Asymptotics of multi-buffered queueing systems with generalised processor sharing

Kotopoulos, Constantinos A. January 2000 (has links)
No description available.
7

Identifying User Actions from Network Traffic

Rizothanasis, Georgios January 2015 (has links)
Identification of a user’s actions while browsing the Internet is mostly achieved by instrumentation of the user’s browser or by obtaining server logs. In both cases this requires installation of software on multiple clients and/or servers in order to obtain sufficient data. However, by using network traffic, access to user generated traffic from multiple clients to multiple servers is possible. In this project a proxy server is used for recording network traffic and a user-action identification algorithm is proposed. The proposed algorithm includes various policies of analyzing network traffic in order to identify user actions. This project also presents an evaluation framework for the proposed policies, based on which the tradeoff of the various policies is revealed. Proxy servers are widely deployed by numerous organizations and often used for web mining, so with the work of this project user action recognition can be a new tool when considering web traffic evaluation.
8

Trafgen: An efficient approach to statistically accurate artificial network traffic generation

Helvey, Eric Lee January 1998 (has links)
No description available.
9

Detecting Hidden Wireless Cameras through Network Traffic Analysis

Cowan, KC Kaye 02 October 2020 (has links)
Wireless cameras dominate the home surveillance market, providing an additional layer of security for homeowners. Cameras are not limited to private residences; retail stores, public bathrooms, and public beaches represent only some of the possible locations where wireless cameras may be monitoring people's movements. When cameras are deployed into an environment, one would typically expect the user to disclose the presence of the camera as well as its location, which should be outside of a private area. However, adversarial camera users may withhold information and prevent others from discovering the camera, forcing others to determine if they are being recorded on their own. To uncover hidden cameras, a wireless camera detection system must be developed that will recognize the camera's network traffic characteristics. We monitor the network traffic within the immediate area using a separately developed packet sniffer, a program that observes and collects information about network packets. We analyze and classify these packets based on how well their patterns and features match those expected of a wireless camera. We used a Support Vector Machine classifier and a secondary-level of classification to reduce false positives to design and implement a system that uncovers the presence of hidden wireless cameras within an area. / Master of Science / Wireless cameras may be found almost anywhere, whether they are used to monitor city traffic and report on travel conditions or to act as home surveillance when residents are away. Regardless of their purpose, wireless cameras may observe people wherever they are, as long as a power source and Wi-Fi connection are available. While most wireless camera users install such devices for peace of mind, there are some who take advantage of cameras to record others without their permission, sometimes in compromising positions or places. Because of this, systems are needed that may detect hidden wireless cameras. We develop a system that monitors network traffic packets, specifically based on their packet lengths and direction, and determines if the properties of the packets mimic those of a wireless camera stream. A double-layered classification technique is used to uncover hidden wireless cameras and filter out non-wireless camera devices.
10

Modeling and Estimation Techniques for Wide-Area Network Traffic with Atypical Components

Minton, Carl Edward 30 April 2002 (has links)
A critical first step to improving existing and designing future wide-area networks is an understanding of the load placed on these networks. Efforts to model traffic are often confounded by atypical traffic - traffic particular to the observation site not ubiquitously applicable. The causes and characteristics of atypical traffic are explored in this thesis. Atypical traffic is found to interfere with parsimonious analytic traffic models. A detection and modeling technique is presented and studied for atypical traffic characterized by strongly clustered inliers. This technique is found to be effective using both real-world observations and simulated data. Another form of atypical traffic is shown to result in multimodal distributions of connection statistics. Putative methods for bimodal estimation are reviewed and a novel technique, the midpoint-distance profile, is presented. The performance of these estimation techniques is studied via simulation and the methods are examined in the context of atypical network traffic. The advantages and disadvantages of each method are reported. / Master of Science

Page generated in 0.0309 seconds