• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Determine network survivability using heuristic models

Chua, Eng Hong 03 1900 (has links)
Approved for public release; distribution in unlimited. / Contemporary large-scale networked systems have improved the efficiency and effectiveness of our way of life. However, such benefit is accompanied by elevated risks of intrusion and compromises. Incorporating survivability capabilities into systems is one of the ways to mitigate these risks. The Server Agent-based Active network Management (SAAM) project was initiated as part of the next generation Internet project to address the increasing multi-media Internet service demands. Its objective is to provide a consistent and dedicated quality of service to the users. SAAM monitors the network traffic conditions in a region and responds to routing requests from the routers in that region with optimal routes. Mobility has been incorporated to SAAM server to prevent a single point of failure from bringing down the entire SAAM server and its service. With mobility, it is very important to select a good SAAM server locality from the client's point of view. The choice of the server must be a node where connection to the client is most survivable. In order to do that, a general metric is defined to measure the connection survivability of each of the potential server hosts. However, due to the complexity of the network, the computation of the metric becomes very complex too. This thesis develops heuristic solutions of polynomial complexity to find the hosting server node. In doing so, it minimizes the time and computer power required. / Defence Science & Technology Agency (Singapore)
2

A suitable server placement for peer-to-peer live streaming

Yuan, X.Q., Yin, H., Min, Geyong, Liu, X., Hui, W., Zhu, G.X. January 2013 (has links)
No / With the rapid growth of the scale, complexity, and heterogeneity of Peer-to-Peer (P2P) systems, it has become a great challenge to deal with the peer's network-oblivious traffic and self-organization problems. A potential solution is to deploy servers in appropriate locations. However, due to the unique features and requirements of P2P systems, the traditional placement models cannot yield the desirable service performance. To fill this gap, we propose an efficient server placement model for P2P live streaming systems. Compared to the existing solutions, this model takes the Internet Service Provider (ISP) friendly problem into account and can reduce the cross-network traffic among ISPs. Specifically, we introduce the peers' contribution into the proposed model, which makes it more suitable for P2P live streaming systems. Moreover, we deploy servers based on the theoretical solution subject to practical data and apply them to practical live streaming applications. The experimental results show that this new model can reduce the amount of cross-network traffic and improve the system efficiency, has a better adaptability to Internet environment, and is more suitable for P2P systems than the traditional placement models.
3

NetClust: A Framework for Scalable and Pareto-Optimal Media Server Placement

Yin, H., Zhang, X., Zhan, T.Y., Zhang, Y., Min, Geyong, Wu, D.O. January 2013 (has links)
No / Effective media server placement strategies are critical for the quality and cost of multimedia services. Existing studies have primarily focused on optimization-based algorithms to select server locations from a small pool of candidates based on the entire topological information and thus these algorithms are not scalable due to unavailability of the small pool of candidates and low-efficiency of gathering the topological information in large-scale networks. To overcome this limitation, a novel scalable framework called NetClust is proposed in this paper. NetClust takes advantage of the latest network coordinate technique to reduce the workloads when obtaining the global network information for server placement, adopts a new Kappa -means-clustering-based algorithm to select server locations and identify the optimal matching between clients and servers. The key contribution of this paper is that the proposed framework optimizes the trade-off between the service delay performance and the deployment cost under the constraints of client location distribution and the computing/storage/bandwidth capacity of each server simultaneously. To evaluate the performance of the proposed framework, a prototype system is developed and deployed in a real-world large-scale Internet. Experimental results demonstrate that 1) NetClust achieves the lower deployment cost and lower delay compared to the traditional server selection method; and 2) NetClust offers a practical and feasible solution for multimedia service providers.

Page generated in 0.2122 seconds