• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • Tagged with
  • 11
  • 11
  • 11
  • 11
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Reliable content delivery using persistent data sessions in a highly mobile environment /

Pantoleon, Periklis K. January 2004 (has links) (PDF)
Thesis (M.S. in Computer Science)--Naval Postgraduate School, March 2004. / Thesis advisor(s): Wen Su, John Gibson. Includes bibliographical references (p. 189). Also available online.
2

Reliable content delivery using persistent data sessions in a highly mobile environment

Pantoleon, Periklis K. 03 1900 (has links)
Approved for public release; distribution is unlimited / Special Forces are crucial in specific military operations. They usually operate in hostile territory where communications are difficult to establish and preserve, since the operations are often carried out in a remote environment and the communications need to be highly mobile. The delivery of information about the geographical parameters of the area can be crucial for the completion of their mission. But in that highly mobile environment, the connectivity of the established wireless networks (LANs) can be unstable and intermittently unavailable. Existing content transfer protocols are not adaptive to volatile network connectivity. If a physical connection is lost, any information or part of a file already retrieved is discarded and the same information must be retransmitted again after the reestablishment of the lost session. The intention of this Thesis is to develop a protocol in the application layer that preserves the already transmitted part of the file, and when the session is reestablished, the information server can continue sending the rest of the file to the requesting host. Further, if the same content is available from another server through a better route, the new server should be able to continue to serve the content, starting from where the session with the previous server ended. / Lieutenant, Hellenic Navy
3

Using XML/HTTP to store, serve and annotate tactical scenarios for X3D operational visualization and anti-terrorist training

Mnif, Khaled 03 1900 (has links)
Approved for public release, distribution unlimited / Adopting Extensible Markup Language (XML) and Hypertext Transfer Protocol (HTTP) are key steps to accommodate the evolution of Internet technologies. While HTTP is already a proven standard communication protocol responsible for the rapid expansion of the World Wide Web, XML provides general mechanisms for determining validatable documents and addresses several deficiencies of HTML regarding diverse document structure and content. XML and HTTP together provide many of the essential capabilities associated with database engines. The Modeling, Virtual Environments and Simulation (MOVES) Institute of the Naval Postgraduate School (NPS) is continuing to build a database of 3D tactical scenarios and using X3D and VRML tools. The configuration parameters and statistical results of these scenarios are XML documents. For a better understanding and usability of these results by the end users, a Webbased application stores and manipulates these XML document. This thesis develops a server-side application that can store, serve, and annotate tactical scenarios for X3D operational visualization and anti-terrorist training by using XML and HTTP technologies. The experimental demonstration for this work is the prototypical Anti-Terrorism/Force Protection (AT/FP) simulation model developed by Lieutenant James W. Harney, USN, using Extensible 3D Graphics (X3D)/ Virtual Reality Modeling Language (VRML) models. / Captain, Tunisian Army
4

Using XML/HTTP to store, serve and annotate tactical scenarios for X3D operational visualization and anti-terrorist training /

Mnif, Khaled. January 2003 (has links) (PDF)
Thesis (M.S. in Computer Science)--Naval Postgraduate School, March 2003. / Thesis advisor(s): Don Brutzman, Curtis L. Blais. Includes bibliographical references (p. 121-122). Also available online.
5

A new model for the marginal distribution of HTTP request rate

Judge, John Thomas. January 2004 (has links)
Thesis (Ph.D.)--University of Wollongong, 2004. / Typescript. Includes bibliographical references: leaf 106-117.
6

HTTP 1.2 Distributed HTTP for load balancing server systems : a thesis /

O'Daniel, Graham Michael. Haungs, Michael L. January 1900 (has links)
Thesis (M.S.)--California Polytechnic State University, 2010. / Title from PDF title page; viewed on June 21, 2010. Major professor: Michael Haungs, Ph.D. "Presented to the faculty of California Polytechnic State University, San Luis Obispo." "In partial fulfillment of the requirements for the degree [of] Master of Science in Computer Science." "June 2010." Includes bibliographical references (p. 76-77).
7

Network monitoring with focus on HTTP

Schmid, Andreas 01 May 1998 (has links)
Since its introduction in the early 1990s, the quick growth of the World Wide Web (WWW) traffic raises the question of whether past Local Area Network (LAN) packet traces still reflect the current situation or whether they have become obsolete. For this thesis, several LAN packet traces were obtained by monitoring the LAN of a typical academic environment. The tools for monitoring the network were a stand-alone HP LAN Protocol Analyzer as well as the free-ware software tool tcpdump. The main focus was placed on acquiring a low-level overview of the LAN traffic. Thus, it was possible to determine what protocols were mainly used and how the packet sizes were distributed. In particular, this study aimed at establishing the amount of WWW traffic on the LAN, and determining the MIME-Types of this traffic. The results indicate that in a typical academic environment, conventional sources of LAN traffic such as NFS are still predominant, whereas WWW traffic plays a rather marginal role. Furthermore, a large portion of the network packets contains little or no data at all, while another significant portion of the packets have sizes around the Maximum Transfer Unit (MTU). Consequently, research in the networking field has to direct its focus on issues beside the WWW. / Graduation date: 1998
8

Program analysis to support quality assurance techniques for web applications

Halfond, William G. J. 20 January 2010 (has links)
As web applications occupy an increasingly important role in the day-to-day lives of millions of people, testing and analysis techniques that ensure that these applications function with a high level of quality are becoming even more essential. However, many software quality assurance techniques are not directly applicable to modern web applications. Certain characteristics, such as the use of HTTP and generated object programs, can make it difficult to identify software abstractions used by traditional quality assurance techniques. More generally, many of these abstractions are implemented differently in web applications, and the lack of techniques to identify them complicates the application of existing quality assurance techniques to web applications. This dissertation describes the development of program analysis techniques for modern web applications and shows that these techniques can be used to improve quality assurance. The first part of the research focuses on the development of a suite of program analysis techniques that identifies useful abstractions in web applications. The second part of the research evaluates whether these program analysis techniques can be used to successfully adapt traditional quality assurance techniques to web applications, improve existing web application quality assurance techniques, and develop new techniques focused on web application-specific issues. The work in quality assurance techniques focuses on improving three different areas: generating test inputs, verifying interface invocations, and detecting vulnerabilities. The evaluations of the resulting techniques show that the use of the program analyses results in significant improvements in existing quality assurance techniques and facilitates the development of new useful techniques.
9

Enabling energy-awareness for internet video

Ejembi, Oche Omobamibo January 2016 (has links)
Continuous improvements to the state of the art have made it easier to create, send and receive vast quantities of video over the Internet. Catalysed by these developments, video is now the largest, and fastest growing type of traffic on modern IP networks. In 2015, video was responsible for 70% of all traffic on the Internet, with an compound annual growth rate of 27%. On the other hand, concerns about the growing energy consumption of ICT in general, continue to rise. It is not surprising that there is a significant energy cost associated with these extensive video usage patterns. In this thesis, I examine the energy consumption of typical video configurations during decoding (playback) and encoding through empirical measurements on an experimental test-bed. I then make extrapolations to a global scale to show the opportunity for significant energy savings, achievable by simple modifications to these video configurations. Based on insights gained from these measurements, I propose a novel, energy-aware Quality of Experience (QoE) metric for digital video - the Energy - Video Quality Index (EnVI). Then, I present and evaluate vEQ-benchmark, a benchmarking and measurement tool for the purpose of generating EnVI scores. The tool enables fine-grained resource-usage analyses on video playback systems, and facilitates the creation of statistical models of power usage for these systems. I propose GreenDASH, an energy-aware extension of the existing Dynamic Adaptive Streaming over HTTP standard (DASH). GreenDASH incorporates relevant energy-usage and video quality information into the existing standard. It could enable dynamic, energy-aware adaptation for video in response to energy-usage and user ‘green' preferences. I also evaluate the subjective perception of such energy-aware, adaptive video streaming by means of a user study featuring 36 participants. I examine how video may be adapted to save energy without a significant impact on the Quality of Experience of these users. In summary, this thesis highlights the significant opportunities for energy savings if Internet users gain an awareness about their energy usage, and presents a technical discussion how this can be achieved by straightforward extensions to the current state of the art.
10

Get the right price every day

Garcia Sotelo, Gerardo Javier 01 January 2005 (has links)
The purpose of this project is to manage restaurants using a software system called GRIPED (Get the Right Price Every day). The system is designed to cover quality control, food cost control and portion control for better management of a restaurant.

Page generated in 0.095 seconds