• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 10
  • 9
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 103
  • 103
  • 103
  • 103
  • 29
  • 22
  • 18
  • 16
  • 13
  • 11
  • 11
  • 11
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Optimizing bandwidth of tactical communications systems

Cox, Criston W. 06 1900 (has links)
Current tactical networks are oversaturated, often slowing systems down to unusable speeds. Utilizing data collected from major exercises and Operation Iraqi Freedom II (OIF II), a typical model of existing tactical network performance is modeled and analyzed using NETWARS, a DISA sponsored communication systems modeling and simulation program. Optimization technologies are then introduced, such as network compression, caching, Quality of Service (QoS), and the Space Communication Protocol Standards Transport Protocol (SCPS-TP). The model is then altered to reflect an optimized system, and simulations are run for comparison. Data for the optimized model was obtained by testing commercial optimization products known as Protocol Enhancement Proxies ( Support Activity (MCTSSA) testing laboratory.
52

Information-theoretics based technoeconomic growth models: simulation and computation of forecasting in telecommunication services

Unknown Date (has links)
This research is concerned with algorithmic representation of technoeconomic growth concerning modern and next-generation telecommunications including the Internet service. The goal of this study thereof is to emphasize efforts to establish the associated forecasting and, the envisioned tasks thereof include : (i) Reviewing the technoeconomic considerations prevailing in telecommunication (telco) service industry and their implicating features; (ii) studying relevant aspects of underlying complex system evolution (akin to biological systems), (iii) pursuant co-evolution modeling of competitive business structures using dichotomous (flip-flop) states as seen in predator evolutions ; (iv) conceiving a novel algorithm based on information-theoretic principles toward technoeconomic forecasting on the basis of modified Fisher-Kaysen model consistent with proportional fairness concept of comsumers' willingness-to-pay, and (v) evaluating forecast needs on inter-office facility based congestion sensitive traffics encountered. Commensurate with the topics indicated above, necessary algorithms, analytical derivations and compatible models are proposed. Relevant computational exercises are performed with MatLab[TM] using data gathered from open-literature on the service profiles of telecommunication companies (telco); and ad hoc model verifications are performed on the results. Lastly, discussions and inferences are made with open-questions identified for further research. / by Raef Rashad Yassin. / Thesis (Ph.D.)--Florida Atlantic University, 2012. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
53

An empirical study on Chinese text compression: from character-based to word-based approach.

January 1997 (has links)
by Kwok-Shing Cheng. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1997. / Includes bibliographical references (leaves 114-120). / Abstract --- p.i / Acknowledgement --- p.iii / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Importance of Text Compression --- p.1 / Chapter 1.2 --- Motivation of this Research --- p.2 / Chapter 1.3 --- Characteristics of Chinese --- p.2 / Chapter 1.3.1 --- Huge size of character set --- p.3 / Chapter 1.3.2 --- Lack of word segmentation --- p.3 / Chapter 1.3.3 --- Rich semantics --- p.3 / Chapter 1.4 --- Different Coding Schemes for Chinese --- p.4 / Chapter 1.4.1 --- Big5 Code --- p.4 / Chapter 1.4.2 --- GB (Guo Biao) Code --- p.4 / Chapter 1.4.3 --- HZ (Hanzi) Code --- p.5 / Chapter 1.4.4 --- Unicode Code --- p.5 / Chapter 1.5 --- Modeling and Coding for Chinese Text --- p.6 / Chapter 1.6 --- Static and Adaptive Modeling --- p.6 / Chapter 1.7 --- One-Pass and Two-Pass Modeling --- p.8 / Chapter 1.8 --- Ordering of models --- p.9 / Chapter 1.9 --- Two Sets of Benchmark Files and the Platform --- p.9 / Chapter 1.10 --- Outline of the Thesis --- p.11 / Chapter 2 --- A Survey of Chinese Text Compression --- p.13 / Chapter 2.1 --- Entropy for Chinese Text --- p.14 / Chapter 2.2 --- Weakness of Traditional Compression Algorithms on Chinese Text --- p.15 / Chapter 2.3 --- Statistical Class Algorithms for Compressing Chinese --- p.16 / Chapter 2.3.1 --- Huffman coding scheme --- p.17 / Chapter 2.3.2 --- Arithmetic Coding Scheme --- p.22 / Chapter 2.3.3 --- Restricted Variable Length Coding Scheme --- p.26 / Chapter 2.4 --- Dictionary-based Class Algorithms for Compressing Chinese --- p.27 / Chapter 2.5 --- Experiments and Results --- p.32 / Chapter 2.6 --- Chapter Summary --- p.35 / Chapter 3 --- Indicator Dependent Huffman Coding Scheme --- p.37 / Chapter 3.1 --- Chinese Character Identification Routine --- p.37 / Chapter 3.2 --- Reduction of Header Size --- p.39 / Chapter 3.3 --- Semi-adaptive IDC for Chinese Text --- p.44 / Chapter 3.3.1 --- Theoretical Analysis of Partition Technique for Com- pression --- p.48 / Chapter 3.3.2 --- Experiments and Results of the Semi-adaptive IDC --- p.50 / Chapter 3.4 --- Adaptive IDC for Chinese Text --- p.54 / Chapter 3.4.1 --- Experiments and Results of the Adaptive IDC --- p.57 / Chapter 3.5 --- Chapter Summary --- p.58 / Chapter 4 --- Cascading LZ Algorithms with Huffman Coding Schemes --- p.59 / Chapter 4.1 --- Variations of Huffman Coding Scheme --- p.60 / Chapter 4.1.1 --- Analysis of EPDC and PDC --- p.60 / Chapter 4.1.2 --- "Analysis of PDC, 16Huff and IDC" --- p.65 / Chapter 4.1.3 --- Time and Memory Consumption --- p.71 / Chapter 4.2 --- "Cascading LZSS with PDC, 16Huff and IDC" --- p.73 / Chapter 4.2.1 --- Experimental Results --- p.76 / Chapter 4.3 --- "Cascading LZW with PDC, 16Huff and IDC" --- p.79 / Chapter 4.3.1 --- Experimental Results --- p.82 / Chapter 4.4 --- Chapter Summary --- p.84 / Chapter 5 --- Applying Compression Algorithms to Word-segmented Chi- nese Text --- p.85 / Chapter 5.1 --- Background of word-based compression algorithms --- p.86 / Chapter 5.2 --- Terminology and Benchmark Files for Word Segmentation Model --- p.88 / Chapter 5.3 --- Word Segmentation Model --- p.88 / Chapter 5.4 --- Chinese Entropy from Byte to Word --- p.91 / Chapter 5.5 --- The Generalized Compression and Decompression Model for Word-segmented Chinese text --- p.92 / Chapter 5.6 --- Applying Huffman Coding Scheme to Word-segmented Chinese text --- p.94 / Chapter 5.7 --- Applying WLZSSHUF to Word-segmented Chinese text --- p.97 / Chapter 5.8 --- Applying WLZWHUF to Word-segmented Chinese text --- p.102 / Chapter 5.9 --- Match Ratio and Compression Ratio --- p.105 / Chapter 5.10 --- Chapter Summary --- p.108 / Chapter 6 --- Concluding Remarks --- p.110 / Chapter 6.1 --- Conclusions --- p.110 / Chapter 6.2 --- Contributions --- p.111 / Chapter 6.3 --- Future Directions --- p.112 / Chapter 6.3.1 --- Integrate Decremental Coding Scheme with IDC --- p.112 / Chapter 6.3.2 --- Re-order the Character Sequences in the Sliding Window of LZSS --- p.113 / Chapter 6.3.3 --- Multiple Huffman Trees for Word-based Compression --- p.113 / Bibliography --- p.114
54

LDGM codes for wireless and quantum systems

Lou, Hanqing. January 2006 (has links)
Thesis (Ph.D.)--University of Delaware, 2006. / Principal faculty advisor: Javier Garcia-Frias, Dept. of Electrical and Computer Engineering. Includes bibliographical references.
55

Performance comparison between three different bit allocation algorithms inside a critically decimated cascading filter bank

Weaver, Michael B. January 2009 (has links)
Thesis (M.S.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Electrical and Computer Engineering, 2009. / Includes bibliographical references.
56

A compression engine for multidimensional array database systems /

Dehmel, Andreas. January 2002 (has links)
Dissertation (Dr. rer. nat.)--Institut für Informatik de Technischen Universität München, 2001. / Includes bibliographical references.
57

XCQ : a framework for XML compression and querying /

Lam, Wai-Yeung. January 2003 (has links)
Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2003. / Includes bibliographical references (leaves 142-147). Also available in electronic version. Access restricted to campus users.
58

Testing for delay defects utilizing test data compression techniques

Putman, Richard Dean, 1970- 29 August 2008 (has links)
As technology shrinks new types of defects are being discovered and new fault models are being created for those defects. Transition delay and path delay fault models are two such models that have been created, but they still fall short in that they are unable to obtain a high test coverage of smaller delay defects; these defects can cause functional behavior to fail and also indicate potential reliability issues. The first part of this dissertation addresses these problems by presenting an enhanced timing-based delay fault testing technique that incorporates the use of standard delay ATPG, along with timing information gathered from standard static timing analysis. Utilizing delay fault patterns typically increases the test data volume by 3-5X when compared to stuck-at patterns. Combined with the increase in test data volume associated with the increase in gate count that typically accompanies the miniaturization of technology, this adds up to a very large increase in test data volume that directly affect test time and thus the manufacturing cost. The second part of this dissertation presents a technique for improving test compression and reducing test data volume by using multiple expansion ratios while determining the configuration of the scan chains for each of the expansion ratios using a dependency analysis procedure that accounts for structural dependencies as well as free variable dependencies to improve the probability of detecting faults. Finally, this dissertation addresses the problem of unknown values (X’s) in the output response data corrupting the data and degrading the performance of the output response compactor and thus the overall amount of test compression. Four techniques are presented that focus on handling response data with large percentages of X’s. The first uses X-canceling MISR architecture that is based on deterministically observing scan cells, and the second is a hybrid approach that combines a simple X-masking scheme with the X-canceling MISR for further gains in test compression. The third and fourth techniques revolve around reiterative LFSR X-masking, which take advantage of LFSR-encoded masks that can be reused for multiple scan slices in novel ways. / text
59

Compressed-domain processing of MPEG audio signals

Lanciani, Christopher A. 06 1900 (has links)
No description available.
60

Video analysis and compression for surveillance applications

Savadatti-Kamath, Sanmati S. 17 November 2008 (has links)
With technological advances digital video and imaging are becoming more and more relevant. Medical, remote-learning, surveillance, conferencing and home monitoring are just a few applications of these technologies. Along with compression, there is now a need for analysis and extraction of data. During the days of film and early digital cameras the processing and manipulation of data from such cameras was transparent to the end user. This transparency has been decreasing and the industry is moving towards `smart users' - people who will be enabled to program and manipulate their video and imaging systems. Smart cameras can currently zoom, refocus and adjust lighting by sourcing out current from the camera itself to the headlight. Such cameras are used in the industry for inspection, quality control and even counting objects in jewelry stores and museums, but could eventually allow user defined programmability. However, all this will not happen without interactive software as well as capabilities in the hardware to allow programmability. In this research, compression, expansion and detail extraction from videos in the surveillance arena are addressed. Here, a video codec is defined that can embed contextual details of a video stream depending on user defined requirements creating a video summary. This codec also carries out motion based segmentation that helps in object detection. Once an object is segmented it is matched against a database using its shape and color information. If the object is not a good match, the user can either add it to the database or consider it an anomaly. RGB vector angle information is used to generate object descriptors to match objects to a database. This descriptor implicitly incorporates the shape and color information while keeping the size of the database manageable. Color images of objects that are considered `safe' are taken from various angles and distances (with the same background as that covered by the camera is question) and their RGB vector angle based descriptors constitute the information contained in the database. This research is a first step towards building a compression and detection system for specific surveillance applications. While the user has to build and maintain a database, there are no restrictions on the size of the images, zoom and angle requirements, thus, reducing the burden on the end user in creating such a database. This also allows use of different types of cameras and doesn't need a lot of up-front planning on camera location, etc.

Page generated in 0.1605 seconds