Return to search

Optimal erasure protection assignment for scalably compressed data over packet-based networks

This research is concerned with the reliable delivery of scalable compressed data over lossy communication channels. Recent works proposed several strategies for assigning optimal code redundancies to elements of scalable data, which form a linear structure of dependency, under the assumption that all source elements are encoded onto a common group of network packets. Given large data and small network packets, such schemes require very long channel codes with high computational complexity. In networks with high loss, small packets are more desirable than long packets. The first contribution of this thesis is to propose a strategy for optimally assigning elements of the scalable data to clusters of packets, subject to constraints on packet size and code complexity. Given a packet cluster arrangement, the scheme then assigns optimal code redundancies to the source elements, subject to a constraint on transmission length. Experimental results show that the proposed strategy can outperform the previous code assignment schemes subject to the above-mentioned constraints, particularly at high channel loss rates. Secondly, we modify these schemes to accommodate complex structures of dependency. Source elements are allocated to clusters of packets according to their dependency structure, subject to constraints on packet size and channel codeword length. Given a packet cluster arrangement, the proposed schemes assign optimal code redundancies to the source elements, subject to a constraint on transmission length. Experimental results demonstrate the superiority of the proposed strategies for correctly modelling the dependency structure. The last contribution of this thesis is to propose a scheme for optimizing protection of scalable data where limited retransmission is possible. Previous work assumed that retransmission is not possible. For most real-time or interactive applications, however, retransmission of lost data may be possible up to some limit. In the present work we restrict our attention to streaming sources (e.g., video) where each source element can be transmitted in one or both of two time slots. An optimization algorithm determines the transmission and level of protection for each source element, using information about the success of earlier transmissions. Experimental results confirm the benefit of limited retransmission.

Identiferoai:union.ndltd.org:ADTP/242292
Date January 2004
CreatorsThie, Johnson, Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW
PublisherAwarded by:University of New South Wales. Electrical Engineering and Telecommunications
Source SetsAustraliasian Digital Theses Program
LanguageEnglish
Detected LanguageEnglish
RightsCopyright Johnson Thie, http://unsworks.unsw.edu.au/copyright

Page generated in 0.0021 seconds