The Voice over Internet Protocol (VoIP) is on its way to surpassing toll quality. Although VoIP shares its transmission channel with other communication traffic, today internet has a wider bandwidth than the legacy Digital Loop Carrier and voice could be digitized higher than traditional 8 kbps, to say 16 kbps. Thus, VoIP should not be limited by the toll quality. However, VoIP quality could go down, as a result of unpredictable traffic congestion and network imperfections. These two situations cause delay jitter and packet loss of VoIP. To overcome these challenges, there are ongoing works for service providers including but not limited to optimizing routing and adding more bandwidth. There are also works by developers at the user’s end, which includes compressing voice packet size and processing playout delay adapted to the network condition.
While VoIP planning or off-line quality monitoring and control use overall quality measurements such as mean opinion score (MOS) or R-factor, the real-time quality supervision typically uses the network condition factors only. The control mechanism that is based on network quality could adjust the channel parameter by changing Codec and its parameters, and changing playout delay, etc. to minimize the loss of voice quality.
As bandwidth plays a prominent role in IP traffic congestion, compressing the packet header is a possible solution to minimize congestion. Replacing a completed packet header with a smaller header will significantly reduce the packet header size. For instance, with a context, a compressed header will not consist of RTP header and, thus, could reduce 16 bytes from each packet. However, the primary question is how to deal with delay jitter calculation without time stamping. In this research, a delay jitter calculation for VoIP packet without timestamp has been provided.
Compressing payload or using high compressing Codecs, is another major solution for preventing quality downgrade with limited bandwidth. The challenge with many Codec and the tradeoff between Codec quality and packet loss due to limited bandwidth has been addressed in this research with a summary of Codec quality evaluation and a bandwidth planning calculation.
Although the E-model and its R-factor has been proposed by the International Telecommunication Union (ITU) for VoIP quality measurement, with many network and Codec parameters, it could only be used for offline quality control. Since accessing a live traffic for monitoring live quality is somewhat impossible, at the client side, only packet loss and delay jitter matters. In this research, more in-depth investigation of adaptive playout delay based on jitter prediction has been carried out and recommended as the end user solution for quality improvement. An adaptive playout delay based on Markov model also has been developed in detail and tested with real VoIP network. This development has closed the gap between research and engineering. Therefore, the Markov model could be evaluated and implemented.
Identifer | oai:union.ndltd.org:USF/oai:scholarcommons.usf.edu:etd-7856 |
Date | 04 April 2017 |
Creators | Le, An Thanh |
Publisher | Scholar Commons |
Source Sets | University of South Flordia |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Graduate Theses and Dissertations |
Rights | default |
Page generated in 0.0018 seconds