Layer-aligned multipriority rateless codes for layered video streaming

Hsu-Feng Hsiao, Yong Jhih Ciou

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


There exists a multitude of techniques, including automatic repeat request and error correction codes, to minimize data corruption when transmitting over error-prone networks. Streaming of multimedia data can usually withstand a certain level of data loss, yet have strict limitations on the latency tolerance. To enable acceptable reliability of transmission and low transmission latency, the channel coding approach is usually more appealing at the cost of additional bandwidth. In this paper, an N-cycle layer-aligned overlapping structure, which is good for layered data, is proposed. Accordingly, layer-aligned multipriority rateless codes were developed with favorable probabilities to control the protection strength for each layer of the streaming data. The major contribution of this paper is the analytical model developed to predict the failure decoding probabilities for each video layer and it is shown to achieve accurate estimation. A prediction model to estimate the expected decompressible video frames was developed for use with the developed codes for streaming scalable videos. By maximizing the number of expected decompressible video frames, the protection strength of the developed codes can then be determined. Simulation results show that the developed codes are good for streaming layered videos, which are difficult to deal with using traditional rateless codes, with or without unequal error protection.

Original languageEnglish
Article number6727474
Pages (from-to)1395-1404
Number of pages10
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number8
StatePublished - 1 Jan 2014


  • Rateless codes
  • scalable video coding
  • unequal error protection
  • video streaming


Dive into the research topics of 'Layer-aligned multipriority rateless codes for layered video streaming'. Together they form a unique fingerprint.

Cite this