TY - GEN
T1 - Non-repetitive encoding with increased degree-1 encoding symbols for LT codes
AU - Yen, Kuo Kuang
AU - Liao, Yen Chin
AU - Chen, Chih Lung
AU - Chang, Hsie-Chia
PY - 2012
Y1 - 2012
N2 - For LT codes with robust Soliton distribution, the ripple size is relatively small in the beginning of BP decoding process. Therefore, most of decoding termination occurs due to lack of ripple at early stage. In this study, we aim at reducing early decoding termination for low symbol loss probability. First, given k input symbols, the degree-1 proportion is increased to enlarge the average ripple size within the range 0 ≤ n ≤ k=2, where n is the number of decoded input symbols. Second, we propose Non-Repetitive (NR) encoding scheme to avoid generating repeated degree-1 encoding symbols. An NR encoder forces the first k degree-1 encoding symbols to connect to different input symbols. Simulation results show that NR encoding outperforms LT encoding in terms of symbol loss probability. Besides, less encoding symbols is needed to achieve high successful decoding probability when our scheme is applied. With k = 2000, NR encoding reaches a successful decoding probability of 99.6% when overhead is 0.2, while LT encoding requires an overhead of 0.32 to reach the same probability.
AB - For LT codes with robust Soliton distribution, the ripple size is relatively small in the beginning of BP decoding process. Therefore, most of decoding termination occurs due to lack of ripple at early stage. In this study, we aim at reducing early decoding termination for low symbol loss probability. First, given k input symbols, the degree-1 proportion is increased to enlarge the average ripple size within the range 0 ≤ n ≤ k=2, where n is the number of decoded input symbols. Second, we propose Non-Repetitive (NR) encoding scheme to avoid generating repeated degree-1 encoding symbols. An NR encoder forces the first k degree-1 encoding symbols to connect to different input symbols. Simulation results show that NR encoding outperforms LT encoding in terms of symbol loss probability. Besides, less encoding symbols is needed to achieve high successful decoding probability when our scheme is applied. With k = 2000, NR encoding reaches a successful decoding probability of 99.6% when overhead is 0.2, while LT encoding requires an overhead of 0.32 to reach the same probability.
KW - BP decoding
KW - LT code
KW - Non-Repetitive encoding
KW - degree
UR - http://www.scopus.com/inward/record.url?scp=84874146078&partnerID=8YFLogxK
U2 - 10.1109/APCCAS.2012.6419120
DO - 10.1109/APCCAS.2012.6419120
M3 - Conference contribution
AN - SCOPUS:84874146078
SN - 9781457717291
T3 - IEEE Asia-Pacific Conference on Circuits and Systems, Proceedings, APCCAS
SP - 655
EP - 658
BT - 2012 IEEE Asia Pacific Conference on Circuits and Systems, APCCAS 2012
T2 - 2012 IEEE Asia Pacific Conference on Circuits and Systems, APCCAS 2012
Y2 - 2 December 2012 through 5 December 2012
ER -