Further Exploration of Convolutional Encoders for Unequal Error Protection and New UEP Convolutional Codes

Hung Hua Tang, Chung-Hsuan Wang, Mao Chao Lin

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In this paper, the unequal error protection (UEP) capability of convolutional encoders, in terms of the separation vector, is studied from an algebraic viewpoint. A simple procedure is presented for constructing a generator matrix, which is basic and has the largest separation vector for every convolutional code. Such a generator matrix would be desirable, since the corresponding encoder not only achieves UEP optimality, but also avoids undesired catastrophic error propagation. In addition, canonical generator matrices, which are both basic and reduced, are even more preferable for encoding, since they attain the lowest complexity for Viterbi decoding. However, the direct transformation from a UEP-optimal generator matrix to a canonical generator matrix may come with an unexpected loss of the separation vector. We also propose a specific type of transformation matrix that reduces the external degrees of the generator matrices, from which canonical generator matrices can be constructed that include the mitigated degradation of the separation vector. Finally, beneficial UEP convolutional codes that achieve the maximum free distances for the given code parameters are provided.

Original languageEnglish
Article number7511703
Pages (from-to)4857-4866
Number of pages10
JournalIEEE Transactions on Information Theory
Volume62
Issue number9
DOIs
StatePublished - 1 Sep 2016

Keywords

  • Basic generator matrix
  • canonical generator matrix
  • optimal generator matrix
  • separation vector
  • unequal error protection

Fingerprint

Dive into the research topics of 'Further Exploration of Convolutional Encoders for Unequal Error Protection and New UEP Convolutional Codes'. Together they form a unique fingerprint.

Cite this