论文标题

使用卷积LDGM代码传输Bernoulli源

Transmission of Bernoulli Sources Using Convolutional LDGM Codes

论文作者

Wang, Yixin, Zhu, Tingting, Ma, Xiao

论文摘要

我们在本文中建议利用卷积低密度发生器矩阵(LDGM)代码,用于通过二进制输入输出对称(BIOS)通道传输Bernoulli源。为此,我们提出了一个新框架,以证明线性代码的编码定理,该定理统一了通道编码定理,源编码定理和联合源通道编码(JSCC)定理。在提出的框架中,系统的位和相应的奇偶校验检查位扮演着不同的角色。确切地说,嘈杂的系统位用于限制典型代码字的列表大小,而嘈杂的奇偶校验检查位用于从列表中选择最大似然代码字。该线性代码的新框架允许系统的位和奇偶校验检查位以不同的方式和不同的渠道传输。使用此框架,我们证明了Bernoulli Generator矩阵代码(BGMC)是通过BIOS通道的能力方面的,用于Bernoulli源的熵方面,以及用于JSCC应用的系统效率。对于线性代码,位于位错误速率(BER)上的下限,可用于预测误差地板,因此是设计JSCC系统的简单工具。数值结果表明,卷积LDGM代码在瀑布区域表现良好,并且与派生的误差地板良好匹配,如果需要,则可以通过简单地增加编码内存来降低该误差地板。

We propose in this paper to exploit convolutional low density generator matrix (LDGM) codes for transmission of Bernoulli sources over binary-input output-symmetric (BIOS) channels. To this end, we present a new framework to prove the coding theorems for linear codes, which unifies the channel coding theorem, the source coding theorem and the joint source-channel coding (JSCC) theorem. In the presented framework, the systematic bits and the corresponding parity-check bits play different roles. Precisely, the noisy systematic bits are used to limit the list size of typical codewords, while the noisy parity-check bits are used to select from the list the maximum likelihood codeword. This new framework for linear codes allows that the systematic bits and the parity-check bits are transmitted in different ways and over different channels. With this framework, we prove that the Bernoulli generator matrix codes (BGMCs) are capacity-achieving over BIOS channels, entropy-achieving for Bernoulli sources, and also system-capacity-achieving for JSCC applications. A lower bound on the bit-error rate (BER) is derived for linear codes, which can be used to predict the error floors and hence serves as a simple tool to design the JSCC system. Numerical results show that the convolutional LDGM codes perform well in the waterfall region and match well with the derived error floors, which can be lowered down if required by simply increasing the encoding memory.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源