作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2022, Vol. 48 ›› Issue (1): 197-203. doi: 10.19678/j.issn.1000-3428.0060159

• 移动互联与通信技术 • 上一篇    下一篇

一种基于循环神经网络的极化码BP译码算法

何彦琦1,2, 彭大芹1,2, 赵雪志1,2   

  1. 1. 重庆邮电大学 通信与信息工程学院, 重庆 400065;
    2. 重庆邮电大学 电子信息与网络工程研究院, 重庆 400065
  • 收稿日期:2020-12-01 修回日期:2021-01-05 发布日期:2021-01-11
  • 作者简介:何彦琦(1996-),男,硕士研究生,主研方向为极化码译码、SCMA多用户检测;彭大芹(通信作者),正高级工程师;赵雪志,硕士研究生。
  • 基金资助:
    国家自然科学基金“基于SIW圆极化赋形阵列的毫米波可重构切换波束天线研究”(E020B2018023)。

A Recurrent Neural Network Based BP Decoding Algorithm for Polar Codes

HE Yanqi1,2, PENG Daqin1,2, ZHAO Xuezhi1,2   

  1. 1. School of Communication and Information Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China;
    2. Institute of Electronic Information and Network Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
  • Received:2020-12-01 Revised:2021-01-05 Published:2021-01-11

摘要: 置信传播(BP)算法作为极化码最常用的软判决输出译码算法之一,具有并行传输、高吞吐量等优点,但其存在收敛较慢、运算复杂度高等缺陷。提出一种基于循环神经网络的偏移最小和近似置信传播译码算法。通过偏移最小和近似算法替代乘法运算,修改迭代过程中的消息更新策略,并运用改进的循环神经网络架构实现参数共享。仿真结果表明,相比传统BP译码算法,该译码算法在提升误码率(BER)性能的前提下,减少约75%的加法运算且收敛速度大幅提升,相比基于深度神经网络的BP译码算法,该算法在确保BER性能无显著下降的前提下,使用加法运算替代乘法运算,节省了约80%的存储空间开销。

关键词: 极化码, 置信传播, 循环神经网络, 偏移最小和, 运算复杂度

Abstract: Belief Propagation(BP) is one of the most commonly used soft decision decoding algorithms for polar codes, which enables parallel transport and displays high throughput.However, BP suffers from slow convergence and high computational complexity.To address the problem, this paper proposes a Recurrent Neural Network(RNN)-based approximate BP decoding algorithm for polar codes with and Offset Min-Sum(OMS).The message update strategy is modified in the iterative process by replacing multiplication operations with the minimum offset and approximation algorithm, and an improved RNN architecture is used to realize parameter sharing.Simulation results show that compared with traditional BP algorithms, the proposed decoding algorithm can reduce addition operations by about 75% and greatly improve the convergence speed and the Bit Error Ratio(BER) performance. Compared with the DNN-BP decoding algorithm, the proposed algorithm uses addition operation to replace multiplication operation, and saves about 80% storage space overhead with no significant decline in BER performance.

Key words: polar codes, Belief Propagation(BP), Recurrent Neural Network(RNN), Offset Min-Sum(OMS), computational complexity

中图分类号: