TY - GEN
T1 - Code Rate Optimization via Neural Polar Decoders
AU - Aharoni, Ziv
AU - Huleihel, Bashar
AU - Pfister, Henry D.
AU - Permuter, Haim H.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024/1/1
Y1 - 2024/1/1
N2 - In this work, we explore the enhancement of polar codes for channels with memory, focusing on achieving low decoding complexity and optimizing input distributions for maximum transmission rates. Polar codes are known for their efficient decoding, exhibiting a complexity of O(N log N) in memoryless channels, and complexity of O(|S|3 N log N) in finite state channels (FSCs), where|S|is the state space size. A notable recent advancement is the integration of neural networks (NNs) to create an neural polar decoder (NPD), which is adept at learning from data without the knowledge of the channel model, effectively bypassing the cubic complexity growth associated with the channel state size. In this paper, we propose a framework to optimize the input distribution for polar codes, aiming to maximize the mutual information of effective bit channels. This framework has been tested on both memoryless and FSCs, including the additive white Gaussian noise (AWGN) channel and the Ising channel, yielding promising results. The key contribution of this paper is the demonstration of the feasibility of simultaneously selecting an optimal input distribution and creating a practical decoder for various channel types, even in the absence of a channel model. This approach paves the way for new advancements in data-driven communication theory, especially for channels with memory.
AB - In this work, we explore the enhancement of polar codes for channels with memory, focusing on achieving low decoding complexity and optimizing input distributions for maximum transmission rates. Polar codes are known for their efficient decoding, exhibiting a complexity of O(N log N) in memoryless channels, and complexity of O(|S|3 N log N) in finite state channels (FSCs), where|S|is the state space size. A notable recent advancement is the integration of neural networks (NNs) to create an neural polar decoder (NPD), which is adept at learning from data without the knowledge of the channel model, effectively bypassing the cubic complexity growth associated with the channel state size. In this paper, we propose a framework to optimize the input distribution for polar codes, aiming to maximize the mutual information of effective bit channels. This framework has been tested on both memoryless and FSCs, including the additive white Gaussian noise (AWGN) channel and the Ising channel, yielding promising results. The key contribution of this paper is the demonstration of the feasibility of simultaneously selecting an optimal input distribution and creating a practical decoder for various channel types, even in the absence of a channel model. This approach paves the way for new advancements in data-driven communication theory, especially for channels with memory.
KW - Channel capacity
KW - channels with memory
KW - data-driven
KW - polar codes
UR - http://www.scopus.com/inward/record.url?scp=85202898683&partnerID=8YFLogxK
U2 - 10.1109/ISIT57864.2024.10619429
DO - 10.1109/ISIT57864.2024.10619429
M3 - Conference contribution
AN - SCOPUS:85202898683
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 2424
EP - 2429
BT - 2024 IEEE International Symposium on Information Theory, ISIT 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers
T2 - 2024 IEEE International Symposium on Information Theory, ISIT 2024
Y2 - 7 July 2024 through 12 July 2024
ER -