Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method
The information bottleneck method is a generic clustering framework from the field of machine learning which allows compressing an observed quantity while retaining as much of the mutual information it shares with the quantity of primary relevance as possible. The framework was recently used to desi...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-09-01
|
Series: | Algorithms |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-4893/12/9/192 |
id |
doaj-029ca354749c4318892f3cf1c60577d3 |
---|---|
record_format |
Article |
spelling |
doaj-029ca354749c4318892f3cf1c60577d32020-11-24T21:59:50ZengMDPI AGAlgorithms1999-48932019-09-0112919210.3390/a12090192a12090192Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck MethodSyed Aizaz Ali Shah0Maximilian Stark1Gerhard Bauch2Institute of Communications, Hamburg University of Technology, 21073 Hamburg, GermanyInstitute of Communications, Hamburg University of Technology, 21073 Hamburg, GermanyInstitute of Communications, Hamburg University of Technology, 21073 Hamburg, GermanyThe information bottleneck method is a generic clustering framework from the field of machine learning which allows compressing an observed quantity while retaining as much of the mutual information it shares with the quantity of primary relevance as possible. The framework was recently used to design message-passing decoders for low-density parity-check codes in which all the arithmetic operations on log-likelihood ratios are replaced by table lookups of unsigned integers. This paper presents, in detail, the application of the information bottleneck method to polar codes, where the framework is used to compress the virtual bit channels defined in the code structure and show that the benefits are twofold. On the one hand, the compression restricts the output alphabet of the bit channels to a manageable size. This facilitates computing the capacities of the bit channels in order to identify the ones with larger capacities. On the other hand, the intermediate steps of the compression process can be used to replace the log-likelihood ratio computations in the decoder with table lookups of unsigned integers. Hence, a single procedure produces a polar encoder as well as its tailored, quantized decoder. Moreover, we also use a technique called <i>message alignment</i> to reduce the space complexity of the quantized decoder obtained using the information bottleneck framework.https://www.mdpi.com/1999-4893/12/9/192information bottleneck methodpolar codesquantized decodingcode construction |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Syed Aizaz Ali Shah Maximilian Stark Gerhard Bauch |
spellingShingle |
Syed Aizaz Ali Shah Maximilian Stark Gerhard Bauch Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method Algorithms information bottleneck method polar codes quantized decoding code construction |
author_facet |
Syed Aizaz Ali Shah Maximilian Stark Gerhard Bauch |
author_sort |
Syed Aizaz Ali Shah |
title |
Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method |
title_short |
Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method |
title_full |
Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method |
title_fullStr |
Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method |
title_full_unstemmed |
Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method |
title_sort |
coarsely quantized decoding and construction of polar codes using the information bottleneck method |
publisher |
MDPI AG |
series |
Algorithms |
issn |
1999-4893 |
publishDate |
2019-09-01 |
description |
The information bottleneck method is a generic clustering framework from the field of machine learning which allows compressing an observed quantity while retaining as much of the mutual information it shares with the quantity of primary relevance as possible. The framework was recently used to design message-passing decoders for low-density parity-check codes in which all the arithmetic operations on log-likelihood ratios are replaced by table lookups of unsigned integers. This paper presents, in detail, the application of the information bottleneck method to polar codes, where the framework is used to compress the virtual bit channels defined in the code structure and show that the benefits are twofold. On the one hand, the compression restricts the output alphabet of the bit channels to a manageable size. This facilitates computing the capacities of the bit channels in order to identify the ones with larger capacities. On the other hand, the intermediate steps of the compression process can be used to replace the log-likelihood ratio computations in the decoder with table lookups of unsigned integers. Hence, a single procedure produces a polar encoder as well as its tailored, quantized decoder. Moreover, we also use a technique called <i>message alignment</i> to reduce the space complexity of the quantized decoder obtained using the information bottleneck framework. |
topic |
information bottleneck method polar codes quantized decoding code construction |
url |
https://www.mdpi.com/1999-4893/12/9/192 |
work_keys_str_mv |
AT syedaizazalishah coarselyquantizeddecodingandconstructionofpolarcodesusingtheinformationbottleneckmethod AT maximilianstark coarselyquantizeddecodingandconstructionofpolarcodesusingtheinformationbottleneckmethod AT gerhardbauch coarselyquantizeddecodingandconstructionofpolarcodesusingtheinformationbottleneckmethod |
_version_ |
1725847021912326144 |