Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models ofte...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9265251/ |
id |
doaj-8aa308d30133449686f3cda521a9c18d |
---|---|
record_format |
Article |
spelling |
doaj-8aa308d30133449686f3cda521a9c18d2021-03-30T03:41:35ZengIEEEIEEE Access2169-35362020-01-01821305221306110.1109/ACCESS.2020.30395439265251Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary MemoryKe Zang0https://orcid.org/0000-0003-0214-5417Zhenguo Ma1https://orcid.org/0000-0002-7283-2637College of Biomedical Engineering and Instrument Science, Yuquan Campus, Zhejiang University, Hangzhou, ChinaCollege of Biomedical Engineering and Instrument Science, Yuquan Campus, Zhejiang University, Hangzhou, ChinaAs a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals.https://ieeexplore.ieee.org/document/9265251/Automatic modulation classification (AMC)recurrent neural networks (RNNs)hierarchical recurrent structurelong-term memory |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Ke Zang Zhenguo Ma |
spellingShingle |
Ke Zang Zhenguo Ma Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory IEEE Access Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory |
author_facet |
Ke Zang Zhenguo Ma |
author_sort |
Ke Zang |
title |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
title_short |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
title_full |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
title_fullStr |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
title_full_unstemmed |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
title_sort |
automatic modulation classification based on hierarchical recurrent neural networks with grouped auxiliary memory |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. |
topic |
Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory |
url |
https://ieeexplore.ieee.org/document/9265251/ |
work_keys_str_mv |
AT kezang automaticmodulationclassificationbasedonhierarchicalrecurrentneuralnetworkswithgroupedauxiliarymemory AT zhenguoma automaticmodulationclassificationbasedonhierarchicalrecurrentneuralnetworkswithgroupedauxiliarymemory |
_version_ |
1724182972594126848 |