Automatic Generation and Evaluation of Chinese Classical Poetry with Attention-Based Deep Neural Network

The computer generation of poetry has been studied for more than a decade. Generating poetry on a human level is still a great challenge for the computer-generation process. We present a novel Transformer-XL based on a classical Chinese poetry model that employs a multi-head self-attention mechanism...

Full description

Bibliographic Details
Main Authors: Lee, H.J (Author), Zhao, J. (Author)
Format: Article
Language:English
Published: MDPI 2022
Subjects:
Online Access:View Fulltext in Publisher
Description
Summary:The computer generation of poetry has been studied for more than a decade. Generating poetry on a human level is still a great challenge for the computer-generation process. We present a novel Transformer-XL based on a classical Chinese poetry model that employs a multi-head self-attention mechanism to capture the deeper multiple relationships among Chinese characters. Furthermore, we utilized the segment-level recurrence mechanism to learn longer-term dependency and overcome the context fragmentation problem. To automatically assess the quality of the generated poems, we also built a novel automatic evaluation model that contains a BERT-based module for checking the fluency of sentences and a tone-checker module to evaluate the tone pattern of poems. The poems generated using our model obtained an average score of 9.7 for fluency and 10.0 for tone pattern. Moreover, we visualized the attention mechanism, and it showed that our model learned the tone-pattern rules. All experiment results demonstrate that our poetry generation model can generate high-quality poems. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.
ISBN:20763417 (ISSN)
DOI:10.3390/app12136497