Leveraging Pre-Trained Language Model for Summary Generation on Short Text

Bidirectional Encoder Representations from Transformers represents the latest incarnation of pre-trained language models which have been obtained a satisfactory effect in text summarization tasks. However, it has not achieved good results for the generation of Chinese short text summaries. In this w...

Full description

Bibliographic Details
Main Authors: Shuai Zhao, Fucheng You, Zeng Yuan Liu
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9298823/