Using Attentive to improve Recursive LSTM End-to- End Chinese Discourse Parsing

碩士 === 國立中央大學 === 資訊工程學系 === 107 === Discourse parser can help us to understand the relationship and connection between sentences from different angles, but the tree structure data still need to rely on manual marking, which makes this technology cannot be directly used in life. So far, there have b...

Full description

Bibliographic Details
Main Authors: Yu-Jen Wang, 王育任
Other Authors: Chia-Hui Chang
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/4zurd5
Description
Summary:碩士 === 國立中央大學 === 資訊工程學系 === 107 === Discourse parser can help us to understand the relationship and connection between sentences from different angles, but the tree structure data still need to rely on manual marking, which makes this technology cannot be directly used in life. So far, there have been many research studies on automatically construct the complete tree structure on the computer. Since deep learning has progressed rapidly in recent years, the construction method for discourse parser has also changed from the traditional SVM, CRF method to the current recursive neural. In the Chinese corpus tree library CDTB, the parsing analysis problem can be divided into four main problems, including elementary discourse unit (EDU) segmentation, tree structure construction, center labeling, and sense labeling. In this paper, we use many state-of-the-art deep learning techniques, such as attentive recursive neural networks, self-attentive, and BERT to improve the performance. In the end, we succeed to increase the accuracy by more than 10% of F1 of each task, reaching the best performance we know so far.