BERT for Question Generation

碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce three neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employ...

Full description

Bibliographic Details
Main Authors: Ying-Hong Chan, 詹英鴻
Other Authors: Yao-Chung Fan
Format: Others
Language:en_US
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394056%22.&searchmode=basic
id ndltd-TW-107NCHU5394056
record_format oai_dc
spelling ndltd-TW-107NCHU53940562019-11-30T06:09:40Z http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394056%22.&searchmode=basic BERT for Question Generation 基於BERT深度學習模型之問答語句自動生成技術 Ying-Hong Chan 詹英鴻 碩士 國立中興大學 資訊科學與工程學系所 107 In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce three neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. And, the second one remedies the first one by restructuring the architecture into a sequential manner for taking information from previous decoded result. In addition, we further propose third model which improves the performance through different BERT input representation formulation. Our models are trained and evaluated on the recent question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU 4 score of the existing best models from 16.85 to 22.17. Yao-Chung Fan 范耀中 2019 學位論文 ; thesis 30 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce three neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. And, the second one remedies the first one by restructuring the architecture into a sequential manner for taking information from previous decoded result. In addition, we further propose third model which improves the performance through different BERT input representation formulation. Our models are trained and evaluated on the recent question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU 4 score of the existing best models from 16.85 to 22.17.
author2 Yao-Chung Fan
author_facet Yao-Chung Fan
Ying-Hong Chan
詹英鴻
author Ying-Hong Chan
詹英鴻
spellingShingle Ying-Hong Chan
詹英鴻
BERT for Question Generation
author_sort Ying-Hong Chan
title BERT for Question Generation
title_short BERT for Question Generation
title_full BERT for Question Generation
title_fullStr BERT for Question Generation
title_full_unstemmed BERT for Question Generation
title_sort bert for question generation
publishDate 2019
url http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394056%22.&searchmode=basic
work_keys_str_mv AT yinghongchan bertforquestiongeneration
AT zhānyīnghóng bertforquestiongeneration
AT yinghongchan jīyúbertshēndùxuéxímóxíngzhīwèndáyǔjùzìdòngshēngchéngjìshù
AT zhānyīnghóng jīyúbertshēndùxuéxímóxíngzhīwèndáyǔjùzìdòngshēngchéngjìshù
_version_ 1719300466608701440