Advancing natural language processing (NLP) applications of morphologically rich languages with bidirectional encoder representations from transformers (BERT): an empirical case study for Turkish

Language model pre-training architectures have demonstrated to be useful to learn language representations. bidirectional encoder representations from transformers (BERT), a recent deep bidirectional self-attention representation from unlabelled text, has achieved remarkable results in many natural...

Full description

Bibliographic Details
Main Authors: Akın Özçift, Kamil Akarsu, Fatma Yumuk, Cevhernur Söylemez
Format: Article
Language:English
Published: Taylor & Francis Group 2021-04-01
Series:Automatika
Subjects:
Online Access:http://dx.doi.org/10.1080/00051144.2021.1922150