CentralBankRoBERTa: A fine-tuned large language model for central bank communications

Central bank communications are an important tool for guiding the economy and fulfilling monetary policy goals. Natural language processing (NLP) algorithms have been used to analyze central bank communications. These outdated bag-of-words methods often ignore context and cannot distinguish who thes...

Full description

Bibliographic Details
Published in:Journal of Finance and Data Science
Main Authors: Moritz Pfeifer, Vincent P. Marohl
Format: Article
Language:English
Published: KeAi Communications Co., Ltd. 2023-11-01
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2405918823000302
Description
Summary:Central bank communications are an important tool for guiding the economy and fulfilling monetary policy goals. Natural language processing (NLP) algorithms have been used to analyze central bank communications. These outdated bag-of-words methods often ignore context and cannot distinguish who these sentiments are addressing. Recent research has introduced deep-learning-based NLP algorithms, also known as large language models (LLMs), which take context into account. This study applies LLMs to central bank communications and constructs CentralBankRoBERTa, a state-of-the-art economic agent classifier that distinguishes five basic macroeconomic agents and binary sentiment classifier that identifies the emotional content of sentences in central bank communications. The absence of large-language models in the central bank communications literature may be attributed to a lack of appropriately labeled datasets. To address this gap, we introduce our model, CentralBankRoBERTa, offering an easy-to-use and standardized tool for scholars of central bank communications.
ISSN:2405-9188