The Power of Selecting Key Blocks with Local Pre-ranking for Long Document Information Retrieval

On a wide range of natural language processing and information retrieval tasks, transformer-based models, particularly pre-trained language models like BERT, have demonstrated tremendous effectiveness. Due to the quadratic complexity of the self-attention mechanism, however, such models have difficu...

Full description

Bibliographic Details
Main Authors: Chagnon, J. (Author), Cinar, Y.G (Author), Gaussier, E. (Author), Li, M. (Author), Popa, D.N (Author)
Format: Article
Language:English
Published: Association for Computing Machinery 2023
Subjects:
Online Access:View Fulltext in Publisher
View in Scopus