Boosting Arabic Named-Entity Recognition With Multi-Attention Layer

Sequence labeling models with recurrent neural network variants, such as long short-term memory (LSTM) and gated recurrent unit (GRU), show promising performance on several natural language processing (NLP) problems, including named-entity recognition (NER). Most existing models utilize word embeddi...

Full description

Bibliographic Details
Main Authors: Mohammed Nadher Abdo Ali, Guanzheng Tan, Aamir Hussain
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
NLP
Online Access:https://ieeexplore.ieee.org/document/8685084/