Robust Representation Learning via Sparse Attention Mechanism for Similarity Models

The attention-based models are widely used for time series data. However, due to the quadratic complexity of attention regarding input sequence length, the application of Transformers is limited by high resource demands. Moreover, their modifications for industrial time series need to be robust to m...

Full description

Bibliographic Details
Published in:IEEE Access
Main Authors: Alina Ermilova, Nikita Baramiia, Valerii Kornilov, Sergey Petrakov, Alexey Zaytsev
Format: Article
Language:English
Published: IEEE 2024-01-01
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10570432/