STIGANet: Integrating DGCNS and attention mechanisms for real-time 3D pose estimation in sports

In modern sports training and competitions, precise action analysis and feedback are essential for optimizing athletes’ performance. Traditional methods, however, are time-consuming, labor-intensive, and prone to subjective judgment, leading to inconsistencies and inaccuracies. Existing AI-based app...

Full description

Bibliographic Details
Published in:Alexandria Engineering Journal
Main Authors: Qi Liu, Zhenzhou Wang, Han Zhang, Changqing Miao
Format: Article
Language:English
Published: Elsevier 2025-05-01
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1110016825002352
Description
Summary:In modern sports training and competitions, precise action analysis and feedback are essential for optimizing athletes’ performance. Traditional methods, however, are time-consuming, labor-intensive, and prone to subjective judgment, leading to inconsistencies and inaccuracies. Existing AI-based approaches struggle with high-speed movements, complex backgrounds, and real-time processing. To address these limitations, we propose the Spatio-Temporal Interweaved Graph and Attention Network (STIGANet) for accurate 3D human pose estimation. STIGANet combines Dynamic Graph Convolutional Networks (DGCN), a Spatio-Temporal Cross-Attention Mechanism (STCA), Spatio-Temporal Interweaved Attention (STIA), and a Deformable Transformer Encoder, enabling effective capture and fusion of spatial and temporal features in human actions. The model improves pose estimation accuracy and robustness in dynamic, real-time sports environments. On the Human3.6M and MPI-INF-3DHP datasets, STIGANet achieves superior performance with MPJPEs of 38.2 mm and 45.3 mm, respectively, outperforming existing methods. These findings highlight the model’s potential for real-time sports action analysis. Overall, this work enhances sports action analysis by combining graph convolutional networks with attention mechanisms, offering a robust framework for real-time insights during sports training and rehabilitation.
ISSN:1110-0168