17 June 2021 Interactive two-stream graph neural network for skeleton-based action recognition
Dun Yang, Qing Zhou, Ju Wen
Author Affiliations +
Abstract

Action recognition has wide applications in fields such as human–computer interaction, virtual reality, and robotics. Since human actions can be represented as a sequence of skeleton graphs, approaches based on graph neural networks (GNNs) have attracted considerable attention in the research action recognition. Recent studies have demonstrated the effectiveness of two-stream GNNs in which discriminative features for action recognition are extracted from both the joint stream and the bone stream. Each stream is generated by GNNs that support message passing along fixed connections between vertices. However, existing two-stream approaches have two limitations: no interaction is allowed between the two streams and temporary contacts between joints or bones cannot be modeled. To address these issues, we propose the interactive two-stream graph neural network, which employs a joint–bone communication block to accelerate the interaction between the joint stream and the bone stream. Furthermore, an adaptive strategy is introduced to enable dynamic connections between vertices. Extensive experiments on three large-scale datasets have demonstrated the effectiveness of our proposed method.

© 2021 SPIE and IS&T 1017-9909/2021/$28.00© 2021 SPIE and IS&T
Dun Yang, Qing Zhou, and Ju Wen "Interactive two-stream graph neural network for skeleton-based action recognition," Journal of Electronic Imaging 30(3), 033025 (17 June 2021). https://doi.org/10.1117/1.JEI.30.3.033025
Received: 2 February 2021; Accepted: 19 April 2021; Published: 17 June 2021
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Bone

Neural networks

Data modeling

RGB color model

Convolution

3D modeling

Video

Back to Top