Research Area:  Machine Learning
We propose a Dynamic Graph-Based Spatial-Temporal Attention (DG-STA) method for hand gesture recognition. The key idea is to first construct a fully-connected graph from a hand skeleton, where the node features and edges are then automatically learned via a self-attention mechanism that performs in both spatial and temporal domains. We further propose to leverage the spatial-temporal cues of joint positions to guarantee robust recognition in challenging conditions. In addition, a novel spatial-temporal mask is applied to significantly cut down the computational cost by 99%. We carry out extensive experiments on benchmarks (DHG-14/28 and SHREC-17) and prove the superior performance of our method compared with the state-of-the-art methods.
Keywords:  
Dynamic Graphs
Hand Gesture Recognition
Spatial-Temporal Attention
Machine learning
Author(s) Name:   Yuxiao Chen, Long Zhao, Xi Peng, Jianbo Yuan, Dimitris N. Metaxas
Journal name:  Computer Vision and Pattern Recognition
Conferrence name:  
Publisher name:  arXiv:1907.08871
DOI:  10.48550/arXiv.1907.08871
Volume Information:  
Paper Link:   https://arxiv.org/abs/1907.08871