Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Spatial Attention-Based 3D Graph Convolutional Neural Network for Sign Language Recognition - 2022

Spatial Attention-Based 3D Graph Convolutional Neural Network for Sign Language Recognition

Research paper on Spatial Attention-Based 3D Graph Convolutional Neural Network for Sign Language Recognition

Research Area:  Machine Learning

Abstract:

Sign language is the main channel for hearing-impaired people to communicate with others. It is a visual language that conveys highly structured components of manual and non-manual parameters such that it needs a lot of effort to master by hearing people. Sign language recognition aims to facilitate this mastering difficulty and bridge the communication gap between hearing-impaired people and others. This study presents an efficient architecture for sign language recognition based on a convolutional graph neural network (GCN). The presented architecture consists of a few separable 3DGCN layers, which are enhanced by a spatial attention mechanism. The limited number of layers in the proposed architecture enables it to avoid the common over-smoothing problem in deep graph neural networks. Furthermore, the attention mechanism enhances the spatial context representation of the gestures. The proposed architecture is evaluated on different datasets and shows outstanding results.

Keywords:  
Sign language recognition
Graph convolutional neural network (GCN)
Attention
Deep learning

Author(s) Name:  Muneer Al-Hammadi ,Mohamed A. Bencherif ,Mansour Alsulaiman ,Ghulam Muhammad,Mohamed Amine Mekhtiche,Wadood Abdul ,Yousef A. Alohali ,Tareq S. Alrayes ,Hassan Mathkour,Mohammed Faisal ,Mohammed Algabri,Hamdi Altaheri,Taha Alfakih and Hamid Ghaleb

Journal name:  Sensors

Conferrence name:  

Publisher name:  MDPI

DOI:  10.3390/s22124558

Volume Information:  Volume 22, Issue 12