Amazing technological breakthrough possible @S-Logix

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • +91- 81240 01111

Social List

Real-time human posture recognition using an adaptive hybrid classifier - 2020

Research Area:  Machine Learning


A reliable adaptive hybrid classifier (hAHC), which combines a posture-based adaptive signal segmentation algorithm with a multi-layer perceptron (MLP) classifier, together with a plurality voting approach, was proposed and evaluated in this study. The hAHC model was evaluated using a real-time posture recognition framework that sought to identify five behaviours (sitting, walking, standing, running, and lying) based on simulated crowd security scenarios. It was compared to a single MLP classifier (sMLP) and a static hybrid classifier (hSHC) from three perspectives (classification precision, recall and F1-score) that used the real-time dataset collected from unfamiliar subjects. Experimental results showed that the hAHC model improved the classification accuracy and robustness slightly more than the hSHC, and significantly more compared to the sMLP (hAHC 82%; hSHC 79%; sMLP 71%). Additionally, the hAHC approach displayed the real-time results as animated figures in an adaptive window, in contrast to the hSHC which used a fixed size-sliding temporal window that as our results demonstrated, was less suitable for presenting real-time results. The main research contribution from this study has been the development of an efficient software-only-based sensor calibration algorithm that can improve accelerometer precision, together with the design of a posture-based adaptive signal segmentation algorithm that cooperated with an adaptive hybrid classifier to improve the performance of real-time posture recognition.

Author(s) Name:  Shumei Zhang & Victor Callaghan

Journal name:  International Journal of Machine Learning and Cybernetics

Conferrence name:  

Publisher name:  Springer

DOI:  10.1007/s13042-020-01182-8

Volume Information:  volume 12, pages 489–499 (2021)