Research Area:  Machine Learning
Deep neural networks have achieved great success in almost every field of artificial intelligence. However, several weaknesses keep bothering researchers due to its hierarchical structure, particularly when large-scale parallelism, faster learning, better performance, and high reliability are required. Inspired by the parallel and large-scale information processing structures in the human brain, a shallow broad neural network model is proposed on a specially designed multi-order Descartes expansion operation. Such Descartes expansion acts as an efficient feature extraction method for the network, improve the separability of the original pattern by transforming the raw data pattern into a high-dimensional feature space, the multi-order Descartes expansion space. As a result, a single-layer perceptron network will be able to accomplish the classification task. The multi-order Descartes expansion neural network (MODENN) is thus created by combining the multi-order Descartes expansion operation and the single-layer perceptron together, and its capacity is proved equivalent to the traditional multi-layer perceptron and the deep neural networks. Three kinds of experiments were implemented, the results showed that the proposed MODENN model retains great potentiality in many aspects, including implementability, parallelizability, performance, robustness, and interpretability, indicating MODENN would be an excellent alternative to mainstream neural networks.
Keywords:  
Deep neural network
Artificial intelligence
Hierarchical structure
High reliability
Multi-order descartes
Implementability
Parallelizability
Performance
Robustness
Interpretability
Author(s) Name:  Haifeng Li, Cong Xu, Lin Ma, Hongjian Bo, David Zhang
Journal name:  IEEE Transactions on Pattern Analysis and Machine Intelligence
Conferrence name:  
Publisher name:  IEEE
DOI:  10.1109/TPAMI.2021.3125690
Volume Information:  Volume 44
Paper Link:   https://ieeexplore.ieee.org/abstract/document/9606548