Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Research Topics for Neural Architecture Search

Research Topics for Neural Architecture Search

PhD Research Topics for Neural Architecture Search

Deep learning offers notable breakthroughs in many fields due to its powerful automatic representation capability. Neural architecture plays a crucial role in representing the features in deep learning for its extraordinary progress and wide range of applications. Emerge of neural architecture search is due to the cause of time-consuming and susceptible to error in manually developed neural architectures.

Neural Architecture Search (NAS) is a revolutionary algorithm and process of automating architecture engineering. The main goal of NAS is to build an automated neural architecture that achieves the best performance on a certain task with limited computing sources and minimal human intervention. The significance sequence of role NAS is to discover architecture from all possible architecture by a search strategy with maximum performance. Some of the early NAS approaches are global and discrete strategy searches from scratch. Currently used NAS approaches are search space, search strategy, and performance estimation. The search space defines a set of designed and optimized operations to form the neural network architecture. The search strategy defines the approach utilized to explore the search space.

The performance estimation strategy evaluates the high performance of a possible neural architecture from its design. Some of the quick applications of NAS are object detection, semantic segmentation, image classification, adversarial learning, speech recognition, architectural scaling, multi-objective optimization, platform-aware, and data augmentation. NAS also be a sub-field of AutoML and possesses important overlap with hyperparameter optimization and meta-learning. The future advancements of NAS are NAS on generative adversarial networks or sensor fusion, NAS methods for multi-task problems and multi-objective problems, unlabelled datasets, self-supervised learning for NAS, and Unsupervised Neural Architecture Search (UnNAS).