List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Domain-agnostic single-image super-resolution via a meta-transfer neural architecture search - 2023

domain-agnostic-single-image-super-resolution-via-a-meta-transfer-neural-architecture-search.jpg

Domain-agnostic single-image super-resolution via a meta-transfer neural architecture search | S-Logix

Research Area:  Machine Learning

Abstract:

Fueled by the powerful learning ability of deep networks, generalized models have been proposed that use external datasets for single-image super-resolution tasks. However, a model trained only with external data may have difficulty in super-resolving images in a domain that differs from the training data. To solve this drawback, several methods have been proposed for internal learning approaches that learn the weights of the network accordance with the test image. Despite these attempts to adapt to specific images using internal learning, they suffer from poor performance due to lack of flexibility that comes from using a fixed architecture regardless of image domain. We thus propose a novel training process that includes external and internal learning. Our internal learning process finds a suitable network architecture and trains the weights for each unseen test image. The overall training process allows the network to learn obtain knowledge from external data and internal learning in a balanced manner. In blind and non-blind experiments, our proposed method outperforms state-of-the-art super-resolution algorithms in various image domains with different kernels. Our proposed approach obtains impressive results in terms of expressing detailed texture and accurate color in images from various domains.

Keywords:  
Network architecture
Neural architecture search
Domain-agnostic
super-resolution
Meta-transfer

Author(s) Name:  Bokyeung Lee, Kyungdeuk Ko, Jonghwan Hong

Journal name:  Neurocomputing

Conferrence name:  

Publisher name:  Elsevier

DOI:  10.1016/j.neucom.2022.12.050

Volume Information:  Volume 524