List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation - 2024


Research Paper on AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation

Research Area:  Machine Learning

Abstract:

Due to privacy or patent concerns, a growing number of large models are released without granting access to their training data, making transferring their knowledge inefficient and problematic. In response, Data-Free Knowledge Distillation (DFKD) methods have emerged as direct solutions. However, simply adopting models derived from DFKD for real-world applications suffers significant performance degradation, due to the discrepancy between teachers training data and real-world scenarios (student domain). The degradation stems from the portions of teachers knowledge that are not applicable to the student domain. They are specific to the teacher domain and would undermine students performance. Hence, selectively transferring teachers appropriate knowledge becomes the primary challenge in DFKD. In this work, we propose a simple but effective method AuG-KD. It utilizes an uncertainty-guided and sample-specific anchor to align student-domain data with the teacher domain and leverages a generative method to progressively trade off the learning process between OOD knowledge distillation and domain-specific information learning via mixup learning. Extensive experiments in 3 datasets and 8 settings demonstrate the stability and superiority of our approach.

Keywords:  

Author(s) Name:  Zihao Tang, Zheqi Lv, Shengyu Zhang, Yifan Zhou, Xinyu Duan, Fei Wu, Kun Kuang

Journal name:  Machine Learning

Conferrence name:  

Publisher name:  arXiv

DOI:  10.48550/arXiv.2403.07030

Volume Information:  Volume 35, (2024)