Research Area:  Machine Learning
Training a deep convolutional neural network (CNN) from scratch is difficult because it requires a large amount of labeled training data and a great deal of expertise to ensure proper convergence. A promising alternative is to fine-tune a CNN that has been pre-trained using, for instance, a large set of labeled natural images. However, the substantial differences between natural and medical images may advise against such knowledge transfer. In this paper, we seek to answer the following central question in the context of medical image analysis: Can the use of pre-trained deep CNNs with sufficient fine-tuning eliminate the need for training a deep CNN from scratch? To address this question, we considered four distinct medical imaging applications in three specialties (radiology, cardiology, and gastroenterology) involving classification, detection, and segmentation from three different imaging modalities, and investigated how the performance of deep CNNs trained from scratch compared with the pre-trained CNNs fine-tuned in a layer-wise manner. Our experiments consistently demonstrated that 1) the use of a pre-trained CNN with adequate fine-tuning outperformed or, in the worst case, performed as well as a CNN trained from scratch; 2) fine-tuned CNNs were more robust to the size of training sets than CNNs trained from scratch; 3) neither shallow tuning nor deep tuning was the optimal choice for a particular application; and 4) our layer-wise fine-tuning scheme could offer a practical way to reach the best performance for the application at hand based on the amount of available data.
Keywords:  
Author(s) Name:  Nima Tajbakhsh; Jae Y. Shin; Suryakanth R. Gurudu; R. Todd Hurst; Christopher B. Kendall; Michael B. Gotway; Jianming Liang
Journal name:  IEEE Transactions on Medical Imaging
Conferrence name:  
Publisher name:  IEEE
DOI:  10.1109/TMI.2016.2535302
Volume Information:  Volume: 35, Issue: 5, May 2016, Page(s): 1299 - 1312
Paper Link:   https://ieeexplore.ieee.org/abstract/document/7426826