TY - JOUR
T1 - A Comprehensive Survey on Transfer Learning
AU - Zhuang, Fuzhen
AU - Qi, Zhiyuan
AU - Duan, Keyu
AU - Xi, Dongbo
AU - Zhu, Yongchun
AU - Zhu, Hengshu
AU - Xiong, Hui
AU - He, Qing
N1 - Funding Information:
Manuscript received March 14, 2020; revised June 11, 2020; accepted June 19, 2020. Date of publication July 7, 2020; date of current version December 21, 2020. This work was supported in part by the National Key Research and Development Program of China under Grant 2018YFB1004300; in part by the National Natural Science Foundation of China under Grant U1836206, Grant U1811461, Grant 61773361, and Grant 61836013; and in part by the Project of Youth Innovation Promotion Association CAS under Grant 2017146. (Fuzhen Zhuang and Zhiyuan Qi contributed equally to this work.) (Corresponding authors: Fuzhen Zhuang; Zhiyuan Qi.) Fuzhen Zhuang, Zhiyuan Qi, Keyu Duan, Dongbo Xi, Yongchun Zhu, and Qing He are with the Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, Chinese Academy of Sciences (CAS), Beijing 100190, China, and also with the University of Chinese Academy of Sciences, Beijing 100049, China (e-mail: zhuangfuzhen@ict.ac.cn; qizhyuan@gmail.com). Hengshu Zhu is with Baidu Inc., Beijing 100085, China. Hui Xiong is with Rutgers, The State University of New Jersey, Newark, NJ 08854 USA.
Funding Information:
This work was supported in part by the National Key Research and Development Program of China under Grant 2018YFB1004300 in part by the National Natural Science Foundation of China under Grant U1836206, Grant U1811461, Grant 61773361, and Grant 61836013; and in part by the Project of Youth Innovation Promotion Association CAS under Grant 2017146.*%blankline%*
Publisher Copyright:
© 2020 IEEE.
PY - 2021/1
Y1 - 2021/1
N2 - Transfer learning aims at improving the performance of target learners on target domains by transferring the knowledge contained in different but related source domains. In this way, the dependence on a large number of target-domain data can be reduced for constructing target learners. Due to the wide application prospects, transfer learning has become a popular and promising area in machine learning. Although there are already some valuable and impressive surveys on transfer learning, these surveys introduce approaches in a relatively isolated way and lack the recent advances in transfer learning. Due to the rapid expansion of the transfer learning area, it is both necessary and challenging to comprehensively review the relevant studies. This survey attempts to connect and systematize the existing transfer learning research studies, as well as to summarize and interpret the mechanisms and the strategies of transfer learning in a comprehensive way, which may help readers have a better understanding of the current research status and ideas. Unlike previous surveys, this survey article reviews more than 40 representative transfer learning approaches, especially homogeneous transfer learning approaches, from the perspectives of data and model. The applications of transfer learning are also briefly introduced. In order to show the performance of different transfer learning models, over 20 representative transfer learning models are used for experiments. The models are performed on three different data sets, that is, Amazon Reviews, Reuters-21578, and Office-31, and the experimental results demonstrate the importance of selecting appropriate transfer learning models for different applications in practice.
AB - Transfer learning aims at improving the performance of target learners on target domains by transferring the knowledge contained in different but related source domains. In this way, the dependence on a large number of target-domain data can be reduced for constructing target learners. Due to the wide application prospects, transfer learning has become a popular and promising area in machine learning. Although there are already some valuable and impressive surveys on transfer learning, these surveys introduce approaches in a relatively isolated way and lack the recent advances in transfer learning. Due to the rapid expansion of the transfer learning area, it is both necessary and challenging to comprehensively review the relevant studies. This survey attempts to connect and systematize the existing transfer learning research studies, as well as to summarize and interpret the mechanisms and the strategies of transfer learning in a comprehensive way, which may help readers have a better understanding of the current research status and ideas. Unlike previous surveys, this survey article reviews more than 40 representative transfer learning approaches, especially homogeneous transfer learning approaches, from the perspectives of data and model. The applications of transfer learning are also briefly introduced. In order to show the performance of different transfer learning models, over 20 representative transfer learning models are used for experiments. The models are performed on three different data sets, that is, Amazon Reviews, Reuters-21578, and Office-31, and the experimental results demonstrate the importance of selecting appropriate transfer learning models for different applications in practice.
KW - Domain adaptation
KW - interpretation
KW - machine learning
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85088803089&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85088803089&partnerID=8YFLogxK
U2 - 10.1109/JPROC.2020.3004555
DO - 10.1109/JPROC.2020.3004555
M3 - Review article
AN - SCOPUS:85088803089
SN - 0018-9219
VL - 109
SP - 43
EP - 76
JO - Proceedings of the Institute of Radio Engineers
JF - Proceedings of the Institute of Radio Engineers
IS - 1
M1 - 9134370
ER -