Transferability-Guided Cross-Domain Cross-Task Transfer Learning
We propose two novel transferability metrics Fast Optimal Transport-based Conditional Entropy (F-OTCE) and Joint Correspondence OTCE (JC-OTCE) to evaluate how much the source model (task) can benefit the learning of the target task and to learn more generalizable representations for cross-domain cross-task transfer learning. Unlike the original OTCE metric proposed in our previous work, that requires evaluating the empirical transferability on auxiliary tasks, our metrics are auxiliary-free such that they can be computed much more efficiently.

Comparison of the OTCE metric (top) with our proposed F-OTCE (middle) and JC-OTCE (bottom) metrics.
Specifically, F-OTCE estimates transferability by first solving an optimal transport (OT) problem between source and target distributions and then uses the optimal coupling to compute the negative conditional entropy (NCE) between the source and target labels. It can also serve as an objective function to enhance downstream transfer learning tasks including model finetuning and domain generalization (DG).

The OTCE-based finetuning method.
Meanwhile, JC-OTCE improves the transferability accuracy of F-OTCE by including label distances in the OT problem, though it incurs additional computation costs.
Extensive experiments demonstrate that F-OTCE and JC-OTCE outperform state-ofthe-art auxiliary-free metrics by 21.1% and 25.8%, respectively, in correlation coefficient with the ground-truth transfer accuracy. By eliminating the training cost of auxiliary tasks, the two metrics reduce the total computation time of the previous method from 43 min to 9.32s and 10.78s, respectively, for a pair of tasks. When applied in the model finetuning and DG tasks, F-OTCE shows significant improvements in the transfer accuracy in few-shot classification experiments, with up to 4.41% and 2.34% accuracy gains, respectively.
Publication
Yang Tan, Yang Li*, Shao-Lun Huang, and Xiao-Ping Zhang, Transferability-Guided Cross-Domain Cross-Task Transfer Learning, in IEEE Transactions on Neural Networks and Learning Systems, 2024 (Accepted) | ppt |
@article{tan2024transferability, title={Transferability-Guided Cross-Domain Cross-Task Transfer Learning}, author={Tan, Yang and Li, Yang and Huang, Shao-Lun and Zhang, Xiao-Ping}, journal={IEEE Transactions on Neural Networks and Learning Systems}, year={2024}, DOI={10.1109/TNNLS.2024.3358094} }