OTCE: A Transferability Metric for Cross-Domain Cross-Task Representations

 Transfer learning across heterogeneous data distributions (a.k.a. domains) and distinct tasks is a more general and challenging problem than conventional transfer learning, where either domains or tasks are assumed to be the same. While neural network based feature transfer is widely used in transfer learning applications, finding the optimal transfer strategy still requires time-consuming experiments and domain knowledge. We propose a transferability metric called Optimal Transport based Conditional Entropy (OTCE), to analytically predict the transfer performance for supervised classification tasks in such cross-domain and cross-task feature transfer settings. Our OTCE score characterizes transferability as a combination of domain difference and task difference, and explicitly evaluates them from data in a unified framework. Specifically, we use optimal transport to estimate domain difference and the optimal coupling between source and target distributions, which is then used to derive the conditional entropy of the target task (task difference). Experiments on the largest cross-domain dataset DomainNet and Office31 demonstrate that OTCE shows an average of 21% gain in the correlation with the ground truth transfer accuracy compared to stateof-the-art methods. We also investigate two applications of the OTCE score including source model selection and multi-source feature fusion.

Publication

 Yang Tan, Yang Li*, and Shao-lun Huang. OTCE: A Transferability Metric for Cross-Domain Cross-TaskRepresentations. 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR’21), 2021 (Accepted) pdf ppt
• Supplementary materials : pdf

Bibtex
@INPROCEEDINGS{tan-li-huang21,
author={Y. {Tan} and Y. {Li} and S. {Huang}  },
booktitle={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
title={OTCE: A Transferability Metric for Cross-Domain Cross-Task Representations},
year={2021},
month={June}
}