Tsallis Entropy for Deep Transfer Learning and Domain Adaptation

Publish Year: 1400
نوع سند: مقاله کنفرانسی
زبان: English
View: 148

This Paper With 11 Page And PDF Format Ready To Download

  • Certificate
  • من نویسنده این مقاله هستم

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این Paper:

شناسه ملی سند علمی:

CSCG04_101

تاریخ نمایه سازی: 23 اسفند 1400

Abstract:

In this research, we propose Tsallis entropy for deep transfer learning as an efficient and flexible method to construct a regularized classifier. We improve bias problem on the Convolutional Neural Network (CNN) model based on statistical learning for unsupervised domain adaptation. At first, Tsallis entropy on source domain has been applied to reduce loss. Then, a cosine similarity is used based on K Nearest Neighbors (KNN) classifier to regularize CNN classifier by alleviating the error discrepancy between them. A non-extensive Tsallis entropy function based on KNN classifier is designed as self-regularization for reducing the learning bias. Moreover, the marginal distribution and the conditional distribution have simultaneously been aligned by "Joint Distribution Adaptation " (JDA). Finally, the experiments are implemented on the regularized CNN according to the traditional cross entropy loss and proposed Tsallis entropy loss. The results show that regularized deep transfer learning based on Tsallis entropy is effective and robust for domain adaptation problems, and it has less loss than state-of-the-art domain adaptation methods to detect unreliable samples.

Authors

Zahra Ramezani

Department of Statistics, University of Mazandaran, Babolsar, Iran

Ahmad Pourdarvish

Department of Statistics, University of Mazandaran, Babolsar, Iran