Robust Local Preserving and Global Aligning Network for Adversarial Domain Adaptation

Wenwen Qiang, Jiangmeng Li, Changwen Zheng, Bing Su, Hui Xiong

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Unsupervised domain adaptation (UDA) requires source domain samples with clean ground truth labels during training. Accurately labeling a large number of source domain samples is time-consuming and laborious. An alternative is to utilize samples with noisy labels for training. However, training with noisy labels can greatly reduce the performance of UDA. In this paper, we address the problem that learning UDA models only with access to noisy labels and propose a novel method called robust local preserving and global aligning network (RLPGA). RLPGA improves the robustness of the label noise from two aspects. One is learning a classifier by a robust informative-theoretic-based loss function. The other is constructing two adjacency weight matrices and two negative weight matrices by the proposed local preserving module to preserve the local topology structures of input data. We conduct theoretical analysis on the robustness of the proposed RLPGA and prove that the robust informative-theoretic-based loss and the local preserving module are beneficial to reduce the empirical risk of the target domain. A series of empirical studies show the effectiveness of our proposed RLPGA.

Original languageEnglish (US)
Pages (from-to)3014-3029
Number of pages16
JournalIEEE Transactions on Knowledge and Data Engineering
Issue number3
StatePublished - Mar 1 2023
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics


  • Wasserstein distance
  • adversarial learning
  • noisy label
  • representation learning
  • unsupervised domain adaptation


Dive into the research topics of 'Robust Local Preserving and Global Aligning Network for Adversarial Domain Adaptation'. Together they form a unique fingerprint.

Cite this