Download Full Text (10.6 MB)
Domain invariance and discrimination of learned features as two crucial factors affect the performance of unsupervised domain adaptation (UDA) person re-identification (Re-ID). Person attributes (such as “backpack”, “boots”, “handbag”, etc) remaining unchanged across multiple domains have been used as mid-level visual-semantic information in UDA person Re-ID. As two main challenges, both misalignment of attribute-related regions across multiple images and domain shift between source and target domains affect the learning of domain-invariant features (DIF).
To address the above two challenges, this article proposes to take advantage of the stability of person attributes and the complementarity of person attributes and the corresponding low-level visual features to guide the learning of discriminative DIF. Specifically, the proposed solution contains the generation of latent attribute-correlated visual features (GLAVF), DIF learning under the guidance of person attributes, and the alignment of person attributes corresponding to the local regions of pedestrian images. Due to the gap between person attributes and visual features, person attributes are first converted into latent attribute-correlated visual features (LAVF) without any specific domain information in GLAVF, and then LAVF are used as the substitutions of person attributes to guide the learning of DIF. To enhance the discrimination of learned features, the proposed solution mainly explores the alignment between person attributes and corresponding local regions, and the alignment of the same person attributes across multiple pedestrian images. A fully connected layer is used to achieve the above two types of alignment in the proposed framework, which reduces the adverse impacts of inference information and ensures the semantic consistency between person attributes and corresponding local regions across multiple pedestrian images. The effectiveness of the proposed solution is confirmed on four existing datasets by comparative experiments.