Camera Style Transfer for Person Re-identification
DOI:
Author:
Affiliation:

1. School of Communication and Information Engineering, Shanghai University, Shanghai 200072, China; 2. Institute of Smart City, Shanghai University, Shanghai 200072, China

Clc Number:

TP391.41;TP332

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Style variations among different cameras is an important challenge in the field of person re-identification. To smooth the camera style disparities and enrich the diversity of pedestrian samples, this paper explicitly learns invariant features among cameras through a style transfer approach. Specifically, a cycle consistent adversarial networks (CycleGAN) is used to generate transformed images with other camera styles for each pedestrian, and along with the original samples, form the augmented training set. In addition, this paper uses an attention mechanism to reweight the feature channels to extract more discriminative pedestrian appearance features, and finally, the multi-task loss is used to supervise the training process of the re-identification network. The experimental results show that the mAP and top-1 metrics of the method in this paper achieve 86.5%, 95.1% and 77.1%, 87.2% on the public datasets Market1501 and DukeMTMC-reID, respectively, which are better than the existing algorithms. Camera style transfer as a data augmentation approach effectively expands the dataset and reduces the human labeling cost, while improving the identification accuracy in multi-camera scenarios.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 17,2024
  • Published: