Adversarial network samples were generated for hyperspectral image classification
DOI:
Author:
Affiliation:

1.College of Automation & Electric Engineering, Qingdao University of Science & Technology, Qingdao 266061,China;2. College of Information Science & Engineering, Ocean University of China, 266101, China

Clc Number:

TP751

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Hyperspectral image contains rich geographical location information and spectral information. Hyperspectral image classification is a basic and important research direction in the field of remote sensing. However, the insufficient number of hyperspectral image samples is still the main problem that restricts the further improvement of classification accuracy.In generative adversarial network, generator and discriminator are constantly learning against each other. In the final ideal state, the pseudo sample discriminator generated by generator cannot be discriminated and pseudo data samples very similar to real samples are generated. This paper uses generative adversarial network to generate new pseudo-samples based on a small number of original samples, so as to solve the problems of sample acquisition difficulty and insufficient sample quantity. In the experiment, 200 and 400 sample points were selected from two hyperspectral image data sets, and new pseudo-samples were generated in the generative adversarial network for classification training. Compared with SVM, 3DCNN and other classification methods with insufficient samples, the average accuracy of the whole classification has been significantly improved. Experimental results show that the classification performance of the proposed method is better than that of other classification methods.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: June 14,2024
  • Published: