UST-Net: A U-Shaped Transformer Network Using Shifted Windows for Hyperspectral Unmixing

Zhiru Yang,Mingming Xu,Shanwei Liu,Hui Sheng,Jianhua Wan

Published 2023 in IEEE Transactions on Geoscience and Remote Sensing

ABSTRACT

Autoencoders (AEs) are commonly utilized for acquiring low-dimensional data representations and performing data reconstruction, which makes them suitable for hyperspectral unmixing (HU). However, AE networks trained pixel by pixel and those employing localized convolutional filters disregard the global material distribution and distant interdependencies, resulting in the loss of necessary spatial feature information essential for the unmixing process. To overcome this limitation, we propose an innovative deep neural network model named U-shaped transformer network using shifted windows (UST-Net). UST-Net prioritizes spatial information in the scene that is more discriminative and significant by using multihead self-attention blocks based on shifted windows. Unlike patch-based unmixing networks, UST-Net operates on the complete image, eliminating inconsistencies associated with patches. Moreover, the downsampling and upsampling stages are used to extract hyperspectral image (HSI) feature maps at different scales. This process generates a context-rich and spatially accurate abundance map without losing local details. The experimental results of one synthetic dataset and three real datasets demonstrate that UST-Net significantly outperforms both traditional and several other advanced neural network methods. Our code is publicly available at https://github.com/UPCGIT/UST-Net.

PUBLICATION RECORD

  • Publication year

    2023

  • Venue

    IEEE Transactions on Geoscience and Remote Sensing

  • Publication date

    Unknown publication date

  • Fields of study

    Computer Science, Environmental Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-53 of 53 references · Page 1 of 1

CITED BY

Showing 1-39 of 39 citing papers · Page 1 of 1