Blending Ultra Spectral Images of Multi-Source Remote Sensors

Vishal Siddartha Chilkuri,D. Bharathi,R. Karthi

Published 2022 in 2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)

ABSTRACT

Images and procedures from remote sensing are effective tools for studying the earth's surface. Data quality is essential for improving remote sensing applications and producing crisp, noise-free images. Due to the different gathering methods, obtaining a free set of data is quite challenging in most cases. So, picture or information fusion is crucial in far-flung sensing applications. Spatiotemporal fusion (STF) is a method for fusing pix with the proper temporal and spatial decision by integrating (temporally dense) coarse-resolution pictures with (temporally sparse) fine-resolution pictures. This paper makes a specialty of enforcing spatiotemporal fusion of multi-supply remote sensing pictures. In this paper, STF of multi-source remote sensing images, specifically Landsat and Sentinel sensors images is performed using the ESTARFM fusion method. In total 2 experiments were conducted with Landsat 7, Landsat 8, and Sentinel 2 data. In experiment 1 Landsat 7, and Sentinel 2 images are considered fine and coarse resolution images respectively, and in experiment 2 Landsat 8, and Sentinel 2 as fine and coarse resolution images. The metrics suggest that by applying the STF method, the similarity between the fused image and the original image at the prediction time of experiment 2 is more when compared to the corresponding image results of experiment 1.

PUBLICATION RECORD

  • Publication year

    2022

  • Venue

    2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)

  • Publication date

    2022-11-16

  • Fields of study

    Not labeled

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.