Due to the high temporal repetition rates, median/low spatial resolution remote sensing images are the main data source of change detection (CD). It is worth noting that they contain a large number of mixed pixels, which makes adequately capturing the details in the resulting thematic map challenging. The spectral unmixing (SU) method is a potential solution to this problem, as it decomposes mixed pixels into a set of fractions of the land covers. However, there are accumulated errors in the fractional difference images, which lead to a poor change detection results. Meanwhile, the spectra variation of the endmember and the heterogeneity of the land cover materials cannot be fully considered in the traditional framework. In order to solve this problem, a novel change detection approach with image stacking and dividing based on spectral unmixing while considering the variability of endmembers (CD_SDSUVE) was proposed in this paper. Firstly, the remote sensing images at different times were stacked into a unified framework. After that, several patch images were produced by dividing the stacked images so that the similar endmembers according to each land cover can be completely extracted and compared. Finally, the multiple endmember spectral mixture analysis (MESMA) is performed, and the abundant images were combined to produce the entire change detection thematic map. This proposed algorithm was implemented and compared to four relevant state-of-the-art methods on three experimental data, whereby the results confirmed that it effectively improved the accuracy. In the simulated data, the overall accuracy (OA) and Kappa coefficient values were 99.61% and 0.99. In the two real data, the maximum of OA were acquired with 93.26% and 80.85%, which gained 14.88% and 13.42% over the worst results at most. Meanwhile, the Kappa coefficient value was consistent with the OA.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited