Next Article in Journal
Residual Learning and Multi-Path Feature Fusion-Based Channel Estimation for Millimeter-Wave Massive MIMO System
Next Article in Special Issue
TPFusion: Texture Preserving Fusion of Infrared and Visible Images via Dense Networks
Previous Article in Journal
Entropy-Variance Curves of Binary Sequences Generated by Random Substitutions of Constant Length
Previous Article in Special Issue
Research on Color Image Encryption Algorithm Based on Bit-Plane and Chen Chaotic System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multiscale Geometric Analysis Fusion-Based Unsupervised Change Detection in Remote Sensing Images via FLICM Model

1
Department of Electronic Engineering, Tsinghua University, Beijing 100084, China
2
College of Information Science and Engineering, Xinjiang University, Urumqi 830046, China
*
Author to whom correspondence should be addressed.
Submission received: 17 December 2021 / Revised: 25 January 2022 / Accepted: 26 January 2022 / Published: 18 February 2022
(This article belongs to the Special Issue Advances in Image Fusion)

Abstract

:
Remote sensing image change detection is widely used in land use and natural disaster detection. In order to improve the accuracy of change detection, a robust change detection method based on nonsubsampled contourlet transform (NSCT) fusion and fuzzy local information C-means clustering (FLICM) model is introduced in this paper. Firstly, the log-ratio and mean-ratio operators are used to generate the difference image (DI), respectively; then, the NSCT fusion model is utilized to fuse the two difference images, and one new DI is obtained. The fused DI can not only reflect the real change trend but also suppress the background. The FLICM is performed on the new DI to obtain the final change detection map. Four groups of homogeneous remote sensing images are selected for simulation experiments, and the experimental results demonstrate that the proposed homogeneous change detection method has a superior performance than other state-of-the-art algorithms.

1. Introduction

The application of remote sensing images is more and more extensive in the current research. These applications include image fusion [1,2,3,4,5,6], image classification [7,8,9,10,11], change detection [12,13,14,15,16,17], etc. In particular, remote sensing image change detection is to calculate the changed region from the images obtained in two different periods, and this method plays a significant role in the change observation of land use change, flood disaster, earthquake, and fire.
Many remote sensing image change detection methods have been proposed to detect the changed information, and these methods can be divided into two components: supervised and unsupervised algorithms [18,19]. Because the corresponding classifier in supervised change detection method usually needs to be trained with available labeled data, its acquisition usually takes time and is costly. Compared with the supervised method, the unsupervised method does not need labeled reference images for training; in general, the multi-temporal remote sensing images we obtained do not have reference images, which matches the practical applications. Remote sensing image change detection mainly contains three steps: preprocessing (e.g., geometric registration or denoising); difference image generation; and analyzing the difference image to obtain the change detection map.
The thresholding-based, segmentation-based, and clustering-based methods are widely used in unsupervised change detection approaches [15]. In terms of the thresholding-based methods, the Kittler-Illingworth minimum-error thresholding method [20], the Otsu method [21], and likelihood ratio method [22] are used. Gong et al. [23] introduced a synthetic aperture radar (SAR) image change detection method based on a neighborhood-based ratio (NR) operator and the generalization of Kittler and Illingworth thresholding (GKIT) model. Xu et al. [24] proposed SAR image change detection method using a modified neighborhood-based operator and iterative Otsu model. Geetha et al. [25] proposed multi-temporal SAR image change detection using a Laplacian pyramid and Otsu model. For the segmentation-based methods, Celik et al. [26] proposed a remote sensing image change detection method based on an undecimated discrete wavelet transform and Chan–Vese segmentation model. The clustering-based methods are most popular in the image change detection, e.g., Celik et al. [27] introduced one remote sensing image change detection method using a principal component analysis and K-means clustering (PCAKM) model. Li et al. [28] proposed an unsupervised SAR change detection using gabor wavelet and fuzzy C-means clustering. Chen et al. [29] introduced the nonsubsampled contourlet transform-Hidden Markov Tree model (NSCT-HMT) model and fuzzy local information c-means (FLICM) into the remote sensing image change detection. The aforementioned change detection methods have made some achievements in the field of remote sensing change detection.
In recent studies, the deep learning methods have been successfully applied to remote sensing change detection. These methods include principal component analysis network (PCANet) [30], channel weighting-based deep cascade network [31], convolutional-wavelet neural networks [32], multiscale capsule network [33], transferred deep learning [34], deep pyramid feature learning networks [35], attention-based deeply supervised network [36], etc. Because the methods based on deep learning use training samples for training, the accuracy of the final change detection results is also relatively high.
In this paper, we present a novel remote sensing image change detection method based on a multiscale geometric analysis fusion and FLICM model. Simulation experiments on four groups of remote sensing images verify the practicability and effectiveness of the proposed algorithm.

2. Methodology

This section introduces the proposed remote sensing image change detection method, and we assume that multi-temporal remote sensing images are registered. The main contents include the difference image (DI) calculated by log-ratio operator (LR) and mean-ratio operator (MR), respectively; the fused difference image generated by NSCT fusion; and the final change detection map computed by FLICM model. The structure of the proposed algorithm is shown in Figure 1.

2.1. Multiscale Geometric Analysis

Multiscale geometric analysis includes ridgelet, curvelet, contourlet, and shearlet transform, etc. [37]. These transforms have been widely used in image processing, such as image denoising and image fusion. Nonsubsampled contourlet transform (NSCT) is the optimization model of contourlet [38], and it is a translation invariant, multiscale, and multidirectional transformation. NSCT is constructed by a nonsubsampled pyramid (NSP) and nonsubsampled directional filter bank (NSDFB). Firstly, NSP decomposes the input image into high-pass and low-pass parts, and then NSDFB decomposes the high-frequency sub-band into multiple directional sub-bands, and the low-frequency part continues to be decomposed, as above. Liu et al. [39] introduced the image fusion based on NSCT and sparse representation model.

2.2. Difference Image Generation

In the process of remote sensing image change detection, the difference image (DI) generation is an important step. It is assumed that there are registered and corrected remote sensing images X and Y, and the difference images computed by the log-ratio operator (LR) [40] and mean-ratio operator (MR) [41] are described as follows:
L R = l o g Y X = l o g Y l o g X ,  
M R = 1 m i n μ 1 μ 2 , μ 2 μ 1 ,
where μ 1 and μ 2 show the local mean values of the remote sensing images X and Y, respectively.
The background information generated by the log-ratio image is relatively flat, and the change area information reflected by the mean-ratio image is relatively consistent with the real change trend of the remote sensing image. Therefore, the log-ratio image and mean-ratio image can be integrated into one new difference image with complementary information. Compared with single difference image computed by the log-ratio or mean-ratio operator, the fused difference image can not only reflect the real change trend but also suppress the background.
In order to achieve more useful information, we integrate the two difference images through NSCT transformation. The main step of the NSCT-based fusion can be concluded as follows.
Step 1: The LR and MR images are decomposed by NSCT into low-frequency (LF) and high-frequency (HF) components, respectively. We define them as D I L F L R ,   D I L F M R and D I H F L R ,   D I H F M R .
Step 2: Fuse the low- and high-frequency components using the average rule and Gaussian weighted local area energy rule, respectively.
D I L F f u s e = D I L F L R + D I L F M R / 2 ,
D I H F f u s e i , j = D I H F L R i , j ,   E H F L R i , j E H F M R i , j D I H F M R i , j ,   E H F L R i , j > E H F M R i , j ,
where E H F i , j shows the Gaussian weighted local area energy coefficient, and it is computed by
E H F i , j = h = p p t = p p g h , t D I H F i + h ,   j + h 2 ,
where g h , t shows the element of the rotationally symmetric Gaussian low-pass filter g of size 2 p + 1 × 2 p + 1 with standard deviation σ = 1 .
Step 3: The fused difference image D I f i n a l is calculated by the inverse NSCT performing on fused low-frequency D I L F f u s e   and high-frequency D I H F f u s e .
In this section, the NSCT decomposition level is one, and it has one low-frequency sub-band and two high-frequency sub-bands. This ensures the running time of the algorithm and achieves good fusion effect. Subsequently, the fused difference image will be analyzed by the FLICM model.

2.3. FLICM Model

In the fuzzy local information c-means (FLICM) clustering model, the fuzzy factor G k i is defined as follows [42]:
G k i = j N I 1 d i j + 1 1 u k j m x j v k 2 ,  
where the ith pixel represents the center of the local window, the jth pixel depicts the neighboring pixels falling into the window around the ith pixel, and d i j presents the spatial Euclidean distance between pixels i and j. v k shows the prototype of the center of cluster k, and u k j   shows the fuzzy membership of the gray value j with respect to the kth cluster. x j v k 2 shows the Euclidean distance between object x j and cluster center v k .
According to the previously defined function G k i , the objective function of the FLICM model is calculated by
J m = i = 1 N k = 1 c u k i m x i v k 2 + G k i ,  
where v k and u k i have the same meaning as in Equation (6). N and c represent the number of the data items and clusters, respectively. x i v k 2 shows the Euclidean distance between object x i and cluster center v k . The u k i and v k are defined as follows:
u k i = 1 j = 1 c x i v k 2 + G k i x i v j 2 + G j i 1 / m 1 ,
v k = i = 1 N u k i m x i i = 1 N u k i m .

3. Experimental Results and Discussion

In this section, two groups of SAR images and two groups of optical images are used to simulate. In order to evaluate the detection accuracy of the proposed algorithm more accurately, subjective and objective evaluations are adopted. Some state-of-the-art change detection methods are compared, such as PCAKM [27], Gabor wavelet and two-level clustering (GaborTLC) [28], LMT [43], PCANet [30], NRELM [44], neighborhood-based ratio and collaborative representation (NRCR) [45], and convolutional-wavelet neural networks (CWNN) [32]. Meanwhile, the false negative (FN) [32], false positive (FP) [32], overall error (OE) [32], percentage correct classification (PCC) [32], kappa coefficient (KC) [32,46,47], and F1-score (F1) [18] are used as the objective evaluation metrics. Figure 2, Figure 3, Figure 4 and Figure 5 show the remote sensing images for simulating.

3.1. Experimental Data

The first data utilized in the experiment is the Ottawa data set with the size 290 × 350 pixels. The original images were obtained in May and August 1997, respectively, which are shown in Figure 2a,b. The corresponding ground-truth image is depicted in Figure 2c.
The second data is the Whenchuan data set with the size 442 × 301 obtained by ESA/ASAR on 3 March 2008 and 16 June 2008, respectively, which are shown in Figure 3a,b. The corresponding reference image is shown in Figure 3c.
The third data is the Mexico data set of optical images with the size 512 × 512 captured in April 2000 and May 2005, respectively. The two original images and the reference image are depicted in Figure 4.
The fourth data is the Yambulla data set consists of two optical images with the size of 500 × 500 pixels (as shown in Figure 5); they were acquired on 1 October 2015 and 6 February 2016 over the area of the Yambulla State Forest (Australia), respectively. More details of the data sets are concluded in Table 1.

3.2. Analysis of the Difference Image

In this subsection, we discuss the difference images generated by different methods and the change detection results generated by FLICM model. In Figure 6, we can see the difference images computed by the log-ratio operator (LR), mean-ratio operator (MR), and NSCT fusion, respectively.
The performance of the difference images (DIs) computed by the LR, MR, and NSCT fusion models are evaluated by the empirical receiver operating characteristics (ROC) curves (as shown in Figure 7), which are plotted by utilizing the true positive (TP) rate (TPR) versus the false positive (FP) rate (FPR). Moreover, two quantitative criteria derived from the ROC curve can be calculated: the area under the curve (AUC) [48] and the diagonal distance (Ddist) [48], as well as the corresponding metrics, are shown in Table 2. For the two metrics, the larger the criterion, the better the detection. From Table 2, we can denote that the NSCT fusion model performs better than the LR and MR operators.
Figure 8 shows the change detection results of the difference images with the FLICM model on Ottawa data set, and the corresponding metrics data are shown in Table 3. Figure 8a has the high alarm missing rate; in other words, FN value is too large; Figure 8b has the high false detection rate, and the FP is large; Figure 8c is the best change detection result, with the highest values of PCC, KC, and F1; at the same time, the balanced FN and FP values are generated, and it has the lowest OE value. This also shows that the result of fused difference image computed by the proposed method is better than that of single LR and MR images.

3.3. Experimental Comparison

The change detection results generated by the proposed remote sensing image change detection algorithm, as well as seven comparative approaches, are depicted in Figure 9, Figure 10, Figure 11 and Figure 12 and Table 4, Table 5, Table 6 and Table 7.
Figure 9 shows the change maps on Ottawa data set. From the results, it can be seen that the LMT method generates the worst performance, and it has the highest FN value. The PCAKM and GaborTLC methods have high missed detection, losing some detail information. The NRCR method has more false detection, exhibiting many isolated spots with the highest FP values. The PCANet and NRELM algorithms give a similar performance, but these two methods still have some missed detection with high FN value. The visual performance obtained by CWNN technique is better than the previously mentioned six algorithms, while it has some false detection with high FP value. For the proposed change detection model, it achieves the best performance compared to other state-of-the-art approaches, and the change map is closer to the reference image. Table 4 gives the FN, FP, OE, PCC, KC, and F1 values for the different image change detection algorithms on the Ottawa data set, respectively. The proposed method achieves the best OE, PCC, KC, and F1 values, and these values are consistent with the visual effect of the experiment.
Figure 10 shows the change results on Wenchuan data set, and the corresponding quantitative evaluation is given in Table 5. From the results, it can be observed that the PCAKM, GaborTLC, LMT, PCANet, NRELM, NRCR, and CWNN methods have high missed detection, and they have high FN values, especially in the CWNN model, and the FN is the highest. Compared to other approaches, the change detection result obtained by the proposed method is the best, the balanced FN and FP values are generated, it matches the reference image best. From Table 5, we can conclude that the FN, OE, PCC, KC, and F1 values achieved by the proposed technique are the best, and the FP value generated by the CWNN model is the best. KC is a more comprehensive evaluation metric, and the KC value of the proposed method is 8.24%, 11.24%, 15.40%, 3.47%, 5.99%, 9.49%, and 16.71% ahead of PCAKM, GaborTLC, LMT, PCANet, NRELM, NRCR, and CWNN, respectively.
Figure 11 and Table 6 give the results on Mexico data set. From the results, it can be seen that the PCAKM, GaborTLC, LMT, and PCANet approaches have high missed detection, the corresponding FN values are high, and the FN value achieved by GaborTLC is the highest. The NRELM, NRCR, and CWNN techniques achieve better performance compared to aforementioned four methods. The result generated by the proposed technique has the highest visual effect advantage compared to the state-of-the-art methods. From the data as shown in Table 6, we can see that the FN, OE, PCC, KC, and F1 values generated by our method are the best, and the FP value achieved by the GaborTLC method is the best. The KC value of the proposed algorithm is 4.69%, 12.07%, 5.27%, 3.03%, 0.37%, 1.08%, and 2.57% ahead of PCAKM, GaborTLC, LMT, PCANet, NRELM, NRCR, and CWNN, respectively. Qualitative and quantitative evaluations of this group of experiments have achieved consistency.
Figure 12 depicts the change maps on Yambulla data set. From the results, it can be seen that the GaborTLC, LMT, NRELM, and NRCR techniques suppress the noise, and the false detection rate is reduced, while they have high missed detection. The PCAKM, PCANet, and CWNN methods generate better performance, but the missed detection rate is still high. Compared with other seven algorithms, the change map generated by our method is the best, and it has the lowest missed detection rate. From the data as shown in Table 7, the values of FN, OE, PCC, KC, and F1 achieved by the proposed method are the best. The KC value of the proposed method is 2.58%, 10.61%, 6.54%, 5.17%, 14.07%, 11.28%, and 1.86% ahead of PCAKM, GaborTLC, LMT, PCANet, NRELM, NRCR, and CWNN, respectively. The qualitative and quantitative evaluations of this group of data are consistent, which proves the superiority of our algorithm.
In order to verify the effectiveness and superiority of the proposed algorithm more accurately, we take the average value of the simulation experimental data of four groups of remote sensing images, as shown in Table 8. The index value distribution fluctuation line of each group of data and comparison algorithms are shown in Figure 13, and the average values are given in the legend. From Table 8, we can denote that the scores of FN, OE, PCC, KC, and F1 generated by the proposed method are the best. The effectiveness of the proposed algorithm is objectively proved.

4. Conclusions

In this paper, a novel remote sensing image change detection method based on NSCT fusion and FLICM model is proposed. The background information computed by the log-ratio image is relatively flat, and the change area information reflected by the mean-ratio image is relatively consistent with the real change trend of the remote sensing image. Therefore, the log-ratio image and mean-ratio image can be integrated into one new difference image with complementary information. Based on these analysis, the difference images generated by log-ratio and mean-ratio operators are fused by the NSCT model, and the fused difference image is obtained. Then, the FLICM model is used to generate the final change detection map. We carried out simulation experiments on four groups of remote sensing images. The experimental results verify the effectiveness of our algorithm by qualitative and quantitative evaluations with other algorithms. Our method can be effectively applied to land cover, flood, earthquake, and forest fire monitoring. In our experiment, we only simulate and verify the change detection in homogeneous remote sensing images. In future work, we will explore and improve the proposed algorithm for change detection in heterogeneous remote sensing images.

Author Contributions

The experimental measurements and data collection were carried out by L.L. and H.M. The manuscript was written by L.L. with the assistance of H.M. and Z.J. All authors reviewed the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Shanghai Aerospace Science and Technology Innovation Fund under Grant No. SAST2019-048; the Cross-Media Intelligent Technology Project of Beijing National Research Center for Information Science and Technology (BNRist) under Grant No. BNR2019TD01022.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

NSCTNonsubsampled contourlet transform
FLICMFuzzy local information C-means clustering
DIDifference image
SARSynthetic aperture radar
NRNeighborhood-based ratio
GKITGeneralization of Kittler and Illingworth thresholding
PCAKMPrincipal component analysis and K-means clustering
NSCT-HMTNonsubsampled contourlet transform-Hidden Markov Tree
PCANetPrincipal component analysis network
LRLog-ratio
MRMean-ratio
NSPNonsubsampled pyramid
NSDFBNonsubsampled directional filter bank
LFLow-frequency
HFHigh-frequency
GaborTLCGabor wavelet and two-level clustering
LMTLogarithmic mean-based thresholding
NRELMNeighborhood-based ratio and extreme learning machine
NRCRNeighborhood-based ratio and collaborative representation
CWNNConvolutional-wavelet neural networks
FNFalse negative
FPFalse positive
OEOverall error
PCCPercentage correct classification
KCKappa coefficient
F1F1-score
ROCReceiver operating characteristics
TPRTrue positive rate
FPRFalse positive rate
AUCArea under the curve
DdistDiagonal distance

References

  1. Xu, H.; Ma, J.; Shao, Z. SDPNet: A deep network for pan-sharpening with enhanced information representation. IEEE Trans. Geosci. Remote Sens. 2021, 59, 4120–4134. [Google Scholar] [CrossRef]
  2. Zhang, H.; Ma, J. GTP-PNet: A residual learning network based on gradient transformation prior for pansharpening. ISPRS J. Photogramm. Remote Sens. 2021, 172, 223–239. [Google Scholar] [CrossRef]
  3. Xu, H.; Le, Z.; Huang, J.; Ma, J. A cross-direction and progressive network for pan-sharpening. Remote Sens. 2021, 13, 3045. [Google Scholar] [CrossRef]
  4. Ma, J.; Yu, W.; Chen, C. Pan-GAN: An unsupervised pan-sharpening method for remote sensing image fusion. Inf. Fusion 2020, 62, 110–120. [Google Scholar] [CrossRef]
  5. Tian, X.; Chen, Y.; Yang, C. A variational pansharpening method based on gradient sparse representation. IEEE Signal Processing Lett. 2020, 27, 1180–1184. [Google Scholar] [CrossRef]
  6. Liu, Y.; Chen, X.; Wang, Z. Deep learning for pixel-level image fusion: Recent advances and future prospects. Inf. Fusion 2018, 42, 158–173. [Google Scholar] [CrossRef]
  7. Li, H.; Zhang, Y.; Ma, Y. Pairwise elastic net representation-based classification for hyperspectral image classification. Entropy 2021, 23, 956. [Google Scholar] [CrossRef] [PubMed]
  8. Mei, X.; Pan, E.; Ma, Y. Spectral-spatial attention networks for hyperspectral image classification. Remote Sens. 2019, 11, 963. [Google Scholar] [CrossRef] [Green Version]
  9. Zhang, Y.; Ma, Y.; Dai, X. Locality-constrained sparse representation for hyperspectral image classification. Inf. Sci. 2021, 546, 858–870. [Google Scholar] [CrossRef]
  10. Jiang, J.; Ma, J.; Liu, X. Multilayer spectral-spatial graphs for label noisy robust hyperspectral image classification. IEEE Trans. Neural Netw. Learn. Syst. 2020, 99, 1–14. [Google Scholar] [CrossRef]
  11. Jiang, J.; Ma, J.; Wang, Z. Hyperspectral image classification in the presence of noisy labels. IEEE Trans. Geosci. Remote Sens. 2019, 57, 851–865. [Google Scholar] [CrossRef] [Green Version]
  12. Ghaderpour, E.; Vujadinovic, T. Change detection within remotely sensed satellite image time series via spectral analysis. Remote Sens. 2020, 12, 4001. [Google Scholar] [CrossRef]
  13. Panuju, D.; Paull, D.; Griffin, A. Change detection techniques based on multispectral images for investigating land cover dynamics. Remote Sens. 2020, 12, 1781. [Google Scholar] [CrossRef]
  14. Li, H.; Yang, G.; Yang, W. Deep nonsmooth nonnegative matrix factorization network factorization network with semi-supervised learning for SAR image change detection. ISPRS J. Photogramm. Remote Sens. 2020, 160, 167–179. [Google Scholar] [CrossRef]
  15. Yang, G.; Li, H.; Wang, W. Unsupervised change detection based on a unified framework for weighted collaborative representation with RDDL and fuzzy clustering. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8890–8903. [Google Scholar] [CrossRef]
  16. Shao, P.; Shi, W.; Liu, Z. Unsupervised change detection using fuzzy topology-based majority voting. Remote Sens. 2021, 13, 3171. [Google Scholar] [CrossRef]
  17. Xu, Q.; Chen, K.; Zhou, G. Change scapsule network for optical remote sensing image change detection. Remote Sens. 2021, 13, 2646. [Google Scholar] [CrossRef]
  18. Xu, J.; Luo, C.; Chen, X. Remote sensing change detection based on multidirectional adaptive feature fusion and perceptual similarity. Remote Sens. 2021, 13, 3053. [Google Scholar] [CrossRef]
  19. He, Y.; Jia, Z.; Yang, J. Multispectral image change detection based on single-band slow feature analysis. Remote Sens. 2021, 13, 2969. [Google Scholar] [CrossRef]
  20. Moser, G.; Serpico, S. Generalized minimum-error thresholding for unsupervised change detection from SAR amplitude imagery. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2972–2982. [Google Scholar] [CrossRef]
  21. Huo, J.; Mu, L. Fast change detection method for remote sensing image based on method of connected area labeling and spectral clustering algorithm. J. Appl. Remote Sens. 2021, 15, 016506. [Google Scholar] [CrossRef]
  22. Xiong, B.; Chen, J.; Kuang, G. A change detection measure based on a likelihood ratio and statistical properties of SAR intensity images. Remote Sens. Lett. 2012, 3, 267–275. [Google Scholar] [CrossRef]
  23. Gong, M.; Yu, C.; Wu, Q. A neighborhood-based ratio approach for change detection in SAR images. IEEE Geosci. Remote Sens. Lett. 2012, 9, 307–311. [Google Scholar] [CrossRef]
  24. Xu, S.; Liao, Y.; Yan, X. Change detection in SAR images based on iterative Otsu. Eur. J. Remote Sens. 2020, 53, 331–339. [Google Scholar] [CrossRef]
  25. Geetha, R.; Kalaivani, S. Laplacian pyramid-based change detection in multitemporal SAR images. Eur. J. Remote Sens. 2019, 52, 463–483. [Google Scholar] [CrossRef] [Green Version]
  26. Celik, T.; Ma, K. Multitemporal image change detection using undecimated discrete wavelet transform and active contours. IEEE Trans. Geosci. Remote Sens. 2011, 49, 706–716. [Google Scholar] [CrossRef]
  27. Celik, T. Unsupervised change detection in satellite images using principal component analysis and k-means clustering. IEEE Geosci. Remote Sens. Lett. 2009, 6, 772–776. [Google Scholar] [CrossRef]
  28. Li, H.; Celik, T.; Longbotham, N. Gabor feature based unsupervised change detection of multitemporal SAR images based on two-level clustering. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2458–2462. [Google Scholar]
  29. Chen, P.; Zhang, Y.; Jia, Z. Remote sensing image change detection based on NSCT-HMT model and its application. Sensors 2017, 17, 1295. [Google Scholar] [CrossRef] [Green Version]
  30. Gao, F.; Dong, J.; Li, B. Automatic change detection in synthetic aperture radar images based on PCANet. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1792–1796. [Google Scholar] [CrossRef]
  31. Gao, Y.; Gao, F.; Dong, J. Change detection from synthetic aperture radar images based on channel weighting-based deep cascade network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 4517–4529. [Google Scholar] [CrossRef]
  32. Gao, F.; Wang, X.; Gao, Y. Sea ice change detection in SAR images based on convolutional-wavelet neural networks. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1240–1244. [Google Scholar] [CrossRef]
  33. Gao, Y.; Gao, F.; Dong, J. SAR image change detection based on multiscale capsule network. IEEE Geosci. Remote Sens. Lett. 2021, 18, 484–488. [Google Scholar] [CrossRef]
  34. Gao, Y.; Gao, F.; Dong, J. Transferred deep learning for sea ice change detection from synthetic-aperture radar images. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1655–1659. [Google Scholar] [CrossRef]
  35. Yang, M.; Jiao, L.; Liu, F.; Hou, B.; Yang, S.; Jian, M. DPFL-Nets: Deep pyramid feature learning networks for multiscale change detection. IEEE Trans. Neural Netw. Learn. Syst. 2021, 1–15. [Google Scholar] [CrossRef] [PubMed]
  36. Wang, D.; Chen, X.; Jiang, M. ADS-Net: An attention-based deeply supervised network for remote sensing image change detection. Int. J. Appl. Earth Obs. Geoinf. 2021, 101, 102348. [Google Scholar] [CrossRef]
  37. Li, L.; Si, L.; Wang, L. A novel approach for multi-focus image fusion based on SF-PAPCNN and ISML in NSST domain. Multimed. Tools Appl. 2020, 79, 24303–24328. [Google Scholar] [CrossRef]
  38. Li, L.; Ma, H. Pulse coupled neural network-based multimodal medical image fusion via guided filtering and WSEML in NSCT domain. Entropy 2021, 23, 591. [Google Scholar] [CrossRef]
  39. Liu, Y.; Liu, S.; Wang, Z. A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fusion 2015, 24, 147–164. [Google Scholar] [CrossRef]
  40. Kalaiselvi, S.G. α-cut induced fuzzy deep neural network for change detection of SAR images. Appl. Soft Comput. 2020, 95, 106510. [Google Scholar] [CrossRef]
  41. Lou, X.; Jia, Z.; Yang, J. Change detection in SAR images based on the ROF model semi-implicit denoising method. Sensors 2019, 19, 1179. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Krinidis, S.; Chatzis, V. A robust fuzzy local information C-means clustering algorithm. IEEE Trans. Image Processing 2010, 19, 1328–1337. [Google Scholar] [CrossRef] [PubMed]
  43. Sumaiya, M.; Kumari, R. Logarithmic mean-based thresholding for SAR image change detection. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1726–1728. [Google Scholar] [CrossRef]
  44. Gao, F.; Dong, J.; Li, B. Change detection from synthetic aperture radar images based on neighborhood-based ratio and extreme learning machine. J. Appl. Remote Sens. 2016, 10, 046019. [Google Scholar] [CrossRef]
  45. Gao, Y.; Gao, F.; Dong, J. Sea ice change detection in SAR images based on collaborative representation. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 7320–7323. [Google Scholar]
  46. Wang, T.; Kazak, J.; Han, Q. A framework for path-dependent industrial land transition analysis using vector data. Eur. Plan. Stud. 2019, 27, 1391–1412. [Google Scholar] [CrossRef]
  47. Kaliraj, S.; Chandrasekar, N.; Ramachandran, K. Coastal landuse and land cover change and transformations of Kanyakumari coast, India using remote sensing and GIS. Egypt. J. Remote Sens. Space Sci. 2017, 20, 169–185. [Google Scholar] [CrossRef]
  48. Sun, Y.; Lei, L.; Li, X. Nonlocal patch similarity based heterogeneous remote sensing change detection. Pattern Recognit. 2021, 109, 107598. [Google Scholar] [CrossRef]
Figure 1. The structure of the proposed remote sensing image change detection algorithm.
Figure 1. The structure of the proposed remote sensing image change detection algorithm.
Entropy 24 00291 g001
Figure 2. Ottawa data set. (a) Image acquired in May 1997; (b) image acquired in August 1997; (c) reference.
Figure 2. Ottawa data set. (a) Image acquired in May 1997; (b) image acquired in August 1997; (c) reference.
Entropy 24 00291 g002
Figure 3. Wenchuan data set. (a) Image acquired on 3 March 2008; (b) image acquired on 16 June 2008; (c) reference.
Figure 3. Wenchuan data set. (a) Image acquired on 3 March 2008; (b) image acquired on 16 June 2008; (c) reference.
Entropy 24 00291 g003
Figure 4. Mexico data set. (a) Image acquired in April 2000; (b) image acquired in May 2005; (c) reference.
Figure 4. Mexico data set. (a) Image acquired in April 2000; (b) image acquired in May 2005; (c) reference.
Entropy 24 00291 g004
Figure 5. Yambulla data set. (a) Image acquired on 1 October 2015; (b) image acquired on 6 February 2016; (c) reference.
Figure 5. Yambulla data set. (a) Image acquired on 1 October 2015; (b) image acquired on 6 February 2016; (c) reference.
Entropy 24 00291 g005
Figure 6. The difference images with different methods. (a) Log-ratio operator; (b) mean-ratio operator; (c) NSCT fusion.
Figure 6. The difference images with different methods. (a) Log-ratio operator; (b) mean-ratio operator; (c) NSCT fusion.
Entropy 24 00291 g006aEntropy 24 00291 g006b
Figure 7. The ROC curves of operators generated DIs. (a) Ottawa; (b) Wenchuan; (c) Mexico; (d) Yambulla.
Figure 7. The ROC curves of operators generated DIs. (a) Ottawa; (b) Wenchuan; (c) Mexico; (d) Yambulla.
Entropy 24 00291 g007
Figure 8. The change detection results with FLICM model. (a) LR_FLICM; (b) MR_FLICM; (c) NSCT_FLICM.
Figure 8. The change detection results with FLICM model. (a) LR_FLICM; (b) MR_FLICM; (c) NSCT_FLICM.
Entropy 24 00291 g008
Figure 9. The results of different methods on Ottawa data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Figure 9. The results of different methods on Ottawa data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Entropy 24 00291 g009
Figure 10. The results of different methods on Wenchuan data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Figure 10. The results of different methods on Wenchuan data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Entropy 24 00291 g010
Figure 11. The results of different methods on Mexico data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Figure 11. The results of different methods on Mexico data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Entropy 24 00291 g011aEntropy 24 00291 g011b
Figure 12. The results of different methods on Yambulla data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Figure 12. The results of different methods on Yambulla data set. (a) PCAKM; (b) GaborTLC; (c) LMT; (d) PCANet; (e) NRELM; (f) NRCR; (g) CWNN; (h) proposed method; (i) reference.
Entropy 24 00291 g012aEntropy 24 00291 g012b
Figure 13. Objective performance of the methods on different data sets. (a) FN; (b) FP; (c) OE; (d) PCC; (e) KC; (f) F1.
Figure 13. Objective performance of the methods on different data sets. (a) FN; (b) FP; (c) OE; (d) PCC; (e) KC; (f) F1.
Entropy 24 00291 g013aEntropy 24 00291 g013b
Table 1. The description of the four data sets used in the experiment.
Table 1. The description of the four data sets used in the experiment.
Scenario
(Data Set)
LocationDataEventSizeSatelliteSensor Type
1Ottawa, CanadaMay 1997
August 1997
Flood290 × 350Radarsat-1SAR
2Wenchuan, China3 March 2008
16 June 2008
Earthquake442 × 301Radarsat-2SAR
3MexicoApril 2000
May 2005
Fire512 × 512Landsat-7Optical
4Yambulla, Australia1 October 2015
6 February 2016
Bushfire500 × 500Landsat-8Optical
Table 2. The quantitative criteria AUC and Ddist of different operators on remote sensing image data sets.
Table 2. The quantitative criteria AUC and Ddist of different operators on remote sensing image data sets.
MethodsOttawaWenchuanMexicoYambulla
AUCDdistAUCDdistAUCDdistAUCDdist
LR0.95731.28290.96181.27010.98771.34670.99541.3815
MR0.99691.38280.96651.29530.99371.36890.99871.3980
NSCT0.99801.38570.97291.30630.99381.36810.99901.3986
Table 3. The objective evaluations of change detection on Ottawa in Figure 8.
Table 3. The objective evaluations of change detection on Ottawa in Figure 8.
FNFPOEPCC (%)KC (%)F1 (%)
LR_FLICM2588224281297.2388.9390.54
MR_FLICM340896123698.7895.4996.21
NSCT_FLICM658366102498.9996.1896.78
Table 4. The objective evaluations of change detection on Ottawa in Figure 9.
Table 4. The objective evaluations of change detection on Ottawa in Figure 9.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM1901582248397.5590.4991.93
GaborTLC2531253278497.2689.0790.66
LMT526623528994.7977.4380.31
PCANet1011839185098.1893.1294.21
NRELM1157578173598.2993.4894.50
NRCR7391900263997.4090.5192.07
CWNN3991208160798.4294.1795.12
Proposed658366102498.9996.1896.78
Table 5. The objective evaluations of change detection on Wenchuan in Figure 10.
Table 5. The objective evaluations of change detection on Wenchuan in Figure 10.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM7111939805093.9576.2779.73
GaborTLC8155688884393.3573.2776.98
LMT9333635996892.5169.1173.19
PCANet52841437672194.9581.0484.01
NRELM6492873736594.4678.5281.71
NRCR7638713835193.7275.0278.56
CWNN97205781029892.2667.8071.97
Proposed36122117572995.6984.5187.09
Table 6. The objective evaluations of change detection on Mexico in Figure 11.
Table 6. The objective evaluations of change detection on Mexico in Figure 11.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM5543759630297.6085.1186.42
GaborTLC8515296881196.6477.7379.49
LMT5855640649597.5284.5385.87
PCANet4946713565997.8486.7787.95
NRELM3702943464598.2389.4390.41
NRCR37341252498698.1088.7289.76
CWNN44911053554497.8987.2388.39
Proposed33161223453998.2789.8090.75
Table 7. The objective evaluations of change detection on Yambulla in Figure 12.
Table 7. The objective evaluations of change detection on Yambulla in Figure 12.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM2956116307298.7792.8693.54
GaborTLC610534613997.5484.8386.15
LMT457160463198.1588.9089.91
PCANet3979134411398.3590.2791.17
NRELM732533735897.0681.3782.93
NRCR634831637997.4584.1685.53
CWNN2629153278298.8993.5894.20
Proposed1782227200999.2095.4495.89
Table 8. The average objective evaluations of change detection on the four data sets.
Table 8. The average objective evaluations of change detection on the four data sets.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM4378599497796.9786.1887.91
GaborTLC6327318664496.2081.2283.32
LMT6256340659695.7479.9982.32
PCANet3805781458697.3387.8089.33
NRELM4669607527697.0185.7087.39
NRCR4615974558996.6784.6086.48
CWNN4310748505896.8685.6987.42
Proposed2342983332598.0491.4892.63
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, L.; Ma, H.; Jia, Z. Multiscale Geometric Analysis Fusion-Based Unsupervised Change Detection in Remote Sensing Images via FLICM Model. Entropy 2022, 24, 291. https://0-doi-org.brum.beds.ac.uk/10.3390/e24020291

AMA Style

Li L, Ma H, Jia Z. Multiscale Geometric Analysis Fusion-Based Unsupervised Change Detection in Remote Sensing Images via FLICM Model. Entropy. 2022; 24(2):291. https://0-doi-org.brum.beds.ac.uk/10.3390/e24020291

Chicago/Turabian Style

Li, Liangliang, Hongbing Ma, and Zhenhong Jia. 2022. "Multiscale Geometric Analysis Fusion-Based Unsupervised Change Detection in Remote Sensing Images via FLICM Model" Entropy 24, no. 2: 291. https://0-doi-org.brum.beds.ac.uk/10.3390/e24020291

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop