Next Article in Journal
Comparison of Changes in Urban Land Use/Cover and Efficiency of Megaregions in China from 1980 to 2015
Next Article in Special Issue
Spatial–Spectral Feature Fusion Coupled with Multi-Scale Segmentation Voting Decision for Detecting Land Cover Change with VHR Remote Sensing Images
Previous Article in Journal
Vegetation and Soil Fire Damage Analysis Based on Species Distribution Modeling Trained with Multispectral Satellite Data
Previous Article in Special Issue
Multi-Scale Semantic Segmentation and Spatial Relationship Recognition of Remote Sensing Images Based on an Attention Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Coarse-to-Fine Scheme for Remote Sensing Image Registration Based on SIFT and Phase Correlation

1
Faculty of Electrical Engineering, Zhejiang University, No. 38, West Lake District, Hangzhou 310000, China
2
Schoole of Computer Science and Technology, Hangzhou Dianzi University, No.1 Street, Baiyang Street, Hangzhou Economic and Technological Development Zone, Hangzhou 310018, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(15), 1833; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11151833
Submission received: 26 June 2019 / Revised: 25 July 2019 / Accepted: 30 July 2019 / Published: 6 August 2019

Abstract

:
Automatic image registration has been wildly used in remote sensing applications. However, the feature-based registration method is sometimes inaccurate and unstable for images with large scale difference, grayscale and texture differences. In this manuscript, a coarse-to-fine registration scheme is proposed, which combines the advantage of feature-based registration and phase correlation-based registration. The scheme consists of four steps. First, feature-based registration method is adopted for coarse registration. A geometrical outlier removal method is applied to improve the accuracy of coarse registration, which uses geometric similarities of inliers. Then, the sensed image is modified through the coarse registration result under affine deformation model. After that, the modified sensed image is registered to the reference image by extended phase correlation. Lastly, the final registration results are calculated by the fusion of the coarse registration and the fine registration. High universality of feature-based registration and high accuracy of extended phase correlation-based registration are both preserved in the proposed method. Experimental results of several different remote sensing images, which come from several published image registration papers, demonstrate the high robustness and accuracy of the proposed method. The evaluation contains root mean square error (RMSE), Laplace mean square error (LMSE) and red–green image registration results.

Graphical Abstract

1. Introduction

Image registration is the process of geometrically aligning two or more images with overlapping scenes taken at different times, from different viewpoints, or by different sensors [1]. It has been widely applied in many fields, such as biomedical image analysis, remote sensing, computer vision, and pattern matching [2,3]. It is significant to study image registration in all these disciplines to develop robust, accurate, and computationally efficient algorithms to overlay the relevant images [4].
Traditionally, there are several kinds of image registration methods based on feature, frequency domain and so on [5,6]. These image registration techniques can meet the requirement of accuracy for many remote sensing images. However, there are always problems such as inefficiency, inaccuracy and instability when it comes to the automatic registration of multi-sensor images [7]. Geometric and radiometric differences between images can easily decrease the accuracy of image registration. Therefore, further studies are required to overcome these difficulties and improve the efficiency, accuracy and robustness.
The main purpose of this paper is to present a coarse-to-fine image registration method for large resolution, geometric and radiometric differences. The proposed method is divided into four steps. In Step-1, we coarsely match the reference image and the sensed image through scale-invariant feature transform (SIFT) [8] and the outlier removal method (GSM) [9]. In this step, the coarse registration parameters can be evaluated through classic least-squares method [10]. However, there are still small registration errors due to large resolution, geometric and radiometric differences in many remote sensing images. Phase correlation image registration method is effective and robust for fine registration of small errors [11,12]. In Step-2, we modify the sensed image through the coarse registration parameters, and the modified sensed image and the reference image will have the same size and scenes, which is suitable for phase correlation. In Step-3, the phase correlation combined with Log-polar transforms is used to register the modified sensed image and the reference image. In Step-4, we process the coarse and fine registration results with a transformation model and obtain the final registration result. The main contributions of this manuscript are given as follows:
1
Registration accuracy and robustness is improved through the combination of feature-based image registration method and phase correlation-based image registration method.
2
Modifying the sensed image through coarse registration parameters solves the problem of strict requirements for size of the reference image and the sensed image registered by phase correlation.
3
The proposed fusion calculation method of the coarse registration parameters and the fine registration parameters in Step 4 make the final registration results more accurate.
The new method comprehensively utilizes feature-based and phase correlation-based methods to overcome the problem of inaccuracy or instability of a single method, namely, the proposed method combines the advantages of the robustness of feature-based method and accuracy of phase correlation-base method. Experiments on several different remote sensing images, which come from several published image registration papers [13,14], demonstrate that the proposed coarse to fine method is highly accurate and robust to remote sensing images with different conditions.
The remainder of this manuscript is organized as follows. Section 2 introduces the related work of the proposed method. In Section 3, the proposed method is described in detail. Section 4 illustrates the experimental images, results and analysis of the proposed method and other popular image registration methods. In Section 5, experiments are discussed to verify the accuracy and robustness of the proposed method. Finally, conclusions are presented in Section 6.

2. Related Work

In this section, we introduce the related knowledge of the proposed method. Firstly, scale-invariant feature transform [8] algorithm is described because this classical algorithm is used as the main part of the coarse registration. Then, we introduce the recent studies on feature-based methods. Finally, different kinds of phase correlation-based methods are introduced.

2.1. Scale-Invariant Feature Transform (Sift)

Scale-invariant feature transform (SIFT) was proposed by Lowe [8] in 2004. SIFT can be roughly divided into two main parts: feature detection and feature description. The purpose of feature detection is to find feature points which are robust to rotation, scale and illumination change. After finding the feature points, the descriptor is extracted for every feature point. Matching can be achieved by calculating the distance between feature points.
Feature detection includes two steps: the extremum extraction in scale space and the accuracy location of feature points. Firstly, image pyramid is constructed by convolution of different Gaussian kernels. Then, difference of Gaussian is used to simplify Gaussian scale space. Extreme points are detected in the DOG as feature points, and unstable extreme points are removed through contrast and curvature.
Since feature points are located, descriptors are designed according to the local gradient distribution of feature points. The main direction is determined by the maximum gradient direction of the neighborhood around feature points, which can guarantee the rotation invariant of feature points.

2.2. Feature-Based Image Registration Methods

Spatial-based image registration methods contain two main categories: intensity-based and feature-based methods [15]. Intensity-based method directly calculates the image intensity values, such as normalized cross correlation (NCC) and mutual information (MI) [12]. Chen et al. proposed a novel similarity metric in combination of rotationally invariant regional mutual information and SIFT (RIRMI) [16], which considers original gray information and spatial information. Feature-based method achieves image registration through detecting strong robust features in images and establishing mapping relationship, combining geometric transformation model. Many feature-based methods have been proposed, among which scale-invariant feature transform [8] is one popular and efficient method to register remote sensing images. After scale-invariant feature transform, many feature-based image registration methods have been proposed by improving the standard SIFT approach. PCA-SIFT was proposed by Ke et al. to construct a new descriptor through dimension reduction of 128-dimensional descriptor to 32 or 20 dimensions, using principal component analysis (PCA) [17]. The smaller dimension of descriptor reduces the computation time, but also reduces the distinction of features. Gradient location and orientation histogram (GLOH) is also an efficient feature descriptor, extending scale-invariant feature transform (SIFT) by changing the location grid and using principal component analysis (PCA) to reduce the size [18]. This method generates high distinction and robustness descriptor, but increases the computational complexity. Speed-up robust feature (SURF) uses integral images for convolution, a Hessian matrix-based measure for the detector and a distribution-based descriptor to achieve faster detection, description and matching [19]. Binary robust invariant scalable keypoints (BRISK) proposes a novel scale-space fast-based detector, combining the assembly of a bit-string descriptor from intensity comparisons retrieved by dedicated sampling of each keypoint neighborhood [20]. Sedaghat et al. proposed uniform robust SIFT (UR-SIFT) and cross-matching technique to register remote sensing images [13], based on which modified uniform robust scale-invariant feature transform (URSIFT) was proposed to refine the initial matches through the bivariate histogram and the random sample consensus [21]. Spatial orientation feature matching (SOFM) was proposed to extract similarity feature values by relying on the distinct path between two specific interest points and following the alternation of the signal while traversing the path. This method is robust to scale, translation, rotation, intensity noise and occlusion because the similarity values are deformation invariant [22]. Kahaki proposed a new image registration method based on dissimilarity values to measure the dissimilarity of the features through the path based on Eigenvector properties, which is efficient and robust to local variation of the image information [23]. Besides, contour-based methods are also useful feature-based methods. Among them, Mean Projection Transform (MPT) is a robust corner classifier and detector formed by parabolic fit approximation [24].
The detectors of feature-based methods are constructed in multi-scale Gaussian space, which is robust to scale transformation. The direction of descriptors are decided by the maximum gradient direction of the neighborhood around feature points, so feature-based feature descriptor is efficient to rotation transformation.

2.3. Phase Correlation-Based Image Registration Methods

Frequency-based method matches images based on translation model or similarity model using the information and operation in frequency domain [25]. This method is highly efficient and robust to the frequency-dependent noise and intensity contrast [11,12]. Phase correlation is one famous representation of frequency-based method, proposed by Kuglin [26] first to estimate the translation between images. Then, Chen et al. [27] improved the initial phase correlation by Fourier–Mellin transforms, which can evaluate the rotation and translation of two images. In addition, the log-polar coordinate transformation was introduced by Reddy and Chatterji [28] to evaluate the scale, rotation and translation of two images. Because the log-polar coordinate transformation of phase correlation is bound to bring additional noise and reduce the robustness, many studies have been done to avoid the log-polar coordinate transformation and the effects of noise, aiming to high accuracy and robustness of translation. Hoge [29] proposed the subspace identification extension to the phase correlation method, combining the singular value decomposition (SVD) [30] with the initial phase correlation to filter noise. The SVD method is based on complex matrix operation, which is robust to large magnitude of shift between images but is sometimes slow. Then, the robust 2D fitting method [31] offers a simpler and fast solution through least square fitting. Tong et al. [32] integrated the advantages of Hoge’s method and the random sample consensus (RANSAC) algorithm, improving the robustness to aliasing and noise, as well as performance in the case of practical remote sensing images. Dong et al. [33] used low-rank matrix factorization with mixture of Gaussian to filter noise of the normalized cross-relation matrix Q, leading to high accuracy and robustness to aliasing, noise and gray difference. Above all, the phase correlation image registration is highly accurate to translation of images, but does not work for large scale and rotation.

3. Materials and Methods

In this section, the proposed method is introduced exhaustively. Firstly, the registration deformation model in the proposed method is given with expressions. In addition, it is essential to fuse the coarse registration result and the fine registration result. After that, every step of the entire algorithm flow is described in details.

3.1. Algorithm Description

The proposed coarse-to-fine image registration method aims to improve the accuracy and robustness of image registration. As previously mentioned, the proposed method consists of four steps: the coarse registration, modifying the sensed image, the fine registration and fusion of the two registration results, as depicted in Figure 1.
1
In the coarse registration step, feature points of the reference image and the sensed image are detected and described by scale-invariant feature transform (SIFT) algorithm. Mismatching is bound to decrease the accuracy of the registration result, thus we adopt the novel outlier removal method to remove many outliers, improving the accuracy of the coarse registration.
2
Due to strict requirements for size of the reference image and the sensed image registered by phase correlation, the fine registration cannot work directly. In this case, it is essential to modify the sensed image. From Step (1), we obtain the coarse registration result. Therefore, we can modify the sensed image through the coarse transformation parameters.
3
The border of image is regarded as sharp discontinuities because of the periodicity of Fourier transformation, resulting in high frequencies in frequency domain [29]. Flat-top function is used to weaken the interference of the border, which is a 2D improved window function [34]. Frequency masking is also used as a small trick to filter out the frequencies contaminated by aliasing [35]. Then, the transformation parameters are calculated by the phase correlation introduced previously.
4
The final transformation parameter is evaluated in combination with the coarse registration and the fine registration results, using Equation (5)

3.2. The Adopted Deformation Model

The affine deformation model is used to evaluate the relationship of the reference image and the sensed image. The formula of the model is shown as follows.
x r e f y r e f = a cos θ , a sin θ a sin θ , a cos θ x s e n y s e n + t x t y
where ( x r e f , y r e f ) represents the coordinates extracted by scale-invariant feature transform (SIFT) and outlier removal in the reference image. ( x s e n , y s e n ) represents the corresponding coordinates in the sensed image. a and θ are the scale factor and rotation angle, respectively. t x and t y represent the translation in horizontal and vertical directions.

3.3. Outlier Removal

The reliability of feature matching is significant to the accuracy of image registration. Lots of outlier removal methods have been proposed utilizing different properties of feature points [9,36,37,38]. Above these methods, the novel geometrical outlier removal method is robust to scale transformation because of the scale factor introduced in this paper (GSM) [9]. While we get the scale transformation accurately in the coarse registration, phase correlation can calculate the remaining deformation easily. The specific process of the novel outlier removal method is as follows:
Suppose S r e f = { s i r e f } k = 1 N = { ( x i r e f , y i r e f ) } k = 1 N is the initial point set in the reference image, and S s e n = { s i s e n } k = 1 N = { ( x i s e n , y i s e n ) } k = 1 N is the corresponding point set in the sensed image, where ( i , j ) { k , k = 1 , 2 , N } . N is the number of the initial matching feature points. Then, D is defined as the distance of two points.
D ( s i r e f , s j r e f ) = s i r e f s j r e f = ( x i r e f x j r e f ) 2 + ( y i r e f y j r e f ) 2
Combining the Equations (1) and (2) [9], we can obtain:
D ( s i r e f , s j r e f ) = a s i s e n s j s e n = a D ( s i s e n , s j s e n )
Therefore, the scale factor is the ratio of D ( s i r e f , s j r e f ) and D ( s i s e n , s j s e n ) . As is known, the scale factor of a pair of remote sensing images is fixed, based on which we can construct an objective function to select inlier from the initial points.
S i n = arg min S i n F ( S i n ; I , λ )
The objective function F is defined as
F ( S i n ; I , λ ) = i S i n j S i n k S i n D ( s i r e f , s j r e f ) D ( s i s e n , s j s e n ) D ( s i r e f , s k r e f ) D ( s i s e n , s k s e n ) + λ ( N S i n )
In the objective function, S i n represents the unknown inlier set, and I represents the initial matching feature point set. λ is the parameter that controls the tradeoff between the inlier set and the outlier set.
Suppose S i n = i = 1 N p i because the inlier set is unknown. If the point belongs to the inlier set, the value of p i is 1, else the value of p i is 0. Then, we can modify the Equation (5) [9]:
F ( S i n ; I , λ ) = i = 1 N p i ( j = 1 N k = 1 N D ( s i r e f , s j r e f ) D ( s i s e n , s j s e n ) D ( s i r e f , s k r e f ) D ( s i s e n , s k s e n ) ) + λ ( N i = 1 N p i )
F ( S i n ; I , λ ) = i = 1 N p i ( f i λ ) + λ N
f i = j = 1 N k = 1 N D ( s i r e f , s j r e f ) D ( s i s e n , s j s e n ) D ( s i r e f , s k r e f ) D ( s i s e n , s k s e n )
To minimize the objective function, we put the point whose cost value f i is smaller than λ into the inlier set. Then, the optimal solution of the objective function in Equation (5) is determined by the criteria as follows. λ = N / 100 according to the work in [9].
p i = 1 , f i λ 0 , f i > λ i = 1 , 2 , N
The optimal inlier set S i n is:
S i n = i p i = 1 , i = 1 , 2 , , N

3.4. Phase Correlation with Log-Polar Coordinate Transformation

There still exist small scale and rotation transformation between the coarse-correction image and the reference image. Therefore, phase correlation with log-polar coordinate transformation is more suitable for fine registration because it is robust to scale and rotation transformation [28]. The principle of phase correlation with phase correlation with log-polar coordinate transformation is briefly introduced as follows.
Firstly, the initial phase correlation only for translation is described. Suppose f 1 and f 1 are the two image differ only by a shift ( x 0 , y 0 ) i.e.,
f 2 ( x , y ) = f 1 ( x + x 0 , y + y 0 )
According to the Fourier shift property, the corresponding Fourier transforms F 1 and F 2 can be expressed as:
F 2 ( u , v ) = F 1 ( u , v ) exp { i ( u x 0 + v y 0 ) }
Then, we can obtain the normalized cross-power spectrum by:
Q ( u , v ) = F 2 ( u , v ) F 1 ( u , v ) | F 2 ( u , v ) F 1 ( u , v ) | = exp { i ( u x 0 + v y 0 ) }
where ∗ represents complex conjugate. The magnitude of Q ( u , v ) is normalized for all frequencies so that the phase correlation-base method is robust to gray variation [39].
Calculating the inverse Fourier-transform (FT) to Equation (13) [28], we can derive the following expression:
F 1 ( Q ( u , v ) ) = F 1 ( exp { i ( u x 0 + v y 0 ) } ) = δ ( x + x 0 , y + y 0 )
where F 1 denotes the inverse Fourier-transform (FT). Then, we can evaluate the pixel translation from the peak coordinates of Equation (13). Additionally, we also can evaluate the sub-pixel translation by fitting the neighborhood around the peak coordinates.
If there exists scale and rotation transformation, we can express the relationship of the sensed image and the reference image as follows.
f 2 ( x , y ) = f 1 ( a 1 ( x cos θ 0 + y sin θ 0 ) + x 0 , a 1 ( x sin θ 0 + y cos θ 0 ) + y 0 )
where a represents scale factor, θ represents rotation angle, and ( x 0 , y 0 ) is the translations. By applying the inverse FT to Equation (15) [28], the following expression can be derived:
F 2 ( u , v ) = F 1 ( a 1 ( u cos θ 0 + v sin θ 0 ) , a 1 ( u sin θ 0 + v cos θ 0 ) ) exp { i ( u x 0 + v y 0 ) }
To separate and evaluate a and θ , it is a wise idea to consider M 1 and M 2 as the magnitudes of F 1 and F 2 from Equation (16).Then, we can obtain:
M 2 ( u , v ) = M 1 ( a 1 ( u cos θ 0 + v sin θ 0 ) , a 1 ( u sin θ 0 + v cos θ 0 ) )
Transferring Equation (17) [28] into log-polar coordinates, the scale factor and rotation angle can be separated as follows.
M 2 ( ρ , θ ) = M 1 ( log ρ log ρ 0 , θ θ 0 )
In this way, we can evaluate a and θ as the same method as the translation.

3.5. Fusion of Two Registration Results

According to the position of feature points, we can evaluate the affine transformation parameters of the coarse registration. From phase correlation with log-polar coordinate transformation, the scale factor, rotation angle and translation between modified sensed image and reference image can be evaluated. Then, we can derive the final affine transformation parameters as follows.
Supposing that the ( x r e f , y r e f ) are the coordinates in the reference image, and ( x s e n 1 , y s e n 1 ) are the corresponding coordinates in the sensed image. The relationship of reference image and sensed image can be represented as:
x r e f y r e f = a 1 cos θ 1 , a 1 sin θ 1 a 1 sin θ 1 , a 1 cos θ 1 x s e n 1 y s e n 1 + t x 1 t y 1 + e r r o r x 1 e r r o r y 1
where ( a 1 , θ 1 , t x 1 , t y 1 ) are the affine transformation parameters evaluated by the coarse registration. ( e r r o r x 1 , e r r o r y 1 ) are the coarse registration error. Supposing that ( x s e n 2 , y s e n 2 ) represent the coordinates in the modified sensed image, we can express the relationship among the sensed image, the modified sensed image and the reference image:
x s e n 2 y s e n 2 = a 1 cos θ 1 , a 1 sin θ 1 a 1 sin θ 1 , a 1 cos θ 1 x s e n 1 y s e n 1 + t x 1 t y 1
x r e f y r e f = a 2 cos θ 2 , a 2 sin θ 2 a 2 sin θ 2 , a 2 cos θ 2 x s e n 2 y s e n 2 + t x 2 t y 2 + e r r o r x 2 e r r o r y 2
where ( a 2 , θ 2 , t x 2 , t y 2 ) are the the affine transformation parameters evaluated by the fine registration. ( e r r o r x 2 , e r r o r y 2 ) are the fine registration error. Combining Equations (20) and (21), we can fuse these two registration results:
x r e f y r e f = a 1 a 2 cos θ 1 cos θ 2 sin θ 1 sin θ 2 , sin θ 1 cos θ 2 cos θ 1 sin θ 2 sin θ 1 cos θ 2 + cos θ 1 sin θ 2 , cos θ 1 cos θ 2 sin θ 1 sin θ 2 x s e n 2 y s e n 2 + a 2 cos θ 2 t x 1 a 2 sin θ 2 t y 1 + t x 2 a 2 sin θ 2 t x 1 + a 2 cos θ 2 t y 1 + t y 2 + e r r o r x 2 e r r o r y 2
As is introduced previously, the advantage of phase correlation-based method is high accuracy. That is to say, ( e r r o r x 2 , e r r o r y 2 ) are much smaller than ( e r r o r x 1 , e r r o r y 1 ) . From Equations (19) and (22), it is easy to find the registration accuracy is improved by the fine registration.

4. Experimental Results and Analyses

In this section, experiments on several different remote sensing images are described to evaluate the precision and robustness of the proposed method. A computer with 2-GHz CPU and 8-GB RAM (Intel Core 5) was used. To evaluate the performance of the proposed method, we compared the proposed method with four powerful registration methods: SIFT [8] with RANSAC (SIFT-RANSAC), SIFT with GSM outlier removal strategy [9] (SIFT-GSM), RIRMI [16] and the combination of SIFT, RANSAC and PC [28] (SIFT-RANSAC-PC). GSM stands for the novel outlier removal method [9] introduced in Section 3.3. PC stands for the extended phase correlation introduced in Section 3.4. RIRMI is a novel similarity metric in combination of rotationally invariant regional mutual information and SIFT (RIRMI) [16], which considers both original gray information and spatial information. Firstly, the dataset and evaluation criterion are introduced. Then, the experimental results (comparison of the proposed method with different methods) are presented. Finally, we present the analyses of the results of the different methods.

4.1. Dataset

We selected six pairs of remote sensing images considering difference of grayscale, resolution and scene, with a pair of simulated remote sensing images and four pairs of real remote sensing images. These images are displayed in Figure 2, which come from several published image registration papers [13,14]. Image Pair 1 includes two simulated hyper-spectral remote sensing images with 2 0 rotation and four times scaling, which covers a city area located in Colombia-Cali and corresponds to Figure 2a,b. Image Pair 2 has a significant grayscale difference between the sensed image (band 3 from Landsat TM) and the reference image (band 1 from SPOT 4), mainly covering mountain area in Hangzhou, which corresponds to Figure 2d,c. Image Pair 3 has a significant spatial resolution difference (about 1.875 times) between the sensed image (band 55 from EO-1) and the reference image (from Google Earth) covering a plain area in Suzhou, which corresponds to Figure 2f,e. Image Pair 4 has a significant spatial resolution difference (about three times) and texture difference between the sensed image (band 3 from Landsat ETM+) and the reference image (band 1 from SPOT 5 ), covering a plain area in Halifax, which corresponds to Figure 2h,g. Image Pair 5 has a significant noise and scale difference between the sensed image (SPOT 4) and the reference image (IRS-1C), covering a farmland area in Iran-Tehran, which corresponds to Figure 2j,i. Image Pair 6 has a significant spatial resolution difference (about 2 times) and grayscale difference between the sensed image (IRS-1C) and the reference image (SPOT 4), covering a field and town area at Iran-Tehran, which corresponds to Figure 2h,g. Table 1 lists the details of the four typical image pairs.

4.2. Evaluation Criteria

To evaluate the accuracy and efficiency of the proposed method, we compared the image registration results of our method with SIFT [8] with RANSAC, SIFT with GSM outlier removal strategy [9], RIRMI [16] and the combination of SIFT and PC [28] without GSM in terms of root mean square error (RMSE) and Laplace mean square error (LMSE) [40]. The coarse registrations were evaluated by SIFT with d r a d i o = 0.8 for the remote sensing images. The threshold used for RANSAC to select the inliers was 1 pixel, which is the most reasonable value through many experiments.
The reliable reference geometric transformation parameters were calculated by manually selecting tie-points via ENVI and manual registration. Correct matches are feature points with a spatial distance less than 1 pixel between the reference image and the sensed image after correction with the manual registration results. Here, RMSE and LMSE were used as the measurement of the position error of image registration. The calculation formulas are as follow:
R M S E = 1 N i = 1 N ( ( x i 1 x i 2 ) 2 + ( y i 1 y i 2 ) 2 )
L M S E = i = 1 m j = 1 n ω ( x i , j ) ω ( x i , j ) i = 1 m j = 1 n ω ( x i , j ) 2
ω ( x i , j ) = x i + 1 , j + x i 1 , j + x i , j + 1 + x i , j 1 4 x i , j
In Equation (23), N points (x,y) are taken in the reference image, and then points ( x 1 , y 1 ) are obtained through the affine transformation model using the image registration parameter calculated by the algorithm. ( x 2 , y 2 ) are obtained through the real affine transformation model as calculated by manual selection. In Equations (24) and (25), x i , j is the grayscale value of the point ( i , j ) in the reference image, and x i , j is the point in the registered image.

4.3. Experimental Results and Analysis

Due to strict requirements of the phase correlation method, the sensed image must be modified by the coarse registration. The modified sensed images are shown in Figure 3. Comparing these images with the reference image shown in Figure 2, we can easily find that the overlapping part of the sensed image and the reference image was extracted in the modified sensed image. In addition, the modified sensed image was adjusted to the same size as the reference image. Therefore, the modification step is essential and efficient.
The ground truth and deformation parameters evaluated by the proposed method, SIFT-RANSAC, SIFT-GSM, SIFT-RANSAC-PC and RIRMI are summarized in Table 2, respectively, for Image Pairs 1–5. It is easy to find that RMSE and LMSE of the proposed method of the proposed method are always smaller than the other four methods, regardless of the registration difficulty of the reference image and the sensed image, which strongly demonstrates the high accuracy and robustness of the proposed method. Besides, it can be found that RMSE of SIFT-RANSAC-PC is lower than SIFT-RANSAC, which shows the combination of SIFT and phase correlation is a wise idea.
RANSAC is an outlier removal method assuming that the number of inliers in the initial point set is much larger than outliers. Once large differences of the reference image and the sensed image lead to more outliers in the initial point set extracted by SIFT, the results of the coarse registration for SIFT-RANSAC-PC will be poor. GSM uses geometric similarity to eliminate outliers, which will not be affected by the number of outliers. Both SIFT-RANSAC-PC and the proposed method combine SIFT and PC, and RMSE and LMSE of the proposed method are smaller than SIFT-RANSAC-PC in Table 2.
Due to the combination of phase correlation-based image registration method, computation cost of the proposed method is bound to be a little large. In Table 2, we can find that processing time of the proposed method is about 0.2 s longer than SIFT-based method, and much shorter than RIRMI, which shows that the proposed method spends a little more time (0.2 s) to improve the accuracy and robustness. The added computational cost is very small, and the improvement is significant.
Image registration results of Image Pairs 1–6 registered by the proposed method are shown in Figure 4. Observing the roads, rivers, etc. in the image registration results, it can be found visually that the registration error of the proposed method is small in these five pairs of remote sensing images.
For Image Pair 4, the RMSE value of the proposed method is 2 pixels due to the large scale difference (three times), but, in Figure 4h, we can see the registration result is good visually. Local checkerboard mosaicked images of Image Pair 4 are displayed in Figure 5. In Figure 5, we can find that there is no misplacement at the road junction in the proposed method, but there is a small misplacement at the road junction in the other four methods, which shows that the registration result for this image pair is improved through the proposed method. This demonstrates that the proposed method is robust to scale difference.

5. Discussion

In the experiments, we selected five pairs of remote sensing images. RMSE and LMSE of all these five method are low for Image Pairs 1 and 2, because Image Pair 1 is a simulated image and the scale and grayscale difference of Image Pairs 2 and 6 are small. In this case, the feature-based method can register the reference image and the sensed image with small error, and the accuracy improvement of the proposed method is not very significant. For Image Pair 3, the texture of the reference image and the sensed image is very different, resulting in large RMSE, LMSE and poor precision of feature-based method. This demonstrates that feature-based method is not suitable for large texture difference. The phase correlation is not sensitive to texture, leading to obvious accuracy improvement of the proposed method and SIFT-RANSAC-PC. Grayscale difference in Image Pair 4 is large, which is not conducive to feature extraction and description. Then, the registration result of SIFT-RANSAC method in Image Pair 4 is not good. The significant improvement of the proposed method demonstrates that the fine registration of the proposed method helps overcome interference of large grayscale difference. There exists obvious noise in the reference image of Image Pair 5, and RMSE and LMSE of the proposed method are much lower than those of the other four methods, which show the robustness of the proposed coarse-to-fine method. Experimental results of different condition of remote sensing images show the accuracy and robustness of the proposed method.

6. Conclusions

Many image registration methods have been proposed based on feature or phase correlation in recent years. Feature-based methods are efficient to scale and rotation difference, but sensitive to grayscale, texture difference and noise. Besides, accuracy of feature-based methods is obviously lower than phase correlation-based methods. Therefore, we propose a novel coarse-to-fine image registration method in combination with SIFT and phase correlation. For higher accuracy, we add an efficient outlier removal strategy (GSM) after SIFT. Firstly, SIFT is adopt for coarse registration because it is efficient to scale and rotation difference, GSM is used to remove outliers and registration parameters are evaluated with affine deformation model. Then, the sensed image is modified by the coarse registration parameters to meet the requirements of phase correlation-based method. After that, fine registration parameters are evaluated through phase correlation with log-polar coordinate transformation. Lastly, the final registration parameters are evaluated by the fusion of coarse and fine registration parameters.
Several remote sensing image pairs with different resolution, grayscale, texture and scene were used to test the efficiency and robustness of the proposed method. In addition, the registration results of the proposed method were compared with several relative image registration methods. RMSE, LMSE and image registration results demonstrate the high accuracy and robustness of the proposed method. There also exist some shortages in the proposed method. When the scale-difference or noise of the reference image and the sensed image is so significant that the coarse registration fails, the fine registration will not work because the sensed image cannot be modified. In the future, we will find more general and robust feature-based registration methods to guarantee the coarse registration.

Author Contributions

All authors made significant contributions to this work. H.Y. designed the research and analyzed the results. X.L. assisted in the prepared work and validation work. L.Z. and S.C. provided advice for the preparation and revision of the manuscript.

Acknowledgments

This work was funded by the National Nature Science Foundation of China (No. 61671408), and the Joint Fund Project of Chinese Ministry of Education (No. 6141A02022350 and No. 6141A02022362), which is gratefully acknowledged.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jiang, J.; Shi, X. A Robust Point-Matching Algorithm Based on Integrated Spatial Structure Constraint for Remote Sensing Image Registration. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1716–1720. [Google Scholar] [CrossRef]
  2. Gholipour, A.; Kehtarnavaz, N.; Briggs, R.; Devous, M.; Gopinath, K. Brain Functional Localization: A Survey of Image Registration Techniques. IEEE Trans. Med Imaging 2007, 26, 427–451. [Google Scholar] [CrossRef] [PubMed]
  3. Zhao, M.; Wu, Y.; Pan, S.; Zhou, F.; An, B.; Kaup, A. Automatic Registration of Images with Inconsistent Content Through Line-Support Region Segmentation and Geometrical Outlier Removal. IEEE Trans. Image Process. 2018, 27, 2731–2746. [Google Scholar] [CrossRef] [PubMed]
  4. Murphy, J.M.; Le Moigne, J.; Harding, D.J. Automatic Image Registration of Multimodal Remotely Sensed Data with Global Shearlet Features. IEEE Trans. Geosci. Remote Sens. 2016, 54, 1685–1704. [Google Scholar] [CrossRef] [PubMed]
  5. Sharma, K.; Goyal, A. Classification based survey of image registration methods. In Proceedings of the 2013 Fourth International Conference on Computing, Communications and Networking Technologies (ICCCNT), Tiruchengode, India, 4–6 July 2013; pp. 1–7. [Google Scholar] [CrossRef]
  6. Yang, Z.; Yang, Y.; Yang, K.; Wei, Z. Non-Rigid Image Registration with Dynamic Gaussian Component Density and Space Curvature Preservation. IEEE Trans. Image Process. 2019, 28, 2584–2598. [Google Scholar] [CrossRef] [PubMed]
  7. Li, K.; Zhang, Y.; Zhang, Z.; Lai, G. A Coarse-to-Fine Registration Strategy for Multi-Sensor Images with Large Resolution Differences. Remote Sens. 2019, 11, 470. [Google Scholar] [CrossRef]
  8. Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  9. Yang, H.; Li, X.; Chen, S. A novel image registration method based on geometrical outlier removal. In Proceedings of the Algorithms, Technologies, and Applications for Multispectral and Hyperspectral Imagery XXV, International Society for Optics and Photonics, Baltimore, MD, USA, 14 May 2019; Volume 10986, p. 109861N. [Google Scholar]
  10. Gruen, A. Adaptive least squares correlation: A powerful image matching technique. S. Afr. J. Photogramm. Remote Sens. Cartogr. 1985, 14, 175–187. [Google Scholar]
  11. Brown, L.G. A Survey of Image Registration Techniques. ACM Comput. Surv. 1992, 24, 325–376. [Google Scholar] [CrossRef]
  12. Zitová, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comput. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
  13. Sedaghat, A.; Mokhtarzade, M.; Ebadi, H. Uniform Robust Scale-Invariant Feature Matching for Optical Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4516–4527. [Google Scholar] [CrossRef]
  14. Sedaghat, A.; Ebadi, H. Remote sensing image matching based on adaptive binning SIFT descriptor. IEEE Trans. Geosci. Remote Sens. 2015, 53, 5283–5293. [Google Scholar] [CrossRef]
  15. Gong, M.; Zhao, S.; Jiao, L.; Tian, D.; Wang, S. A Novel Coarse-to-Fine Scheme for Automatic Image Registration Based on SIFT and Mutual Information. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4328–4338. [Google Scholar] [CrossRef]
  16. Chen, S.; Li, X.; Zhao, L.; Yang, H. Medium-low resolution multisource remote sensing image registration based on SIFT and robust regional mutual information. Int. J. Remote Sens. 2018, 39, 3215–3242. [Google Scholar] [CrossRef]
  17. Yan, K.; Sukthankar, R. PCA-SIFT: A more distinctive representation for local image descriptors. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2004, Washington, DC, USA, 27 June–2 July 2004; Volume 2, pp. 506–513. [Google Scholar] [CrossRef]
  18. Mikolajczyk, K.; Schmid, C. A performance evaluation of local descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1615–1630. [Google Scholar] [CrossRef] [PubMed]
  19. Bay, H.; Ess, A.; Tuytelaars, T.; Gool, L.V. Speeded-Up Robust Features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
  20. Leutenegger, S.; Chli, M.; Siegwart, R.Y. BRISK: Binary Robust invariant scalable keypoints. In Proceedings of the International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011. [Google Scholar]
  21. Paul, S.; Pati, U.C. Remote Sensing Optical Image Registration Using Modified Uniform Robust SIFT. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1300–1304. [Google Scholar] [CrossRef]
  22. Kahaki, S.M.M.; Nordin, M.J.; Ashtari, A.H.; Zahra, S.J. Deformation invariant image matching based on dissimilarity of spatial features. Neurocomputing 2016, 175, 1009–1018. [Google Scholar] [CrossRef]
  23. Kahaki, S.M.M.; Nordin, M.J.; Ashtari, A.H.; Zahra, S.J. Invariant feature matching for image registration application based on new dissimilarity of spatial features. PLoS ONE 2016, 11, e0149710. [Google Scholar]
  24. Kahaki, S.; Nordin, M.; Ashtari, A. Contour-based corner detection and classification by using mean projection transform. Sensors 2014, 14, 4126–4143. [Google Scholar] [CrossRef]
  25. Ye, Z.; Tong, X.; Zheng, S.; Guo, C.; Gao, S.; Liu, S.; Xu, X.; Jin, Y.; Xie, H.; Liu, S.; et al. Illumination-Robust Subpixel Fourier-Based Image Correlation Methods Based on Phase Congruency. IEEE Trans. Geosci. Remote Sens. 2019, 57, 1995–2008. [Google Scholar] [CrossRef]
  26. Kuglin, C. The phase correlation image alignment methed. In Proceedings of the Intnational Conference Cybernetics Society, San Francisco, CA, USA, 23–25 September 1975; pp. 163–165. [Google Scholar]
  27. Chen, Q.S.; Defrise, M.; Deconinck, F. Symmetric phase-only matched filtering of Fourier-Mellin transforms for image registration and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 1994, 16, 1156–1168. [Google Scholar] [CrossRef]
  28. Reddy, B.S.; Chatterji, B.N. An FFT-based technique for translation, rotation, and scale-invariant image registration. IEEE Trans. Image Process. 1996, 5, 1266–1271. [Google Scholar] [CrossRef] [PubMed]
  29. Hoge, W.S. A subspace identification extension to the phase correlation method [MRI application]. IEEE Trans. Med. Imaging 2003, 22, 277–280. [Google Scholar] [CrossRef] [PubMed]
  30. Horn, R.A.; Johnson, C.R.; Elsner, L. Topics in Matrix Analysis; Cambridge University Press: Cambridge, MA, USA, 1994. [Google Scholar]
  31. Liu, J.G.; Yan, H. Robust phase correlation methods for sub-pixel feature matching. In Proceedings of the 1st Annual Confrence System Engineering Auton, Systtem, Defence Technology Centre, Anaheim, CA, USA, 18–21 September 2006; p. A13. [Google Scholar]
  32. Tong, X.; Ye, Z.; Xu, Y.; Liu, S.; Li, L.; Xie, H.; Li, T. A novel subpixel phase correlation method using singular value decomposition and unified random sample consensus. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4143–4156. [Google Scholar] [CrossRef]
  33. Dong, Y.; Long, T.; Jiao, W.; He, G.; Zhang, Z. A novel image registration method based on phase correlation using low-rank matrix factorization with mixture of Gaussian. IEEE Trans. Geosci. Remote Sens. 2017, 56, 446–460. [Google Scholar] [CrossRef]
  34. Ge, P.; Lan, C.; Wang, H. An improvement of image registration based on phase correlation. Opt.-Int. J. Light Electron Opt. 2014, 125, 6709–6712. [Google Scholar] [CrossRef]
  35. Leprince, S.; Barbot, S.; Ayoub, F.; Avouac, J.P. Automatic and precise orthorectification, coregistration, and subpixel correlation of satellite images, application to ground deformation measurements. IEEE Trans. Geosci. Remote Sens. 2007, 45, 1529–1558. [Google Scholar] [CrossRef]
  36. Ma, J.; Zhou, H.; Ji, Z.; Yuan, G.; Jiang, J.; Tian, J. Robust Feature Matching for Remote Sensing Image Registration via Locally Linear Transforming. IEEE Trans. Geosci. Remote Sens. 2015, 53, 6469–6481. [Google Scholar] [CrossRef]
  37. Ming, Z.; An, B.; Wu, Y.; Luong, H.V.; Kaup, A. RFVTM: A Recovery and Filtering Vertex Trichotomy Matching for Remote Sensing Image Registration. IEEE Trans. Geosci. Remote Sens. 2017, 55, 1–17. [Google Scholar]
  38. Ma, J.; Jiang, J.; Zhou, H.; Ji, Z.; Guo, X. Guided Locality Preserving Feature Matching for Remote Sensing Image Registration. IEEE Trans. Geosci. Remote Sens. 2018, 56, 4435–4447. [Google Scholar] [CrossRef]
  39. Wan, X.; Liu, J.G.; Yan, H. The illumination robustness of phase correlation for image alignment. IEEE Trans. Geosci. Remote Sens. 2015, 53, 5746–5759. [Google Scholar] [CrossRef]
  40. Kahaki, S.M.; Arshad, H.; Nordin, M.J.; Ismail, W. Geometric feature descriptor and dissimilarity-based registration of remotely sensed imagery. PLoS ONE 2018, 13, e0200676. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Overall workflow of the proposed method.
Figure 1. Overall workflow of the proposed method.
Remotesensing 11 01833 g001
Figure 2. Six pairs of original images: (a,b) Image Pair 1; (c,d) Image Pair 2; (e,f) Image Pair 3; (g,h) Image Pair 4; (i,j) Image Pair 5; and (k,l) Image Pair 6.
Figure 2. Six pairs of original images: (a,b) Image Pair 1; (c,d) Image Pair 2; (e,f) Image Pair 3; (g,h) Image Pair 4; (i,j) Image Pair 5; and (k,l) Image Pair 6.
Remotesensing 11 01833 g002aRemotesensing 11 01833 g002b
Figure 3. Six modified sensed images: (a) the modified sensed image of Image Pair 1; (b) the modified sensed image of Image Pair 2; (c) the modified sensed image of Image Pair 3; (d) the modified sensed image of Image Pair 4; (e) the modified sensed image of Image Pair 5; and (f) the modified sensed image of Image Pair 6.
Figure 3. Six modified sensed images: (a) the modified sensed image of Image Pair 1; (b) the modified sensed image of Image Pair 2; (c) the modified sensed image of Image Pair 3; (d) the modified sensed image of Image Pair 4; (e) the modified sensed image of Image Pair 5; and (f) the modified sensed image of Image Pair 6.
Remotesensing 11 01833 g003
Figure 4. Image registration results: (a,b) correspondence to the registration result of Image Pair 1; (a) overlapped image before the registration; (b) the difference after the registration; (c,d) correspondence to the registration result of Image Pair 2; (c) overlapped image before the registration; (d) the difference after the registration; (e,f) correspondence to the registration result of Image Pair 3; (e) overlapped image before the registration; (f) the difference after the registration; (g,h) correspondence to the registration result of Image Pair 4; (g) overlapped image before the registration; (h) the difference after the registration; (i,j) correspondence to the registration result of Image Pair 5, (i) overlapped image before the registration; (j) the difference after the registration; (k,l) correspondence to the registration result of Image Pair 6; (k) overlapped image before the registration; and (l) the difference after the registration.
Figure 4. Image registration results: (a,b) correspondence to the registration result of Image Pair 1; (a) overlapped image before the registration; (b) the difference after the registration; (c,d) correspondence to the registration result of Image Pair 2; (c) overlapped image before the registration; (d) the difference after the registration; (e,f) correspondence to the registration result of Image Pair 3; (e) overlapped image before the registration; (f) the difference after the registration; (g,h) correspondence to the registration result of Image Pair 4; (g) overlapped image before the registration; (h) the difference after the registration; (i,j) correspondence to the registration result of Image Pair 5, (i) overlapped image before the registration; (j) the difference after the registration; (k,l) correspondence to the registration result of Image Pair 6; (k) overlapped image before the registration; and (l) the difference after the registration.
Remotesensing 11 01833 g004aRemotesensing 11 01833 g004b
Figure 5. Checkerboard mosaicked image of Image Pair 4: (a) correspondence to SIFT-RANSAC; (b) correspondence to SIFT-GSM; (c) correspondence to SIFT-RANSAC-PC; (d) correspondence to RIRMI; and (e) correspondence to the proposed method.
Figure 5. Checkerboard mosaicked image of Image Pair 4: (a) correspondence to SIFT-RANSAC; (b) correspondence to SIFT-GSM; (c) correspondence to SIFT-RANSAC-PC; (d) correspondence to RIRMI; and (e) correspondence to the proposed method.
Remotesensing 11 01833 g005
Table 1. Specifications of image data for experiments.
Table 1. Specifications of image data for experiments.
Data TypeNo.SatelliteSpectral ModeImage SizePix Size (m/Pixel)Bits per PixelAcquisition Data
Simulated image1 Hyperspectral-Band: 45500 × 3255322005
Hyperspectral-Band: 452324 × 190620322005
Real image2SPOT4Multispectral-Band: 1611 × 12353082001
Landsat TMMultispectral-Band: 3648 × 12303082004
3EO-1Hyperspectral-Band: 55239 × 2567.5162014
Google 543 × 5084162017
4SPOT-5Panchromatic1311 × 12151082006
Landsat ETM+Multispectral-Band: 3440 × 4103081999
5IRS-1CPanchromatic1346 × 1135581998
SPOT 4Panchromatic700 × 5901081996
6IRS-1CPanchromatic1122 × 1032561998
SPOT 4Panchromatic568 × 5221081999
Table 2. Comparison of deformation parameter evaluated by the proposed method, evaluated by four other methods and the ground truth.
Table 2. Comparison of deformation parameter evaluated by the proposed method, evaluated by four other methods and the ground truth.
Image PairMethod a 11 a 12 a 21 a 22 t x t y RMSELMSETime (s)
1Ground truth0.2350.086−0.0860.235−160.52559.117
SIFT-RANSAC0.2350.086−0.0860.235−160.75558.2990.6150.02365.97
SIFT-GSM0.2350.086−0.0860.235−160.084858.1970.5610.04666.03
SIFT-RANSAC-PC0.2350.086−0.0860.235−160.07758.4440.3130.01966.3
RIRMI0.2350.086−0.0840.235−160.73958.3780.3920.036132.11
the proposed method0.2350.086−0.0860.235−160.19959.30.2610.01876.31
2Ground truth0.996−0.0740.07580.99852.27−42.875
SIFT-RANSAC0.993−0.07860.07860.99452.624−43.0231.3250.00776.25
SIFT-GSM0.995−0.0750.0750.99552.544−43.2630.9060.00596.32
SIFT-RANSAC-PC0.993−0.07270.07850.99652.271−42.6130.7130.00446.38
RIRMI0.997−0.07350.07420.99752.053−41.2670.6440.003533.25
the proposed method0.995−0.07310.07750.99752.252−42.9770.3170.00276.41
3Ground truth1.819−0.4380.4421.79246.66456.888
SIFT-RANSAC1.811−0.4420.4341.78347.35255.3344.6010.39863.27
SIFT-GSM1.812−0.4440.4421.77447.86456.4554.3720.38353.16
SIFT-RANSAC-PC1.811−0.430.431.81146.59654.3172.640.41993.41
RIRMI1.814-0.4310.441.7745.6955.8893.480.378825.78
the proposed method1.814−0.4340.4341.81447.2156.3472.1870.37623.33
4Ground truth2.999−0.00090.00843.014−6.559−6.778
SIFT-RANSAC3−0.001−0.0022.996−3.641.4093.230.20045.77
SIFT-GSM30.00030.0012.997−3.8480.2182.9590.18385.81
SIFT-RANSAC-PC30.004-0.0043−5.7070.4152.4530.17845.92
RIRMI3−0.003−0.0043−3.412−0.3562.560.193129.58
the proposed method30.0001−0.00013−4.927−0.8092.1160.16865.97
5Ground truth1.998−0.104−0.0152.00418.428−13.749
SIFT-RANSAC1.9760.05170.05221.97617.39−13.7973.5130.26576.31
SIFT-GSM1.993−0.039−0.0111.99318.498−14.1252.970.32766.38
SIFT-RANSAC-PC1.979−0.0896−0.002152.01718.191−13.9962.60.31966.52
RIRMI1.9780.0898−0.01672.00618.586−13.2442.0030.257730.17
the proposed method1.993−0.108−0.0122.00418.735−13.1841.2170.21666.55
6Ground truth0.5020.0270.00150.5011−10.5431.5112
SIFT-RANSAC0.4960.0036−0.00640.489−9.6844.9781.5310.16995.87
SIFT-GSM0.51320.00950.00360.5144−10.3563.65161.2450.24775.91
SIFT-RANSAC-PC0.5020.02540.000670.497−10.8304.0271.05740.23596.03
RIRMI0.5030.000630.000630.503−11.2373.9460.8760.127728.66
the proposed method0.5010.02560.000640.501−10.5852.3020.3760.13066.12

Share and Cite

MDPI and ACS Style

Yang, H.; Li, X.; Zhao, L.; Chen, S. A Novel Coarse-to-Fine Scheme for Remote Sensing Image Registration Based on SIFT and Phase Correlation. Remote Sens. 2019, 11, 1833. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11151833

AMA Style

Yang H, Li X, Zhao L, Chen S. A Novel Coarse-to-Fine Scheme for Remote Sensing Image Registration Based on SIFT and Phase Correlation. Remote Sensing. 2019; 11(15):1833. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11151833

Chicago/Turabian Style

Yang, Han, Xiaorun Li, Liaoying Zhao, and Shuhan Chen. 2019. "A Novel Coarse-to-Fine Scheme for Remote Sensing Image Registration Based on SIFT and Phase Correlation" Remote Sensing 11, no. 15: 1833. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11151833

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop