Next Article in Journal
Exploring Equity in Healthcare Services: Spatial Accessibility Changes during Subway Expansion
Next Article in Special Issue
Semantic Relation Model and Dataset for Remote Sensing Scene Understanding
Previous Article in Journal
Research on HAR-Based Floor Positioning
Previous Article in Special Issue
Quantifying the Characteristics of the Local Urban Environment through Geotagged Flickr Photographs and Image Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Cost Function for the Uncertainty of Matching Point Distribution on Image Registration

1
College of Resources and Environment, Chengdu University of Information Technology, Chengdu 610000, China
2
Ministry of Education Key Laboratory of Virtual Geographic Environment, Nanjing Normal University, Nanjing 210000, China
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2021, 10(7), 438; https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi10070438
Submission received: 29 April 2021 / Revised: 9 June 2021 / Accepted: 21 June 2021 / Published: 25 June 2021
(This article belongs to the Special Issue Deep Learning and Computer Vision for GeoInformation Sciences)

Abstract

:
Computing the homography matrix using the known matching points is a key step in computer vision for image registration. In practice, the number, accuracy, and distribution of the known matching points can affect the uncertainty of the homography matrix. This study mainly focuses on the effect of matching point distribution on image registration. First, horizontal dilution of precision (HDOP) is derived to measure the influence of the distribution of known points on fixed point position accuracy on the image. The quantization function, which is the average of the center points’ HDOP* of the overlapping region, is then constructed to measure the uncertainty of matching distribution. Finally, the experiments in the field of image registration are performed to verify the proposed function. We test the consistency of the relationship between the proposed function and the average of symmetric transfer errors. Consequently, the proposed function is appropriate for measuring the uncertainty of matching point distribution on image registration.

1. Introduction

Matching points are the direct data sources of the homography matrix, fundamental matrix, camera parameters, and 3D point cloud. Thus, their uncertainty has a direct influence on the quality of image registration, image mosaics, image fusion, and image-based reconstruction. It is an essential research topic in geographic information systems to ensure the uncertainty of matching points.
In early literature, Brand [1] and Sankowski [2] used statistical methods to calculate the error between image points and ground points. Weng [3], Kanazawa [4], and Brooks [5] used a covariance matrix to describe the accuracy of feature points, and Haralick [6], Haralick [7], and Leo [8] used covariance propagation law to calculate the uncertainty of feature points. Matching points can be formed from the feature points with the same scene in two images. Fitzpatrick [9] pointed out that the most important error measure of matching points is target registration error, which is the distance after registration between corresponding points not used to calculate the registration transform. Fathy [10] also studied the accuracy for different error criteria of matching points. From the above literature, we know the criteria of the uncertainty of matching points.
The uncertainty of matching points depends on numerous factors [11], including the number, accuracy, and distribution of matching points. Some scholars have paid close attention to the feature matching algorithm to improve the accuracy of matching points. For example, Gui [12] presented a point-pattern matching method using SURF and shape context, Tong [13] and Zhao [14] improved image feature point detection and matching algorithm in a binocular vision system, and Hu [15] studied a robust image feature point matching algorithm based on structural distance. Moreover, Mai [16] investigated the impact of selection error and distribution of fiducial points on the accuracy of image matching between 3D images. Other scholars focused on the distribution of matching points. Fitzpatrick [9] also studied the distribution of target registration error in rigid-body point-based registration and found that the method was reliable for image matching to use the triangle constraint [17,18,19,20,21]. Tan [17,18], Guo [19], and Seo [22] indicated that a more precise result can be obtained using the evenly distributed matching points. However, in previous literature, the quantitative method for the distribution of matching points could not get appropriate attention.
The accuracy of image registration depends significantly on the uncertainty of the extracted matching points. The existing work [23] introduced horizontal dilution of precision (HDOP) to measure the location error on the image, and preliminarily described the construction method for the uncertainty of matching point distribution on 3D reconstruction. The present work mainly focuses on the detailed derivation that HDOP can measure the error of image points, describes the design process of the cost function ( H D O P * ¯ ), and uses the criteria (symmetric transfer errors) of the uncertainty of matching points to test the validity of the proposed function in the field of image registration.
The remainder of this paper is organized as follows. Section 2 mainly introduces derivation of HDOP on the image and the design process of the cost function for the uncertainty of matching point distribution. Section 3 uses experiments to verify the rationality of the proposed function. Section 4 provides conclusions. Finally, Section 5 describes the related patents.

2. Methods

As is shown in Figure 1, image 1 and image 2 are a pair of stereo images. First, a certain amount of known points (dark spots) in the two images can be extracted and then the homography matrix H can be computed by the above known points. Here, H represents the projective transformation relations of image 1 and image 2. Using H, we can seek all the corresponding fixed points (red spots) in the overlapping region. When the corresponding fixed points are superimposed together, then image registration is realized.
The above process shows that the quality for image registration is mainly caused by the accuracy, number, and distribution of the known matching points. This study mainly measures the uncertainty of the matching point distribution in the overlapping region of two images.

2.1. Derivation of HDOP

Suppose that the true coordinate of the fixed point is (X, Y) and the coordinate of the i-th known point is ( X i , Y i ) . Thus, the distance between the i-th known point and the fixed point is
P i X , Y = X X i 2 + Y Y i 2
However, it is inevitable that a location error will occur between the measured and true coordinates of the fixed point. Suppose the measured coordinate of the fixed point is ( X ^ , Y ^ ). The distance between the i-th known point and the actual measurement point then becomes
P i ^ X ^ , Y ^ = X ^ X i 2 + Y ^ Y i 2
Relative to the distance between the known point and the fixed point, the location error is relatively small. Therefore, in this study, Formula (1) can be transformed using the Taylor series one-time terms at the measured location of ( X ^ , Y ^ ). We get
P i = P i ^ X ^ , Y ^ + P i X X ^ , Y ^ × X X ^ + P i Y X ^ , Y ^ × Y Y ^ P i X X ^ , Y ^ = X i X ^ X ^ X i 2 + Y ^ Y i 2 P i Y X ^ , Y ^ = Y i Y ^ X ^ X i 2 + Y ^ Y i 2
Suppose that the distance errors between the fixed point and known points are d P = d P 1 , d P 2 , , d P i , , d P n T , here d P i = P i P i ^ . The location error is d L = d X , d Y T ; here d X = X X ^ , d Y = Y Y ^ . Thus, we can transform Formula (3) to the following:
d p = d P 1 d P 2 d P i d P n = P 1 P 1 ^ P 2 P 2 ^ P i P i ^ P n P n ^ = P 1 X X ^ , Y ^ P 1 Y X ^ , Y ^ P 2 X X ^ , Y ^ P 2 Y X ^ , Y ^ P i X X ^ , Y ^ P n X X ^ , Y ^ P i Y X ^ , Y ^ P n Y X ^ , Y ^ X X ^ Y Y ^ = A d L
where
A = P 1 X X ^ , Y ^ P 1 Y X ^ , Y ^ P 2 X X ^ , Y ^ P 2 Y X ^ , Y ^ P i X X ^ , Y ^ P n X X ^ , Y ^ P i Y X ^ , Y ^ P n Y X ^ , Y ^ n × 2
If more than two known points are given, then the set of equations d p = A d L derived from Formula (4) is over-determined, and the location error d L can be calculated by the least squares method.
d L = A T A 1 A T d p
According to the definition of covariance, we know that
c o v d L = E d L d L T = A T A 1 A T E d p d P T A A T A T
Assume that the distance error σ P has the same value in the X, Y direction in the same stereo pair, then
E d p d P T = I σ P 2
Moreover,
c o v d L = A T A 1 A T   I σ P 2 A A T A 1 = σ P 2 A T A 1
Here, c o v d L reflects the relationship between the location error and the distance error. When the distance error σ P is a fixed value, the location error only relates with A T A 1 . Hence, Tr A T A 1 is used to measure the location error on the image, and Tr() is the sum of the diagonal elements of the matrix, which is the scalar function of the matrix [24].
In the field of satellite navigation and geomatics engineering, dilution of precision (DOP) represents the influence of the relative geometric relationship between the user and the positioning constellation on the location error. HDOP, which is a type of DOP, expresses the precision of the plane position on the basis of satellite latitude and longitude coordinates. HDOP is consistent with the location error on the image. Thus, we can use HDOP to describe the location error on the image.
H D O P = Tr A T A 1

2.2. Design Process

Formulas (4)–(10) illustrate that the HDOP is related to the number and position of known points. This study mainly investigates the effect of the distribution of known points on the location error of fixed points, and so the effect of the number of matching points needs to be eliminated.
With Formulas (3) and (5), we know that
A T A = P 1 X 2 + + P n X 2 P 1 X P 1 Y + + P n X P n Y P 1 Y P 1 X + + P n Y P n X P 1 Y 2 + + P n Y 2 2 × 2
Here, tr A T A = n . Suppose that λ 1 and λ 2 are the eigenvalues of A T A , then λ 1 + λ 2 = n . Gerschgorin’s disk theorem [25] in matrix theory shows that the range of the first and second eigenvalues of A T A are the same. Therefore,
H D O P = tr A T A 1 = tr dig 1 λ 1 , 1 λ 2 = 1 λ 1 + 1 λ 2 2 × 1 λ 1 × 1 λ 2 1 2 2 n
Then, the purpose of HDOP divided by 2 n is to remove the effect of the number of known points. Meanwhile, the normalization function is utilized to transform H D O P × 2 n into the range of 0–1. Specifically, H D O P × 2 n 1 can make its range from (1, + ) to (0, + ). Next, the anti-tangent function is selected for transformation, with a range between 0 and π/2. Thereafter, the transformation result is multiplied by 2 and then divided by π. Finally, H D O P × 2 n can be converted to between 0 and 1. We also use H D O P * to describe the converted function.
H D O P * = 2 × arctan H D O P × n 2 1 / π .
In previous literature, feature matching in stereo images encouraged uniform distribution [17,18,19,22], and matching points with even distribution in the overlapping region are better to estimate a more precise homography matrix. Therefore, uniformity is an important parameter to measure the distribution of matching points.
In the field of satellite navigation and geomatics engineering, when the user is at the center of a uniform polyhedron formed by multiple visible satellites, the DOP is the smallest and the positioning accuracy is the highest [26,27]. Similarly, when the fixed points on the images are at the center of evenly distributed known points, the HDOP is the smallest and their location error is the smallest. In this study, the center points of the overlapping region are considered as the fixed points, their H D O P * values are chosen to measure the uniformity of matching point distribution.
In addition, the known matching points are comprised of corresponding feature points on the left and right images, but their pixel coordinates on the respective image are different. Hence, the H D O P * calculated by using Formulas (1)–(13) on the left and right images are different. This study selected H D O P * ¯ , the average H D O P * on all images, as the final result. Figure 2 is the flow chart of the proposed method.
Here, H D O P * ¯ has a range of [0, 1]. When H D O P * ¯ , calculated by the known matching points of a certain distribution, is close to 0, the distribution based on these matching points may be more even in the overlapping region of stereo images, and it is better for them to perform image registration.

3. Experiment

This work chose to verify the rationality of the proposed method in the field of image registration. Stereo pairs, which include the left and right images, were selected on the basis of simulated and real scenes. Here, the average of symmetric transfer errors of matching points in the images was considered as a parameter to evaluate the quality of the image registration.

3.1. Simulation Scene

The experiment in this work was a simulation scenario of a plane calibration board. Photos were taken with a Huawei Honor 30S mobile phone, and the photo sizes were 3456 pixels × 4608 pixels with a focal length of 6 mm. The stereo pair was comprised of both photos in Figure 3a,b and had an overlap of about 50%. A total of 112 pairs of matching points were extracted by hand, and their coordinates are shown in Figure 3a,b. Then, the overlapping region and its corresponding center points can be calculated by the above matching points. Figure 3c is an image registered by 112 pairs of matching points. The black shadow is the overlapping region of the left and right images. The white gap in the black shadow is the part where the two images do not overlap completely.

3.1.1. Data Source

There were two experiments designed in the simulation scenario. Experiment I included two tests. Test1-1 concerned the different number of matching points in the same distributed region. Test1-2 involved the matching points with the same distribution uniformity and different distribution locations. Experiment II included three tests with the same number and the different distribution of matching points, which could be studied to gain the relationship between the quality of image registration and H D O P * ¯ . Test2-1 involved the matching points in the central area spread to the entire overlapping region, test2-2 concerned the matching points of the image corners spread to the entire overlapping region, and Test2-3 concerned the linear matching points spread to the entire overlapping region. These matching points were also extracted by uniform sampling and by controlling pixel coordinates on the images. Their distributions are shown in Figure 4.

3.1.2. Result Evaluation

In the image registration where errors occur in both images, it is suitable that errors be estimated in both images. Therefore, the symmetric transfer error in two images is chosen to measure the deviation of matching points, which is the sum of the transfer errors in the first and the second images.
The transfer error is the Euclidean image distance in the second image between the measured point x’ and the point Hx at which the corresponding point x is mapped from the first image [11]. Therefore, the symmetric transfer error can be described as
i d x i , H 1 x i 2 + d x i , H x i 2
Here, xi and xi are a pair of matching points in two images, where i is the number of known matching points. H and H−1 are the matrices of the forward and backward transformation, respectively. In addition, we use the average of symmetric transfer errors of all control points in the overlapping region as the quality of image registration. In Figure 3, 112 pairs of matching points are considered as the above control points. We can use matching points in Figure 4 to gain the matrix H and then use Formula (14) to gain the average of symmetric transfer errors of the above control points.
The proposed method in this study can be used to calculate the H D O P * ¯ value of matching points with different numbers and different distributed regions. Specifically, the overlapping region (red rectangles in Figure 4) needs to be estimated, its their center points can then be computed. The H D O P * values on both images were calculated using Formulas (1)–(13) and then H D O P * ¯ was determined. The specific calculated results are shown in Table 1 and Table 2.
(1)
Correctness of the proposed method
The purpose of Table 1 is to illustrate the correctness of the proposed method in this study. The proposed method used H D O P * to describe the relationship between point error and point distribution, eliminated the influence of the number of matching points, and chose the center points’ H D O P * of the overlapping region to measure the distribution of matching points. Therefore, there were two purposes for Experiment I in Table 1: one is to illustrate that the number of matching points has little effect on H D O P * ¯ , and the other is to illustrate that the center point’s H D O P * ¯ of the overlapping region can reflect the influence of same distribution uniformity and different distribution location on image registration.
In Test1-1 of Table 1, as the number of matching points decreases, the error increases. We can say that the number of matching points influences the average of symmetric transfer errors, but the effect is little. Similarly, the change trend of H D O P * ¯ is consistent with the average of symmetric transfer errors and has little effect overall. Therefore, the proposed method in this study eliminated the influence of the number of matching points.
In Test1-2, deviation degree is a ratio of the distance between the center of the region where matching points are located and the center of the overlapping region to the half diagonal of the overlapping region [28]. As the deviation degree increases, both the average of symmetric transfer errors and H D O P * ¯ increase. H D O P * ¯ and the deviation degree are consistent with the average of symmetric transfer errors. Hence, we can say that H D O P * ¯ can reflect the effect of the deviation degree on matching point distribution. It is more appropriate to use the center points’ H D O P * ¯ of the overlapping region to measure the impact of matching point distribution on the image registration.
(2)
Rationality of the proposed method
On the basis of Experiment I, we know that the number of matching points, taken by itself, has little effect. Thus, this experiment randomly selected about 15 points and designed three tests to verify the rationality of the proposed method.
Distribution uniformity (DU) [28] is the ratio of the total length of the minimum spanning tree of all points to the one-half power of the number of points, and it can be used to measure the uniformity of points on the image plane. As DU increases, the distribution of points becomes more uniform, and the average of symmetric transfer errors should be smaller, but there is an exception in Test2-1 of Table 2.
However, the changing laws of H D O P * ¯ proposed in this paper, the uniform distribution of matching points, and the average of symmetric transfer error are consistent in Table 2. When H D O P * ¯ values in Figure 4(a4,a5) are close to 1, the average of symmetric transfer errors is relatively large. In addition, when H D O P * ¯ values is close to 0, the average of symmetric transfer errors is small.
Compared with DU, H D O P * ¯ has a range [0, 1] to measure the uniformity of matching points distribution, and it is more suitable to measure the influence of matching points distribution on the average of symmetric transfer errors.

3.2. Real Scene

In this experiment, the stereo pair (Tsinghua School) published by the Institute of Automation of the Chinese Academy of Sciences was selected for testing. Matching points were extracted by using the SURF and the nearest neighbor search algorithms. A total of 642 pairs of matching points are shown in Figure 5a,b with ‘+’ symbols. In addition, 30 pairs of control points were extracted by hand, and their coordinates are shown in Figure 5a,b with ‘o’ symbols. Figure 5c is the registered image obtained by overlapping the above matching points.

3.2.1. Data Source

On the basis of the simulation scene experiment, we know that the number of matching points only has a little effect. Thus, this experiment randomly selected about 150 points and designed the following experimental data to verify the rationality of the proposed method.

3.2.2. Result Evaluation

In this experiment, we chose the average of symmetric transfer errors of control points in Figure 5 to measure the quality of image registration. We used matching points in Figure 6 and the flow chart in Figure 2 to calculate the H D O P * ¯ value, and then used Formula (14) to compute the average of symmetric transfer errors. The specific calculated results are shown in Table 3.
Table 3 presents perfect conditions for matching points in Figure 6(a1,b2,c2) to perform image registration. We can then say that the matching points are distributed around the center of the overlapping region, and evenly distributed matching points can perform image registration better.
From Table 2 and Table 3, we found that the average of symmetric transfer errors is not the smallest for evenly distributed matching points throughout the overlapping region. The matching points distributed on the edge of the overlapping region may affect the accuracy of image registration. Therefore, it is recommended that the matching points should be uniformly distributed inside the overlapping region that does not contain its marginal region.

4. Conclusions

This study mainly accomplished two things: one was to construct the cost function to measure the uncertainty of matching point distribution, and the other was to test the validity of the proposed method in the field of image registration. Specifically, the proposed method was designed as follows:
(1)
The study derived the influence value of the known points on the position error of the fixed point, which is represented by HDOP.
(2)
2 × a r c t a n H D O P × n / 2 1 / π is a function to measure the uncertainty of known point distribution and has a range of [0, 1]. Here, the aim of H D O P × n / 2 is to remove the effect of the number of known points.
(3)
The average function H D O P * ¯ of the center points of the overlapping region was chosen to measure the uncertainty of matching point distribution.
We used two groups of experiment data to test the validity of the proposed function in the field of image registration. The proposed function is consistent with the average of symmetric transfer errors in the image registration and can be used to measure the uncertainty of matching point distribution. When H D O P * ¯ is close to 0, it is better for these distributed matching points to perform image registration. Additionally, when it is close to 1, the average of symmetric transfer errors may be larger, and we may need to re-extract matching points for image registration.

5. Patents

There is a Chinese patent resulting from the work reported in this manuscript. The patent title is “A quantitative method for calculating the reliability of the distribution of matching points”, and its number is ZL201910311174.2.

Author Contributions

Conceptualization, Yuxia Bian; methodology, Yuxia Bian and Shuhong Fang; software, Yongbin Chu; validation, Meizhen Wang and Zhihong Liu; formal analysis, Meizhen Wang and Yongbin Chu; investigation, Yuxia Bian; resources, Zhiye Xia; writing—original draft preparation, Yuxia Bian; writing—review and editing, Yuxia Bian, Meizhen Wang and Jun Chen; supervision, Zhihong Liu; project administration, Yongbin Chu; funding acquisition, Yuxia Bian, Yongbin Chu, Zhihong Liu, Jun Chen and Zhiye Xia. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Technology Innovation Research and Development Project of Chengdu Science and Technology Bureau, grant number 2019-YF05-02641-SN; the Key Research and Development Projects of Sichuan Science and Technology, grant numbers 2020YFG0146, 2020YFG0144 and 2019YFS0472; the National Natural Science Foundation of China, grant numbers 41771535 and 41601422; and the School Undergraduate Teaching Project, grant number BKJX2020030.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

In this study, experiment scene was the plane calibration board, and its photos were taken with a mobile phone. It is relatively easy to obtain experiment data. Therefore, no data were created in this study.

Acknowledgments

Thanks to the technical help given by Yansong Duan of Wuhan University, China.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Brand, P.; Mohr, R. Accuracy in image measure. In Proceedings of the SPIE Conference on Videometrics, Boston, MA, USA, 6 October 1994. [Google Scholar] [CrossRef]
  2. Sankowski, W.; Włodarczyk, M.; Kacperski, D.; Grabowski, K. Estimation of measurement uncertainty in stereo vision system. Image Vis. Comput. 2017, 61, 70–81. [Google Scholar] [CrossRef]
  3. Weng, J.; Huang, T.; Ahuja, N. Motion and structure from two perspective views: Algorithms, error analysis, and error estimation. IEEE Trans. Pattern Anal. Mach. Intell. 1989, 11, 451–476. [Google Scholar] [CrossRef] [Green Version]
  4. Kanazawa, Y.; Kanatani, K. Do we really have to consider covariance matrices for image feature points? Electron. Commun. Jpn. 2002, 86, 1–10. [Google Scholar] [CrossRef]
  5. Brooks, M.J.; Chojnacki, W.; Gawley, D.; Hengel, A.V.D. What value covariance information in estimating vision parameters? In Proceedings of the IEEE International Conference on Computer Vision 2001, Vancouver, BC, Canada, 7–14 July 2001. [Google Scholar] [CrossRef] [Green Version]
  6. Haralick, R.M. Propagating Covariance in Computer Vision. Int. J. Pattern Recognit. Artif. Intell. 1996, 10. [Google Scholar] [CrossRef] [Green Version]
  7. Steele, R.M.; Jaynes, C. Feature Uncertainty Arising from Covariant Image Noise. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–26 June 2005. [Google Scholar] [CrossRef]
  8. Di Leo, G.; Liguori, C.; Paolillo, A. Covariance Propagation for the Uncertainty Estimation in Stereo Vision. IEEE Trans. Instrum. Meas. 2011, 60, 1664–1673. [Google Scholar] [CrossRef]
  9. Fitzpatrick, J.; West, J. The distribution of target registration error in rigid-body point-based registration. IEEE Trans. Med Imaging 2001, 20, 917–927. [Google Scholar] [CrossRef] [PubMed]
  10. Fathy, M.E.; Hussein, A.S.; Tolba, M.F. Fundamental matrix estimation: A study of error criteria. Pattern Recognit. Lett. 2011, 32, 383–391. [Google Scholar] [CrossRef] [Green Version]
  11. Hartley, R.; Zisserman, A.; Faugeras, O. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: Cambridge, UK, 2003; pp. 93–139. [Google Scholar]
  12. Gui, Y.; Su, A.; Du, J. Point-pattern matching method using SURF and Shape Context. Int. J. Light Electron. Optics 2013, 124, 1869–1873. [Google Scholar] [CrossRef]
  13. Tong, G.; Wang, C.; Wang, P. Study on Improving Image Feature Points Detection and Matching Accuracy in Binocular Vision System. In Proceedings of the International Industrial Informatics and Computer Engineering Conference, Xi’an, China, 10–11 January 2015. [Google Scholar] [CrossRef] [Green Version]
  14. Zhao, Y.; Su, J. Local sharpness distribution–based feature points matching algorithm. J. Electron. Imaging 2014, 23, 13011. [Google Scholar] [CrossRef]
  15. Hu, M.; Liu, Y.; Fan, Y. Robust Image Feature Point Matching Based on Structural Distance. In Proceedings of the Chinese Conference on Image and Graphics Technologies, Berlin, Germany, 10–12 March 2015. [Google Scholar] [CrossRef]
  16. Mai, H.Y.; Lee, D.-H. Impact of Matching Point Selections on Image Registration Accuracy between Optical Scan and Computed Tomography. BioMed Res. Int. 2020, 2020, 1–7. [Google Scholar] [CrossRef] [PubMed]
  17. Tan, X.; Sun, C.; Sirault, X.; Furbank, R.; Pham, T.D. Feature Correspondence with Even Distribution. In Proceedings of the International Conference on Digital Image Computing Techniques and Applications, Perth, Australia, 14 July 2012. [Google Scholar] [CrossRef]
  18. Tan, X.; Sun, C.; Sirault, X.; Furbank, R.; Pham, T.D. Feature matching in stereo images encouraging uniform spatial distribution. Pattern Recognit. 2015, 48, 2530–2542. [Google Scholar] [CrossRef]
  19. Guo, X.; Cao, X. Good match exploration using triangle constraint. Pattern Recognit. Lett. 2012, 33, 872–881. [Google Scholar] [CrossRef]
  20. Zhu, Q.; Wu, B.; Xu, Z.-X.; Qing, Z. Seed Point Selection Method for Triangle Constrained Image Matching Propagation. IEEE Geosci. Remote Sens. Lett. 2006, 3, 207–211. [Google Scholar] [CrossRef]
  21. Mahmood, M.T.; Lee, I.H. A Reliable Distribution Quality Measure in Image Registration. IEEE Access 2019, 7, 119367–119374. [Google Scholar] [CrossRef]
  22. Seo, J.-K.; Hong, H.-K.; Jho, C.-W.; Choi, M.-H. Two quantitative measures of inlier distributions for precise fundamental matrix estimation. Pattern Recognit. Lett. 2004, 25, 733–741. [Google Scholar] [CrossRef]
  23. Bian, Y.; Liu, X.; Wang, M.; Liu, H.; Fang, S.; Yu, L. Quantification Method for the Uncertainty of Matching Point Distribution on 3D Reconstruction. Int. J. Geo-Inf. 2020, 9, 187. [Google Scholar] [CrossRef] [Green Version]
  24. Zhang, X.-D. Matrix Analysis and Applications, 2nd ed.; Tsinghua University Press: Beijing, China, 2013; pp. 49–51. [Google Scholar]
  25. Bu, C.J.; Luo, Y.S. Matrix Theory; Harbin Engineer University Press: Harbin, China, 2003; pp. 164–174. [Google Scholar]
  26. Sheng, H.; Yang, J.S.; Zeng, F.L. The Minimum Value of GDOP in Pseudo-range Positioning. Fire Control Command Control 2009, 34, 22–24. [Google Scholar] [CrossRef]
  27. Li, J.; Li, Z.; Zhou, W.; Si, S. Study on the Minimum of GDOP in Satellite Navigation and its Applications. Acta Geod. Cartogr. Sin. 2011, 40, 85–88. [Google Scholar]
  28. Qu, Z.G. Image Structure Description and Matching based on Graph Theory. Ph.D. Thesis, National University of Defense Technology, Changsha, China, 2013. [Google Scholar]
Figure 1. Image registration. The dark spots at the same position on image 1 and image 2 are a pair of known matching points. We can use them to calculate the homography matrix H and then use H to gain image 3.
Figure 1. Image registration. The dark spots at the same position on image 1 and image 2 are a pair of known matching points. We can use them to calculate the homography matrix H and then use H to gain image 3.
Ijgi 10 00438 g001
Figure 2. The flow chart of the proposed method.
Figure 2. The flow chart of the proposed method.
Ijgi 10 00438 g002
Figure 3. Primary data of the simulation scene. (a) Left image and the feature points. (b) Right image and the feature points. The symbols + with the same number in (a,b) are a pair of known matching points. The yellow dots are the center points of the overlapping region of the left and right images. (c) Registered image.
Figure 3. Primary data of the simulation scene. (a) Left image and the feature points. (b) Right image and the feature points. The symbols + with the same number in (a,b) are a pair of known matching points. The yellow dots are the center points of the overlapping region of the left and right images. (c) Registered image.
Ijgi 10 00438 g003
Figure 4. Data source of the simulation scene. The red rectangle indicates the overlapping region, and the yellow rectangle indicates the region where known matching points are located. There are 112 pairs in (a1), 56 pairs in (b1), 28 pairs in (c1), and 14 pairs in (d1). There are about 16 pairs of matching points in the center of the overlapping region in (a2), in the center-right overlapping region in (b2), and gathered toward the upper-right corner of the overlapping region from (a2), (c2) to (d2). There are about 14 pairs of known matching points extending from the center to the entire overlapping region in (a3d3), extending from the upper-right corner to the entire overlapping region in (a4d4), and extending linearly to the entire overlapping region in (a5d5).
Figure 4. Data source of the simulation scene. The red rectangle indicates the overlapping region, and the yellow rectangle indicates the region where known matching points are located. There are 112 pairs in (a1), 56 pairs in (b1), 28 pairs in (c1), and 14 pairs in (d1). There are about 16 pairs of matching points in the center of the overlapping region in (a2), in the center-right overlapping region in (b2), and gathered toward the upper-right corner of the overlapping region from (a2), (c2) to (d2). There are about 14 pairs of known matching points extending from the center to the entire overlapping region in (a3d3), extending from the upper-right corner to the entire overlapping region in (a4d4), and extending linearly to the entire overlapping region in (a5d5).
Ijgi 10 00438 g004aIjgi 10 00438 g004bIjgi 10 00438 g004c
Figure 5. Primary data of the real scene (Robot Vision Group (ia.ac.cn)). (a) Left image. (b) Right image. (c) Registered image. The ‘+’ and ‘o’ points with the same number on the left and right images represent a pair of matching points and a pair of control points, respectively.
Figure 5. Primary data of the real scene (Robot Vision Group (ia.ac.cn)). (a) Left image. (b) Right image. (c) Registered image. The ‘+’ and ‘o’ points with the same number on the left and right images represent a pair of matching points and a pair of control points, respectively.
Ijgi 10 00438 g005
Figure 6. Data source of the real scene. There are about 150 pairs of known matching points extending from the center to the upper-right corner of overlapping region in (a1c1), extending from the upper-right corner to the entire overlapping region in (a2c2).
Figure 6. Data source of the real scene. There are about 150 pairs of known matching points extending from the center to the upper-right corner of overlapping region in (a1c1), extending from the upper-right corner to the entire overlapping region in (a2c2).
Ijgi 10 00438 g006
Table 1. Results calculated by the matching points in Experiment I.
Table 1. Results calculated by the matching points in Experiment I.
Test1-1Matching PointsFigure 4(a1)Figure 4(b1)Figure 4(c1)Figure 4(d1)
Number112562814
Average of
symmetric transfer errors
1.79361.89332.08393.1605
HDOP * ¯ 0.30610.30620.31730.3303
Test1-2Matching PointsFigure 4(a2)Figure 4(b2)Figure 4(c2)Figure 4(d2)
Deviation degree00.25360.44750.7452
Average of
symmetric transfer errors
1.19052.45183.46668.8725
HDOP * ¯ 0.25230.29600.58970.7857
Table 2. Results calculated by the matching points in Experiment II.
Table 2. Results calculated by the matching points in Experiment II.
Test2-1Matching PointsFigure 4(a3)Figure 4(b3)Figure 4(c3)Figure 4(d3)
HDOP * ¯ 0.25210.25260.29360.3254
Distribution Uniformity59880110881404
Average of
symmetric transfer errors
1.45451.48861.74942.7627
Test2-2Matching PointsFigure 4(a4)Figure 4(b4)Figure 4(c4)Figure 4(d4)
HDOP * ¯ 0.84150.42170.29400.3045
Distribution Uniformity54378111101309
Average of
symmetric transfer errors
37.16419.42101.43382.8030
Test2-3Matching PointsFigure 4(a5)Figure 4(b5)Figure 4(c5)Figure 4(d5)
HDOP * ¯ 0.84290.52220.33360.2878
Distribution Uniformity731113513111343
Average of
symmetric transfer errors
259.270412.51422.41131.9046
Table 3. Results calculated by the matching points in Figure 6.
Table 3. Results calculated by the matching points in Figure 6.
Test-1Matching PointsFigure 6(a1)Figure 6(b1)Figure 6(c1)
HDOP * ¯ 0.25310.57490.6258
Average of
symmetric transfer errors
6.53538.463810.4455
Test-2Matching PointsFigure 6(a2)Figure 6(b2)Figure 6(c2)
HDOP * ¯ 0.62580.27580.2789
Average of
symmetric transfer errors
10.44557.10077.6965
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bian, Y.; Wang, M.; Chu, Y.; Liu, Z.; Chen, J.; Xia, Z.; Fang, S. A Cost Function for the Uncertainty of Matching Point Distribution on Image Registration. ISPRS Int. J. Geo-Inf. 2021, 10, 438. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi10070438

AMA Style

Bian Y, Wang M, Chu Y, Liu Z, Chen J, Xia Z, Fang S. A Cost Function for the Uncertainty of Matching Point Distribution on Image Registration. ISPRS International Journal of Geo-Information. 2021; 10(7):438. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi10070438

Chicago/Turabian Style

Bian, Yuxia, Meizhen Wang, Yongbin Chu, Zhihong Liu, Jun Chen, Zhiye Xia, and Shuhong Fang. 2021. "A Cost Function for the Uncertainty of Matching Point Distribution on Image Registration" ISPRS International Journal of Geo-Information 10, no. 7: 438. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi10070438

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop