PE&RS July 2016 Public - page 559

Geometrical Consistency Voting Strategy for
Outlier Detection in Image Matching
Luping Lu, Yong Zhang, and Pengjie Tao
False matches in tie-point image matching are common. This
paper introduces a straightforward and effective prepossess-
ing method to reject false matches from initial matches. The
method is based on the idea of Hough transform using only
two geometrical consistency parameters, namely, the scale
parameter and the rotation parameter between two images. A
weighted voting strategy is employed, and it can further im-
prove the robustness of the algorithm. The method can handle
a large rate of outliers and produce more robust matches with
low complexity. No assumptions with regard to the relative
pose between two images are necessary, and large perspective
deformation can be handled as well. Experiments with ground
reference data show that the algorithm works effectively even
when the ratio of inliers is below 10 percent. In these data, the
ratio of inliers can be improved from 5 percent to 40 percent
on average.
For tie-point image matching tasks (Han
et al.
, 2012; Wu
, 2011), feature extraction (Schmid
et al.
, 2000; Tuytelaars
and Mikolajczyk, 2008) and feature description (Gauglitz
, 2011; Gil
et al.
, 2010; Mikolajczyk and Schmid, 2005)
are only part of the work involved in achieving success-
ful matching. The remaining part is an effective strategy to
identify correct corresponding points from initial candidate
matches. The initial candidate matches are usually composed
of features from one image and their
nearest neighbors from
another image. Using epipolar geometry (Hartley and Zisser-
man, 2003) directly to select correct matches from the initial
candidate matches is known to frequently end in failure, even
with the aid of random sample consensus (
) (Fischler
and Bolles, 1981) or least median of squares (
) (Meer
, 1991; Zhang, 1997) when the ratio of inliers is low. The
major causes of failures are as follows: (a) When using epipo-
lar geometry constraint as the transformation model between
images, at least five to eight correct matches are required
to generate a correct hypothesis. This condition is difficult
when the ratio of inliers is low; and (b) When using a simpler
transformation model, such as similarity transformation, the
threshold for deciding whether a match is an inlier or not is
difficult to determine.
does not require any tuning vari-
able because it attempts to minimize median squared error.
However, it becomes worse when the inlier ratio is under 0.5
because the median comes from the outliers.
-like algorithms usually work effectively when the
ratio of inliers is higher than 30 percent (Choi
et al.
, 1997;
Chum and Matas, 2008; Meer, Mintz, Rosenfeld and Kim,
1991). However, in difficult image matching cases, the propor-
tion of correct matches among initial candidate matches can
be less than 10 percent. We observed the poor performance of
-like algorithms when outliers are dominant. Thus, a
preprocessing step is necessary to filter the initial candidate
matches, remove most of the incorrect matches first, and
then identify the final correct matches by using the epipolar
geometry constraint. In this paper, a preprocessing method
is proposed to filter the initial matches. We call this part of
algorithm a candidate match filter.
The ratio of correct matches among all initial candidate
matches is mainly affected by the following three factors: (1)
the quality of feature extraction and description algorithms;
(2) the size of overlapping area between images; and (3) the
search space for finding matches. Coarse-to-fine strategy (Sun
et al.
, 2014; Zhang
et al.
, 2014) is commonly used to ensure
tie-point matching quality, considering its capacity to limit
the search space. With the effects of 3D viewpoint change and
scene occlusion, candidate matches are commonly obtained
according to the local similarity of features. A major limita-
tion of pure local feature matching lies in its neglect of the
spatial relation of local features, which have been proven use-
ful in improving search accuracy (Hu
et al.
, 2015).
A good candidate match filter should have the following
three properties: (1) excellent anti-noise property, because
initial candidate matches may be dominated by incorrect
matches; (2) excellent ability to preserve correct matches. Cor-
rect matches should not be negatively affected when remov-
ing incorrect matches; and (3) prior information with regard
to the transformation parameters of images should not be
required, considering its unavailability during the first stage
of tie-point image matching.
Two commonly used candidate match filters are as follows:
(a) a match is accepted as a potential match by forcing the
symmetric constraint if and only if the two features are the
nearest neighbors of each other at the same time; and (b) by
setting a threshold of the distance ratio between the nearest
and the second nearest neighbor (Lowe, 2004).
Many incorrect matches can be removed by using these
methods. However, neither of them considers multiple near-
est neighbors because they use the first nearest neighbor only.
To determine the significance of considering multiple nearest
neighbors for each feature point, we conducted image match-
ing experiments using aerial images with ground references
(one such data is presented in Plate 1a), that is, the orienta-
tion parameters of the images and the elevation of the scene
are known, thereby allowing us to know the correctness of
a match by comparing the projection coordinate
and the
image coordinate
of a match point. In our tests, a match is
considered correct if the Euclidean distance between
Luping Lu is with the Collaborative Innovation Center of
Geospatial Technology, Wuhan University and the Electronic
Information School, Wuhan University, No. 129 Luoyu Road,
Wuhan, 430079, P.R. China (
Yong Zhang and Pengjie Tao are with the School of Remote
Sensing and Information Engineering, Wuhan University, No.
129 Luoyu Road, Wuhan, 430079, P. R. China.
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 7, July 2016, pp. 559–570.
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.7.559
July 2016
447...,549,550,551,552,553,554,555,556,557,558 560,561,562,563,564,565,566,567,568,569,...582
Powered by FlippingBook