PE&RS November 2014 - page 1041

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
November 2014
1041
A Robust Image Matching Method
based on Optimized BaySAC
Zhizhong Kang, Fengman Jia, and Liqiang Zhang
Abstract
This paper proposes a robust image-matching method, which
integrates
SIFT
with the optimized Bayes SAmpling Consensus
(
BaySAC
). As the point correspondences are likely contaminat-
ed by outliers, we present a novel robust estimation method
involving an efficient
BaySAC
for eliminating falsely accepted
correspondences. The key points of the proposed hypothesis
testing algorithm are determining and updating the prior
probabilities of pseudo-correspondences. First, we propose a
strategy for prior probability determination in terms of the sta-
tistical characteristics of a deterministic mathematical model
for hypothesis testing. Moreover, the inlier probability updat-
ing is simplified based on a memorable form of Bayes’ Theo-
rem. The proposed approach is validated on a variety of im-
age pairs. The results indicate that when compared with the
performance of RANdom SAmpling Consensus (
RANSAC
) and
the original
BaySAC
, the proposed optimized
BaySAC
consumes
less computation and obtains higher matching accuracy
when the hypothesis set is contaminated with more outliers.
Introduction
Image-matching refers to the process of identifying the point
correspondences between two or multiple images in the same
scene or project. This process is an important topic in the
digital photogrammetry and computer vision fields, includ-
ing image registration (e.g., Han
et al.
, 2012; Wu
et al.
, 2012;
Wang
et al.
, 2013), object recognition (Yang and Cohen, 1999;
Wakahara
et al.
, 2001; Lin
et al.
, 2012; Wang
et al.
, 2014),
object tracking (e.g., Donoser and Bischof, 2006; Goedeme
et
al.
, 2004), three-dimensional visualization (Zhang
et al.
, 2012)
and camera calibration. As external environmental noise
introduces many challenges to image-matching, it is necessary
to find a robust, accurate and fast image-matching method.
The relation between two or more images of the same scene
taken from different views can be described using the epipo-
lar geometry. The fundamental matrix (
FM
) is the representa-
tion of the epipolar geometry in computer vision. Calculating
the fundamental matrix with high accuracy is a necessary step
in computer vision applications.
The fundamental matrix can be computed from the pixel
coordinates of the corresponding points in the uncalibrated
images. To approximate the fundamental matrix, the first
step is to select corresponding points, which is also the
key part of image-matching. Generally, the image matching
methods can be divided into two categories: intensity-based
and feature-based methods. The intensity-based matching
methods can provide sub-pixel or better accuracy; however,
these methods are sensitive to intensity change and geometry
deformations. Therefore, stable and automatic feature-based
algorithms are widely used in image-matching applications.
There are many famous feature-based methods that have been
successfully applied, including Harris,
SIFT
, Principal Com-
ponents Analysis
SIFT
(
PCA-SIFT
), and Speeded Up Robust Fea-
tures (
SURF
). The Harris algorithm (Harris and Stephens, 1988)
proposed in 1988 is based on the eigenvalues of the second
moment matrix. However, Harris corners are not scale-invari-
ant. The
SIFT
algorithm (Lowe, 1999) was first introduced by
David Lowe, and its remarkable advantages are scale-invariant
and high speed. Ke and Sukthankar (2004) proposed an im-
proved
SIFT
algorithm called
PCA-SIFT
which applied Principal
Components Analysis (
PCA
) to the normalized gradient patch
instead of using
SIFT
’s smoothed weighted histograms. The
paper revealed that
PCA-SIFT
could produce increased accuracy
and faster matching.
SURF
was proposed by Bay
et al.
(2006).
In the context of scale space extrema detection,
SURF
filters the
integral images using a box filter as an approximation of sec-
ond-order Gaussian partial derivatives.
SURF
was demonstrat-
ed to approximate or even outperform previously proposed
schemes with respect to repeatability, distinctiveness, and
robustness, yet it could be computed much faster. Juan and
Gwun (2009) presented a comparative study of
SIFT
,
SURF
and
PCA-SIFT
. Their study revealed that except for its computation
efficiency,
SIFT
had better performance in the situation of rota-
tion, scale, blur changes, and affine transformations than
SURF
and
PCA-SIFT
. Reseachers have proposed many other distribu-
tion-based descriptors for image-matching using histograms to
represent different features (Mikolajczyk and Schmid, 2005;
Morel and Yu, 2009; Guo
et al.
, 2010). Making use of the ro-
bust color invariance against varying imaging conditions, some
algorithms with color-based descriptors have been presented
(Diplaros
et al.
, 2006; Abdel-Hakim and Farag, 2006; Stokman
and Gevers, 2007; Verma
et al.
, 2011). Some filter-based de-
scriptors were proposed in the methods of Moreno
et al.
(2009)
and Gómez and Romero (2011). Geometric constraints, such as
distance invariance (Kang
et al.
, 2009), distribution pattern (Di
et al.
, 2011) and triangulation (Wu
et al.
, 2011), have also been
applied in
SIFT
-based image-matching.
After correspondences are selected, many proposed meth-
ods can be used for estimating the fundamental matrix. These
methods are roughly divided into three classes: linear meth-
ods, iterative methods, and robust methods. Linear methods
can largely reduce the computing time but are sensitive to
Zhizhong Kang and Fengman Jia are with the Department of
Remote Sensing and Geo-Information Engineering, School of
Land Science and Technology, China University of Geosci-
ences, Xueyuan Road 29, Haidian District, Beijing 100083,
China (
).
Liqiang Zhang is with the Research Center for Remote Sens-
ing and GIS, School of Geography, Beijing Normal University,
100875 Beijing, China.
Photogrammetric Engineering & Remote Sensing
Vol. 80, No. 11, November 2014, pp. 1041–1052.
0099-1112/14/8011–1041
© 2014 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.80.11.1041
999...,1031,1032,1033,1034,1035,1036,1037,1038,1039,1040 1042,1043,1044,1045,1046,1047,1048,1049,1050,1051,...1086
Powered by FlippingBook