PE&RS July 2016 Public - page 521

RPC-Based Coregistration of VHR Imagery
for Urban Change Detection
Shabnam Jabari and Yun Zhang
Abstract
In urban change detection, coregistration between bi-temporal
Very High Resolution (
VHR
) images taken from different viewing
angles, especially from high off-nadir angles, is very challenging.
The relief displacements of elevated objects in such images usu-
ally lead to significant misregistration that negatively affects the
accuracy of change detection. This paper presents a novel solu-
tion, called Patch-Wise CoRegistration (
PWCR
), that can overcome
the misregistration problem caused by viewing angle difference
and accordingly improve the accuracy of urban change detec-
tion. The
PWCR
method utilizes a Digital Surface Model (
DSM
)
and the Rational Polynomial Coefficients (
RPCs
) of the images to
find corresponding points in a bi-temporal image set. The corre-
sponding points are then used to generate corresponding patches
in the image set. To prove that the
PWCR
method can overcome
the misregistration problem and help achieving accurate change
detection, two change detection criteria are tested and incorpo-
rated into a change detection framework. Experiments on four
bi-temporal image sets acquired by Ikonos, GeoEye-1, and World-
view-2 satellites from different viewing angles show that the
PWCR
method can achieve highly accurate image patch coregistration
(up to 80 percent higher than traditional coregistration for elevat-
ed objects), so that the change detection framework can produce
accurate urban change detection results (over 90 percent).
Introduction
Urban change detection matters to a large number of organi-
zations, such as municipalities and local governments, for a
wide range of applications including map updating and haz-
ard assessment.
VHR
satellite images have been increasingly
used for urban change detection because they can provide
adequate details of urban environment (Armenakis
et al.
,
2010). Changes in urban areas can be detected by comparing
corresponding pixels/objects in bi/multi-temporal satellite
images, if the spatial relation between the corresponding
pixels/objects is known. Hence, change detection is normally
conducted in the following three steps:
1. establishing a spatial relation between bi-temporal
images (coregistration),
2. specifying the element of change detection: object
(through segmentation) or pixel,
3. indicating the change: analyzing the spectral or spatial
features of the objects or pixels to find changes.
Among these steps, the role of coregistration is crucial, since
any error in coregistration, i.e., misregistration, directly
affects the accuracy of change detection. Misregistration may
cause two types of errors in change detection: either error of
omission, in which the changed object is classified as un-
changed; or error of commission, in which the unchanged
object is classified as changed (Sundaresan
et al.
, 2007). In
both cases, the change detection accuracy is decreased.
The coregistration task is even more serious in urban
VHR
imagery acquired with high off-nadir viewing angles due to
severe relief displacements (Chen
et al.,
2021). Because of this
effect, the tops of the elevated objects, e.g., buildings in urban
areas, lean far from their footprints, exposing parts of their side
exterior, i.e., building façade, and blocking other lower objects
such as roads. The latter effect is called occlusion. This leaning
can occur in different directions depending on the image view-
ing angles, so that the coregistration of corresponding pixels/
objects in bi-temporal images becomes extremely difficult.
Although object-based methods can slightly compensate
for misregistration errors, because of using parameters like
the mean value of a group of pixels as a representation of the
related object (Blaschke, 2003), the extensive misregistration
caused by different viewing angles cannot be compensated by
such methods. As a result, the errors like sliver polygons pro-
duced by misregistration are not avoidable in change detection.
To avoid misregistration errors, our literature review
indicates that the majority of the studies in urban change
detection used close-to-nadir imagery (Zhou
et al.
, 2008; Al-
Khudhairy
et al.
, 2005; Bouziani
et al.
,. 2010; Im and Jensen
2005; Gueguen
et al.
, 2011) or off-nadir images from flat areas
(Doxani
et al.
, 2012; Niemeyer
et al.
, 2008; Im and Jensen
2005; Im
et al.
, 2007). Other researchers used ortho-rectified
images to attenuate the relief displacement distortions (Nie-
meyer
et al.
, 2008; Doxani
et al.
, 2012; Im
et al.
, 2008).
Another group of studies utilized bi-temporal DSMs (Choi
et al.
, 2009) or bi-temporal DSMs and images for change
detection (Tian
et al.
, 2014; Jung, 2004). In these studies, two
DSMs
produced by either lidar or two sets of stereo images,
are required and the change detection algorithm is guided by
elevation differences.
However, almost all of the
VHR
images are taken from
different viewing angles, because of the sensors’ agility
to quickly capture ground images of interest within a 45°
off-nadir angle. Not many images are adequate to produce
stereo-derived bi-temporal DSMs. Therefore, the abovemen-
tioned methods cannot use most of the existing
VHR
images
for change detection.
To utilize off-nadir images for change detection, the stud-
ies led by Pollard
et al.
, (2010) used a voxel-based approach.
In this approach, each image is considered as an instance of
a volumetric space with intensities following the Bayesian
probability theory. The intensities are updated every time
Shabnam Jabari is with the Department of Geodesy and
Geomatics Engineering, University of New Brunswick,
15 Dineen Dr., Room E13, P.O. Box 4400 Fredericton,
New Brunswick, CANADA E3B 5A3. and formally with
the Department of Geodesy and Geomatics Engineering,
University of New Brunswick (
).
Yun Zhang is with the Department of Geodesy and Geomatics
Engineering, University of New Brunswick, Guest Professor
of Peking University, and Visiting Professor of MIT (2015), 15
Dineen Dr., Fredericton, NB, Canada E3B 5A3.
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 7, July 2016, pp. 521–534.
0099-1112/16/521–534
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.7.521
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
July 2016
521
447...,511,512,513,514,515,516,517,518,519,520 522,523,524,525,526,527,528,529,530,531,...582
Powered by FlippingBook