PE&RS July 2016 Public - page 524

back-projection of the
DSM
into image spaces, the hypotheti-
cal matching points, here referred to as corresponding points,
are detected (Figure 2b). For this projection Rational Function
Model (Equations 1) is used (Grodecki 2001):
x
P X Y Z
P X Y Z
y
P X Y Z
P X Y Z
P X Y Z
c
m
b
=
=
(
)
=
= =
1
2
3
4
0
( , , )
( , , )
( , , )
( , , )
, ,
0 0
m
a
m
a b c
a b c
A X Y Z
∑∑
=
, ,
(1)
where
x
~ and
y
~ are normalized image coordinates, and
X
,
Y
,
and
Z
are normalized ground coordinates;
m
is generally set
to 3. In this study, since bi-temporal images are used, the
above equations are re-written for the image
i
(
i
= 1
or
2) and
ground point
j
as:
x
y
G
X
Y
Z
ij
i
j
 =
(
)
(2)
where
G
X
Y
Z
i
j
(
)
is the transformation based on the
RFM
equations.
However,
RPCs
provided by imaging vendors have inherent
uncertainties due to small attitude or ephemeris errors, which
manifest themselves as biases in the image coordinate space
(Fraser and Hanley, 2005). Fraser (2003) demonstrated that
the bias vectors, which are the result of a direct comparison
between the back-projected
GCPs
in the image space, using
RPCs
, and the corresponding image points, are fairly invariant
within an image and can be modeled using an affine transfor-
mation, disregarding the type of terrain. In their experiments,
the standard error of the biases is around half a pixel.
Applying the affine bias compensation to the image points,
the adjusted
RFM
equation can be written as:
ˆ
ˆ
(
)
x
y
TG
X
Y
Z
ij
i i
j
 =
(3)
where
x
ˆ and
y
ˆ are bias compensated image coordinates, and
T
i
is a 2D affine transformation given in:
ˆ
ˆ
x
y
T
x
y
m m m
m m m
ij
i
ij
 =
 =
11
12
13
21
22
23
0 0 1
i
ij
x
y
1
(4)
where
m
kl
,
K
1:3,
l
1:3 are the unknown coefficients of the
affine transformation for which at least three control points
are required. Therefore, using the adjusted
RFM
equations,
a back-projection from ground space to image space can be
generated with an accuracy better than one pixel. Figure 3
depicts how uncorrected
RPCs
and an affine transformation in
each image are used to relate the corresponding image points.
The process of finding the corresponding image points is
done indirectly using the
DSM
as an indicator. By finding the
corresponding points, a look-up-table (
LUT
) is generated. In
each row of the
LUT
, the
DSM
pixel ground coordinates and the
corresponding image points are given. Therefore, for each
DSM
pixel: [
x
1
,
y
1
,
x
2
,
y
2
,
X
,
Y
,
Z
,
S
k
]
j
, j
{1,2,…,
N
}, is generated; where
(
x
1
,
y
1
) are base image coordinates, (
x
2
,
y
2
) are target image coor-
dinates, (
X
,
Y
,
Z
) are ground coordinates from the
DSM
.
S
k
is the
patch
ID
from the base image segmentation, and
N
is the total
number of pixels in the
DSM
.
Step 2: Occlusion and Façade Removal
In the second step, false change detection results, caused by
the effect of relief displacement and accordingly occlusion,
must be prevented. This effect is schematically depicted in
Figure 4. In this figure, AE is the hypothetical curve map-
ping the point A in the object space to the point E in the base
image space (AE is represented by a curve since in satellite
imagery the collinearity equations are replaced by
RFM
; there-
fore, AE is not a straight line). Although all of the points A, B,
Figure 3. Indirect matching of corresponding points using back-projection of DSM pixels into image spaces: (a) A schematic representation
of back-projection of a ground pixel, A, to the image spaces using RFM (G1 and G2 operators). Using the uncorrected RPCs, the ground point
A is projected to a~ ′ and a~ ′′ in the bi-temporal images. Afterwards, by performing an affine transformation (T1 and T2), a~ ′ and a~ ′′ are trans-
ferred to their correct places a′ and a′′ respectively, resulting in the indirect matching of points a′ and a′′; and (b) The same fact is shown with
a real example. Some sample pixels (the corner pixels of a building) in the DSM are projected to the bi-temporal images using uncorrected
RPCs. With T1 and T2 affine transformations in the image spaces, the associated image points are transferred to their correct positions.
524
July 2016
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
447...,514,515,516,517,518,519,520,521,522,523 525,526,527,528,529,530,531,532,533,534,...582
Powered by FlippingBook