PE&RS November 2016 - page 879

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
November 2016
879
Automated Relative Orientation of UAV-Based
Imagery in the Presence of Prior Information
for the Flight Trajectory
Fangning He and Ayman Habib
Abstract
UAV
-based 3D reconstruction has been used in various appli-
cations. However, mitigating the impact of outliers in auto-
matically matched points remains to be a challenging task.
Assuming the availability of prior information regarding the
UAV
trajectory, this paper presents two approaches for reliable
estimation of Relative Orientation Parameters (
ROPs
) in the
presence of high percentage of matching outliers. The first
approach, which assumes that the
UAV
platform is moving at
a constant flying height while maintaining the camera in a
nadir-looking orientation, provides a two-point closed-form
solution. The second approach starts from prior information
regarding the flight trajectory to define a linearized model,
which is augmented with a built-in outlier removal procedure,
to estimate a refined set of
ROPs
. Experimental results from real
datasets demonstrate the feasibility of the proposed approach-
es in providing reliable
ROPs
from
UAV
-based imagery in the
presence of a high percentage of matching outliers (up to 90
percent).
Introduction
Accurate 3D modeling has become a key prerequisite for
several applications, such as urban planning, archaeological
documentation, environmental monitoring, disaster after-
math assessment, change detection, precision agriculture,
and military applications. 3D reconstruction/representation
of objects can be achieved through either active or passive
remote sensing systems. Due to financial and technical
constraints, passive sensor systems, which commonly use
digital line/frame imaging sensors, are still an optimum op-
tion for various 3D reconstruction applications (Remondino
and El-Hakim, 2006). Within the photogrammetric research
community, automation of image-based 3D reconstruction has
been investigated for decades. In order to derive high-quality
3D reconstruction, conventional photogrammetric mapping
requires the knowledge of the Interior Orientation Parameters
(
IOPs
) of the utilized cameras, Exterior Orientation Parameters
(
EOPs
) of the involved images, and corresponding points/fea-
tures within overlapping images. The
IOPs
can be derived from
a camera calibration process (Fraser, 1997; Habib and Morgan,
2003). The
EOPs
of the involved imagery can be either derived
through an indirect or a direct geo-referencing process (Cra-
mer
et al
., 2000; Skaloud, 2002). For indirect geo-referencing,
the image
EOPs
are indirectly established using tie and control
points. However, the identification of reliable tie points and
the set-up of control points are time-consuming and costly
activities. In spite of the fact that the direct geo-referencing
simplifies the derivation of the
EOPs
for each exposure station,
significant initial investment for the acquisition of a high-end
GNSS/INS
Position and Orientation System (
POS
) is required,
especially when seeking high level of reconstruction accura-
cy. For 3D reconstruction, whether it is based on indirect or
direct georeferencing, we need to automatically identify con-
jugate points in overlapping images (this is commonly known
as the matching problem). Image matching can be a chal-
lenging task when dealing with imagery that has poor and/or
repetitive texture. Therefore, one can argue that the adoption
of conventional photogrammetric mapping techniques for
image-based 3D reconstruction, especially for some emerging
applications such as precision agriculture, can be limited.
Large-area 3D reconstruction has been traditionally es-
tablished using manned-airborne data acquisition platforms.
Unmanned Aerial Vehicles (
UAVs
) have recently emerged as
a promising geospatial data acquisition system. This prom-
ise is mainly attributed to recent advances in low-cost direct
georeferencing systems as well as imaging sensors operating at
different portions of the electromagnetic spectrum. Compared
to manned-airborne systems, the advantages of
UAVs
include
their low-cost, ease of storage and deployment, ability to fly
lower and collect high resolution data with consumer-grade
cameras, and filling an important gap between wheel-based
and manned-airborne platforms. To date, several research
efforts have been geared towards the use of
UAVs
for small-area
mapping applications (He
et al
., 2015; He and Habib, 2014;
Lari
et al
., 2015). Structure from Motion (
SfM
), which was ini-
tiated by the computer vision research community, has been
widely adopted for
UAV
-based 3D reconstruction. Similar to
the procedure that has been adopted by the photogrammetric
community for decades,
SfM
is implemented in three steps to
simultaneously estimate the
EOPs
of the involved images and
drive 3D coordinates of matched features within the overlap
area (Hartley and Zisserman, 2003; Huang and Netravali,
1994). In the first step, the relative orientation parameters
(
ROPs
) relating stereo-images are initially estimated using auto-
matically identified conjugate point and/or line features. Then,
a local reference coordinate system is established to define
an arbitrary datum for deriving the image
EOPs
as well as 3D
coordinates of matched points. Finally, a bundle adjustment
procedure is implemented to refine the
EOPs
and object coor-
dinates derived in the second step. The bundle adjustment
procedure can achieve the best 3D reconstruction accuracy
provided that we are utilizing sufficiently accurate approxi-
mations of the unknowns and correct feature correspondences
(Stewenius
et al
., 2006). In this regard, one should note that
accurate estimation of the
ROPs
, which define the position and
orientation of one image relative to another, is a prerequisite
for any 3D reconstruction using
SfM
(Horn, 1990a).
In the past few decades, recovery of the
ROPs
has been in-
Lyles School of Civil Engineering, Purdue University, West
Lafayette, IN 47906 (
).
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 11, November 2016, pp. 879–891.
0099-1112/16/879–891
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.11.879
819...,869,870,871,872,873,874,875,876,877,878 880,881,882,883,884,885,886,887,888,889,...906
Powered by FlippingBook