PE&RS December 2016 Public - page 23

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
December 2016
925
Image-Based Mobile Mapping
for 3D Urban Data Capture
Stefan Cavegn and Norbert Haala
Abstract
Ongoing innovations in dense multi-view stereo image matching
meanwhile allow for 3
D
data collection using image sequences
captured from mobile mapping platforms even in complex and
densely built-up areas. However, the extraction of dense and
precise 3
D
point clouds from such street-level imagery presumes
high quality georeferencing as a first processing step. While
standard direct georeferencing solves this task in open areas,
poor
GNSS
coverage in densely built-up areas and urban canyons
frequently prevents sufficient accuracy and reliability. Thus, we
use bundle block adjustment, which additionally integrates tie
and control point information for precise georeferencing of our
multi-camera mobile mapping system. Subsequently, this allows
the adaption of a state-of-the-art dense image matching pipeline
to provide a suitable 3
D
representation of the captured urban
structures. In addition to the presentation of different processing
steps, this paper also provides an evaluation of the achieved
image-based 3
D
capture in a dense urban environment.
Introduction
For a considerable period, 3
D
data capture from mobile
mapping systems was completely (Puente
et al
., 2013)
or primarily (Kersten
et al
., 2009) based on lidar sensors.
Meanwhile, advances in Visual Odometry (Scaramuzza and
Fraundorfer, 2011; Fraundorfer and Scaramuzza, 2012), Si-
multaneous Localization and Mapping (Cadena
et al
., 2016),
Structure-from-Motion (Schönberger and Frahm, 2016), and
Dense Image Matching (Remondino
et al
., 2014) alternatively
enable the use of camera-based systems for highly efficient and
accurate 3
D
mapping even in complex urban environments
(Pollefeys
et al
., 2008; Gallup, 2011). Within the paper, we
present the adaption of the stereo matching software SURE
(Rothermel
et al
., 2012) for the evaluation of image sequences
collected from a multi-camera mobile mapping system. The
use of multiple cameras potentially provides a large redun-
dancy during photogrammetric processing, which e.g. proved
to be very beneficial for tasks like the extraction of Digital
Surface Models (
DSM
) from airborne imagery (Haala, 2014).
However, dense matching of street level imagery as captured
from mobile mapping systems is frequently aggravated if com-
pared to airborne data collection which can usually be realized
by state-of-the-art software tools. One reason is large variances
in distance as collected from the terrestrial viewpoints. This
can result in great variations of image scale, which frequently
has to be considered during different matching steps. Problems
also arise from multiple occlusions due to the complex 3
D
geometry especially in dense urban areas. In contrast to rather
simple 2.5
D
processing during
DSM
generation from airborne
views, this geometry also requires the implementation of steps
like filtering and data fusion in true 3
D
space. Besides, more
complex data structures like (meshed) 3
D
points have to be
generated to encode the complex geometry of such built-up
environments and are useful for following tasks like the auto-
matic reconstruction of building façades (Tutzauer and Haala,
2015).
The mobile mapping system used for our investigations
alternatively aims on the collection of so-called geospatial 3
D
image spaces (Nebiker
et al
., 2015). These are georeferenced
RGB-D
images to be used for tasks like 3
D
monoplotting, where
a user can accurately measure 3
D
coordinates at features of
interest simply by clicking on a location within the 2
D
imag-
ery. In order to link each pixel to a corresponding 3
D
coor-
dinate, this of course presumes both image georeferencing
and matching at the pixel level. Our mobile mapping system,
which is described in more detail in the next section, features
a ground sampling distance (
GSD
) of 1 cm for distances of 23 m
from the system. This further reduces to 2 to 6 mm for a more
typical measurement range of 4 to 14 m. Direct georeferencing
as usually applied by mobile mapping systems allows for an
image registration at the centimeter level in open areas which
provide good
GNSS
conditions. As an example, in such an
environment Burkhard
et al
. (2012) obtained absolute 3
D
point
measurement accuracies of 4 to 5 cm in average for their ste-
reovision mobile mapping system. However, our applications
aim on image-based mobile mapping for 3
D
urban data capture.
Therefore, our test scenarios presented in the following section
mainly include densely built-up urban areas, where multipath
effects and signal shading by trees and buildings aggravate
this process. Thus as discussed in the Image Orientation by
Image-Based Georeferencing Section, image orientation by
image-based georeferencing is required, which improves the
results from direct georeferencing by a supplementary bundle
block adjustment using additional tie and control point obser-
vations. These results then allow for a high quality alignment
of the respective image sequences as a prerequisite for the
multi-view stereo matching presented in the Dense Multi-View
Stereo Matching section. As demonstrated by our investiga-
tions the accuracy, reliability and completeness of products
like 3
D
point clouds considerably benefit from the available
redundancy during image-based mobile mapping for urban
data capture.
Mobile Mapping Platform and Test Scenario
All data used for the investigations presented in this paper
was captured by the multi-sensor stereovision mobile mapping
Stefan Cavegn is with the Institute of Geomatics Engineering,
FHNW University of Applied Sciences and Arts Northwestern
Switzerland, Gruendenstrasse 40, 4132 Muttenz, Switzerland;
and the Institute for Photogrammetry, University of Stuttgart,
Geschwister-Scholl-Strasse 24, 70174 Stuttgart, Germany
(
).
Norbert Haala is with the Institute for Photogrammetry,
University of Stuttgart, Geschwister-Scholl-Strasse 24, 70174
Stuttgart, Germany.
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 12, December 2016, pp. 925–933.
0099-1112/16/925–933
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.12.925
1...,13,14,15,16,17,18,19,20,21,22 24,25,26,27,28,29,30,31,32,33,...80
Powered by FlippingBook