PE&RS December 2016 Public - page 71

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
December 2016
973
Improved Urban Scene Classification
Using Full-Waveform Lidar
M. Azadbakht, C. S. Fraser, and K. Khoshelham
Abstract
Full-waveform lidar data provides supplementary radiometric
as well as more accurate geometric target information, when
compared to discrete return systems. In this research, a wide
range of classes in an urban scene; including trees, medium
vegetation, low vegetation (grass), water bodies, pitched roofs,
flat roofs, asphalt, vehicles, power lines, walls (fences) and
concrete are considered. In order to tackle the challenge of
distinguishing geometrically similar classes and enhancing
the separability of other targets, a new set of features based on
deconvolved waveforms is introduced. The positive effect of
the proposed feature dataset on classification accuracy in indi-
vidual classes is shown using two ensemble classifiers (ran-
dom forests and RUSBoost). Performance of the classifiers is
improved by integration with sampling techniques, especially
for the under-represented classes. The final output of the pro-
posed method is a highly detailed land cover map of the urban
scene, which affords good separability between critical classes.
Introduction
Full-waveform lidar systems provide an almost continuous
representation of the received signal (Mallet and Bretar, 2009)
in the form of amplitudes of returned pulses along the laser
beam. These can be advantageous in characterizing targets.
The recorded signal contains information regarding the
geometry and radiometric characteristics of targets, but it is
affected by various additional parameters, including flying
height and the laser energy. Postprocessing of waveforms al-
lows retrieval of point clouds with higher accuracy than those
produced by discrete lidar systems. Also, valuable radio-
metric attributes of targets, such as the cross-section and its
derivatives, can be recovered. The retrieved range-corrected
target cross-section signal is independent of the laser instru-
ment characteristics, and it is thus more representative of the
targets. Moreover, the signal shape itself contains information
that is of interest in applications such as forestry (Muss
et al
.,
2013) and urban environments.
Detailed spatial information of urban environments is
advantageous in a variety of applications, including change
detection, mapping specific targets, urban planning and
management, and disaster management (e.g., flood modeling).
Accurate mapping of these classes (e.g., power lines or roof
types) is a time-consuming task without utilizing lidar data.
Land cover classification using lidar, along with the applica-
tion of different methodologies over diverse classes of targets,
has already been reported by many researchers (Buján
et al
.,
2012; Chehata
et al
., 2009; Mallet
et al
., 2011; Rodriguez-Ga-
liano
et al
., 2012). A thorough review of proposed lidar-based
classification methods in urban environments has been
reported by Yan
et al
. (2015) and the relevance of specific fea-
tures from full-waveform lidar data for classification in these
environments has also been reported (Alexander
et al
., 2010;
Mallet
et al
., 2011; Niemeyer
et al
., 2011). However, thus far,
only a limited number of classes have been considered. For
example, the three major classes of buildings, vegetation, and
ground were investigated by Mallet
et al
. (2011) and Niemey-
er
et al
. (2011). Alexander
et al
. (2010) reported classification
using decision trees and waveform data over six targets in an
urban scene, where a small number of samples was used to
assess the classification result. A combined feature dataset of
lidar and a satellite image was utilized by Guo
et al
. (2011),
where per-class performance of the under-represented classes
was not satisfactory.
Different target surfaces are in different abundance in
complex urban environments, which causes data imbalance,
in that small numbers of training samples are available for
specific classes. This is of significant importance when train-
ing a classifier for a large number of classes, and it should
be addressed in order to attain more acceptable results. In
this study, two state-of-the-art classifiers, random forests and
RUSBoost, have been selected in the investigation of a very
detailed classification problem with 11 classes, in which the
training dataset is heavily imbalanced towards a few of the
classes. The feature dataset is comprised of geometric, radio-
metric and pulse shape attributes extracted from waveforms.
In order to balance the training samples, different portions of
majority classes are considered in the case of classification
using random forests, while a random undersampling strategy
is implemented in RUSBoost. The relevance of waveform fea-
tures is shown by analyzing their impact on the classification
accuracy.
Methodology
In this study, features extracted from waveform lidar are
considered in order to classify a dense urban environment.
These features range from geometric to radiometric, and ex-
tend to additional features extracted from the waveform itself.
Retrieval of the target response (cross-section) is based upon
a robust deconvolution method developed in previous work
(Azadbakht
et al
., 2016). Two ensemble classifiers are then
applied to examine the role of both waveform features and
the size of the training dataset in the identification of various
target classes.
Waveform Processing
In order to retrieve the target response from waveforms, a
robust deconvolution method based upon sparsity-based reg-
ularization (Azadbakht
et al
., 2016; Azadbakht
et al
., 2014)
M. Azadbakht and C. S. Fraser are with the Cooperative
Research Centre for Spatial Information, VIC 3053, Australia;
and the Department of Infrastructure Engineering, University
of Melbourne, VIC 3010, Australia (m.azadbakht@student.
unimelb.edu.au).
K. Khoshelham is with the Department of Infrastructure Engi-
neering, University of Melbourne, VIC 3010, Australia.
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 12, December 2016, pp. 973–980.
0099-1112/16/973–980
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.12.973
1...,61,62,63,64,65,66,67,68,69,70 72,73,74,75,76,77,78,79,80
Powered by FlippingBook