PE&RS May 2017 Full - page 369

for selected Hyperion bands (Figure 2). Hyperion sensor
records data in 242 channels with 30 m spatial resolution
and ~10 nm spectral resolution over 355 – 2577 nm range.
Only 198 channels of 242 are calibrated (7 to 57 and 77 to
224) (
USGS
, 2011b), (Beck, 2003). Furthermore, we removed
channels affected due to water absorption and selected 141
of 242 channels for further processing. Overall, the channels
removed were: 1-8, 58-81, 98-101, 119-134, 164-187, and 218-
242 (Deshpande
et al
., 2013). Selected channels within the
range 1-70 was divided by 40, and 71-242 was divided by 80
to convert the DNs to radiance (W/m
2
SR µm) (Beck, 2003).
After converting the DNs to radiance values, we removed
additive component of atmospheric effects using an extension
of improved dark object technique (Chavez, 1988), (Figure
2). We selected a very clear atmospheric model, that is, path
radiance is inversely proportional to fourth power of the
wavelength, and chose channel 52 (874.53 nm) for selecting
initial path radiance value. The acquired scene contains large
water bodies that should indicate zero radiance from channel
52. Hence, in this case the dark bodies represented water bod-
ies. We selected 0.5 (W/m
2
SR µm) as a starting path radiance
value to avoid overcorrection of the channels beyond visible
range. Correction factors for all wavelengths were calculated
for “very clear” atmospheric conditions. Initial path radiance
value is then multiplied by correction factor for each channel
to arrive at a radiance value that needs to be subtracted from
each channel, respectively (Deshpande
et al
., 2013).
Next, we used
FAR
and
IAR
for calculating reflectance values
of each pixel in the image (Figure 2). Log Residuals (Green
et
al
., 1985) is one more image-based method which corrects the
atmospheric effects in similar manner. But, the method does
not create relative reflectance signatures that resembles labora-
tory spectrum and hence was not considered for comparison.
In addition to
IAR
and
FAR
, we calculated reflectance using
physics-based methods such as (
USGS
, 2011) (referred as the
EO-1
equation) and 6SV (Vermote
et al
., 2006) for comparing
discrimination of
VIS
classes with image-based methods (Fig.
2). The images used in the current experiments were from
month of April and hence provide relatively clear atmosphere
as compared with other months; and, it is safe to use the
Equation 3. We accounted for additive component of atmo-
spheric effects using dark object technique as previously ex-
plained as well. Next, 6SV parameters were used for calculat-
ing reflectance calibration coefficients of each Hyperion band:
aerosol profile – “Urban”, and Atmospheric profile – “User-
WaterAndOzone” (2.240 g/cm
2
, 0.276 cm-atm, respectively),
target_custom_altitude – 0.559 (AERONET, 2015).
Next, we used Spectral Angle Mapper (
SAM
) (Kruse
et al
.,
1993) to detect
VIS
classes in the acquired images (Figure 2).
The algorithm calculates spectral angle between each refer-
ence spectrum and a pixel in the image. The label of the
reference having minimum spectral angle with the pixel is
assigned to the pixel (Algorithm 1).
Finally, we measured the accuracy of results for all the
experiments using standard confusion matrix, and further cal-
culated Producer’s and User’s accuracy. Producer’s accuracy in-
dicate fraction of reference pixels that are correctly identified by
classifier (omission errors); and User’s accuracy indicate fraction
of pixels assigned to a particular class that are true class pixels
(commission errors) (Congalton, 1991). For example, Experiment
1 reference signatures for
VIS
classes are extracted from, GGPA,
REPG, and PLMU (Tables 1 and 2), and accuracy is calculated
using TRUN, RENS, and
PLCI
regions (Table 1 and 3), respective-
ly. For example, overall accuracy of classification for experiment
1 is: 94 percent (
IAR
) with Producer’s and User’s accuracy of 100
percent for Vegetation class for both
IAR
and
FAR
, User’s accuracy
and Producer’s accuracy of 100 percent for Impervious Surface
class for both
IAR
and
FAR
and so on (Tables 3 and 4).
Experimental Set Up for Classification Experiments
We performed various calibration and classification experi-
ments, as described in overall procedure, to compare intra-
class / inter-class confusion if any using
SAM
. All reference
signatures were taken from pure pixels within the Hyperion
image (
USGS
, 2013a) and were verified by field inspection and/
or from high resolution imagery. A few mixture signatures
for thematic classes also were extracted intentionally. As a
strategy, reference signatures were extracted from small regions
and tested on large area. We selected test regions that are away
from reference pixels by a large distance margin. All the exper-
iments were performed on the 22 April image (
USGS
, 2013a).
The detailed experimental set up is explained as follows:
Experiment 1 and 2 (Baseline)
We extracted reference signatures from most representative
materials of
VIS
classes. For example, the vegetation signature
is taken from trees, soil signatures from bare soil area with
little or low grass cover, and impervious surface signatures
are taken from dense residential area with concrete roofs. In
Experiment 2, we changed the reference signature of impervi-
ous surface to concrete pavement and observed if all impervi-
ous surface targets are detected.
Experiment 3, 4, and 5:
In Experiment 3, we extracted reference signatures of the
subclasses of impervious surfaces: concrete and industrial
roofs and then further investigated (in experiment 4 and 5) if
different income zones can be detected spectrally by extract-
ing reference mixture signatures from respective zones.
Experiment 6 and 7:
Similar to experiment for impervious surfaces, we succes-
sively extracted reference signatures of various subclasses in
soil and performed
VIS
classification.
Experiment 8:
We divided vegetation reference signatures into two groups
and performed
VIS
classification.
We provide below a description of regions and their codes
used to extract the reference signatures and to calculate ac-
curacy of classification. All the locations are in and around of
Pune City, India:
Trees
Urban tree cover (
TRLP
,
TRUN
); farms, green grass etc., and
other miscellaneous low lying vegetation (
GGRF
,
GGHB
,
GGHI
).
Algorithm 1. Pixel Classification Using Spectral Angle Mapper
Input:
lP
= list of pixels in the image (
np
×
nm
),
where
np
=number of pixels,
nm
= number of dimensions;
lR
= list of reference pixels (
nr
x
nm
) where
nr
= number of references
Output: lC = list of class labels for
corresponding pixels in the input list
Create empty lC (npX1)
SAM(
P
l,
R
l
)
{
FOR every
p
i
in lP
FOR every r
j
in lR
SAM (p
i
,r
j
)
Class(
p
i
)= Class (MIN of {SAM(p
i
,r) for all
j
s})
SET lC
k
= Class of
P
i
RETURN
}
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
May 2017
369
327...,359,360,361,362,363,364,365,366,367,368 370,371,372,373,374,375,376,377,378,379,...386
Powered by FlippingBook