Você está na página 1de 259

Image Analysis for ArcGIS

Geographic Imaging by ERDAS


December 2010
Copyright 2010 ERDAS, Inc.
All rights reserved.
Printed in the United States of America.
The information contained in this document is the exclusive property of ERDAS, Inc. This work is protected under United
States copyright law and other international copyright treaties and conventions. No part of this work may be reproduced
or transmitted in any form or by any means, electronic or mechanical, including photocopying and recording, or by any
information storage or retrieval system, except as expressly permitted in writing by ERDAS, Inc. All requests should be
sent to the attention of:
Manager, Technical Documentation
ERDAS, Inc.
5051 Peachtree Corners Circle
Suite 100
Norcross, GA 30092-2500 USA.
The information contained in this document is subject to change without notice.
Government Reserved Rights. MrSID technology incorporated in the Software was developed in part through a project
at the Los Alamos National Laboratory, funded by the U.S. Government, managed under contract by the University of
California (University), and is under exclusive commercial license to LizardTech, Inc. It is used under license from
LizardTech. MrSID is protected by U.S. Patent No. 5,710,835. Foreign patents pending. The U.S. Government and the
University have reserved rights in MrSID technology, including without limitation: (a) The U.S. Government has a non-
exclusive, nontransferable, irrevocable, paid-up license to practice or have practiced throughout the world, for or on
behalf of the United States, inventions covered by U.S. Patent No. 5,710,835 and has other rights under 35 U.S.C.
200-212 and applicable implementing regulations; (b) If LizardTech's rights in the MrSID Technology terminate during
the term of this Agreement, you may continue to use the Software. Any provisions of this license which could reasonably
be deemed to do so would then protect the University and/or the U.S. Government; and (c) The University has no
obligation to furnish any know-how, technical assistance, or technical data to users of MrSID software and makes no
warranty or representation as to the validity of U.S. Patent 5,710,835 nor that the MrSID Software will not infringe any
patent or other proprietary right. For further information about these provisions, contact LizardTech, 1008 Western Ave.,
Suite 200, Seattle, WA 98104.
ERDAS, ERDAS IMAGINE, Stereo Analyst, IMAGINE Essentials, IMAGINE Advantage, IMAGINE, Professional,
IMAGINE VirtualGIS, Mapcomposer, Viewfinder, and Imagizer are registered trademarks of ERDAS, Inc.
Other companies and products mentioned herein are trademarks or registered trademarks of their respective owners.
v Table of Contents
Table of Contents v
Table of Contents
Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xi
List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii
Foreword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Introducing Image Analysis for ArcGIS . . . . . . . . . . . . . . . . . . . . . . . . . 3
Performing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Updating Databases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Categorizing Land Cover and Characterizing Sites . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Identifying and Summarizing Natural Hazard Damage . . . . . . . . . . . . . . . . . . . . . . . 6
Identifying and Monitoring Urban Growth and Changes . . . . . . . . . . . . . . . . . . . . . . 7
Extracting Features Automatically . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Assessing Vegetation Stress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Learning More . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Finding Answers to Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Getting Help on Your Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Contacting ERDAS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Contacting ESRI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
ERDAS Education Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
ESRI Education Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Quick-Start Tutorial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Start Image Analysis for ArcGIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Add the Image Analysis for ArcGIS Extension . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Add Toolbars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Exercise 2: Using Histogram Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Add a Theme of Moscow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Apply a Histogram Equalization in the View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Apply a Histogram Equalization to modify a file . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Apply an Invert Stretch to the Moscow Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Exercise 3: Identifying Similar Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Add and Draw a Theme Depicting an Oil Spill . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Create a Shapefile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Draw the Polygon with the Seed Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Exercise 4: Finding Changed Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
vi Table of Contents
Find Changed Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Add and Draw the Images of Atlanta . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Compute the Difference Due to Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Clear the View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Use Thematic Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Add the Images of an Area Damaged by Hurricane . . . . . . . . . . . . . . . . . . . . . . . . 28
Create Three Classes of Land Cover . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Assign Class Names and Colors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Categorize and Name the Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Recode Class Names and Colors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Use Thematic Change to see Land Cover Changes . . . . . . . . . . . . . . . . . . . . . . . . 34
Add a Feature Theme that Shows the Property Boundary . . . . . . . . . . . . . . . . . . . 35
Make the Property Transparent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Summarize the Area . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Exercise 5: Mosaicking Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Add and Draw the Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Zoom In to See Image Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Use Mosaic to Join the Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Create Custom Cutlines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Exercise 6: Orthorectifying Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Add Raster and Feature Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Select the Coordinate System for the Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Orthorectify your Image using GeoCorrection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Place Fiducials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Place Links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Whats Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Applying Data Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Seed Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Controlling the Seed Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Preparing to Use the Seed Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Changing the Seed Radius . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Image Info. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
NoData Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Using the Image Info Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Options Dialog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
General Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Extent Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Table of Contents vii
Cell Size Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Preferences Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Raster Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Using the Options Dialog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Geoprocessing Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Specifying Geoprocessing Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Updating Existing Geoprocessing Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Using Data Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Create New Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Creating a New Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Subset Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Subsetting an Image Spectrally . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Subsetting an Image Spatially . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Mosaic Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Mosaicking Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Reproject Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Reprojecting an Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Performing Spatial Enhancement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Applying convolution Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Convolution Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Convolution Formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Zero Sum Kernels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
High-Frequency Kernels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Low-Frequency Kernels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Applying Convolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Non-Directional Edge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Applying Non-Directional Edge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Focal Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Applying Focal Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Resolution Merge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Brovey Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Applying Resolution Merge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Using Radiometric Enhancement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Linear and Nonlinear . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
viii Table of Contents
Linear Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Nonlinear Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Piecewise Linear Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Contrast Stretch on the Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Varying the Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Applying a LUT Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Histogram Equalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Effect on Contrast . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Performing Histogram Equalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Histogram Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Performing Histogram Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Brightness Inversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Applying Brightness Inversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Applying Spectral Enhancement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
RGB to IHS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
Converting RGB to IHS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
IHS to RGB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Converting IHS to RGB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Vegetative Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Image Algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Applying Vegetative Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Color IR to Natural Color . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Changing Color IR to Natural Color . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Performing GIS Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Information Versus Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Neighborhood Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Performing Neighborhood Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Thematic Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Performing Thematic Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Summarize Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Applying Summarize Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Recode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Recoding by Class Name . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Table of Contents ix
Recoding by Symbology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Recoding with Previously Grouped Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Using Utilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Image Difference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
Using Image Difference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Layer Stack. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Using Layer Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Rescale Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Using Rescale Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
Understanding Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
The Classification Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Unsupervised Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Supervised Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Signatures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Decision Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Parametric Decision Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Nonparametric Decision Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Classification Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Classification Scheme . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Supervised versus Unsupervised Classification . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Classifying Enhanced Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Limiting Dimensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Unsupervised Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
Clusters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
ISODATA Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
Initial cluster Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Pixel Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Percentage Unchanged . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
Performing Unsupervised Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
Supervised Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
Performing Supervised Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Classification Decision Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
Parametric Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
Nonparametric Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
Minimum Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
x Table of Contents
Maximum Likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Mahalanobis Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
Parallelepiped . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
Using Conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Converting Raster to Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
Performing Raster to Features Conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
Performing Features to Raster Conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Applying GeoCorrection Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
Disadvantages of Rectification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Georeferencing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Georeferencing Only . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Ground Control Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Entering GCP Coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
Tolerance of RMSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
Thematic Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
Orthorectification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
GeoCorrection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
The GeoCorrection Properties Dialog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
General Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
Links Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
Elevation Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
SPOT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
Panchromatic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
XS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Stereoscopic Pairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
SPOT 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
The Spot Properties Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
Polynomial Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
Transformation Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
Linear Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
Nonlinear Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
High-Order Polynomials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Effects of Order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Minimum Number of GCPs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
The Polynomial Properties Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Table of Contents xi
Rubber Sheeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Triangulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Triangle-Based Rectification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Linear Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Nonlinear Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Checkpoint Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
The Camera Properties Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Orientation Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Camera Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
Fiducials Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
IKONOS, QuickBird, and RPC Properties . . . . . . . . . . . . . . . . . . . . . . . . 195
IKONOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
QuickBird . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
RPC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
IKONOS, QuickBird, and RPC Parameters Tab . . . . . . . . . . . . . . . . . . . . . . . . . . 197
IKONOS, QuickBird, and RPC Chipping Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Scale and Offset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Arbitrary Affine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Landsat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Landsat 1-5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
MSS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
TM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Band Combinations for Displaying TM Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Landsat 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Landsat 7 Data Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Landsat 7 Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
The Landsat Properties Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
Parameters Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
xi List of Figures
List of Figures xi
List of Figures
Figure 1: Airphoto with Shapefile of Streets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Figure 2: Classified Image for Radio Towers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Figure 3: Before and After Hurricane Hugo, and Shapefile . . . . . . . . . . . . . . . . . . . 6
Figure 4: Urban Areas Represented in Red . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Figure 5: Oil Spill and Polygon Grown in Spill . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Figure 6: Crop Stress Shown through Vegetative Indices . . . . . . . . . . . . . . . . . . . . 9
Figure 7: Seed Tool Properties Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Figure 8: General Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Figure 9: Extent Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Figure 10: Cell Size Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Figure 11: Options Preferences Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Figure 12: Raster Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Figure 13: Extent Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Figure 14: Amazon TM Image before Spectral Subsetting . . . . . . . . . . . . . . . . . . 75
Figure 15: Amazon TM Image after Spectral Subsetting . . . . . . . . . . . . . . . . . . . . 75
Figure 16: Pentagon Image before Spatial Subsetting . . . . . . . . . . . . . . . . . . . . . 76
Figure 17: Pentagon Subset Image after Analysis Extent . . . . . . . . . . . . . . . . . . . 76
Figure 18: Convolution with High-Pass Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Figure 19: Image of Seattle before Non-Directional Edge . . . . . . . . . . . . . . . . . . . 90
Figure 20: Image of Seattle after Non-Directional Edge . . . . . . . . . . . . . . . . . . . . 90
Figure 21: Image before Focal Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Figure 22: Image after Focal Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Figure 23: High-Resolution, Multi-spectral, and Resolution Merge Images . . . . . . 96
Figure 24: Contrast Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Figure 25: Histogram Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Figure 26: Image before Brightness Inversion . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Figure 27: Image after Brightness Inversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Figure 28: Variance of Intensity and Hue in RGB to IHS . . . . . . . . . . . . . . . . . . . 112
Figure 29: Infrared Image of a Golf Course . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Figure 30: Natural Colors after Color IR to Natural Color . . . . . . . . . . . . . . . . . . 120
Figure 31: Input Image from 1973 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Figure 32: Input Image from 1994 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Figure 33: Thematic Change Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Figure 34: Image Showing Changes between 1973 and 1994 . . . . . . . . . . . . . . . 129
Figure 35: Thematic Image before Recode by Class Name . . . . . . . . . . . . . . . . . 132
Figure 36: Thematic Image after Recode by Class Name . . . . . . . . . . . . . . . . . . 132
xii List of Figures
Figure 37: Soil Data Image before Recode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Figure 38: Soil Data Image after Recode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Figure 39: Image Difference File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
Figure 40: Highlight Change File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
Figure 41: Stacked Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Figure 42: Raster Image before Conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Figure 43: Raster Image after Conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Figure 44: Elevation Source File Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
Figure 45: Elevation Source Constant Settings . . . . . . . . . . . . . . . . . . . . . . . . . 174
Figure 46: SPOT Panchromatic versus SPOT XS . . . . . . . . . . . . . . . . . . . . . . . . 177
Figure 47: Orientation Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Figure 48: Camera Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
Figure 49: Fiducials Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Figure 50: IKONOS, QuickBird, and RPC Parameters Tab . . . . . . . . . . . . . . . . 197
Figure 51: Chipping Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Figure 52: Chipping Tab using Scale and Offset . . . . . . . . . . . . . . . . . . . . . . . . 199
Figure 53: Chipping Tab using Arbitrary Affine . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Figure 54: Landsat MSS versus Landsat TM . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Figure 55: Landsat Properties Parameters Tab . . . . . . . . . . . . . . . . . . . . . . . . 205
xiii List of Tables
List of Tables xiii
List of Tables
Table 1: Bilinear Interpolation Resampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Table 2: Nearest Neighbor Resampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Table 3: Cubic Convolution Resampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Table 4: Data Type Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Table 5: IR and R Bands of Common Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Table 6: SPOT XS Bands and Wavelengths . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Table 7: SPOT 4 Bands and Wavelengths . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
Table 8: Number of GCPs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Table 9: IKONOS Bands and Wavelengths . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Table 10: QuickBird Bands and Wavelengths . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Table 11: TM Bands and Wavelengths. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
Table 12: Landsat 7 Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
xiv List of Tables
1 Foreword
Foreword 1
Foreword
An image of the Earths surface is a wealth of information. Images
capture a permanent record of buildings, roads, rivers, trees, schools,
mountains, and other features located on the Earths surface. But
images go beyond simply recording features. Image serve the following
purposes:
Record relationships and processes as they occur in the real world.
Are snapshots of geography, but they are also snapshots of reality.
Chronicle our Earth and everything associated with it; they record a
specific place at a specific point in time. They are snapshots of our
changing cities, rivers, and mountains.
Are snapshots of life on Earth.
The data in a geographic information system (GIS) needs to reflect
reality, and snapshots of reality need to be incorporated and accurately
transformed into instantaneously ready, easy-to-use information. From
snapshots to digital reality, images are pivotal in creating and
maintaining the information infrastructure used by todays society.
Geographic information systems today are carefully created with
features, attributed behavior, analyzed relationships, and modeled
processes.
There are five essential questions that any GIS needs to answer:
where, what, when, why, and how. Uncovering why, when, and how are
all done within the GIS; images let you extract the where and what.
Precisely where is that building? What is that parcel of land used for?
What type of tree is that? The new extensions developed by ERDAS
use imagery that let you accurately address the questions where and
what, so you can then derive answers for the other three.
But our Earth is changing! Urban growth, suburban sprawl, industrial
usage and natural phenomena continually alter our geography. As our
geography changes, so does the information we need to understand it.
Because an image is a permanent record of features, behavior,
relationships, and processes captured at a specific moment in time,
using a series of images of the same area taken over time lets you more
accurately model and analyze the relationships and processes that are
important to our Earth.
The extensions by ERDAS are technological breakthroughs that let you
transform a snapshot of geography into information that digitally
represents reality in the context of a GIS. Image Analysis for
ArcGIS and Stereo Analyst for ArcGIS are tools built on top of a GIS
to maintain that GIS with up-to-date information. The extensions
provided by ERDAS reliably transform imagery directly into your GIS for
analyzing, mapping, visualizing, and understanding our world.
2 Foreword
On behalf of the Image Analysis for ArcGIS and Stereo Analyst for
ArcGIS product teams, we wish you all the best in working with these
products and hope you are successful in your GIS and mapping
endeavors.
3 Introducing Image Analysis for ArcGIS
Introducing Image Analysis for ArcGIS 3
Introducing Image Analysis for ArcGIS
Image Analysis for ArcGIS is primarily designed for natural resource
and infrastructure management. This extension is useful in the fields of
forestry, agriculture, environmental assessment, engineering, and
infrastructure projects such as facility siting and corridor monitoring,
and general geographic database update and maintenance.
Today, imagery of the Earths surface is an integral part of desktop
mapping and GIS, and its more important than ever to provide realistic
backdrops to geographic databases and to quickly update details
involving street or land use data.
IN THIS CHAPTER
Performing Tasks
Learning More
4 Introducing Image Analysis for ArcGIS
Performing Tasks Image Analysis for ArcGIS lets you perform many tasks, including:
Updating Databases
Categorizing Land Cover and Characterizing Sites
Identifying and Summarizing Natural Hazard Damage
Identifying and Monitoring Urban Growth and Changes
Extracting Features Automatically
Assessing Vegetation Stress
Updating Databases There are many types of imagery to choose from in a wide range of
scales, spatial and spectral resolutions, and map accuracies. Aerial
photography is often the choice for map updating because of its high
precision. With Image Analysis for ArcGIS you can use imagery to
identify changes and make revisions and corrections to your
geographic database.
Figure 1: Airphoto with Shapefile of Streets
Introducing Image Analysis for ArcGIS 5
Categorizing Land Cover
and Characterizing Sites
Transmission towers for radio-based telecommunications must all be
visible from each other, be within a certain range of elevations, and
avoid fragile areas like wetlands. With Image Analysis for ArcGIS, you
can categorize images into land cover classes to help identify suitable
locations. You can use imagery and analysis techniques to identify
wetlands and other environmentally sensitive areas.
The classification features let you divide an image into many different
classes, and then highlight them. In this case the areas not suitable for
tower placement are highlighted, and the placement for the towers are
sited appropriately.
Figure 2: Classified Image for Radio Towers
6 Introducing Image Analysis for ArcGIS
Identifying and
Summarizing Natural
Hazard Damage
When viewing a forest hit by a hurricane, you can use the mapping tools
of Image Analysis for ArcGIS to show where the damage occurred. With
other tools, you can show the condition of the vegetation, how much
stress it suffers, and how much damage it sustained in the hurricane.
Below, Landsat images taken before Hurricane Hugo in 1987 and after
Hurricane Hugo in 1989, in conjunction with a shapefile that identifies
the forest boundary, are used for comparison. Within the shapefile, you
can see detailed tree stand inventory and management information.
Figure 3: Before and After Hurricane Hugo, and Shapefile
Introducing Image Analysis for ArcGIS 7
Identifying and
Monitoring Urban Growth
and Changes
Cities grow over time, and images give a good sense of how they grow
and how to preserve remaining land by managing that growth. You can
use Image Analysis for ArcGIS to reveal patterns of urban growth over
time.
Here, Landsat data spanning 21 years was analyzed for urban growth.
The yellow urban areas from 1994 represent how much the city has
grown beyond the red urban areas from 1973. The final view shows the
differences in extent of urban land use and land cover between 1973
and 1994. Those differences are represented as classes.
Figure 4: Urban Areas Represented in Red
8 Introducing Image Analysis for ArcGIS
Extracting Features
Automatically
Suppose you are responsible for mapping the extent of an oil spill as
part of a rapid response effort. You can use synthetic aperture radar
(SAR) data and Image Analysis for ArcGIS tools to identify and map the
extent of such environmental hazards.
The following image shows an oil spill off the northern coast of Spain.
The first image shows the spill, and the second image shows a polygon
grown in the oil spill using the Seed tool. The second image gives you
an example of how you can isolate the exact extent of a particular
pattern using Image Analysis for ArcGIS.
Figure 5: Oil Spill and Polygon Grown in Spill
Introducing Image Analysis for ArcGIS 9
Assessing Vegetation
Stress
Crops experience different stresses throughout the growing season.
You can use multispectral imagery and analysis tools to identify and
monitor a crops health.
In these images, the vegetative indices function is used to evaluate crop
stress. The stressed areas are then automatically digitized and saved
as a shapefile. You can use this type of information to help identify
sources if there is variability in growth patterns, and then quickly update
crop management plans.
Figure 6: Crop Stress Shown through Vegetative Indices
Learning More If you are just learning about GIS, you may want to read the following
books about ArcCatalog and ArcMap: Using ArcCatalog and Using
ArcMap. Knowing about these applications can make your use of Image
Analysis for ArcGIS much easier.
See the quick-start tutorial in Quick-Start Tutorial on page 11 if you
are ready to learn about how Image Analysis for ArcGIS works. In this
tutorial, you learn how to adjust the appearance of an image, how to
identify similar areas of an image, how to align an image to a feature
theme, find areas of change, and mosaic images. The tutorial is written
so that you can do the exercises using your computer and the example
data supplied with Image Analysis for ArcGIS. If youd rather, you can
just read the tutorial to learn about the functionality of Image Analysis
for ArcGIS.
Finding Answers to
Questions
This book describes the typical workflow involved in creating and
updating GIS data for mapping projects. The chapters are set up so that
you first learn the theory behind certain applications, then you are
introduced to the typical workflow you would apply to get the results you
want. A glossary is provided to help you understand any terms you
might not have seen before.
10 Introducing Image Analysis for ArcGIS
Getting Help on Your
Computer
You can get a lot of information about the features of Image Analysis for
ArcGIS by accessing the online help. To browse the online help
contents, select Image Analysis desktop Help from the Image Analysis
dropdown list. From this point you can use the table of contents, index,
or search feature to locate the information you need. If you need online
help for ArcGIS, click Help on the ArcMap toolbar and select ArcGIS
Desktop Help.
Contacting ERDAS You can contact ERDAS for technical support, if needed, at 770-776-
3650. Customers outside the United States should contact their local
distributor. Visit ERDAS on the Web at www.erdas.com.
Contacting ESRI If you need to contact ESRI for technical support regarding ArcGIS,
refer to Getting technical support in the Help systems Getting More
Help section. The telephone number for technical support is 909-793-
3774. You can also visit ESRI on the Web at www.esri.com.
ERDAS Education
Solutions
ERDAS offers instructor-based training for Image Analysis for ArcGIS.
For more information, go to the training Web site at www.erdas.com.
You can follow the training link to training centers, course schedules,
and course registration.
ESRI Education
Solutions
ESRI provides educational opportunities related to GIS, GIS
applications, and technology. You can choose among instructor-led
courses, Web-based courses, and self-study workbooks to find
educational solutions that fit your learning style and budget. For more
information, visit the Web site www.esri.com/education.
11 Quick-Start Tutorial
Quick-Start Tutorial 11
Quick-Start Tutorial
Now that you know a little about the Image Analysis for ArcGIS
extension and its potential applications, the following exercises give you
hands-on experience in using many of the extensions tools.
In Image Analysis for ArcGIS, you can quickly identify areas with similar
characteristics. This is useful in cases such as environmental disasters,
burn areas, or oil spills. Once an area is defined, it can also be quickly
saved into a shapefile, eliminating the need for manual digitizing.
This tutorial shows you how to use some of the Image Analysis for
ArcGIS tools and gives you a good introduction to using Image Analysis
for ArcGIS for your own GIS needs.
IN THIS CHAPTER
Exercise 1: Getting Started
Exercise 2: Using Histogram Stretch
Exercise 3: Identifying Similar Areas
Exercise 4: Finding Changed Areas
Exercise 5: Mosaicking Images
Exercise 6: Orthorectifying Images
Whats Next?
12 Quick-Start Tutorial
Exercise 1: Getting
Started
In this exercise, you learn how to start Image Analysis for ArcGIS and
activate the toolbar associated with it. You gain access to all the
important Image Analysis for ArcGIS features through its toolbar and
menu list. After completing this exercise, youll be able to locate any
Image Analysis for ArcGIS tool you need for preparation, enhancement,
analysis, or geocorrection.
This exercise assumes you have successfully installed Image Analysis
for ArcGIS on your computer. You must use a single or dual monitor
workstation that is configured for use with ArcMap and Image Analysis
for ArcGIS.
If you have not installed Image Analysis for ArcGIS, refer to the
installation guide packaged on the CD and install it now.
Start Image Analysis for
ArcGIS
To start Image Analysis for ArcGIS, follow this step:
1. Click the Start button on your desktop, point to All Programs, point to
ArcGIS, and then click ArcMap to start the application.
Add the Image Analysis
for ArcGIS Extension
To add the Image Analysis for ArcGIS extension, follow these steps:
1. When the ArcMap dialog opens, keep the option to create a new empty
map and then click OK to open ArcMap.
2. Select Extensions from the Tools menu to open the Extensions dialog.
1
Quick-Start Tutorial 13
3. Check the Image Analysis check box to add the extension to ArcMap.
Once the Image Analysis check box is enabled, the extension is
activated.
4. Click Close.
Add Toolbars The Image Analysis toolbar is your gateway to many of the tools and
features you can use with the extension. Use it to choose different
analysis types, select a geocorrection type, or set links in an image,
among other things.
To add the Image Analysis toolbar, follow this step:
1. Click the Customize menu, point to Toolbars, and then select Image
Analysis to add the Image Analysis toolbar.
4
3
14 Quick-Start Tutorial
Exercise 2: Using
Histogram Stretch
Image data, displayed without any contrast manipulation, might appear
either too light or too dark, making it difficult to begin your analysis.
Image Analysis for ArcGIS lets you display the same data in many
different ways. For example, changing the distribution of pixels lets you
change the brightness and contrast of the image. This is called
histogram stretch, which lets you manipulate the display of data to
make your image easier to visually interpret and evaluate.
Add a Theme of Moscow To add an Image Analysis for ArcGIS theme of Moscow, follow these
steps:
1. Open a new view. If you are starting this exercise immediately after
Exercise 1, you should have a new, empty view ready.
2. Click the Add Data button to open the Add Data dialog.
3. Select moscow_spot.tif. The path to the example data directory is:
Program Files\ArcTutor\ImageAnalysis
4. Click Add to display the image in the view.
The moscow_spot.tif image displays in the view.
3
4
Quick-Start Tutorial 15
Apply a Histogram
Equalization in the View
Standard deviations is the default histogram stretch applied to images
by ArcGIS. You can apply histogram equalization to redistribute the
data so that each display value has roughly the same number of data
points. You can find more information about histogram equalization in
Using Radiometric Enhancement on page 97.
To apply a histogram equalization in the view, follow these steps:
1. Select moscow_spot.tif in the ArcMap table of contents.
2. Right-click and select Properties to open the Layer Properties dialog.
3. Click the Symbology tab.
4. Select RGB Composite in the Show box.
5. Check the order in the Band column and click the dropdown arrows to
change any of the bands.
Note: You can also change the order of the bands by clicking the color
bar next to each band in the ArcMap table of contents. If you want
bands to display in a certain order for each image that you draw in the
view, click Tools, point to Options, and select Raster in ArcMap and
change the default RGB band combinations.
6. Select Histogram Equalize from the Type dropdown list as the stretch
type.
7. Click Apply and OK to close the Layer Properties dialog.
5
3
6
7
4
16 Quick-Start Tutorial
Apply a Histogram
Equalization to modify a
file
You can apply the changes you made to a copy of the file permanently
using Image Analysis for ArcGIS.
To apply a histogram equalization to modify a file, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select Histogram Equalization to open the
Histogram Equalization dialog.
2. Select moscow_spot.tif from the Input Image dropdown list.
3. Leave the number in the Number of Bins field at 256 for this exercise. It
defaults to 256 because the image is 8 bits. In the future, you can
change this number to suit your needs.
4. Click the browse button for the Output Image field and navigate to the
directory where you want your output images stored.
5. Type a name for your image, and then click Save.
The path displays in the Output Image field.
Note: You can access the Options dialog from the Image Analysis
dropdown list, click the General tab, and then type the working directory
you want to use. This step saves you time by automatically bringing up
your working directory whenever you click the browse button to store an
output image.
2
4
3
6
Quick-Start Tutorial 17
6. Click OK to close the Histogram Equalization dialog. The equalized
image displays in the ArcMap table of contents and in your view.
This is the histogram equalized image of Moscow.
Apply an Invert Stretch to
the Moscow Image
In this example, you apply an invert stretch to the image to redisplay it
with its brightness values reversed. Areas that originally appeared
bright are now dark, and dark areas are bright.
To apply an invert stretch to the Moscow image, follow these steps:
1. Right-click the file that you equalized in the ArcMap table of contents
and select Properties to go to the Symbology tab on the Layer
Properties dialog.
2. Click the Histograms button in the Stretch box to view the histograms.
3. Click OK to return to the Symbology tab.
4. Check the Invert check box.
5. Click Apply and OK to display the inverted image.
2
4
5
18 Quick-Start Tutorial
This is an inverted image of moscow_spot.tif.
You can apply different types of stretches to your image to emphasize
different parts of the data. Depending on the original distribution of the
data, one stretch might make the image display better than another.
Image Analysis for ArcGIS lets you apply a LUT stretch, histogram
equalization, or brightness inversion permanently to a copy of your files.
It also provides a function that matches the histogram of one image to
that of a reference image.
The Layer Properties Symbology tab can be a learning tool to see the
effect of stretches on the input and output histograms. You learn more
about these stretches in Using Radiometric Enhancement on page
97.
Quick-Start Tutorial 19
Exercise 3:
Identifying Similar
Areas
With Image Analysis for ArcGIS, you can quickly identify areas with
similar characteristics in images. This is useful for identifying
environmental disasters or burn areas.
Once an area is defined, you can save it into a shapefile. This lets you
avoid the need for manual digitizing. To define the area, use the Seed
tool to point to an area of interest such as a dark area on an image
depicting an oil spill. The Seed tool returns a graphic polygon outlining
areas with similar characteristics.
Add and Draw a Theme
Depicting an Oil Spill
To add and draw an Image Analysis for ArcGIS theme depicting an oil
spill, follow these steps:
1. If you are starting immediately after the previous exercise, clear your
view by clicking the New Map File button on the ArcMap toolbar. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension.
2. Click the Add Data button to open the Add Data dialog.
3. Select radar_oilspill.img, and then click Add to display the image in
your view.
This is a radar image showing an oil spill off the northern coast of Spain.
1
20 Quick-Start Tutorial
Create a Shapefile In this exercise, you use the Seed tool. The Seed tool grows a polygon
graphic in the image that encompasses all similar and contiguous
areas. However, you must first create a shapefile in ArcCatalog and
start editing to activate the Seed tool. After going through these steps,
click inside the area you want to highlight, in this case an oil spill, and
create a polygon. The polygon lets you see how much area the oil spill
covers.
To create a shapefile using the Seed tool, follow these steps:
1. Click the Zoom In button on the Tools toolbar, and then drag a rectangle
around the black area in the image to see the spill more clearly.
2. Click the ArcCatalog button. You can store the shapefile you create in
the example data directory or navigate to a different directory.
3. Select the directory in the ArcCatalog table of contents, click File, point
to New, and then select Shapefile to open the Create New ShapeFile
dialog.
4. Type a name for the new shapefile oilspill in the Name field.
5. Select Polygon from the Feature Type dropdown list.
1
2
11
4
7
6
5
Quick-Start Tutorial 21
6. Check the Show Details check box.
7. Click the Edit button to open the Spatial Reference Properties dialog.
8. Click the Import button to open the Browse for Dataset dialog, which
contains the example data directory.
9. Select radar_oilspill.img, and then click Add to return to the Spatial
Reference Properties dialog.
10. Click Apply and OK to return to the Create New Shapefile dialog.
11. Click OK to return to the ArcCatalog window.
12. Select the oilspill shapefile, and then drag it into the table of contents in
the ArcMap window.
13. Close ArcCatalog.
10
8
22 Quick-Start Tutorial
Draw the Polygon with
the Seed Tool
To draw the polygon with the Seed tool, follow these steps:
1. Select Seed Tool Properties from the Image Analysis dropdown list to
open the Seed Tool Properties dialog.
2. Type a seed radius of 10 in the Seed Radius field.
Note: The seed radius is the number of pixels surrounding the
target pixel. The range of values of those surrounding pixels is
considered when the Seed tool grows the polygon.
3. Uncheck the Include Island Polygons check box.
4. Click OK to close the Seed Tool Properties dialog.
5. Click the Editor toolbar button on the ArcMap toolbar to display the
Editor toolbar.
6. Select Start Editing from the Editor dropdown list.
7. Click the Seed Tool button, and then click a point in the center of the oil
spill.
2
3
4
5
7
Quick-Start Tutorial 23
Note: The Seed tool takes a few moments to produce the polygon.
This is a polygon of an oil spill grown by the Seed tool.
Note: If you dont automatically see the formed polygon in the image,
click the refresh button at the bottom of the ArcMap window.
You can see how the tool identifies the extent of the spill. An emergency
team could be informed of the extent of this disaster to effectively plan
a cleanup of the oil.
24 Quick-Start Tutorial
Exercise 4:
Finding Changed
Areas
The Image Analysis for ArcGIS extension lets you see changes over
time. You can perform this type of analysis on either continuous data
using image difference or thematic data using thematic change.
Find Changed Areas In the following example, you work with two continuous data images of
the north metropolitan Atlanta, Georgia areaone from 1987 and one
from 1992. Continuous data images are those obtained from remote
sensors like Landsat and SPOT. This kind of data measures reflectance
characteristics of the Earths surface, analogous to exposed film
capturing an image. You can use image difference to identify areas that
have been cleared of vegetation to construct a large regional shopping
mall.
Add and Draw the Images
of Atlanta
To add and draw the images of Atlanta, follow these steps:
1. If you are starting immediately after the previous exercise, clear your
view by clicking the New Map File button on the ArcMap toolbar. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension.
2. Click the Add Data button to open the Add Data dialog.
3. Hold down the Ctrl key, and then select both atl_spotp_87.img and
atl_spotp_92.img.
4. Click Add to display the images in your view.
With images active in the view, you can calculate the difference
between them.
Quick-Start Tutorial 25
Compute the Difference
Due to Development
In this exercise, you learn how to use image difference, which is useful
for analyzing images of the same area to identify any changes in land
cover features over time. Image difference performs a subtraction of
one theme from another. This change is highlighted in green and red
masks that depict increasing and decreasing values.
To compute the difference due to development, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Utilities, and then
select Image Difference to open the Image Difference dialog.
2. Click the browse button for the Before Theme field and navigate to the
file named atl_spotp_87.img.
3. Select Layer_1 from the Before Layer dropdown list.
4. Click the browse button for the After Image field and navigate to the file
named atl_spotp_92.img.
5. Select Layer_1 from the After Layer dropdown list.
6. Click the As Percent button in the Highlight Changes box.
4
6
7
9
2
8
3
5
10
11
26 Quick-Start Tutorial
7. Type 15 in the Increases More Than field.
8. Type 15 in the Decreases More Than field.
9. Navigate to the directory where you want your difference image file
stored, type the name of the file, and then click Save.
10. Navigate to the directory where you want your highlight image file
stored, type the name of the file, and then click Save.
11. Click OK to close the Image Difference dialog.
The highlight image and difference image files display in the ArcMap
table of contents and in the view.
The image difference image shows the results of the subtraction of the
before theme from the after theme.
Quick-Start Tutorial 27
12. Uncheck the Difference Image check box in the ArcMap table of
contents to disable it and display the highlight image.
The image difference function calculates the difference in pixel values
with the 15 percent parameter you set. It finds areas with at least a 15
percent increase in designated clearing and highlights them in green.
Image difference also finds areas that have less than a 15 percent
decrease than before (designating an area that has increased
vegetation or an area that was once dry, but is now wet) and highlights
them in red. These changes display in the highlight change file.
Clear the View You can now clear the view and either go to the next portion of this
exercise, Exercise 4: Finding Changed Areas on page 27, or end the
session by closing ArcMap. If you want to shut down ArcMap with
Image Analysis for ArcGIS, click the File menu and select Exit. Click No
when asked to save changes.
Use Thematic Change Image Analysis for ArcGIS provides the thematic change feature to
make comparisons between thematic data images. Thematic change
creates a theme that shows all possible combinations of change to
display an areas land cover class change over time.
Thematic change is similar to image difference in that it computes
changes between the same area at different points in time. However,
thematic change can only be used with thematic data (data that is
classified into distinct categories). An example of thematic data is a
vegetation class map.
The next example uses two images of an area near Hagan Landing,
South Carolina. The images were taken in 1987 and 1989, before and
after Hurricane Hugo. Suppose you are the forest manager for a paper
company that owns a parcel of land in the hurricanes path. With Image
Analysis for ArcGIS, you can see exactly how much of your forested
land was destroyed by the storm.
28 Quick-Start Tutorial
Add the Images of an
Area Damaged by
Hurricane
To add the images of an area damaged by Hurricane Hugo, follow these
steps:
1. If you are starting immediately after the previous exercise, clear your
view by clicking the New Map File button on your ArcMap toolbar. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension.
2. Click the Add Data button to open the Add Data dialog.
3. Hold down the Ctrl key, and select both tm_oct87.img and
tm_oct89.img.
4. Click Add to display the images in your view.
This image shows an area damaged by Hurricane Hugo.
Quick-Start Tutorial 29
Create Three Classes of
Land Cover
Before you calculate thematic change, you must first categorize the
before and after themes. You can access the categorize function
through unsupervised classification, which is an option available on the
Image Analysis dropdown list. You use the thematic themes created
from those classifications to complete the thematic change calculation.
To create three classes of land cover, follow these steps:
1. Select tm_oct87.img from the Layers dropdown list on the Image
Analysis toolbar.
2. Click the Image Analysis dropdown arrow, point to Classification, and
then select Unsupervised/Categorize to open the Unsupervised
Classification dialog.
3. Click the browse button for the Input Image field and navigate to the
directory with the tm_oct87.img file.
4. Type 3 in the Desired Number of Classes field.
1
3
5 6
4
30 Quick-Start Tutorial
5. Navigate to the directory where you want to store the output image, type
the file name (for this example, use unsupervised_class_87), and
then click Save.
6. Click OK to close the Unsupervised Classification dialog.
Note: Using unsupervised classification to categorize continuous
images into thematic classes is useful when you are unfamiliar with the
data that makes up your image. When you designate the number of
classes you want the data divided into, Image Analysis for ArcGIS
performs a calculation assigning pixels to classes depending on their
values. By using unsupervised classification, you are better able to
quantify areas of different land cover in your image. You can then
assign the classes names like water, forest, and bare soil.
7. Uncheck the tm_oct87.img check box in the ArcMap table of contents
so the original theme is not drawn in the view. This step also makes the
remaining themes draw faster.
7
Quick-Start Tutorial 31
Assign Class Names and
Colors
To give the classes names and assign colors to represent them, follow
these steps:
1. Double-click unsupervised_class_87.img in the ArcMap table of
contents to open the Layer Properties dialog.
2. Click the Symbology tab.
3. Verify that Class_names is selected in the Value Field field.
4. Select Class 001 in the Label column, and then type Water.
5. Double-click the color bar in the Symbol column for Class 001, and
select blue from the color palette.
6. Select Class 002 in the Label column, and then type Forest.
7. Double-click the color bar in the Symbol column for Class 002, and
select green from the color palette.
8. Select Class 003 in the Label column, and then type Bare Soil.
9. Double-click the color bar in the Symbol column for Class 003, and
select a tan or light brown color from the color palette.
5
3
4
10
2
32 Quick-Start Tutorial
10. Click Apply and OK to close the Layer Properties dialog.
Categorize and Name the
Areas
To categorize and name the areas in the post-hurricane image, follow
these steps:
1. Follow the steps provided for the theme tm_oct87.img in Exercise 4:
Finding Changed Areas on page 29 and Exercise 4: Finding
Changed Areas on page 31 to categorize the classes of the
tm_oct89.img theme.
2. Uncheck the tm_oct89.img check box in the ArcMap table of contents
so that it does not draw in the view.
Recode Class Names and
Colors
After modifying the class names and colors using the Unique Values
page on the Symbology tab on the Layers Properties dialog, you can
permanently save these changes. Using recode with the From View
option, the class names and colors are saved to a thematic image file.
Quick-Start Tutorial 33
To recode class names and colors permanently to a file, follow these
steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode to open the Recode dialog.
2. Click the browse button for the Input Image field and select one of the
classified images.
3. Select From View from the Map Pixel Value through Field dropdown
list.
4. Type the name of the output image in the Output Image field, or click
the browse button, navigate to your working directory, name the output
image, and then click Save.
5. Click OK to close the Recode dialog.
Now use the same steps to perform a recode on the other classified
image of the Hugo area so that both images have your class names and
colors permanently saved.
3
5
2
4
34 Quick-Start Tutorial
Use Thematic Change to
see Land Cover Changes
To use thematic change to see how land cover changed because of
Hugo, follow these steps:
1. Make sure the check boxes for both images you recoded are checked
in the ArcMap table of contents so they are active in the view.
2. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Thematic Change to open the Thematic Change dialog.
3. Select the 87 recoded classification image from the Before Theme
dropdown list.
4. Select the 89 recoded classification image from the After Theme
dropdown list.
5. Navigate to the directory where you want to store the Output Image,
type the file name, and then click Save.
6. Click OK to close the Thematic Change dialog.
7. Check the Thematic Change check box in the ArcMap table of
contents to draw it in the view.
8. Double-click the Thematic Change title to open the Layer Properties
dialog.
9. Double-click the color bar in the ArcMap table of contents for Was:
Forest, is now: Bare Soil.
10. Click the color red in the color palette, and then click OK. If you dont
want to choose red, you can use any color you like.
5
4
3
6
Quick-Start Tutorial 35
You can see the amount of destruction in red. The red shows what was
forest and is now bare soil.
Add a Feature Theme that
Shows the Property
Boundary
Using thematic change, the overall damage caused by the hurricane is
clear. Next, you see how much damage actually occurred on the paper
companys land.
To add a feature theme that shows the property boundary, follow these
steps:
1. Click the Add Data button to open the Add Data dialog.
2. Select property.shp, and then click Add.
This figure displays a thematic change image with the property
shapefile.
Make the Property
Transparent
To make the property transparent, follow these steps:
36 Quick-Start Tutorial
1. Double-click the property theme in the ArcMap table of contents to open
the Layer Properties dialog.
2. Click the Symbology tab, and then click the color symbol to open the
Symbol Selector dialog.
3. Click the Hollow symbol in the box on the left side of the Symbol
Selector dialog.
4. Type 3 in the Outline Width field.
5. Select a color that easily stands out to show your property line from the
Outline Color dropdown list.
6. Click OK to close the Symbol Selector dialog.
5
4
3
6
Quick-Start Tutorial 37
7. Click Apply and OK when you return to the Symbology tab to close the
Layer Properties dialog.
The yellow outline clearly shows the devastation within the paper
companys property boundaries.
Summarize the Area Next, you use the summarize areas function to give area calculations of
loss inside the polygon you created.
To summarize the area, follow these steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Summarize Areas to open the Summarize Areas dialog.
2. Select a file from the Zone Theme dropdown list, or navigate to the
directory where it is stored.
3. Select an attribute from the Zone Attribute dropdown list.
4. Select a file from the Class Theme dropdown list, or navigate to the
directory where it is stored.
5. Click the browse button for the Summarize Results Table field to specify
a name for the new summarize areas table that is created.
2
3
4
5
6
38 Quick-Start Tutorial
6. Click OK to summarize areas.
7. View the different results of the summarized areas in the Summarize
Areas Results dialog that displays.
8. Click Close to close the Summarize Areas Results dialog, or click
Export to Table to export the information to a text file.
When the process completes, the resulting table is added to ArcMap.
Click the Source tab in the ArcMap table of contents to see the new
table.
Exercise 5:
Mosaicking
Images
Image Analysis for ArcGIS lets you mosaic multiple images. When you
mosaic images, you join them together to form one image that covers
the entire area. In the following exercise, you mosaic two air photos with
the same resolution.
Add and Draw the Images To add and draw the mosaicked images, follow these steps:
1. Clear your view by clicking the New Map File button on the ArcMap
toolbar if you are starting immediately after the previous exercise. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension with a new
map.
2. Click the Add Data button to open the Add Data dialog.
3. Hold down the Ctrl key and select both Airphoto1.img and
Airphoto2.img.
4. Click Add to display the images in your view.
Quick-Start Tutorial 39
5. Click Airphoto1.img and drag it so that it is at the top of the ArcMap
table of contents.
The Mosaic tool joins the two air photos as they display in the view.
Whichever image is on top is also on top in the mosaicked image.
Zoom In to See Image
Details
To zoom in to see image details, follow these steps:
1. Right-click Airphoto1.img in the ArcMap table of contents and select
Zoom to Raster Resolution.
The two images display at a 1:1 resolution. You can now use the Pan
tool to see how they overlap.
40 Quick-Start Tutorial
2. Click the Pan button on the Tool toolbar, and then maneuver the images
in the view.
This illustration shows where the two images overlap.
3. Click the Full Extent button so that both images display their entirety in
the view.
2 3
Quick-Start Tutorial 41
Use Mosaic to Join the
Images
To use Mosaic to join the two images, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Mosaic Images to open the Mosaic Images dialog.
2. Select Use Order Displayed from the Handle Images Overlaps By
dropdown list.
3. Check the Automatically Crop Images By check box if you want to
automatically crop your images, and then type the percentage by which
to crop the images in the Percent field.
4. Click the Brightness/Contrast button as the Color Balance By setting.
5. Navigate to the directory where you want to save your files, type the file
name, and then click Save.
5
2
3
4
6
42 Quick-Start Tutorial
6. Click OK to return to the Mosaic Images dialog, and then click OK to
close.
The Mosaic function joins the two images as they display in the view.
Airphoto1 is mosaicked over Airphoto2.
Create Custom Cutlines It is possible to create custom cutlines for your images. You must
subset the images prior to mosaicking to do so.
Note: This exercise is optional.
To create custom cutlines, follow these steps:
1. Add the image you want to subset to your view.
2. Add an existing shapefile, or create a new polygon shapefile and
digitize the desired extent of your output file.
3. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Subset Image to open the Subset Image dialog.
4. Click the browse button for the Input Image field and navigate to where
your polygon shapefile is stored.
5. Type the name of the output image in the Output Image field, or
navigate to the directory where you want it stored.
6. Click OK to subset the image.
7. Repeat for all of the images you want to create cutlines for.
8. Follow the steps in Exercise 5: Mosaicking Images on page 41 to
mosaic the images.
Quick-Start Tutorial 43
Exercise 6:
Orthorectifying
Images
The Image Analysis for ArcGIS extension for ArcGIS has a feature
called GeoCorrection properties. The function of this feature is to rectify
images. One of the tools that makes up GeoCorrection properties is the
camera model.
In this exercise you orthorectify images using the camera model in
GeoCorrection properties.
Add Raster and Feature
Datasets
To add raster and feature datasets, follow these steps:
1. Clear your view by clicking the New Map File button on the ArcMap
toolbar if you are starting immediately after the previous exercise. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension with a new
map.
2. Click the Add Data button to open the Add Data dialog.
3. Hold down the Ctrl key and select both ps_napp.img and
ps_streets.shp.
4. Click Add to display the images in your view.
5. Right-click ps_napp.img in the ArcMap table of contents and select
Zoom to Layer.
The image is drawn in the view. You can see the fiducial markings
around the edges and at the top.
Select the Coordinate
System for the Image
This procedure defines the coordinate system for the data frame in
Image Analysis for ArcGIS.
44 Quick-Start Tutorial
To select the coordinate system for the image, follow these steps:
1. Right-click the image in your view and select Data Frame Properties
to open the Data Frame Properties dialog.
2. Click the Coordinate System tab.
3. Select Predefined in the Select a Coordinate System box.
4. Click Projected Coordinate Systems, and then click Utm.
5. Click NAD 1927, and then click NAD 1927 UTM Zone 11N.
6. Click Apply and OK to close the Data Frame Properties dialog.
2
5
6
Quick-Start Tutorial 45
Orthorectify your Image
using GeoCorrection
To orthorectify your image using GeoCorrection properties, follow these
steps:
1. Select Camera from the Model Types dropdown list on the Image
Analysis toolbar.
2. Click the GeoCorrection Properties button to open the Camera
Properties dialog.
3. Click the Elevation tab.
4. Click File to use as the elevation source.
5. Navigate to the ArcGIS ArcTutor directory, and select ps_dem.img as
the elevation file.
6. Select Meters from the Elevation Units dropdown list.
7. Check the Account for Earths Curvature check box.
2 1
5
6
7
3
4
46 Quick-Start Tutorial
8. Click the Camera tab.
9. Select Default Wild from the Camera Name dropdown list.
10. Type -0.004 in the X field and 0.000 in the Y field in the Principal Point
box.
11. Type 152.804 in the Focal Length field.
12. Type 4 in the Number of Fiducials field, and then press the Tab key to
update the fiducials table below this field.
13. Type the following coordinates in the table cells. Press the Tab key to
move from cell to cell.
14. Type the camera name in the Camera Name field.
15. Click Save to navigate to a directory and save the camera information
with the camera name.
16. Click Save to return to the Camera tab.
9
12
11
10
8
13
13
17
15
1. -106.000 106.000
2. 105.999 105.9942
3. 105.998 -105.999
4. -106.008 -105.999
Quick-Start Tutorial 47
17. Click Apply and move to the next section.
Place Fiducials To place fiducials, follow these steps:
1. Click the Fiducials tab.
2. Make sure the first fiducial orientation is selected.
3. Click the Fiducial Measurement button.
Image Analysis for ArcGIS takes you to the approximate location of the
first fiducial placement, and your cursor becomes a crosshair.
4. Click the Fixed Zoom In button on the Tools toolbar.
5. Zoom in until you see the fiducial, and then click the crosshair.
Image Analysis for ArcGIS takes you to each of the four points where
you can click the crosshair in the fiducial marker.
6. Right-click the image in the ArcMap table of contents and select Zoom
to Layer after you finish placing fiducials.
You see that both the image and the shapefile display in the view.
7. View the root mean square error (RMSE) on the Fiducials tab by
reopening the Camera Properties dialog. The RMSE should be less
than 1.
1
3
2
4
48 Quick-Start Tutorial
8. Click OK to close the Camera Properties dialog.
After placing fiducials, both the image and the shapefile are shown in
the view for rectification.
Place Links To place links, follow these steps:
1. Click the Add Links button.
2. Looking closely at the image and shapefile in the view, and using the
next image as a guide, line up where to place the first link. Follow the
markers in the next image to place the first three links. You must click
the crosshair on the point in the image first and then click the
corresponding location in the shapefile.
Your first link should look approximately like this:
3. Place links 2 and 3.
Quick-Start Tutorial 49
After placing the third link, your image should look something like this:
4. Zoom to the lower-left corner of the image, and place a link according
to the previous image.
Your image should warp and become aligned with the streets shapefile.
You can use the Zoom tool to draw a rectangle around the aligned area
and zoom in to see it more clearly.
Now take a look at the Total RMSE field on the Links tab on the Camera
Properties dialog. Your RSME should be less than 1. If the error is
higher than 1, you might need to redo the point selection. Remove the
point first by clicking it, and then pressing the Delete key.
5. Select Save As from the Image Analysis dropdown list to save the
image.
50 Quick-Start Tutorial
Whats Next? This tutorial introduced you to some features and basic functions of
Image Analysis for ArcGIS. The following chapters go into greater detail
about the different tools and elements of Image Analysis for ArcGIS,
and include instructions on how to use them to your advantage.
51 Applying Data Tools
Applying Data Tools 51
Applying Data Tools
There are three options in the Image Analysis dropdown list. All three
aid you in manipulating, analyzing, and altering your data so you can
produce results that are easier to interpret. The options are as follows:
Seed Tool Properties Automatically generates feature layer
polygons of similar spectral value.
Image Info Gives you the ability to apply a NoData value and
recalculate statistics.
Options Lets you change extent, cell size, preferences, and
more.
IN THIS CHAPTER
Seed Tool
Image Info
Options Dialog
Geoprocessing Tools
52 Applying Data Tools
Seed Tool The main function of the Seed tool is to automatically generate feature
layer polygons of similar spectral value. Before using the Seed tool, you
must first create the shapefile for the image you are using in
ArcCatalog. To do so, open ArcCatalog, create a new shapefile in the
directory you want to use, name it, select Polygon as the shapefile type,
and then select Start Editing from the Editor dropdown list.
After clicking the Seed Tool button on the Image Analysis toolbar, you
can either click in an image on a single point, or you can drag a
rectangle in a portion of the image that interests you. You can
experiment with which method works best with your data. When you
finish growing the polygon, select Stop Editing from the Editor
dropdown list.
Any bands used in growing the polygon are controlled by the current
visible bands set in the Layer Properties dialog. If you display only one
band, such as the red band when interested in vegetation analysis, the
Seed tool only looks at the statistics of that band to create the polygon.
If you display all the bands (red, green, and blue), the Seed tool
evaluates the statistics in each band of data before creating the
polygon.
When a polygon is created using the Seed tool, it is added to the
shapefile. Like other ArcGIS graphics, you can change the appearance
of the polygon produced by the Seed tool using the graphics tools.
Controlling the Seed Tool You use the Seed tool by clicking it on the Image Analysis toolbar, and
then clicking an image after generating a shapefile. The defaults usually
produce a good result. However, if you want more control over the
parameters of the Seed tool, you can use the Seed Tool Properties
dialog by selecting it from the Image Analysis dropdown list.
Figure 7: Seed Tool Properties Dialog Box
Applying Data Tools 53
Seed Radius
When you use the simple click method, the Seed tool is controlled by
the seed radius. You can change the number of pixels of the seed
radius using the Seed Radius field in the Seed Tool Properties dialog.
The Image Analysis for ArcGIS default seed radius is 5 pixels.
The seed radius determines how selective the Seed tool is when
selecting contiguous pixels. A larger seed radius includes more pixels
to calculate the range of pixel values used to grow the polygon, and
typically produces a larger polygon. A smaller seed radius uses fewer
pixels to determine the range. Setting the seed radius to 0.5 or less
restricts the polygon to growing over pixels with the exact value as the
pixel you click in the image. This is useful for thematic images in which
a contiguous area has a single pixel value instead of a range of values
like continuous data.
Island Polygons
The other option in the Seed Tool Properties dialog is Include Island
Polygons. Leave this option checked if you want to include small, non-
contiguous polygons. You can turn it off for single feature mapping
where you want to see a more refined boundary.
Preparing to Use the
Seed Tool
To activate the Seed tool and generate a polygon in your image, follow
these steps:
1. Click the ArcCatalog button on the Standard toolbar to open
ArcCatalog, and then make sure your working directory displays in the
window.
1
54 Applying Data Tools
2. Click File, point to New, and then select Shapefile to open the Create
New Shapefile dialog.
3. Type a name for the new shapefile in the Name field.
4. Select Polygon from the Feature Type dropdown list.
5. Check the Show Details check box.
6. Click Edit to open the Spatial Reference Properties dialog.
9
4
5
6
3
Applying Data Tools 55
7. Click either the Select, Import, or New button and enter the coordinate
system for the new shapefile to use. Clicking Import lets you import the
coordinates of the image you are creating the shapefile for.
8. Click Apply and OK to close the Spatial Reference Properties dialog.
9. Click OK to close the Create New Shapefile dialog.
10. Drag the new shapefile from the ArcCatalog table of contents into the
ArcMap table of contents.
11. Close ArcCatalog.
12. Select Start Editing from the Editor dropdown list on the Editor toolbar.
8
7
56 Applying Data Tools
Changing the Seed
Radius
To change the seed radius and include island polygons, follow these
steps:
1. Select Seed Tool Properties from the Image Analysis dropdown list to
open the Seed Tool Properties dialog.
2. Type a new value in the Seed Radius field.
3. Check the Include Island Polygons check box if you want to activate
this option.
4. Click OK to close the Seed Radius dialog and growing the polygon over
the image with the Seed tool.
5. Select Stop Editing from the Editor dropdown list on the Editor toolbar.
2
3
4
Applying Data Tools 57
Image Info When analyzing images, you often have pixel values you need to alter
or manipulate to perceive different parts of the image better. The Image
Info feature of Image Analysis for ArcGIS lets you choose a NoData
value for your image so that a pixel value that is unimportant in your
image can be designated as such and is excluded from processes like
statistics calculations.
You can find the Image Info dialog on the Image Analysis dropdown list.
When you open this dialog, the images in your view display in the Layer
Selection dropdown list. You can use the Image Info dialog as follows:
Type a value in the NoData Value field to set the NoData pixels in
your image.
Apply NoData to a single layer of your image instead of the entire
image. When you apply NoData to a single layer, it is important that
you click the Apply button in the Image Info dialog before moving to
the next layer. For more information, see the NoData Value
section that follows.
Recalculate statistics (Recalc Stats) for single bands by clicking the
Current Band button in the Statistics box. Please remember that if
you click Recalc Stats while Current Band is selected, Image Info
only recalculates the statistics for that band. If you want to set
NoData for a single band, but recalculate statistics for all bands,
click the All Bands button after specifying NoData in the single
bands, and then recalculate for all.
You can close and refresh the ArcMap display to see the NoData
value applied by clicking the refresh button at the bottom of the
ArcMap window.
Manually override the setting in the Representation Type box.
Image Analysis for ArcGIS automatically chooses continuous or
thematic depending on the type of image in your view.
NoData Value The NoData Value section of the Image Info dialog lets you label certain
areas of your image as NoData when the pixel values in that area are
not important to your statistics or image. To do so, assign a value that
no other pixel in the image has to the pixels you want to classify as
NoData. Using 0 is not always recommended because 0 can be a valid
value in your image. Look at the Minimum and Maximum values in the
Statistics box and choose a NoData value that is any number between
these minimum and maximum values. You can type N/A or leave the
area blank so that you have no NoData assigned if you don't want to use
this option.
58 Applying Data Tools
Sometimes the pixel value you choose as NoData is already in use,
causing NoData to match some other part of your image. This problem
becomes evident when the image displays in the view and there are
black spots or triangles where it should be clear, or clear spots where it
should be black.
Using the Image Info
Dialog Box
To use the Image Info dialog, follow these steps:
1. Display your image in the view.
2. Select Image Info from the Image Analysis dropdown list to open the
Image Info dialog.
3. Click the Layer Selection dropdown arrow to make sure the correct
image is displayed.
4. Click either the All Bands or Current Band button.
5. Click the Statistics dropdown arrow to make sure the layer you want to
recalculate is selected if you clicked the Current Band button.
6. Type a value in the NoData Value field, or type N/A if you dont want to
assign a pixel to the NoData value.
7. Make sure the correct representation type is chosen for your image.
8. Click the Recalc Stats button to recalculate the statistics using the
NoData value.
9. Click Apply and OK to close the Image Info dialog.
10. Click the refresh button at the bottom of the ArcMap window to refresh
the display.
8
3
6
5
7
4
9
Applying Data Tools 59
Options Dialog You can access the Options dialog using the Image Analysis dropdown
list. This dialog lets you set an analysis mask as well as the extent, cell
size, preferences, and raster for a single operation or future operations.
These options default to process appropriate options, but you can
change them if necessary. You can use the Options dialog with any
Image Analysis feature, but it is particularly useful with the Data
Preparation features that are covered in Using Data Preparation on
page 71.
Note: You can specify any of the settings in the Options dialog from the
Environment Settings dialog, which can be displayed by clicking the
Environments button at the bottom of any process dialog. You can also
access this dialog by clicking Tools/Options/Geoprocessing tab, and
then clicking the Environments button.
The Options dialog has five tabs: General Tab, Extent Tab, Cell Size
Tab, Preferences Tab, and Raster Tab.
General Tab On the General tab, your output directory displays and the analysis
mask defaults to none. However, if you click the Analysis Mask
dropdown arrow, you can set it to any file.
Figure 8: General Tab
You can store your output images and shapefiles in one working
directory by navigating to that directory or typing the directory name in
the Working Directory field. This allows your working directory to
automatically come up every time you click the browse button for an
output image. The Analysis Coordinate System option lets you choose
which coordinate system to save the image withthe one for the input,
the one for the active data frame, or the one you specify.
60 Applying Data Tools
Extent Tab The Extent tab lets you control how much of a theme you want to use
during processing. You can do this by setting the analysis extent.
Figure 9: Extent Tab
All of the settings on the Extent tab become active when you select any
of the following extents from the Analysis Extent dropdown list:
Same as Display Refers to the area currently displayed in the
view. If the view is zoomed in on a portion of a theme, the functions
only operate on that portion of the theme.
Same as Layer Lets you set the extent of processing to the same
extent of another layer in your table of contents. You can also click
the browse button to select a dataset to use as the analysis extent.
If you click this button, you can navigate to the directory where your
data is stored and select a file.
As Specified Below Lets you fill in the information for the extent.
When you select an extent that activates the rest of the Extent tab, the
fields are Top, Right, Bottom, and Left. If you are familiar with the data
and want to enter exact coordinates, you can do so in these fields.
Same as Display and As Specified Below also activate the Snap Extent
To dropdown list, allowing you to select an image to snap the analysis
mask to.
Applying Data Tools 61
The other settings on the Analysis extent dropdown list are:
Intersection of Inputs The default extent for all functions except
Mosaic. When this extent is set, Image Analysis for ArcGIS
performs functions on the area of overlap common to the input
images. Portions of the images outside the area of overlap are
discounted from analysis.
Union of Inputs The default setting of analysis extent for
mosaicking. When this extent is set, Image Analysis for ArcGIS
uses the union of every input theme. It is recommended that you
keep this default setting when mosaicking images. If you change it,
you must check the Use Extent from Analysis Options check box in
the Mosaic Images dialog.
Use Function Defaults The extent that lets you use the general
processing defaults for the specific function or any settings you set
in the Environment Settings dialog.
Cell Size Tab The third tab on the Options dialog is Cell Size. This is for the output cell
size of images you produce using Image Analysis for ArcGIS.
Figure 10: Cell Size Tab
The first field on the Cell Size tab is the Analysis Cell Size dropdown list.
The options in this list are as follows:
Maximum of Inputs Yields an output that has the maximum
resolution of the input files. For example, if you use Image
Difference on a 10 m image and a 20 m image, the output is a 20 m
image.
62 Applying Data Tools
Minimum of Inputs Produces an output that has the minimum
resolution of the input files. For example, if you use Image
Difference on a 10 m image and a 20 m image, the output is a 10 m
image.
As Specified Below Lets you enter the cell size you want, and
Image Analysis for ArcGIS adjusts the output accordingly.
Same as Layer "...." Indicates a layer in the view, and the Cell
Size field reflects the current cell size of that layer.
The cell size displays in either meters or feet. You can change the cell
size by selecting Data Frame Properties from the View menu in ArcMap
to open the Data Frame Properties dialog. Click the General tab, and
then select either Meters or Feet from the Map dropdown list in the Units
box.
You should not manually update the Number of Rows and Number of
Columns fields on the Cell Size tab because they automatically update
as analysis properties are changed.
Preferences Tab It is recommended that you leave the preference to the default of
Bilinear Interpolation. However, you can change it to Nearest Neighbor
or Cubic Convolution if your data requires it.
Figure 11: Options Preferences Tab
Applying Data Tools 63
The Resample Using settings on the Preferences tab are defined as
follows:
Bilinear Interpolation A resampling method that uses the data
file values of four pixels in a 2 2 window to calculate an output data
file value. It does this by computing a weighted average of the input
data file values with a bilinear function.
Nearest Neighbor A resampling method in which the output data
file value is equal to the input pixel that has coordinates closest to
the retransformed coordinates of the output pixel.
Cubic Convolution A resampling method that uses the data file
values of 16 pixels in a 4 4 window to calculate an output data file
value with a cubic function.
This table lists the advantages and disadvantages of bilinear
interpolation resampling.
Table 1: Bilinear Interpolation Resampling
Advantages Disadvantages
Results in output images that are
smoother, without the stair-stepped
effect that is possible with Nearest
Neighbor.
Has the effect of a low-frequency
convolution because pixels are
averaged. Also, edges are
smoothed, and some extremes of
the data file values are lost.
More spatially accurate than Nearest
Neighbor.
Often used when changing the cell
size of the data, such as in SPOT/TM
merges within the 2 x 2 resampling
matrix limit.
64 Applying Data Tools
This table lists the advantages and disadvantages of nearest neighbor
resampling.
Table 2: Nearest Neighbor Resampling
Advantages Disadvantages
Transfers original data values without
averaging them as the other methods
do; therefore, the extremes and
subtleties of the data values are not
lost. This is an important
consideration when discriminating
between vegetation types, locating
an edge associated with a lineament,
or determining different levels of
turbidity or temperatures in a lake
(Jensen 1996).
Usually results in a stair-stepped
effect around diagonal lines and
curves when used to resample from
a larger to a smaller grid size.
Suitable for use before classification. Can drop data values, while
duplicating other values.
Easiest of the three methods to
compute, and the fastest to use.
Can result in breaks or gaps in a
network of linear data when used on
linear thematic data (for example,
roads or streams).
Appropriate for thematic files, which
can have data file values based on a
qualitative (nominal or ordinal) or
quantitative (interval or ratio) system.
The averaging performed with
bilinear interpolation and cubic
convolution is not suited to a
qualitative class value system.
Applying Data Tools 65
This table lists the advantages and disadvantages of cubic convolution
resampling.
Table 3: Cubic Convolution Resampling
Raster Tab The last tab on the Options dialog is Raster.
Figure 12: Raster Tab
Advantages Disadvantages
Uses 4 x 4 resampling. In most
cases, the mean and standard
deviation of the output pixels match
the mean and standard deviation of
the input pixels more closely than
any other resampling method.
Can result in altered data values.
Can both sharpen the image and
smooth out noise (Atkinson 1985),
due to the effect of the cubic curve
weighting. The actual effects
depends on the data you use.
Slowest resampling method because
it is the most computationally
intensive.
Recommended for use when you are
dramatically changing the cell size of
the data, such as in TM/aerial photo
merges (that is, matches the 4 x 4
window more closely than the 2 x 2
window).
66 Applying Data Tools
There are several raster formats with differences between the support
offered by ESRI and by ERDAS. These differences can be attributed to
a variant of that format, how data is stored, or the amount of data that
can be read to improve the display accuracy of that file. In some cases,
ERDAS lets you use ERDAS libraries to support a certain format. This
ensures you the same level of format support as in the past.
Note: It is recommended that you leave the supported formatting
options enabled because disabling a format results in that formats
handling being delivered by ESRI.
The following formats are supported:
SOCET SET
IMAGINE Image
TIFF
NITF
Using the Options Dialog The following steps take you through the settings you can change in the
Options dialog. You can display the Options dialog by selecting Options
from the Image Analysis dropdown list.
Using the General Tab
To specify settings on the General tab, follow these steps:
1. Click the General tab on the Options dialog.
2. Click the browse button for the Working Directory field and navigate to
your working directory.
3. Select an analysis mask from the Analysis Mask dropdown list, or
navigate to a directory and select one.
3
2
4
Applying Data Tools 67
4. Select a setting from the Analysis Coordinate System dropdown list.
Note: If you select As Specified below, the button below it becomes
active. Click the button to open the Spatial Reference Properties
dialog, specify your coordinate system settings, and then click OK.
5. Click the Extent tab to change analysis extents, or click Apply and OK
to close the Options dialog.
Using the Extent Tab
To specify settings on the Extent tab, follow these steps:
1. Select an extent from the Analysis Extent dropdown list, or navigate to
a directory and select a dataset for the extent.
2. Type coordinates in the Top, Left, Right, and Bottom fields if you
selected As Specified Below from the Analysis Extent dropdown list.
3. Select an image from the Snap Extent To dropdown list, or navigate to
the directory where it is stored if you activated this option by selecting
As Specified below or Same as Display from the Analysis Extent
dropdown list.
4. Click the Cell Size tab to change cell sizes, or click Apply and OK to
close the Options dialog.
2
1
3
68 Applying Data Tools
Using the Cell Size Tab
To specify settings on the Cell Size tab, follow these steps:
1. Select the output cell size from the Analysis Cell Size dropdown list, or
navigate to a directory and select a file on which to base the output cell
size.
2. Type a cell size in the Cell Size field if you selected As Specified Below
from the Analysis Cell Size dropdown list.
3. Change the number in the Number of Rows field if necessary.
4. Change the number in the Number of Columns field if necessary.
5. Click the Preferences tab to change preferences, or click Apply and
OK to close the Options dialog.
Using the Preferences Tab
The Preferences tab on the Options dialog has only one option that lets
you resample using either Nearest Neighbor, Bilinear Interpolation, or
Cubic Convolution using the Resample Using dropdown list. Bilinear
Interpolation is the default, and it is recommended that you leave the
preference to this default except for thematic data processing, in which
case you should change it to Nearest Neighbor.
Using the Raster Tab
The Raster tab on the Options dialog has several raster formatting
options that are already enabled. It is recommended that you leave the
supported formatting options enabled because disabling a format
results in that formats handling being delivered by ESRI.
1
2
3
4
Applying Data Tools 69
Geoprocessing
Tools
In Image Analyst for ArcGIS, all functions with the exception of Single
Frame GeoCorrection and the Seed tool, are provided as
geoprocessing tools. This lets you take full advantage of the ArcGIS
geoprocessing environment. Benefits include:
Ability to do image processing in ArcCatalog or ArcToolbox in a
scripting environment.
Ability to chain Image Analysis processes into a workflow or
analysis model.
Opportunity to integrate with other geoprocessing tools or scripts
such as those provided by ESRI and third-party developers.
Tools that respect the settings made in the ArcGIS geoprocessing
environment. Examples of the environment settings are current
workspace, cell size, analysis extent, output coordinate system, and
resampling method.
If you are new to geoprocessing, we suggest that you read the ArcGIS
Desktop online help topics for geoprocessing.
Specifying
Geoprocessing Options
In earlier versions of Image Analysis for ArcGIS, the Image Analysis
Options dialog saved the analysis settings to the map document (.mxd).
Now, the Options dialog provides a more convenient alternative to
using the application-wide geoprocessing Environment Settings dialog.
It lets you view and alter the environment settings that are most relevant
to Image Analysis for ArcGIS functions without having to navigate all of
the geoprocessing environment settings presented by the Environment
Settings dialog.
You can use either Image Analysis for ArcGIS dialogthe Options
dialog or the Environment Settings dialogas a valid method for
specifying options for processing.
Updating Existing
Geoprocessing Models
The Image Analysis for ArcGIS geoprocessing functions were rebuilt
during the time frame of the current version of ArcGIS. It is
recommended that you update existing models with the new
geoprocessing functions. To do this, open the model containing
functions from the earlier version and replace each with the current
version.
70 Applying Data Tools
Using Data Preparation 71
Using Data Preparation
It is sometimes necessary to prepare your data before working with
images. However, you must understand how to prepare your data
before manipulating and analyzing it. You have several options for
preparing data in Image Analysis for ArcGIS.
IN THIS CHAPTER
Create New Image
Subset Image
Mosaic Images
Reproject Image
72 Using Data Preparation
Create New Image The create new image function makes it easy to create a new image file.
It lets you define the size and content of the file. It also lets you specify
whether the new image type is thematic or continuous:
Thematic Data Represented in raster layers that contain
qualitative and categorical information about an area. Thematic
layers lend themselves to applications in which categories or
themes are used. They represent data measured on a nominal or
ordinal scale, such as soils, land use, land cover, and roads.
Thematic data is also known as discrete data.
Continuous Data Represented in raster layers that contain
quantitative (measuring a characteristic on an interval or ratio scale)
and related, continuous values. Continuous raster layers can be
multiband or single band such as Landsat, SPOT, digitized
(scanned) aerial photograph, DEM, slope, and temperature.
The table below summarizes the values appropriate for the various data
types.
The create new image function also provides these options:
Number of Rows and Columns Lets you specify the number of
rows and columns (the default is 512), and the data type.
Data Type Determines the type of numbers and the range of
values that you can store in a raster layer.
Number of Layers Lets you select how many layers to create in
the new file.
Table 4: Data Type Values
Data Type Minimum
Value
Maximum
Value
Unsigned 1 bit 0 1
Unsigned 2 bit 0 3
Unsigned 4 bit 0 15
Unsigned 8 bit 0 255
Signed 8 bit -128 127
Unsigned 16 bit 0 65,535
Signed 16 bit -32,768 32,767
Unsigned 32 bit
Signed 32 bit -2 billion 2 billion
Float Single
Using Data Preparation 73
Initial Value Lets you specify the value given to every cell in the
new file.
Creating a New Image To create a new image, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Create New Image to open the Create New Image
dialog.
2. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
3. Click either the Thematic or Continuous button as the output image
type.
4. Type the number of columns and rows, if different from the default
number of 512, in the Columns and Rows fields.
5. Select a data type from the Data Type dropdown list.
6. Type the number of layers that you want in the Number of Layers field.
7. Type the initial pixel value in the Initial Value field.
8. Click OK to create the image and close the Create New Image dialog.
3
5
7
8
4
6
2
74 Using Data Preparation
Subset Image The subset image function lets you copy a portion (a subset) of an input
data file into an output data file. This may be necessary if you have an
image file that is much larger than the area you need to study.
Subsetting an image has the advantage of eliminating extraneous data
and speeding up processing by reducing the file size. This is important
when dealing with multiband data.
You can use subset image to subset an image either spatially or
spectrally:
Spatially Subset an image spatially by setting an analysis mask
using the Subset Image dialog or set an extent using the Options
dialog or the Environment Settings dialog. Spatial subsets are
particularly useful if you have a large image and you only want to
subset part of it for analysis.
Spectrally Subset an image spectrally by using the subset image
function to work on multiband continuous data to remove specific
bands. For example, if you are working with a thematic mapper (TM)
image that has seven bands of data, you can make a subset of
bands 2, 3, and 4, and discard the rest. Use the Subset Image
dialog to enter the band numbers to extract from the image.
If you want to specify a particular area to subset, click the Zoom In tool
and draw a rectangle over the area. Next, display the Options dialog
and select Same As Display from the Analysis Extent dropdown list on
the Extent tab.
Figure 13: Extent Tab
Using Data Preparation 75
The following are illustrations of a TM image of the Amazon as it
undergoes a spectral subset.
Figure 14: Amazon TM Image before Spectral Subsetting
Figure 15: Amazon TM Image after Spectral Subsetting
76 Using Data Preparation
The illustrations that follow reflect images using the Spatial Subsetting
option.
Figure 16: Pentagon Image before Spatial Subsetting
The rectangle is defined by Top, Left, Bottom, and Right coordinates.
Top and Bottom coordinates are measured as the locations on the Y-
axis and the Left and Right coordinates are measured on the X-axis.
You can then save the subset image and work from there on your
analysis.
Figure 17: Pentagon Subset Image after Analysis Extent
Using Data Preparation 77
Subsetting an Image
Spectrally
To subset an image spectrally, follow these steps:
1. Click the Add Data button and add the image you want to subset to
the view.
2. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Subset Image to open the Subset Image dialog.
3. Click the browse button for the Input Image field and navigate to the
directory where your image is located.
4. Type the number of bands you want present in your output in the Select
Desired Band Numbers field, using a comma for separation.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Subset Image dialog.
Subsetting an Image
Spatially
To subset an image spatially, follow these steps:
1. Click the Add Data button to add your image to the view.
2. Click the Zoom In tool, and draw a rectangle over the area you want to
subset.
5
3
2
78 Using Data Preparation
3. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Subset Image to open the Subset Image dialog.
4. Click the Environments button to open the Environment Settings
dialog.
5. Click the General Settings arrow.
6. Select Same as Display from the Extent dropdown list.
7. Click OK to return to the Subset Image dialog.
8. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
9. Click OK to close the Subset Image dialog.
Note: There are other methods of subsetting. For more information, see
the Subset Image topic in the Image Analysis for ArcGIS online help.
Mosaic Images Mosaicking is the process of joining georeferenced images together to
form a larger image. The input images must all contain map and
projection information, although they need not be in the same projection
or have the same cell sizes. Calibrated input images are also
supported. You can mosaic single or multiband continuous data, or
thematic data.
If you have data in the view, it loads into the Mosaic Images dialog in
that order, giving you a final image that is identical to the initial view.
You can reorder the data in the Mosaic Images dialog.
It is important that the images you mosaic contain the same number of
bands. You cannot mosaic a seven-band TM image with a six-band TM
image. You can, however, use subset image to remove the extra band
and then mosaic.
8
4
Using Data Preparation 79
You can mosaic images with different cell sizes or resolutions. The
output cell size defaults to the maximum cell size. For example, if you
mosaic two images, one with a 4 m resolution and one with a 5 m
resolution, the output mosaicked image has a 5 m resolution. You can
set the cell size to whatever cell size you like using the Options dialog
or the Environment Settings dialog. However, the data cannot be
created to compensate if you specify a higher resolution than your
input. You still have the same coarseness of the original file.
The Extent tab on the Options dialog defaults to Union of Inputs for
mosaicking images. If, for some reason, you want to use a different
extent, you can change it in the Options dialog and check the Use
Extent from Analysis Options check box in the Mosaic Images dialog. It
is recommended that you leave it at the default of Union of Inputs.
For mosaicking images, you should resample using the Nearest
Neighbor option on the Preferences tab. This ensures that the
mosaicked pixels do not differ in their appearance from the original
image. Other resampling methods use averages to compute pixel
values and can produce an edge effect.
With the Mosaic tool you are given a choice of how to handle image
overlaps by using the Order Displayed, Maximum Value, Minimum
Value, or Average Value settings:
Order Displayed Replaces each pixel in the overlap area with the
pixel value of the image that is on top in the view.
Maximum Value Replaces each pixel in the overlap area with the
greater value of corresponding pixels in the overlapping images.
Minimum Value Replaces each pixel in the overlap area with the
lesser value of the corresponding pixels in the overlapping images.
Average Value Replaces each pixel in the overlap area with the
average of the values of the corresponding pixels in the overlapping
images.
The color balancing settings let you choose between balancing by
brightness/contrast, histogram matching, or no color balancing. Select
Histogram Matching to adjust the input images to have similar
histograms as the top image in the view. Select None if you dont want
the pixel values adjusted.
80 Using Data Preparation
Mosaicking Images To mosaic an image, follow these steps:
1. Add the images you want to mosaic to your view.
2. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Mosaic Images to open the Mosaic Images dialog.
3. Arrange the images in the order that you want them in the mosaic using
the arrows to the right of the box below the Input Images field.
4. Select the method you want to use from the Handle Image Overlaps By
dropdown list.
5. Check the Automatically Crop Images By check box to automatically
crop images, and then type the percent by which to crop the images in
the Percent field.
6. Click a button for the color balancing in the Color Balance By box.
7. Check the Use Extent from Analysis Options check box if you want
to use the extent you set in the Options dialog.
8. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
9. Click OK to mosaic the images and close the Mosaic Images dialog.
For more information on mosaicking images, see Quick-Start
Tutorial on page 11.
6
5
9
4
7
8
Using Data Preparation 81
Reproject Image Use Reproject Image to reproject raster image data from one map
projection to another. The new projection is specified in the Reproject
Image dialog.
Optionally, you can specify an output coordinate system using the
Spatial Reference Properties dialog. You can display this dialog by
clicking the button adjacent to the Output Coordinate System field.
However, if you do not want to specify an output coordinate system,
leave the field blank.
Reprojecting an Image To reproject an image, follow these steps:
1. Click the Add Data button and add the image you want to reproject
to the view.
2. Click the Image Analysis dropdown arrow, point to Data Preparation,
and then select Reproject Image to open the Reproject Image dialog.
3. Select the file you want to use from the Input Image dropdown list, or
click the browse button and navigate to the directory where it is stored.
4. Click the button for the Output Coordinate System field to open the
Spatial Reference Properties dialog.
5. Specify your coordinate system settings, and then click OK to return to
the Reproject Image dialog.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Reproject Image dialog.
3
6
4
7
82 Using Data Preparation
83 Performing Spatial Enhancement
Performing Spatial Enhancement 83
Performing Spatial Enhancement
Spatial enhancement is a function that enhances an image using the
values of individual and surrounding pixels. Spatial enhancement deals
largely with spatial frequency, which is the difference between the
highest and lowest values of a contiguous set of pixels. Jensen (1986)
defines spatial frequency as the number of changes in brightness
value per unit distance for any part of an image.
There are three types of spatial frequency:
Zero Spatial Frequency A flat image in which every pixel has the
same value.
Low Spatial Frequency An image consisting of a smoothly
varying gray scale.
High Spatial Frequency An image consisting of drastically
changing pixel values such as a checkerboard of black-and-white
pixels.
Spatial enhancement has functions for convolution filtering, non-
directional edge, focal analysis, and resolution merge to enhance your
images.
This chapter focuses on the explanation of these features as well as
how to apply them to your data. It is organized according to the order in
which the spatial enhancement tools appear. You can skip ahead if the
information you seek is about one of the tools near the end of the list.
IN THIS CHAPTER
Convolution
Non-Directional Edge
Focal Analysis
Resolution Merge
84 Performing Spatial Enhancement
Convolution Convolution filtering is the process of averaging small sets of pixels
across an image. Convolution filtering is used to change the spatial
frequency characteristics of an image (Jensen 1996). The word filtering
is a broad term that refers to the altering of spatial or spectral features
for image enhancement (Jensen 1996). Convolution filtering is one
method of spatial filtering. Some texts use the terms synonymously.
A convolution kernel is a matrix of numbers used to average the value
of each pixel with the values of surrounding pixels. The numbers in the
matrix weigh this average toward particular pixels. These numbers are
often called coefficients because they are used as such in the
mathematical equations.
Applying convolution
Filtering
Apply convolution filtering by clicking the Image Analysis dropdown
arrow and selecting Convolution from the Spatial Enhancement menu.
Convolution Example You can understand how one pixel is convolved by imagining that the
convolution kernel is overlaid on the data file values of the image (in one
band) so that the pixel being convolved is in the center of the window.
2 8 6 6 6
2 8 6 6 6
2 2 8 6 6
2 2 2 8 6
2 2 2 2 8
Kernel
-1 -1 -1
-1 16 -1
-1 -1 -1
Data
Performing Spatial Enhancement 85
Compute the output value for this pixel by multiplying each value in the
convolution kernel by the image pixel value that corresponds to it.
These products are summed, and the total is divided by the sum of the
values in the kernel, as shown in this equation:
integer [((-1 8) + (-1 6) + (-1 6) +
(-1 2) + (16 8) + (-1 6) +
(-1 2) + (-1 2) + (-1 8))/
: (-1 + -1 + -1 + -1 + 16 + -1 + -1 + -1 + -1)]
= int [(128-40) / (16-8)]
= int (88 / 8) = int (11) = 11
When the 2 x 2 set of pixels near the center of this 5 x 5 image is
convolved, the output values are:
The kernel used in this example is a high-frequency kernel. The
relatively lower values become lower, and the higher values become
higher, thus increasing the spatial frequency of the image.
Convolution Formula The following formula is used to derive an output data file value for the
pixel being convolved (in the center):
Where:
f
ij
= The coefficient of a convolution kernel at position i,j (in the
kernel)
d
ij
= The data value of the pixel that corresponds to f
ij

q= The dimension of the kernel, assuming a square kernel
(if q =3, the kernel is 3 x 3)
F = Either the sum of the coefficients of the kernel, or 1 if the
sum of coefficients is 0
V = The output pixel value
1 2 3 4 5
1 - - - - -
2 - 11 5 - -
3 - 0 11 - -
4 - - - - -
5 - - - - -
V
f
ij
d
ij
j 1 =
q

\ .
|
|
| |
i 1 =
q

F
--------------------------------------------- =
86 Performing Spatial Enhancement
Source: Modified from Jensen 1996; Schowengerdt 1983
The sum of the coefficients (F) is used as the denominator of the
equation above so that the output values are in relatively the same
range as the input values. Because F cannot equal 0 (division by 0 is
not defined), F is set to 1 if the sum is 0.
Zero Sum Kernels Zero sum kernels are kernels in which the sum of all coefficients in the
kernel equals 0. When a zero sum kernel is used, the sum of the
coefficients is not used in the convolution equation, as shown above. In
this case, no division is performed (F = 1), because division by 0 is not
defined.
This generally causes the output values to be:
Zero in areas where all input values are equal (no edges)
Low in areas of low spatial frequency
Extreme in areas of high spatial frequency (high values become
much higher, low values become much lower)
Therefore, a zero sum kernel is an edge detector, which usually
smooths out or zeros out areas of low spatial frequency and creates a
sharp contrast where spatial frequency is high, which is at the edges
between homogeneous (homogeneity is low spatial frequency) groups
of pixels. The resulting image often consists of only edges and zeros.
Zero sum kernels can be biased to detect edges in a particular direction.
For example, this 3 x 3 kernel is biased to the south (Jensen 1996).
High-Frequency Kernels A high-frequency kernel, or high-pass kernel, has the effect of
increasing spatial frequency.
High-frequency kernels serve as edge enhancers because they bring
out the edges between homogeneous groups of pixels. Unlike edge
detectors (such as zero sum kernels), they highlight edges and do not
necessarily eliminate other features.
-1 -1 -1
1 -2 1
1 1 1
-1 -1 -1
-1 16 -1
-1 -1 -1
Performing Spatial Enhancement 87
When a high-frequency kernel is used on a set of pixels in which a
relatively low value is surrounded by higher values, like this....
...the low value gets lower. Inversely, when the high-frequency kernel is
used on a set of pixels in which a relatively high value is surrounded by
lower values...
...the high value becomes higher. In either case, spatial frequency is
increased by this kernel.
Low-Frequency Kernels Below is an example of a low-frequency kernel, or low-pass kernel,
which decreases spatial frequency.
This kernel averages the values of the pixels, causing them to be more
homogeneous. The resulting image looks either more smooth or more
blurred.
Figure 18: Convolution with High-Pass Filtering
BEFORE AFTER
204 200 197 - - -
201 106 209 - 10 -
198 200 210 - - -
BEFORE AFTER
64 60 57 - - -
61 125 69 - 188 -
58 60 70 - - -
1 1 1
1 1 1
1 1 1
88 Performing Spatial Enhancement
Applying Convolution To apply convolution, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spatial
Enhancement, and then select Convolution to open the Convolution
dialog.
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Select a kernel to use from the Kernel dropdown list.
4. Click either the Reflection or Background Fill button to specify the
way to handle edges in the image. See Using Convolution for more
information.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
Convolution with High Pass
3
4
5
6
2
Performing Spatial Enhancement 89
6. Click OK to close the Convolution dialog.
Using Convolution
Convolution lets you perform image-enhancement operations such as
averaging and high-pass or low-pass filtering.
Reflection fills in the area beyond the edge of the image with a reflection
of the values at the edge. Background Fill uses zeros to fill in the kernel
area beyond the edge of the image.
Each data file value of the new output file is calculated by centering the
kernel over a pixel and multiplying the original values of the center pixel
and the appropriate surrounding pixels by the corresponding
coefficients from the matrix. These numbers are summed and then
divided by the sum of the coefficients to ensure the output values are
within the general range of the input values. If the sum is zero, the
division is not performed.
Non-Directional
Edge
The non-directional edge function averages the results of two
orthogonal first derivative edge detectors, using the Sobel and Prewitt
filters. The filters are based on a calculation of the 1st derivative, or
slope, in both the X and Y directions. Both use orthogonal kernels
convolved separately with the original image and then combined.
The non-directional edge function is based on the Sobel zero sum
convolution kernel. Most of the standard image processing filters are
implemented as a single-pass moving window (kernel) convolution.
Examples include low-pass, edge-enhance, edge-detection, and
summary filters.
For this model, a Sobel filter is used. To convert this model to the
Prewitt filter calculation, change the kernels according to the example
below.
1 0 1
2 0 2
1 0 1
Vertical
1 2 1
0 0 0
1 2 1
Horizontal
Sobel=
1 0 1
1 0 1
1 0 1
Vertical
1 1 1
0 0 0
1 1 1
Horizontal
Prewitt=
90 Performing Spatial Enhancement
Figure 19: Image of Seattle before Non-Directional Edge
Figure 20: Image of Seattle after Non-Directional Edge
Performing Spatial Enhancement 91
Applying Non-Directional
Edge
To apply non-directional edge, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spatial
Enhancement, and then select Non-Directional Edge to open the
Non-Directional Edge dialog.
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Click either the Sobel or Prewitt button to specify the filter to use.
4. Click either the Reflection or Background Fill button to specify the
way to handle edges in the image. For more information, see Using
Non-Directional Edge.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Non-Directional Edge dialog.
Using Non-Directional Edge
In step 4 in the previous section, Reflection fills in the area beyond the
edge of the image with a reflection of the values at the edge.
Background Fill uses zeros to fill in the kernel area beyond the edge of
the image.
3
4
5
6
2
92 Performing Spatial Enhancement
Focal Analysis The focal analysis function lets you perform one of several types of
analysis on class values in an image file using a process similar to
convolution filtering.
This model (Median Filter) is useful for reducing noise such as random
spikes in data sets, dead sensor striping, and other impulse
imperfections in any type of image. It is also useful for enhancing
thematic images.
Focal analysis evaluates the region surrounding the pixel of interest
(center pixel). The operations you can perform on the pixel of interest
include:
Standard Deviation (measure of texture)
Sum
Mean (good for despeckling radar data)
Median (despeckle radar)
Min
Max
These functions let you select the size of the surrounding region to
evaluate by selecting the window size.
Figure 21: Image before Focal Analysis
Performing Spatial Enhancement 93
Figure 22: Image after Focal Analysis
Applying Focal Analysis To apply focal analysis, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spatial
Enhancement, and then select Focal to open the Focal Analysis
dialog.
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Select the function to use from the Focal Function dropdown list.
4. Select a shape from the Neighborhood Shape dropdown list.
5. Select a matrix size from the Neighborhood Definition - Matrix Size
dropdown list.
3
4
6
7
2
5
94 Performing Spatial Enhancement
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Focal Analysis dialog.
Using Focal Analysis
Focal analysis is similar to convolution in the process it uses. With focal
analysis, you can perform several different types of analysis on the pixel
values in an image file.
Resolution Merge The resolution of a specific sensor can refer to radiometric, spatial,
spectral, or temporal resolution. This function merges imagery of
differing spatial resolutions.
Landsat TM sensors have seven bands with a spatial resolution of 28.5
m. SPOT panchromatic has one broad band with a very good spatial
resolution of 10 m. Combining these two images to yield a seven-band
data set with 10 m resolution provides the best characteristics of both
sensors.
A number of models have been suggested to achieve this image merge:
Welch and Ehlers (1987) use forward-reverse RGB to IHS
transforms, replacing I (from transformed TM data) with the SPOT
panchromatic image. However, this technique is limited to three
bands (RGB).
Chavez (1991), among others, uses the forward-reverse principal
components transforms with the SPOT image, replacing PC-1.
In the above two techniques, it is assumed that the intensity component
(PC-1 or I) is spectrally equivalent to the SPOT panchromatic image,
and that all the spectral information is contained in the other PCs or in
H and S. Because SPOT data does not cover the full spectral range that
TM data does, this assumption does not strictly hold. It is unacceptable
to resample the thermal band (TM6) based on the visible (SPOT
panchromatic) image.
Another technique (Schowengerdt 1980) additively combines a high-
frequency image derived from the high-spatial resolution data (that is,
SPOT panchromatic) with the high-spectral resolution Landsat TM
image.
Brovey Transform The resolution merge function uses the Brovey Transform method of
resampling low spatial resolution data to a higher spatial resolution
while retaining spectral information.
In the Brovey Transform, three bands are used according to the
following formula:
Performing Spatial Enhancement 95
DNB1_new = [DNB1 / DNB1 + DNB2 + DNB3]
[DNhigh res. image]
DNB2_new = [DNB2 / DNB1 + DNB2 + DNB3]
[DNhigh res. image]
DNB3_new = [DNB3 / DNB1 + DNB2 + DNB3]
[DNhigh res. image]
Where:
B = band
The Brovey Transform was developed to visually increase contrast in
the low and high ends of an images histogram (that is, to provide
contrast in shadows, water, and high-reflectance areas such as urban
features). Brovey Transform is good for producing RGB images with a
higher degree of contrast in the low and high ends of the image
histogram and for producing visually appealing images.
Because the Brovey Transform is intended to produce RGB images,
you should merge only three bands at a time from the input
multispectral scene, such as bands 3, 2, 1 from a SPOT or Landsat TM
image or 4, 3, 2 from a Landsat TM image. You should display the
resulting merged image with bands 1, 2, 3 to RGB.
Applying Resolution
Merge
To apply resolution merge, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spatial
Enhancement, and then select Resolution Merge to open the
Resolution Merge dialog.
2. Select a file from the High Resolution Image dropdown list, or navigate
to the directory where the file is stored.
3. Select a file from the Multi-Spectral Image dropdown list, or navigate to
the directory where the file is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
3
4
2
5
96 Performing Spatial Enhancement
5. Click OK to close the Resolution Merge dialog.
Using Resolution Merge
Use resolution merge to integrate imagery of different spatial
resolutions (pixel size). The following images display the inputs and
result of the resolution merge function. The first image is a high-
resolution image, the second image is a multi-spectral image, and the
bottom image in a resolution merge.
Figure 23: High-Resolution, Multi-spectral, and Resolution Merge
Images
97 Using Radiometric Enhancement
Using Radiometric Enhancement 97
Using Radiometric Enhancement
Radiometric enhancement deals with the individual values of pixels in
an image. It differs from spatial enhancement, which takes into account
the values of neighboring pixels.
Radiometric enhancement contains functions to enhance your image
by using the values of individual pixels in each band. Depending on the
points and the bands in which they appear, radiometric enhancements
that are applied to one band might not be appropriate for other bands.
Therefore, the radiometric enhancement of a multiband image can
usually be considered as a series of independent, single-band
enhancements (Faust 1989).
IN THIS CHAPTER
LUT Stretch
Histogram Equalization
Histogram Matching
Brightness Inversion
98 Using Radiometric Enhancement
LUT Stretch LUT stretch creates an output image that contains the data values as
modified by a lookup table (LUT). The output is three bands.
Contrast Stretch When radiometric enhancements are performed on the display device,
the transformation of data file values into brightness values is illustrated
by the graph of a lookup table.
Contrast stretching involves taking a narrow input range and stretching
the output brightness values for those same pixels over a wider range.
This process is done using the Layer Properties dialog in ArcGIS.
Linear and Nonlinear The terms linear and nonlinear, when describing types of spectral
enhancement, refer to the function applied to data to perform the
enhancement. A piecewise linear stretch uses a polyline function to
increase contrast to varying degrees over different ranges of the data.
Linear Contrast Stretch A linear contrast stretch is a simple way to improve the visible contrast
of an image. It is often necessary to contrast stretch raw image data so
you can see it on the display device.
In most raw data, the data file values fall within a narrow rangeusually
a range much narrower than the display device is capable of displaying.
That range can be expanded to utilize the total range of the display
device (usually 0 to 255).
Nonlinear Contrast
Stretch
A nonlinear spectral enhancement gradually increases or decreases
contrast over a range, instead of applying the same amount of contrast
(slope) across the entire image. Usually, nonlinear enhancements bring
out the contrast in one range while decreasing the contrast in other
ranges.
Piecewise Linear
Contrast Stretch
A piecewise linear contrast stretch allows for the enhancement of a
specific portion of data by dividing the lookup table into three sections:
low, middle, and high. It lets you create a number of straight-line
segments that can simulate a curve. You can enhance the contrast or
brightness of any section in a single color gun at a time. This technique
is very useful for enhancing image areas in shadow or other areas of
low contrast.
A piecewise linear contrast stretch normally follows two rules:
1. The data values are continuous; there can be no break in the values
between high, middle, and low. Range specifications adjust in relation
to any changes to maintain the data value range.
2. The data values specified can go only in an upward, increasing
direction.
Using Radiometric Enhancement 99
The contrast value for each range represents a percentage of the
available output range that particular range occupies. Because rules 1
and 2 above are enforced, the contrast and brightness values may
affect the contrast and brightness of other ranges as they are changed.
For example, if the contrast of the low range increases, it forces the
contrast of the middle range to decrease.
Contrast Stretch on the
Display
Usually, a contrast stretch is performed on the display device only, so
that the data file values are not changed. Lookup tables are created that
convert the range of data file values to the maximum range of the
display device. You can then edit and save the contrast stretch values
and lookup tables as part of the raster data image file. These values are
loaded into view as the default display values the next time the image
is displayed.
The statistics in the image file contain the mean, standard deviation,
and other statistics on each band of data. The mean and standard
deviation are used to determine the range of data file values to translate
into brightness values or new data file values. You can specify the
number of standard deviations from the mean to use in the contrast
stretch. Usually the data file values that are two standard deviations
above and below the mean are used. If the data has a normal
distribution, this range represents approximately 95 percent of the data.
The mean and standard deviation are used instead of the minimum and
maximum data file values because the minimum and maximum data file
values do not usually represent most of the data. A notable exception
occurs when the feature being sought is in shadow. The shadow pixels
are usually at the low extreme of the data file values, outside the range
of two standard deviations from the mean.
Varying the Contrast
Stretch
There are variations of the contrast stretch you can use to change the
contrast of values over a specific range, or by a specific amount. By
manipulating the lookup tables as in the following illustration, you can
bring out the maximum contrast in the features of an image.
100 Using Radiometric Enhancement
This figure shows how the contrast stretch manipulates the histogram
of the data, increasing contrast in some areas and decreasing it in
others. This is also a good example of a piecewise linear contrast
stretch, which is created by adding breakpoints to the histogram.
Figure 24: Contrast Stretch
Applying a LUT Stretch To apply a LUT stretch, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select LUT Stretch to open the LUT Stretch
dialog.
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3
2
5
Using Radiometric Enhancement 101
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Set the output type to TIFF.
5. Click OK to close the LUT Stretch dialog.
Using LUT Stretch
LUT stretch provides a means of producing an output image that has
the stretch built into the pixel values to use with packages that have no
stretching capabilities.
Histogram
Equalization
Histogram equalization is a nonlinear stretch that redistributes pixel
values so that there is approximately the same number of pixels with
each value in a range. The result approximates a flat histogram.
Therefore, contrast is increased at the peaks of the histogram and
lessened at the tails.
Histogram equalization can also separate pixels into distinct groups if
there are few output values over a wide range. This can have the visual
effect of a crude classification.
Original Histogram
After Equalization
Peak
Tail
Pixels at peak are spread
apartcontrast is gained
Pixels at
tail are
grouped
contrast
is lost
102 Using Radiometric Enhancement
When performing a histogram equalization, the pixel values of an image
(either data file values or brightness values) are reassigned to a certain
number of bins, which are numbered sets of pixels. The pixels are then
given new values based on the bins to which they are assigned.
The total number of pixels is divided by the number of bins, equaling the
number of pixels per bin, as shown in the following equation:
Where:
N = Number of bins
T = The total number of pixels in the image
A = The equalized number of pixels per bin
The pixels of each input value are assigned to bins, so that the number
of pixels in each bin is as close to A as possible. Consider the following:
There are 240 pixels represented by this histogram. To equalize it to 10
bins, there would be:
240 pixels / 10 bins = 24 pixels per bin = A
A
T
N
---- =
0 1 2 3 4 5 6 7 8 9
5 5
10
15
60 60
40
30
10
5
N
u
m
b
e
r

o
f

P
i
x
e
l
s
Data File Values
A = 24
Using Radiometric Enhancement 103
The following equation is used to assign pixels to bins:
Where:
A =Equalized number of pixels per bin (see above)
H
i
=The number of values with the value i (histogram)
int=Integer function (truncating real numbers to integer)
B
i
=Bin number for pixels with value i
Source: Modified from Gonzalez and Wintz 1977
The 10 bins are rescaled to the range 0 to M. In this example, M = 9
because the input values ranged from 0 to 9 so that the equalized
histogram can be compared to the original. The output histogram of this
equalized image looks like the following illustration:
Effect on Contrast By comparing the original histogram of the example data with the last
one, you can see that the enhanced image gains contrast in the peaks
of the original histogram. For example, the input range of 3 to 7 is
stretched to the range 1 to 8. However, data values at the tails of the
original histogram are grouped together. Input values 0 through 2 all
have the output value of 0. So, contrast among the tail pixels, which
usually make up the darkest and brightest regions of the input image, is
lost.
B
i
int
H
k
k 1 =
i 1

\ .
|
|
| |
H
i
2
------ +
A
---------------------------------------- =
0 1 2 3 4 5 6 7 8 9
15
60 60
40
30
N
u
m
b
e
r

o
f

P
i
x
e
l
s
Output Data File Values
A = 24
20
15
0
1
2
3
4 5
7
8
9
6
Numbers inside bars are input data file values
0 0 0
104 Using Radiometric Enhancement
The resulting histogram is not exactly flat because pixels rarely group
together into bins with an equal number of pixels. Sets of pixels with the
same value are never split up to form equal bins.
Performing Histogram
Equalization
To perform histogram equalization, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select Histogram Equalization to open the
Histogram Equalization dialog.
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. Type the number of bins in the Number of Bins field.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Histogram Equalization dialog.
3
5
4
2
Using Radiometric Enhancement 105
Using Histogram Equalization
The histogram equalization process works by redistributing pixel values
so that there are approximately the same number of pixels with each
value within a range.
Histogram equalization can also separate pixels into distinct groups if
there are few output values over a wide range. This process can have
the effect of a crude classification.
Histogram
Matching
Histogram matching is the process of determining a lookup table that
converts the histogram of one image so that it resembles the histogram
of another. Histogram matching is useful for matching data of the same
or adjacent scenes that were collected on separate days, or are slightly
different because of sun angle or atmospheric effects. This is especially
useful for mosaicking or change detection.
The two input images should have similar characteristics to achieve
good results with histogram matching:
The general shape of the histogram curves should be similar.
Relative dark and light features in the image should be the same.
For some applications, the spatial resolution of the data should be
the same.
The relative distributions of land covers should be about the same,
even when matching scenes that are not of the same area.
106 Using Radiometric Enhancement
When matching the histograms, a lookup table is mathematically
derived, which serves as a function for converting one histogram to the
other as illustrated here.
Figure 25: Histogram Matching
Performing Histogram
Matching
To perform histogram matching, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select Histogram Match to open the
Histogram Match dialog.
Source histogram (A), mapped through the lookup table (B),
approximates model histogram (C).
F
r
e
q
u
e
n
c
y
Input
0 255
F
r
e
q
u
e
n
c
y
Input
0 255
F
r
e
q
u
e
n
c
y
Input
0 255
+
=
(A) (B)
(C)
3
4
2
Using Radiometric Enhancement 107
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. select the file you want to use from the Match Image dropdown list, or
navigate to the directory where it is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Histogram Match dialog.
Using Histogram Matching
Histogram matching mathematically determines a lookup table that
converts the histogram of one image to resemble the histogram of
another, and is particularly useful for mosaicking images or change
detection.
Perform histogram matching when using matching data of the same or
adjacent scenes that were gathered on different days and have
differences due to the angle of the sun or atmospheric effects.
Brightness
Inversion
The brightness inversion function produces images that have the
opposite contrast of the original image. Dark detail becomes light, and
light detail becomes dark.
Inverse emphasizes detail that might be lost in the darkness of low DN
pixels. This function applies the following algorithm:
DN
out
= 1.0 if 0.0 < DN
in
< 0.1
DN
out
=
Figure 26: Image before Brightness Inversion
0.1
DN
in
if 0.1 < DN < 1
108 Using Radiometric Enhancement
Figure 27: Image after Brightness Inversion
Using Radiometric Enhancement 109
Applying Brightness
Inversion
To apply brightness inversion, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select Brightness Inversion to open the
Brightness Inversion dialog.
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the Brightness Inversion dialog.
3
4
2
110 Using Radiometric Enhancement
111 Applying Spectral Enhancement
Applying Spectral Enhancement 111
Applying Spectral Enhancement
Spectral enhancement enhances images by transforming the values of
each pixel on a multiband basis. All of the techniques in this chapter
require more than one band of data. You can use them to:
Extract new bands of data that are more interpretable to the eye
Apply mathematical transforms and algorithms
Display a wider variety of information in the three available color
guns
You can use the features of spectral enhancement to study patterns
that can occur with deforestation or crop rotation and to see images in
a more natural state. You can also use spectral enhancement to view
images in different ways, such as changing the bands in an image from
red, green, and blue (RGB) to intensity, hue, and saturation (IHS).
IN THIS CHAPTER
RGB to IHS
IHS to RGB
Vegetative Indices
Color IR to Natural Color
112 Applying Spectral Enhancement
RGB to IHS The color monitors used for image display on image processing
systems have three color guns that correspond to RGB, the additive
primary colors. When displaying three bands of a multiband data set,
the viewed image is considered to be in RGB space.
However, it is possible to define an alternate color space that uses IHS
as the three positioned parameters (in lieu of RGB). This system is
advantageous in that it presents colors more closely as perceived by
the human eye:
Intensity Refers to the overall brightness of the scene (like PC-1)
and varies from 0 (black) to 1 (white).
Saturation Represents the purity of color and also varies linearly
from 0 to 1.
Hue Represents the color or dominant wavelength of the pixel. It
varies from 0 at the red midpoint through green and blue back to the
red midpoint at 360. It is a circular dimension. In the following
image, 0 to 255 is the selected range; it can be defined as any data
range. However, hue must vary from 0 to 360 to define the entire
sphere (Buchanan 1979).
Figure 28: Variance of Intensity and Hue in RGB to IHS
Saturation
Hue
I
n
t
e
n
s
i
t
y
Applying Spectral Enhancement 113
The following algorithm was used in the Image Analysis for ArcGIS
RGB to IHS transform (Conrac 1980):
Where:
R, G, B= Are each in the range of 0 to 1.0
r, g, b=Are each in the range of 0 to 1.0
M= Largest value, r, g, or b
m= Least value, r, g, or b
At least one of the R, G, or B values is 0, corresponding to the color with
the largest value, and at least one of the R, G, or B values is 1,
corresponding to the color with the least value.
The equation for calculating intensity in the range of 0 to 1.0 is:
The equations for calculating saturation in the range of 0 to 1.0 are:
If M = m, S = 0
If I 0.5,
If I > 0.5,
R
M r
M m
--------------- =
G
M g
M m
--------------- =
B
M b
M m
--------------- =
I
M m +
2
---------------- =
S
M m
M m +
---------------- =
S
M m
2 M m
------------------------ =
114 Applying Spectral Enhancement
The equations for calculating hue in the range of 0 to 360 are:
If M = m, H = 0
If R = M, H = 60 (2 + b - g)
If G = M, H = 60 (4 + r - b)
If B = M, H = 60 (6 + g - r)
Where:
R, G, B =Are each in the range of 0 to 1.0
M = Largest value, R, G, or B
m = Least value, R, G, or B
Converting RGB to IHS To convert RGB to IHS, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spectral
Enhancement, and then select RGB to IHS to open the RGB to IHS
dialog.
2. Type the name of the input image in the Input Image field, or navigate
to the directory where it is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the RGB to IHS dialog.
Using RGB to IHS
Using RGB to IHS applies an algorithm that transforms RGB values to
IHS values.
4
2
3
Applying Spectral Enhancement 115
IHS to RGB IHS to RGB is intended as a complement to the standard RGB to IHS
transform. In the IHS to RGB algorithm, a min-max stretch is applied to
either intensity (I), saturation (S), or both, so that they more fully utilize
the 0 to 1 value range. The values for hue (H), a circular dimension, are
0 to 360. However, depending on the dynamic range of the DN values
of the input image, it is possible that I or S or both occupy only a part of
the 0 to 1 range. In this model, a min-max stretch is applied to either I,
S, or both, so that they more fully utilize the 0 to 1 value range. After
stretching, the full IHS image is retransformed back to the original RGB
space. As the parameter hue is not modified, it largely defines what we
perceive as color, and the resulting image looks very much like the input
image.
It is not essential that the input parameters (IHS) to this transform are
derived from an RGB to IHS transform. You can define I or S as other
parameters, set hue at 0 to 360, and then transform to RGB space. This
is a method of color coding other data sets.
In another approach (Daily 1983), H and I are replaced by low- and
high-frequency radar imagery. You can also replace I with radar
intensity before the IHS to RGB transform (Holcomb 1993). Chavez
evaluates the use of the IHS to RGB transform to resolution merge
Landsat TM with SPOT panchromatic imagery (Chavez 1991).
The algorithm used by Image Analysis for ArcGIS for the IHS to RGB
function is (Conrac 1980):
Given: H in the range of 0 to 360; I and S in the range of 0 to 1.0
If I 0.5,
If I > 0.5,
The equations for calculating R in the range of 0 to 1.0 are:
If H < 60,
If 60 H < 180,
If 180 H < 240,
If 240 H 360,
M I 1 S + ( ) =
M I S I S ( ) + =
m 2 1 M =
R m M m ( )
H
60
------
\ .
| |
+ =
R M =
R m M m ( )
240 H
60
-------------------
\ .
| |
+ =
R m =
116 Applying Spectral Enhancement
The equations for calculating G in the range of 0 to 1.0 are:
If H < 120,
If 120 H < 180,
If 180 H < 300,
If 300 H 360,
The equations for calculating B in the range of 0 to 1.0 are:
If H < 60,
If 60 H < 120,
If 120 H < 240,
If 240 H < 300,
If 300 H 360,
Converting IHS to RGB To convert IHS to RGB, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spectral
Enhancement, and then select IHS to RGB to open the IHS to RGB
dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the IHS to RGB dialog.
Using IHS to RGB
Using IHS to RGB applies an algorithm that transforms IHS values to
RGB values.
G m =
G m M m ( )
H 120
60
-------------------
\ .
| |
+ =
G M =
G m M m ( )
360 H
60
-------------------
\ .
| |
+ =
B M =
B m M m ( )
120 H
60
-------------------
\ .
| |
+ =
B M =
B m M m ( )
H 240
60
-------------------
\ .
| |
+ =
B M =
4
2
3
Applying Spectral Enhancement 117
Vegetative Indices Mapping vegetation is a common application of remotely sensed
imagery. To help you find vegetation quickly and easily, Image Analysis
for ArcGIS includes a vegetative indices feature.
Indices are used to create output images by mathematically combining
the DN values of different bands. These may be simplistic:
(Band X - Band Y)
Or more complex:
In many instances, the indices are ratios of band DN values:
These ratio images are derived from the absorption/reflection spectra of
the material of interest. The absorption is based on the molecular bonds
in the (surface) material. Thus, the ratio often gives information on the
chemical composition of the target.
Applications The main applications of vegetative analysis are as follows:
Indices are used extensively in vegetation analysis to bring out
small differences between various vegetation classes. Often,
judiciously chosen indices can highlight and enhance differences
that you cannot observe in the display of the original color bands.
Indices are also used to minimize shadow effects in satellite and
aircraft multispectral images. You can generate black-and-white
images of individual indices, or a color combination of three ratios.
Examples The following are examples of indices that are preprogrammed in
Image Analysis for ArcGIS:
IR/R (infrared/red)
SQRT (IR/R)
Vegetation Index = IR-R
Normalized Difference Vegetation Index (NDVI) =
Transformed NDVI (TNDVI) =
Source: Modified from Sabins 1987; Jensen 1996; Tucker 1979
The following table shows the infrared (IR) and red (R) band for some
common sensors (Tucker 1979, Jensen 1996):
BandX BandY
BandX BandY +
------------------------------------------
BandX
BandY
-----------------
IR R
IR R +
----------------
IR R
IR R +
---------------- 0.5 +
118 Applying Spectral Enhancement
Image Algebra Image algebra is a general term used to describe operations that
combine the pixels of two or more raster layers in mathematical
combinations. For example, the calculation:
(infrared band) - (red band)
DN
ir
- DN
red
yields a simple, yet very useful, measure of the presence of vegetation.
Band ratios are also commonly used. These are derived from the
absorption spectra of the material of interest. The numerator is a
baseline of background absorption, and the denominator is an
absorption peak. NDVI is a combination of addition, subtraction, and
division.
NDVI =
Table 5: IR and R Bands of Common Sensors
Sensor IR Band R Band
Landsat MSS 4 2
SPOT XS 3 2
Landsat TM 4 3
NOAA AVHRR 2 1
IR R
IR R +
----------------
Applying Spectral Enhancement 119
Applying Vegetative
Indices
To apply vegetative indices, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spectral
Enhancement, and then select Vegetative Indices to open the
Vegetative Indices dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select an appropriate layer from the Near Infrared Band dropdown list.
4. Select an appropriate layer from the Visible Red Band dropdown list.
5. Select an appropriate index from the Desired Index dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Vegetative Indices dialog.
2
7
3
4
6
5
120 Applying Spectral Enhancement
Color IR to Natural
Color
This function lets you simulate natural colors from the bands of data
from an infrared image so that the output is a fair approximation of a
natural color image. You cannot apply this feature to images having
only one band of data (for example, grayscale images). Its for use when
you have data only from the near infrared, visible red, and visible green
segments of the spectrum.
When an image is displayed in natural color, the bands are arranged to
approximate the most natural representation of the image in the real
world. Vegetation becomes green in color, and water becomes dark in
color. Certain bands of data are assigned to the red, green, and blue
color guns of your computer monitor to create natural color.
Figure 29: Infrared Image of a Golf Course
Figure 30: Natural Colors after Color IR to Natural Color
Applying Spectral Enhancement 121
Changing Color IR to
Natural Color
To change Color IR to Natural Color, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Spectral
Enhancement, and then select Color IR to Natural Color to open the
Color IR to Natural Color dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select the appropriate band from the Near Infrared Band dropdown list.
4. Select the appropriate band from the Visible Red Band dropdown list.
5. Select the appropriate band from the Visible Green Band dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Color IR to Natural Color dialog.
2
4
3
7
6
5
122 Applying Spectral Enhancement
123 Performing GIS Analysis
Performing GIS Analysis 123
Performing GIS Analysis
A geographic information system (GIS) is a unique system designed to
enter, store, retrieve, manipulate, and analyze layers of geographic
data to produce interpretable information. A GIS should also create
reports and maps (Marble 1990). The GIS database might include
images, maps, statistical data, or any other data needed in a study. This
chapter is about using the different analysis functions in Image Analysis
for ArcGIS to better use the images, data, maps, and so on located in a
GIS.
The central purpose of a GIS is to turn geographic data into useful
informationthe answers to real-life questionssuch as:
How to redraw political districts in a growing metropolitan area?
How to monitor the influence of global climatic changes on the
Earths resources?
What areas to protect to ensure the survival of endangered
species?
Although the term GIS is commonly used to describe software
packages, a true GIS includes knowledgeable staff, a training program,
budgets, marketing, hardware, data, and software (Walker and Miller
1990). You can use GIS technology in almost any geography-related
discipline, from landscape architecture to natural resource
management to transportation routing.
IN THIS CHAPTER
Information Versus Data
Neighborhood Analysis
Thematic Change
Summarize Areas
Recode
124 Performing GIS Analysis
Information Versus
Data
Information, as opposed to data, is independently meaningful. It is
relevant to a particular problem or question:
Data The land cover at coordinate N875250, E757261 has a data
file value of 8.
Information Land cover with a value of 8 are on slopes too steep
for development.
You can enter data into a GIS and produce information. The information
you want to derive determines the type of data you must enter. For
example, if you are looking for a suitable refuge for bald eagles, zip
code data is probably not needed, while land cover data might be
useful.
For this reason, the first step in any GIS project is usually an
assessment of the scope and goals of the study. Once the project is
defined, you can begin the process of building the database. Although
software and data are commercially available, you must create a
custom database for the particular project and study area. Design the
database to meet the needs and objectives of the organization.
A major step in successful GIS implementation is analysis. In the
analysis phase, data layers are combined and manipulated to create
new layers and to extract meaningful information from them.
Once the database (layers and attribute data) is assembled, the layers
are analyzed and new information is extracted. Some information is
extracted by looking at the layers and visually comparing them to other
layers. However, you can retrieve new information by combining and
comparing layers.
Performing GIS Analysis 125
Neighborhood
Analysis
Neighborhood analysis applies to any image processing technique that
takes surrounding pixels into consideration, such as convolution
filtering and scanning. This is similar to the convolution filtering
performed on continuous data. You can perform several types of
analyses, such as boundary, density, mean, and sum.
With a process similar to the convolution filtering of continuous raster
layers, you can also filter thematic raster layers. The GIS filtering
process is sometimes referred to as scanning, but is not to be confused
with data capture via a digital camera. Neighborhood analysis is based
on local or neighborhood characteristics of the data (Star and Estes
1990).
Every pixel is analyzed spatially, according to the pixels that surround
it. The number and the location of the surrounding pixels are
determined by a scanning window, which is defined by you. These
operations are known as focal operations.
Neighborhood analysis creates a new thematic layer. There are several
types of analyses that you can perform on each window of pixels, as
described below:
Density Produces the number of pixels that have the same class
value as the center (analyzed) pixel. It also measures homogeneity
(sameness) based upon the analyzed pixel. This is often useful in
assessing vegetation crown closure.
Diversity Produces the number of class values that are in the
window. Diversity is also a measure of heterogeneity (difference).
Majority Produces the class value that represents the majority of
the class values in the window. This option operates like a low-
frequency filter to clean up a salt-and-pepper layer.
Maximum Produces the greatest class value in the window. You
can use this to emphasize classes with the higher class values or to
eliminate linear features or boundaries.
Minimum Produces the least or smallest class value in the
window. You can use this option to emphasize classes with low-
class values.
Minority Produces the least common of the class values in the
window. You can use this option to identify the least common
classes or to highlight disconnected linear features.
Rank Produces the number of pixels in the scan window whose
value is less than the center pixel.
126 Performing GIS Analysis
Sum Totals the class values. In a file where class values are
ranked, totaling lets you further rank pixels based on their proximity
to high-ranking pixels.
Performing
Neighborhood Analysis
To perform neighborhood analysis, follow these steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Neighborhood to open the Neighborhood Analysis dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select the function you want to use from the Neighborhood Function
dropdown list.
4. Select Rectangle from the Neighborhood Shape dropdown list.
5. Select the size you want to use from the Neighborhood Definition -
Matrix Size dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Neighborhood Analysis dialog.
2
3
4
7
5
6
Performing GIS Analysis 127
Thematic Change Thematic change identifies areas that have undergone change over
time. Typically, you use thematic change after performing a
classification of your data. By using the categorizations of Before
Theme and After Theme in the Thematic Change dialog, you can
quantify both the amount and the type of changes that take place over
time. Image Analysis for ArcGIS produces a thematic image that has all
the possible combinations of change.
Thematic change creates an output image from two input classified
raster files. The class values of the two input files are organized into a
matrix. The first input file specifies the columns of the matrix, and the
second one specifies the rows. Zero or background values are not
treated special in any way. The number of classes in the output file is
the product of the number of classes from the two input files. The
classes should be the same in each image and in the same order.
Figure 31: Input Image from 1973
Figure 32: Input Image from 1994
Figure 33: Thematic Change Result
128 Performing GIS Analysis
Performing Thematic
Change
To perform thematic change, follow these steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Thematic Change to open the Thematic Change dialog.
2. Click the browse button for the Before Theme field and navigate to the
directory where the before theme image is stored.
3. Click the browse button for the After Theme field and navigate to the
directory where the after theme image is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Thematic Change dialog.
Note: You must use the output from this exercise in the next exercise,
Summarize Areas on page 130. Please keep the image displayed in
your view if you are going on to that exercise.
2
5
3
4
Performing GIS Analysis 129
The following illustration is an example of the previous image after
undergoing thematic change. In the table of contents, you see the
combination of classes from the before and after images.
Figure 34: Image Showing Changes between 1973 and 1994
Summarize Areas Image Analysis for ArcGIS also provides summarize areas as a method
of assessing change in thematic data. Once you complete the thematic
change analysis, you can use summarize areas to limit the analysis to
only a portion of the image and derive quantitative information about
that area.
Summarize areas works by using a feature theme to compile
information about that area in tabular format. Summarize areas
produces cross-tabulation statistics that compare class value areas
between two thematic files using a single thematic change image, and
includes a number of points in common, number of acres (or hectares
or square miles) in common, and percentages.
An example of using summarize areas is to assist a regional planning
office in preparing a study of urban change for certain counties within
the jurisdiction or even within one county or city. A file containing a
polygonal boundary of the area to inventory is analyzed against a file for
the same geographical area containing the land cover categories. The
summary report indicates the amount of urban change in that particular
area of a larger thematic change.
130 Performing GIS Analysis
Applying Summarize
Areas
This exercise uses the output of the thematic change exercise in
Thematic Change on page 128 as the input. Display the thematic
change image in your view before starting this exercise.
To apply summarize areas, follow these steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Summarize Areas to open the Summarize Areas dialog.
2. Select the vector theme you want to use from the Zone Theme
dropdown list, or navigate to the directory where it is stored.
3. Select the attribute you want to summarize from the Zone Attribute
dropdown list.
4. Select the class theme from the Class Theme dropdown list, or navigate
to the directory where it is stored. This is the thematic theme you
generated in Thematic Change on page 128.
5. Click the browse button for the Summarize Results Table field to specify
a name for the new summarize areas table that is created.
6. Click OK to close the Summarize Areas dialog.
When the process completes, the resulting table is added to ArcMap.
7. Click the Source tab in the ArcMap table of contents to see the new
table.
4
3
2
5
Performing GIS Analysis 131
Recode Recoding involves assigning new values to one or more classes of an
existing file. Recoding is used to:
Reduce the number of classes
Combine classes
Assign different class values to existing classes
Write class name and color changes to the Attribute table
When an ordinal, ratio, or interval class numbering system is used, you
can use recoding to assign classes to appropriate values. Recoding is
often performed to make later steps easier. For example, in creating a
model that produces good, better, and best areas, it might be beneficial
to recode the input layers so all of the best classes have the highest
class values.
You can also use recode to save any changes made to the color
scheme or class names of a classified image to the Attribute table for
later use. Just saving an image does not record these changes.
Recoding an image involves two major steps:
1. First, you arrange the discrete classes into common groups.
2. Secondly, you perform the actual recoding process, which rewrites the
Attribute table using the information from your grouping process.
The three recoding methods described in the exercises that follow this
section are more accurately described as three methods of grouping
the classified image to get it ready for the recode process. These
methods are:
Recoding by class name
Recoding by symbology
Recoding a previously grouped image
132 Performing GIS Analysis
The following is a thematic image of South Carolina soil types before
recode by class name.
Figure 35: Thematic Image before Recode by Class Name
The following is a thematic image of South Carolina soil types after the
recode. The changed and grouped class names are listed in the table
of contents.
Figure 36: Thematic Image after Recode by Class Name
Performing GIS Analysis 133
Recoding by Class Name You must first group the classified image in the ArcMap table of
contents and then perform the recode.
To recode by class name, follow these steps:
1. Click the Add Data button and add a classified image to your view.
2. Identify the classes you want to group in the table of contents.
3. Right-click the image name and select Properties.
2
3
134 Performing GIS Analysis
The Layer Properties dialog opens.
4. Click the Symbology tab.
5. Click the Unique Values category.
6. Rename the classes in the Label column in the table that appears in the
middle of the tab so that all of the classes you group together have the
same class name.
7. Double-click the color bar in the Symbol column for each class and
change the colors to reflect the color scheme you want.
The image in your view should now look the way you want.
Note: The Attribute table is not updated with the new class names.
8. Click Apply, and then click OK to apply your changes and close the
Layer Properties dialog.
9. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode.
4
5
7
7
Performing GIS Analysis 135
The Recode dialog opens.
10. Type the name of the output image in the Output Image field, or click
the browse button and navigate to the directory where you want the
output image stored and type a name.
11. Click OK to close the Recode dialog.
Recoding by Symbology This process shows you how to recode by symbology. You see
similarities with recoding by class name, but be aware of some different
procedures.
To record by symbology, follow these steps:
1. Click the Add Data button and display a classified image in your
view.
2. Identify the classes you want to group together.
10
11
136 Performing GIS Analysis
3. Double-click the image name in the table of contents.
The Layer Properties dialog opens.
4. Click the Symbology tab.
5. Click Unique Values in the Show list on the left side of the tab to
expand the category.
3
6
4
5
10
Performing GIS Analysis 137
6. Press the Ctrl key while clicking the first set of classes you want to
group together.
7. Right-click the selected classes and select Group Values.
8. Click in the Label column and type a new name for the class.
9. Follow steps steps 5 - 8 to group the rest of your classes.
10. Click Apply and OK to close the Layer Properties dialog.
11. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode to open the Recode dialog.
12. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
13. Click OK to close the Recode dialog.
Recoding with
Previously Grouped
Image
You may need to open an image that has been classified and grouped
in another program such as ERDAS IMAGINE

. These images might


have more than one valid attribute column that you can use to perform
the recode.
To recode using a previously grouped image, follow these steps:
1. Click the Add Data button and add the grouped image to your view.
2. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode to open the Recode dialog.
3. Click the browse button for the Input Image field to navigate to the
directory where your image is located and select the image.
4. Select an attribute category, which must be an integer field, that you
want to use to recode the image from the Map Pixel Value through Field
dropdown list.
4
5
6
3
138 Performing GIS Analysis
5. Type the name of the output file in the Output Image field, or click the
browse button and navigate to the directory where you want the output
image stored and type a name.
6. Click OK to close the Recode dialog.
The following images depict soil data that was previously grouped in
ERDAS IMAGINE.
Figure 37: Soil Data Image before Recode
Figure 38: Soil Data Image after Recode
139 Using Utilities
Using Utilities 139
Using Utilities
At the core of Image Analysis for ArcGIS is its ability to interpret and
manipulate your data. The Utilities section of Image Analysis for ArcGIS
provides a number of features for you to use in this capacity. These
features let you alter your images to see differences, set new
parameters, create images, or change the data type of your image.
IN THIS CHAPTER
Image Difference
Layer Stack
Rescale Image

140 Using Utilities
Image Difference Image difference gives you the ability to conveniently perform change
detection on aspects of an area by comparing two images of the same
place from different times.
The Image Difference tool is particularly useful in plotting environmental
changes such as urban sprawl and deforestation or the destruction
caused by a wildfire or tree disease. It is also a handy tool to use in
determining crop rotation or to identify a new neighborhood that needs
to be added to an existing GIS.
With image difference, you can highlight specific areas of change. The
following describes two images generated from image-to-image
comparison. One is a grayscale continuous image, and the other is a
five-class thematic image.
Image Difference Image This is the first image. It is a grayscale
image composed of single-band continuous data. This image is
created by subtracting the before image from the after image.
Because image difference calculates change in brightness values
over time, the difference image reflects that change using a
grayscale image. Brighter areas have increased in reflectance. This
might mean an area cleared of forest. Dark areas have decreased
in reflectance. This might mean an area with more vegetation, or an
area that was dry and is now wet.
Highlight Change Image This is the second image. It is a
thematic image that divides the changes into five categories:
Decreased, Some Decrease, Unchanged, Some Increase, and
Increased. Decreased represents areas of negative (darker)
change greater than the threshold for change and is red in color.
Increased shows areas of positive (brighter) change greater than
the threshold and is green in color. Other areas of positive and
negative change less than the thresholds and areas of no change
are transparent. You can change the colors for your application.
Using Utilities 141
Figure 39: Image Difference File
Figure 40: Highlight Change File
142 Using Utilities
Using Image Difference To use image difference, follow these steps:
1. Add the two files you want to compare to your view.
2. Click the Image Analysis dropdown arrow, point to Utilities, and then
select Image Difference to open the Image Difference dialog.
3. Click the browse button for the Before Theme field and navigate to the
directory where the file is stored.
4. Select a layer from the Before Layer dropdown list.
5. Select the file you want to use from the After Theme dropdown list, or
navigate to the directory where it is stored.
6. Click the browse button for the After Layer field and navigate to the
directory where the layer you want to use is stored.
7. Click either the As Percent or As Value button in the Highlight
Changes box.
8. Type values in the Increases More Than and Decreases More Than
fields.
9. Click a color bar and select the color you want to represent the
increases and decreases.
10. Click the browse button for the Image Difference File field and navigate
to the directory where you want the output image stored.
11. Click the browse button for the Highlight Change File field and navigate
to the directory where you want it stored.
4
8
10
5
11
12
3
6
9
7
Using Utilities 143
12. Click OK to close the Image Difference dialog.
Layer Stack Layer stack lets you stack layers from different images in any order to
form a single theme. It is useful for combining different types of imagery
for analysis such as multispectral and radar data. For example, if you
stack three single-band grayscale images, you finish with one three-
band image. In general, stacking images is most useful for combining
grayscale single-band images into multiband images.
There are several applications of layer stack, such as change
visualization, combining and viewing multiple resolution data, and
viewing disparate data types. It is particularly useful if you receive a
multispectral dataset with each of the individual bands in separate files.
You can also use layer stack to analyze datasets taken during different
seasons when different sets show different stages for vegetation in an
area.
An example of a multispectral dataset with individual bands in separate
files is Landsat TM data. Layer stack quickly consolidates the bands of
data into one file.
The image on this page is an example of a layer stack output. The files
used are from the Amazon, and the red and blue bands are from one
image, while the green band is from the other. Bands 1 and 3 are taken
from the Amazon LBAND image, and the remaining layers taken from
Amazon TM.
Figure 41: Stacked Image
144 Using Utilities
Using Layer Stack To use layer stack, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Utilities, and then
select Layer Stack to open the Layer Stack dialog.
2. Click the browse button for the Input Images field and navigate to a
directory where the input images are stored.
3. Check the check boxes for the layers of the input images you want to
use.
4. Click the up and down arrows to the right of the Selected Layers box if
you want to reorder the layers.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Layer Stack dialog.
2
3
5
4
6
Using Utilities 145
Rescale Image Rescale image has a tool that lets you rescale the pixel values of an
input image into a new user-specified range for the specified output
image. It lets you change the data type of the output image. For
example, you can rescale a 16-bit image with an original min-max of 0-
65535 into an 8-bit image of 0-255 and vice versa. You can access the
Rescale tool by selecting Utilities/Rescale Image from the Image
Analysis dropdown list.
During conversion, the Rescale tool recomputes statistics for the input
image and scales the minimum and maximum values obtained into the
specified output values. A user-specified NoData value is also
assigned. If a NoData value is not required, N/A is specified for the
NoData value in the user interface (this is the same approach defined
for the ImageInfo tool).
You can use the Rescale tool by selecting an input image, and then
selecting the output data type. This populates the New Min and New
Max values to the minimum and maximum values appropriate for the
selected data type. Define the output file name and click OK.
Supported output data types are:
Unsigned 1 bit
Unsigned 2 bit
Unsigned 4 bit
Unsigned 8 bit
Signed 8 bit
Unsigned 16 bit
Signed 16 bit
Unsigned 32 bit
Signed 32 bit
Optionally, you can select a NoData value. However, if you do not want
to set a NoData value, leave it as N/A or blank in which case any
previously defined NoData value is not transferred to the output image.
A typical use for the Rescale tool is to rescale an unsigned 8-bit dataset
with valid values ranging from 0-255 into a range of 1-255, and at the
same time, set the NoData value to 0. This provides a means of
allocating a NoData value that does not interfere with valid data values.
One disadvantage of this technique for setting NoData is that pixel
values are altered, which might not be good for certain applications
such as classification.
146 Using Utilities
Using Rescale Image To use rescale image, follow these steps:
1. Add the image you want to rescale to your view.
2. Click the Image Analysis dropdown arrow, point to Utilities, and then
select Rescale Image to open the Rescale Image dialog.
3. Select an input image from the Input Image dropdown list, or navigate
to the directory where it is stored.
4. Select an output type from the Output Type dropdown list.
5. Type an output minimum and maximum in the Output Min and Output
Max fields.
6. Type a value in the NoData Value field if you want to specify a NoData
value.
7. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
8. Click OK to close the Rescale Image dialog.
4
3
6
8
5
7
147 Understanding Classification
Understanding Classification 147
Understanding Classification
Multispectral classification is the process of sorting pixels into a finite
number of individual classes, or categories of data, based on their data
file values. If a pixel satisfies a certain set of criteria, the pixel is
assigned to the class that corresponds to that criteria.
Depending on the type of information you want to extract from the
original data, classes can be associated with known features on the
ground or represent areas that look different to the computer. An
example of a classified image is a land cover map that shows
vegetation, bare land, pasture, urban, and so on.
This chapter covers the two ways to classify pixels into different
categories:
Unsupervised Classification
Supervised Classification
Supervised classification is more closely controlled by you than
unsupervised classification.
IN THIS CHAPTER
The Classification Process
Classification Tips
Unsupervised Classification
Supervised Classification
Classification Decision Rules
148 Understanding Classification
The Classification
Process
Pattern recognition is the scienceand artof finding meaningful
patterns in data, which you can extract through classification. By
spatially and spectrally enhancing an image, pattern recognition is
performed with the human eye; the human brain automatically sorts
certain textures and colors into categories.
In a computer system, spectral pattern recognition is more scientific.
Statistics are derived from the spectral characteristics of all pixels in an
image. However, in supervised classification, the statistics are derived
from the training samples, and not the entire image. After the statistics
are derived, pixels are sorted based on mathematical criteria. The
classification process breaks down into two parts: training and
classifying (using a decision rule).
Training First, the computer system must be trained to recognize patterns in the
data. Training is the process of defining the criteria by which these
patterns are recognized (Hord 1982). You can perform training with
either a supervised or an unsupervised method, as explained in the
sections that follow.
Unsupervised Training Unsupervised training is more computer-automated. It lets you specify
some parameters that the computer uses to uncover statistical patterns
that are inherent in the data. These patterns do not necessarily
correspond to directly meaningful characteristics of the scene, such as
contiguous, easily recognized areas of a particular soil type or land use.
They are basically clusters of pixels with similar spectral characteristics.
In some cases, it might be more important to identify groups of pixels
with similar spectral characteristics than it is to sort pixels into
recognizable categories.
Unsupervised training is dependent upon the data itself for the definition
of classes. This method is usually used when less is known about the
data before classification. It is then the analysts responsibility, after
classification, to attach meaning to the resulting classes (Jensen 1996).
Unsupervised classification is useful only if you can appropriately
interpret the classes.
Supervised Training Supervised training is closely controlled by the analyst. In this process,
select pixels that represent patterns or land cover features that you
recognize, or that you can identify with help from other sources, such as
aerial photos, ground truth data, or maps. Knowledge of the data, and
of the classes you want, is required before classification.
By identifying patterns and building a signature file, you can instruct the
computer system to identify pixels with similar characteristics. If the
classification is accurate, the resulting classes represent the categories
in the data that you originally identified.
Understanding Classification 149
Signatures The result of training is a set of signatures that defines a training sample
or cluster. Each signature corresponds to a class, and is used with a
decision rule (explained below) to assign the pixels in the image file to
a class. Signatures contain both parametric class definitions (mean and
covariance) and non-parametric class definitions (parallelepiped
boundaries that are the per band minima and maxima).
A parametric signature is based on statistical parameters (for example,
mean and covariance matrix) of the pixels that are in the training sample
or cluster. Supervised and unsupervised training can generate
parametric signatures. You can use a set of parametric signatures to
train a statistically based classifier (that is, maximum likelihood) to
define the classes.
Decision Rule After the signatures are defined, the pixels of the image are sorted into
classes based on the signatures by use of a classification decision rule.
The decision rule is a mathematical algorithm that, using data contained
in the signature, performs the actual sorting of pixels into distinct class
values.
Parametric Decision Rule A parametric decision rule is trained by the parametric signatures.
These signatures are defined by the mean vector and covariance matrix
for the data file values of the pixels in the signatures. When a parametric
decision rule is used, every pixel is assigned to a class since the
parametric decision space is continuous (Kloer 1994). There are three
parametric decision rules offered:
Minimum Distance
Mahalanobis Distance
Maximum Likelihood
Nonparametric Decision
Rule
When a nonparametric rule is set, the pixel is tested against all
signatures with nonparametric definitions. This rule results in the
following conditions:
If the nonparametric test results in one unique class, the pixel is
assigned to that class.
If the nonparametric test results in zero classes (for example, the
pixel lies outside all the nonparametric decision boundaries), the
pixel is assigned to a class called Unclassified.
Parallelepiped is the only nonparametric decision rule in Image
Analysis for ArcGIS.
Classification Tips This section contains information about general classifications that can
help you in processing data.
150 Understanding Classification
Classification Scheme Usually, classification is performed with a set of target classes in mind.
Such a set is called a classification scheme (or classification system).
The purpose of such a scheme is to provide a framework for organizing
and categorizing the information that can be extracted from the data
(Jensen 1983). The proper classification scheme includes classes that
are both important to the study and discernible from the data on hand.
Most schemes have a hierarchical structure, which can describe a
study area in several levels of detail.
A number of classification schemes have been developed by specialists
who inventoried a geographic region. Some references for
professionally developed schemes are listed below:
Anderson, J. R., et al. 1976. A Land Use and Land Cover
Classification System for Use with Remote Sensor Data. U.S.
Geological Survey Professional Paper 964.
Cowardin, Lewis M., et al. 1979. Classification of Wetlands and
Deepwater Habitats of the United States. Washington, D.C.: U.S. Fish
and Wildlife Service.
Florida Topographic Bureau, Thematic Mapping Section. 1985.
Florida Land Use, Cover and Forms Classification System. Florida
Department of Transportation, Procedure No. 550-010-001-a.
Michigan Land Use Classification and Reference Committee. 1975.
Michigan Land Cover/Use Classification System. Lansing, Michigan:
State of Michigan Office of Land Use.
Other states or government agencies may also have specialized land
use or cover studies.
It is recommended that you begin the classification process by defining
a classification scheme for the application using previously developed
schemes like those above as a general framework.
Understanding Classification 151
Supervised versus
Unsupervised
Classification
In supervised training, it is important to have a set of classes you want
in mind, and then create the appropriate signatures from the data. You
must also have some way of recognizing pixels that represent the
classes you want to extract.
Supervised classification is usually appropriate when you are
identifying relatively few classes, when there are selected training sites
you can verify with ground truth data, or when you can identify distinct,
homogeneous regions that represent each class. In Image Analysis for
ArcGIS, choose supervised classification if you need to correctly
classify small areas with actual representation.
If you want the classes determined by spectral distinctions inherent in
the data so that you can define the classes later, the application is better
suited to unsupervised classification. This lets you define many classes
easily, and identify classes that are not in contiguous, easily recognized
regions.
If you have areas that have a value of zero, and you do not classify them
as NoData (see Applying Data Tools on page 51), they are
assigned to the first class when performing unsupervised classification.
You can assign a specific class by taking a training sample when
performing supervised classification.
Classifying Enhanced
Data
For many specialized applications, classifying data that has been
spectrally merged or enhancedwith principal components, image
algebra, or other transformationscan produce very specific and
meaningful results. However, without understanding the data and the
enhancements used, it is recommended that you classify only the
original, remotely sensed data.
Limiting Dimensions Although Image Analysis for ArcGIS lets you use an unlimited number
of layers of data for one classification, it is usually wise to reduce the
dimensionality of the data as much as possible. Often, certain layers of
data are redundant or extraneous to the task at hand. Unnecessary
data takes up valuable disk space, and causes the computer system to
perform more arduous calculations, which slows down processing.
152 Understanding Classification
Unsupervised
Classification
Unsupervised classification requires only minimal initial input from you.
However, you must interpret the classes that are created by the
unsupervised classification algorithm. Unsupervised classification is also
called clustering, because it is based on the natural groupings of pixels
in image data when they are plotted in feature space.
If you need to classify small areas with small representation, use
supervised classification. Due to the skip factor of 8 used by the
unsupervised classification signature collection, small areas such as
wetlands, small urban areas, or grasses can be wrongly classified on
rural data sets.
Clusters Clusters are defined with a clustering algorithm, which often uses all or
many of the pixels in the input data file for its analysis. The clustering
algorithm has no regard for the contiguity of the pixels that define each
cluster.
The Iterative Self-Organizing Data Analysis Technique (ISODATA)
(Tou and Gonzalez 1974) clustering method uses spectral distance as
in the sequential method. But it iteratively classifies the pixels, redefines
the criteria for each class, and classifies again, so that the spectral
distance patterns in the data gradually emerge.
ISODATA Clustering ISODATA is iterative in that it repeatedly performs an entire
classification (producing a thematic raster layer) and recalculates
statistics. Self-organizing refers to the way in which it locates clusters
with minimum user input.
The ISODATA method uses minimum spectral distance to assign a
cluster for each candidate pixel. The process begins with a specified
number of arbitrary cluster means or the means of existing signatures.
Then, it processes repetitively, so that those means shift to the means
of the clusters in the data.
Because the ISODATA method is iterative, it is not biased to the top of
the data file, as are the one-pass clustering algorithms.
Understanding Classification 153
Initial cluster Means On the first iteration of the ISODATA algorithm, you can arbitrarily
determine the means of N clusters. After each iteration, a new mean for
each cluster is calculated based on the actual spectral locations of the
pixels in the cluster instead of the initial arbitrary calculation. These new
means are used for defining clusters in the next iteration. The process
continues until there is little change between iterations (Swain 1973).
The initial cluster means are distributed in feature space along a vector
that runs between the point at spectral coordinates (
1
-
1
,
2
-
2
,
3
-
3
,
...
n
-
n
) and the coordinates (
1
+
1
,
2
+
2
,
3
+
3
, ...
n
+
n
). Such a
vector in two dimensions is illustrated below. The initial cluster means
are evenly distributed between (
A
-
A
,
B
-
B
) and (
A
+
A
,
B
+
B
).
Pixel Analysis Pixels are analyzed beginning with the upper-left corner of the image
and going left to right, block-by-block.
The spectral distance between the candidate pixel and each cluster
mean is calculated. The pixel is assigned to the cluster whose mean is
the closest. The ISODATA function creates an output image file with a
thematic raster layer as a result of the clustering. At the end of each
iteration, an image file shows the assignments of the pixels to the
clusters.
ISODATA Arbitrary Clusters
Five arbitrary cluster means in two-dimensional spectral space

Band A
Data File Values
B
a
n
d

B
D
a
t
a

F
i
l
e

V
a
l
u
e
s
154 Understanding Classification
Considering the regular, arbitrary assignment of the initial cluster
means, the first iteration of the ISODATA algorithm always gives results
similar to those in this illustration.
For the second iteration, the means of all clusters are recalculated,
causing them to shift in feature space. The entire process is repeated
each candidate pixel is compared to the new cluster means and
assigned to the closest cluster mean.
Percentage Unchanged After each iteration, the normalized percentage of pixels whose
assignments are unchanged since the last iteration displays in the
dialog. When this number reaches T (the convergence threshold), the
program terminates.
1
Cluster
2
Cluster
3
Cluster
4
Cluster
5
Band A
Data File Values
B
a
n
d

B
D
a
t
a

F
i
l
e

V
a
l
u
e
s
Cluster
Band A
Data File Values
B
a
n
d

B
D
a
t
a

F
i
l
e

V
a
l
u
e
s
1
Cluster
2
Cluster
3
Cluster
4
Cluster
5
Cluster
Understanding Classification 155
Performing
Unsupervised
Classification
To perform unsupervised classification, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Classification, and
then select Unsupervised/Categorize to open the Unsupervised
Classification dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the input file is stored.
3. Type the number of classes you want in the Desired Number of Classes
field.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Unsupervised Classification dialog.
Supervised
Classification
Supervised classification requires a priori (already known) information
about the data, such as:
What type of classes need to be extracted? Soil type? Land use?
Vegetation?
What classes are most likely to be present in the data? That is,
which types of land cover, soil, or vegetation (or whatever) are
represented by the data?
In supervised training, you rely on your own pattern recognition skills
and a priori knowledge of the data to help the system determine the
statistical criteria (signatures) for data classification. You should know
some informationeither spatial or spectralabout the pixels that you
want to classify to select reliable samples.
3
2
4
5
156 Understanding Classification
The location of a specific characteristic, such as a land cover type, may
be known through ground truthing. Ground truthing refers to the
acquisition of knowledge about the study area from field work, analysis
of aerial photography, personal experience, and so on. Ground truth
data is considered to be the most accurate (true) data available about
the area of study. It should be collected at the same time as the
remotely sensed data, so that the data corresponds as much as
possible (Star and Estes 1990). However, some ground data may not
be very accurate due to a number of errors and inaccuracies.
Performing Supervised
Classification
To perform supervised classification, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Classification, and
then select Supervised to open the Supervised Classification dialog.
2. Click the browse button for the Input Image field and navigate to the
directory where the file is stored.
3. Click the browse button for the Signature Features field and navigate to
the directory where the file is stored.
4. Select the field that contains the class names from the Class Name
Field dropdown list.
5. Click either the All Features or Selected Features button to specify
which features to use during classification.
6. Select the rule you want to use from the Classification Rule dropdown
list.
Note: For more information about each option, see Classification
Decision Rules on page 157.
3
2
6
7
4
5
8
Understanding Classification 157
7. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
8. Click OK to close the Supervised Classification dialog.
Classification
Decision Rules
Once you create and evaluate a set of reliable signatures, the next step
is to perform a classification of the data. Each pixel is analyzed
independently. The measurement vector for each pixel is compared to
each signature according to a decision rule or algorithm. Pixels that
pass the criteria and are established by the decision rule are then
assigned to the class for that signature. Image Analysis for ArcGIS lets
you classify the data parametrically with statistical representation.
Parametric Rules Image Analysis for ArcGIS provides these commonly used decision
rules for parametric signatures:
Minimum Distance
Mahalanobis Distance
Maximum Likelihood (With Bayesian Variation)
Nonparametric Rule Image Analysis for ArcGIS provides only one decision rule for
nonparametric signatures:
Parallelepiped
Minimum Distance The minimum distance decision rule (also called spectral distance)
calculates the spectral distance between the measurement vector for
the candidate pixel and the mean vector for each signature.
In this illustration, spectral distance is illustrated by the lines from the
candidate pixel to the means of the three signatures. The candidate
pixel is assigned to the class with the closest mean.

B3

B2

B1

A1

A2

A3
u
u
u

3
Band A
Data File Values
B
a
n
d

B
D
a
t
a

F
i
l
e

V
a
l
u
e
s
Candidate Pixel
o
o
158 Understanding Classification
The equation for classifying by spectral distance is based on the
equation for Euclidean distance.
Where:
n =Number of bands (dimensions)
i = A particular band
c = A particular class
X
xyi
= Data file value of pixel x,y in band i

ci
=Mean of data file values in band i for the sample for class c
SD
xyc
= Spectral distance from pixel x,y to the mean of class c
Source: Swain and Davis 1978
When spectral distance is computed for all possible values of c (all
possible classes) the class of the candidate pixel is assigned to the
class for which SD is the lowest.
Maximum Likelihood The Maximum Likelihood decision rule is based on the probability that
a pixel belongs to a particular class. The basic equation assumes that
these probabilities are equal for all classes, and that the input bands
have normal distributions.
Note: The Maximum Likelihood algorithm assumes that the histograms
of the bands of data have normal distributions. If this is not the case, you
might have better results with the minimum distance decision rule.
The equation for the Maximum Likelihood/Bayesian Classifier is as
follows:
Where:
D = Weighted distance (likelihood)
c =A particular class
X = The measurement vector of the candidate pixel
M
c
= The mean vector of the sample of class c
a
c
= Percent probability that any candidate pixel is a member of
class c (defaults to 1.0, or is entered from a priori data)
Cov
c
= The covariance matrix of the pixels in the sample of class c
|Cov
c
|=Determinant of Cov
c
(matrix algebra)
SD
xyc

ci
X
xyi
( )
2
i 1 =
n

=
a
c
( ) 0.5 Cov
c
( ) ln | | 0.5 X M
c
( )T Cov
c
1
\ .
| |
X M
c
( ) ln
Understanding Classification 159
Cov
c
-1 = Inverse of Cov
c
(matrix algebra)
ln = Natural logarithm function
T = Transposition function (matrix algebra)
Mahalanobis Distance Mahalanobis distance is similar to minimum distance, except that the
covariance matrix is used in the equation. Variance and covariance are
figured in so that clusters that are highly varied lead to similarly varied
classes, and vice versa. For example, when classifying urban areas
typically a class whose pixels vary widelycorrectly classified pixels
may be farther from the mean than those of a class for water, which is
usually not a highly varied class (Swain and Davis 1978).
Note: The Mahalanobis distance algorithm assumes that the
histograms of the bands have normal distributions. If this is not the
case, you might have better results with the parallelepiped or minimum
distance decision rule, or by performing a first-pass parallelepiped
classification.
The equation for the Mahalanobis distance classifier is as follows:
Where:
D= Mahalanobis distance
c= A particular class
X= The measurement vector of the candidate pixel
M
c
= The mean vector of the signature of class c
Cov
c
= The covariance matrix of the pixels in the signature of
class c
Cov
c
-1
= Inverse of Cov
c
T
= Transposition function
The pixel is assigned to the class, c, for which D is the lowest.
Parallelepiped Image Analysis for ArcGIS provides the Parallelepiped decision rule as
its nonparametric decision rule. In the Parallelepiped decision rule, the
data file values of the candidate pixel are compared to upper and lower
limits, which are the minimum and maximum data file values of each
band in the signature.
There are high and low limits for every signature in every band. When
a pixels data file values are between the limits for every band in a
signature, the pixel is assigned to that signatures class. If a pixel falls
into more than one class, the first class is the one assigned. If a pixel
falls into no class boundaries, it is labeled unclassified.
D X M
c
( )
T
Cov
c
1
\ .
| |
X M
c
( ) =
160 Understanding Classification
147 Using Conversion
Using Conversion 147
Using Conversion
The conversion feature gives you the ability to convert shapefiles to
raster images and raster images to shapefiles. This tool is very helpful
when you need to isolate or highlight certain parts of a raster image or
when you have a shapefile and you need to view it as a raster image.
Possible applications include viewing deforestation patterns, urban
sprawl, and shore erosion.
The Image Info tool that is discussed in Applying Data Tools on
page 51 is also an important part of raster/feature conversion. The
ability to assign certain pixel values as NoData is very helpful when
converting images.
IN THIS CHAPTER
Conversion
Converting Raster to Features
Converting Features to Raster
148 Using Conversion
Conversion Always be aware of how the raster dataset represents features when
converting points, polygons, or polylines to a raster, and vice versa.
There is a trade-off when working with a cell-based system. Although
points don't have area, cells do even if points are represented by a
single cell. The smaller the cell size, the smaller the area, and thus a
closer representation of the point feature.
Points with area have an accuracy of plus or minus half the cell size. For
many users, having all data types in the same format and being able to
use them interchangeably in the same language is more important than
a loss of accuracy.
Linear data is represented by a polyline that is also comprised of cells.
Therefore, it has area although, by definition, lines do not. Because of
this, the accuracy of representation varies according to the scale of the
data and the resolution of the raster dataset.
With polygonal or areal data, problems can occur from trying to
represent smooth polygon boundaries with square cells. The accuracy
of the representation is dependent upon the scale of the data and the
size of the cell. The finer the cell resolution and the greater the number
of cells that represent small areas, the more accurate the
representation.
Converting Raster
to Features
During a conversion of a raster representing polygonal features to a
polygonal vector file or feature dataset, the polygons are built from
groups of contiguous cells having the same cell values. Arcs are
created from cell borders in the raster. Continuous cells with the same
value are grouped together to form polygons. Cells that are NoData in
the input raster do not become features in the output polygon feature.
When a raster that represents linear features is converted to a polyline
vector file or feature dataset, a polyline is created from each cell in the
input raster, passing through the center of each cell. Cells that are
NoData in the input raster do not become features in the output polyline
feature.
When you convert a raster representing point features to point vector
file or feature dataset, a point is created in the output for each cell of the
input raster. Each point is positioned at the center of the cell it
represents. NoData cells are not transformed into points.
When you choose Convert Raster to Feature, the Raster to Feature
dialog gives you the choice of a field to specify from the image in the
conversion. You are also given the choice of an output geometry type
so you can specify whether the feature is a point, a polygon, or a
polyline according to the field and data youre using. You can specify
Generalize Lines to smooth out ragged or sharp edges in the new
feature file. You should note that regardless of which field you select,
the category is not populated in the Attribute table after conversion.
The images below show a raster image before conversion and a raster
image after conversion to a shapefile using Value as the field.
Using Conversion 149
Figure 42: Raster Image before Conversion
Figure 43: Raster Image after Conversion
Performing Raster to
Features Conversion
To perform raster to features conversion, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Convert, and then
select Convert Raster to Features to open the Raster to Features
dialog.
150 Using Conversion
2. Click the browse button for the Input Raster field and navigate to the
directory where the file is stored.
3. Select a field to use from the Field dropdown list.
4. Select Point, Polygon, or Polyline from the Output Geometry Type
dropdown list.
5. Check the Generalize Lines check box if you want to smooth out sharp
edges in the image.
6. Type the file name of the shapefile in the Output Features field, or
navigate to the directory where you want it stored.
7. Click OK to close the Raster to Features dialog.
3
4
5
6
7
2
Using Conversion 151
Converting
Features to Raster
You can convert polygons, polylines, or points from any source file to a
raster. You can convert features using both string and numeric fields.
Each unique string in a string field is assigned a unique value to the
output raster. A field is added to the table of the output raster to hold the
original string value from the features.
When you convert points, cells are given the value of the points found
within each cell. Cells that do not contain a point are given the value of
NoData. You can specify a cell size to use in the Features to Raster
dialog. Specify a cell size based on these factors:
The resolution of the input data
The output resolution needed to perform your analysis
The need to maintain a rapid processing speed
Polylines are features that, at certain resolutions, appear as lines
representing streams or roads. When you convert polylines, cells are
given the value of the line that intersects each cell. Cells that are not
intersected by a line are given the value NoData. If more than one line
is found in a cell, the cell is given the value of the first line encountered
while processing. Using a smaller cell size during conversion alleviates
this.
Polygons are used for buildings, forests, fields, and many other features
that are best represented by a series of connected cells. When you
convert polygons, the cells are given the value of the polygon found at
the center of each cell.
152 Using Conversion
Performing Features to
Raster Conversion
To perform features to raster conversion, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Convert, and then
select Convert Features to Raster to open the Features to Raster
dialog.
2. Click the browse button for the Input features field and navigate to the
directory where the file is stored.
3. Select a field to use from the Field dropdown list.
4. Type an output cell size in the Output Cell Size field.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Features to Raster dialog.
4
3
2
5
6
153 Applying GeoCorrection Tools
Applying GeoCorrection Tools 153
Applying GeoCorrection Tools
The tools and methods in this chapter describe the process of
geometrically correcting the distortions in images caused by sensors
and the curvature of the Earth. Even images of seemingly flat areas are
distorted, but you can correct, or rectify, these images so they are
represented on a planar surface, conform to other images, and have the
integrity of a map.
The terms geocorrection and rectification are used synonymously when
discussing geometric correction. Rectification is the process of
transforming data from one grid system into another grid system using
a geometric transformation. Because the pixels of a new grid may not
align with the pixels of the original grid, you must resample the pixels.
Resampling is the process of extrapolating data values for the pixels on
the new grid from the values of the source pixels.
Orthorectification is a form of rectification that corrects for terrain
displacement and is used if there is a DEM of the study area. It is based
on collinearity equations, which is derived by using 3D Ground GCPs.
In relatively flat areas, orthorectification is not necessary, but is
recommended in mountainous areas (or on aerial photographs of
buildings) where a high degree of accuracy is required.
IN THIS CHAPTER
Rectification
GeoCorrection
SPOT
Polynomial Transformation
Rubber Sheeting
Camera
IKONOS, QuickBird, and RPC Properties
Landsat
154 Applying GeoCorrection Tools
Rectification Rectification is necessary in cases where you must change the pixel
grid of an image to fit a map projection system or a reference image.
There are several reasons for rectifying image data:
Comparing pixels scene-to-scene in applications, such as change
detection or thermal inertia
Mapping (day and night comparison)
Developing GIS databases for GIS modeling
Identifying training samples according to map coordinates prior to
classification
Creating accurate, scaled photomaps
Overlaying an image with vector data
Comparing images that are originally at different scales
Extracting accurate distance and area measurements
Mosaicking images
Performing any other analyses requiring precise geographic
locations
Before rectifying the data, consider the primary use for the database
before selecting the optimum map projection and appropriate
coordinate system. If you are doing a government project, the projection
is often predetermined. A commonly used projection in the United
States government is state plane. Use an equal area projection for
thematic or distribution maps and conformal or equal area projections
for presentation maps.
Consider the following before selecting a map projection:
How large or small an area is mapped? Different projections are
intended for different size areas.
Where on the globe is the study area? Polar regions and equatorial
regions require different projections for maximum accuracy.
What is the extent of the study area? Circular, north-south, east-
west, and oblique areas may all require different projection systems
(ESRI 1992).
Applying GeoCorrection Tools 155
Disadvantages of
Rectification
During rectification, you must resample the data file values of rectified
pixels to fit into a new grid of pixel rows and columns. Although some of
the algorithms for calculating these values are highly reliable, you can
lose some spectral integrity of the data during rectification. If map
coordinates or map units are not needed in the application, it might be
wiser not to rectify the image. An unrectified image is more spectrally
correct than a rectified image.
Georeferencing Georeferencing refers to the process of assigning map coordinates to
image data. The image data might already project onto the plane, but
not reference the proper coordinate system. Rectification, by definition,
involves georeferencing, because all map projection systems are
associated with map coordinates. Image-to-image registration involves
georeferencing only if the reference image is already georeferenced.
Georeferencing, by itself, involves changing only the map coordinate
information in the image file. The grid of the image does not change.
Geocoded data are images that are rectified to a particular map
projection and pixel size, usually with radiometric corrections applied. It
is possible to purchase image data that is already geocoded. You
should rectify geocoded data only if it must conform to a different
projection system or be registered to other rectified data.
Georeferencing Only Rectification is unnecessary if there is no distortion in the image. For
example, if an image file is produced by scanning or digitizing a paper
map that is in the projection system you want, the image is already
planar and does not require rectification unless there is some skew or
rotation of the image. Scanning or digitizing produces images that are
planar, but do not contain any map coordinate information. You can
georeference these images, which is a much simpler process than
rectification. In many cases, you need only to update the image header
with new map coordinate information. This involves redefining:
The map coordinate of the upper-left corner of the image
The cell size (the area represented by each pixel)
This information is usually the same for each layer of an image file,
although it can be different. For example, the cell size of band 6 of
Landsat TM data is different from the cell size of the other bands.
Ground Control Points GCPs are specific pixels in an image for which the output map
coordinates (or other output coordinates) are known. GCPs consist of
two X,Y pairs of coordinates:
Source Coordinates Usually data file coordinates in the image
you are rectifying
Reference Coordinates The coordinates of the map or reference
image to which the source image is being registered
156 Applying GeoCorrection Tools
The term map coordinates is sometimes used loosely to apply to
reference coordinates and rectified coordinates. These coordinates are
not limited to map coordinates. For example, map coordinates are
unnecessary in image-to-image registration.
Entering GCP
Coordinates
Accurate GCPs are essential for an accurate rectification. From the
GCPs, the rectified coordinates for all other points in the image are
extrapolated. Select many GCPs throughout the scene. The more
dispersed the GCPs are, the more reliable the rectification. GCPs for
large-scale imagery might include the intersection of two roads, airport
runways, utility corridors, towers, or buildings. For small-scale imagery,
you can use larger features such as urban areas or geologic features.
Do not use landmarks that can vary (edges of lakes, other water bodies,
vegetation, and so on).
You can enter the source and reference coordinates of the GCPs in the
following ways:
Enter a known a priori at the keyboard.
Use your mouse to select a pixel from an image in the view. With
both the source and destination views open, enter source
coordinates and reference coordinates for image-to-image
registration.
Use a digitizing tablet to register an image to a hardcopy map.
Tolerance of RMSE Acceptable RMSE is determined by the end use of the database, the
type of data used, and the accuracy of the GCPs and ancillary data
used. For example, GCPs acquired from GPS should have an accuracy
of about 10 m, but GCPs from 1:24,000-scale maps should have an
accuracy of about 20 m.
It is important to remember that RMSE is reported in pixels. Therefore,
if you are rectifying Landsat TM data and want the rectification accurate
to within 30 meters, the RMSE should not exceed 1.00. Acceptable
accuracy depends on the image area and the project.
Classification Some analysts recommend classification before rectification because
the classification is then based on the original data values. Another
benefit is that a thematic file has only one band to rectify instead of the
multiple bands of a continuous file. On the other hand, it might benefit
you to rectify the data first, especially when using GPS data for the
GCPs. Because this data is very accurate, the classification might be
more accurate if the new coordinates help to locate better training
samples.
Thematic Files Nearest neighbor is the only appropriate resampling method for
thematic files, which is a drawback in some applications. The available
resampling methods are discussed in detail in Options Dialog on page
62.
Applying GeoCorrection Tools 157
Orthorectification Orthorectification is a form of rectification that corrects for terrain
displacement and is used if there is a DEM of the study area. It is based
on collinearity equations, which is derived by using 3D GCPs. In
relatively flat areas, orthorectification is not necessary, but in
mountainous areas (or on aerial photographs of building), where a high
degree of accuracy is required, orthorectification is recommended.
GeoCorrection You can perform georeferencing, which is the process of assigning map
coordinates to image data by using various sensor models, using the
GeoCorrection properties dialogs and the Add Links tool.
The GeoCorrection
Properties Dialog
The individual GeoCorrection tools have their own dialog. It displays
whenever you select a model type for an image on the Image Analysis
toolbar and click the GeoCorrection Properties button. Some of the
dialogs for these tools contain tabs pertaining to that specific tool, but
they all have several tabs in common. Every GeoCorrection tool dialog
has a General tab and a Links tab, and all but Polynomial properties and
Rubber Sheeting properties have an Elevation tab.
General Tab The General tab has the following settings:
Link Coloring Lets you set a threshold and select or change link
colors.
Displayed Units Lets you view the horizontal and vertical units if
they are known. Often only one is known so it might display Meters
for vertical units and Unknown for horizontal units. Display units do
not have an effect on the original data in latitude/longitude format.
The image in the view does not show the changes either.
To specify settings on the General tab, follow these steps:
1. Type a number in the Threshold field, and then click the Within
Threshold and Over Threshold color bar arrows to change the link
colors.
2. View the measurement of the vertical units in the Displayed Units box.
Links Tab The Links tab (this display is also called a CellArray) displays
information about the links in your image, including reference points
and RMSE. If you added links to your image, they are listed on this tab.
The program is interactive between the image and the Links tab, so
when you add links in an image or between two images, information is
automatically updated in the CellArray. You can edit and delete
information displayed in the CellArray. For example, if you want to
experiment with coordinates other than the ones youve been given,
you can plug your own coordinates into the CellArray.
158 Applying GeoCorrection Tools
Note: Before adding links or editing the links table, you must select the
coordinate system in which you want to store the link coordinates.
To select a coordinate system, follow these steps:
1. Right-click in the view and select Data Frame Properties to open the
Data Frame Properties dialog.
2. Click the Coordinate System tab.
3. Select the appropriate predefined coordinate system in the Select a
Coordinate System box if your link coordinates are predefined.
Note: Expand the Layers folder and select that layer if you want to use
the coordinate system from a specific layer.
4. Click OK to close the Data Frame Properties dialog.
3
4
2
Applying GeoCorrection Tools 159
To perform a few additional checks you need to make before
proceeding, follow these steps:
1. Make sure that the correct layer displays in the Layers dropdown list on
the Image Analysis toolbar.
2. Select your model type from the Model Types dropdown list.
3. Click the Add Links button to set your new links.
To proof and edit the coordinates of the links as you enter them, follow
these steps:
1. Click the GeoCorrection Properties button.
2. Click the Links tab.
The coordinates display in the CellArray on this tab.
3. Click in a cell and edit the contents.
4. Click the Export Links to Shapefile button and save the new shapefile.
1 2
3
2 3
4
160 Applying GeoCorrection Tools
Elevation Tab The Elevation tab is in all GeoCorrection Model properties dialogs
except for Polynomial and Rubber Sheeting. The default settings on the
Elevation tab let you choose a file to use as an elevation source.
Figure 44: Elevation Source File Settings

If you do not have an elevation file, click the Constant button to change
the settings in the Elevation Source box and specify the elevation value
and units. Use the constant value that is the average ground elevation
for the entire scene.
Figure 45: Elevation Source Constant Settings
Note: You can also check the Account for Earths curvature check box
if you want to use this option as part of the elevation.
Applying GeoCorrection Tools 161
The following steps take you through the Elevation tab. The first set of
instructions uses File as the elevation source. The second set uses
Constant as the elevation source.
To use a file value as the elevation source, follow these steps:
1. Click the File button in the Elevation Source box.
2. Type the name of the file in the Elevation File field, or navigate to the
directory where it is stored.
3. Select Feet or Meters from the Elevation Units dropdown list.
4. Check the Account for Earths Curvature check box.
5. Click Apply to set the elevation source.
6. Click OK to close the dialog.
To use a constant value as the elevation source, follow these steps:
1. Click the Constant button in the Elevation Source box.
2. Type the elevation value in the Elevation Value field.
3. Select Feet or Meters from the Elevation Units dropdown list.
4. Check the Account for Earths Curvature check box.
5. Click Apply to set the elevation source.
6. Click OK to close the dialog.
1
2
3
4
1
4
2
3
162 Applying GeoCorrection Tools
SPOT The first SPOT satellite, developed by the French Centre National
dEtudes Spatiales (CNES), was launched in early 1986. The second
SPOT satellite was launched in 1990, and the third was launched in
1993. The sensors operate in two modes, multispectral and
panchromatic. SPOT is commonly referred to as a pushbroom scanner,
which means that all scanning parts are fixed, and scanning is
accomplished by the forward motion of the scanner. SPOT pushes
3000/6000 sensors along its orbit. This is different from Landsat, which
scans using 16 detectors perpendicular to its orbit.
The SPOT satellite can observe the same area on the globe once every
26 days. The SPOT scanner normally produces nadir views, but it does
have off-nadir viewing capability. Off-nadir refers to any point that is not
directly beneath the detectors, but off to an angle. Using this off-nadir
capability, you can view one area on the Earth as often as every three
days.
Off-nadir viewing is programmable from the ground control station. It is
useful for collecting data in a region not directly in the path of the
scanner, or where timeliness of data acquisition is crucial in the event
of a natural or man-made disaster. It is also very useful in collecting
stereo data from which you can extract elevation data.
The width of the swath observed varies between 60 km for nadir viewing
and 80 km for off-nadir viewing at a height of 832 km (Jensen 1996).
Panchromatic SPOT Panchromatic (meaning sensitive to all visible colors) has 10
10 m spatial resolution, contains 1 band0.51 to 0.73 mmand is
similar to a black-and-white photograph. It has a radiometric resolution
of 8 bits (Jensen 1996).
Applying GeoCorrection Tools 163
XS SPOT XS, or multispectral, has 20 20 m spatial resolution, 8-bit
radiometric resolution, and contains 3 bands (Jensen 1996).
Table 6: SPOT XS Bands and Wavelengths
Figure 46: SPOT Panchromatic versus SPOT XS
Stereoscopic Pairs The panchromatic scanner can make two observations on successive
days, so that the two images are acquired at angles on either side of the
vertical, resulting in stereoscopic imagery. Stereoscopic imagery is also
achieved by using one vertical scene and one off-nadir scene. This type
of imagery can be used to produce a single image, or topographic and
planimetric maps (Jensen 1996).
Topographic maps indicate elevation. Planimetric maps correctly
represent horizontal distances between objects (Star and Estes 1990).
Band Wavelength
(Microns)
Comments
1, Green 0.50 to 0.59
m
This band corresponds to the green
reflectance of healthy vegetation.
2, Red 0.61 to 0.68
m
This band is useful for discriminating between
plant species. It is also useful for soil boundary
and geological boundary delineations.
3,
Reflective
IR
0.79 to 0.89
m
This band is especially responsive to the
amount of vegetation biomass present in a
scene. It is useful for crop identification and
emphasizes soil/crop and land/water contrasts.
P
a
n
chro
m
a
tic
X
S
1 Band
3 Bands
1 Pixel =
10 m x 10 m
1 Pixel =
20 m x 20 m
Radiometric
Resolution
0-255
164 Applying GeoCorrection Tools
SPOT 4 The SPOT 4 satellite was launched in 1998. SPOT 4 carries High
Resolution Visible Infrared (HR VIR) instruments that obtain information
in the visible and near-infrared spectral bands. It orbits the Earth at 822
km above the Equator and has two sensors on board: a multispectral
sensor, and a panchromatic sensor. The multispectral scanner has a
pixel size of 20 20 m, and a swath width of 60 km. The panchromatic
scanner has a pixel size of 10 10 m, and a swath width of 60 km.
Table 7: SPOT 4 Bands and Wavelengths
The Spot Properties
Dialog Box
In addition to the General, Links, and Elevation tabs, the Spot
Properties dialog also contains a Parameters tab. Most of the
GeoCorrection Properties dialogs contain a Parameters tab, but each
one offers different options.
To specify settings on the Parameters tab on the Spot Properties
dialog, follow these steps:
1. Select Spot from the Model Types dropdown list on the Image Analysis
toolbar.
2. Click the GeoCorrection Properties button to open the Spot Properties
dialog.
Band Wavelength
1, Green 0.50 to 0.59 m
2, Red 0.61 to 0.68 m
3, (near-IR) 0.78 to 0.89 m
4, (mid-IR) 1.58 to 1.75 m
Panchromatic 0.61 to 0.68 m
Applying GeoCorrection Tools 165
3. Click the Parameters tab.
4. Click the XS/XI or Pan button to specify the sensor type.
5. Type the number of iterations in the Number of Iterations field.
6. Type a number for the incidence angle in the Incidence Angle field.
7. Type the pixel value, which forms the background, in the Value field.
8. Type the number of the layer in the Layer field.
9. Click OK to close the Spot Properties dialog.
Polynomial
Transformation
Polynomial equations are used to convert source file coordinates to
rectified map coordinates. Depending on the distortion in the imagery,
complex polynomial equations might be required to express the needed
transformation. The degree of complexity of the polynomial is
expressed as the order of the polynomial. The order of transformation
is the order of the polynomial used in the transformation. Image
Analysis for ArcGIS allows 1st through nth order transformations.
Usually, 1st or 2nd order transformations are used.
4
7
5
3
9
6
8
166 Applying GeoCorrection Tools
Transformation Matrix A transformation matrix is computed from the GCPs. The matrix
consists of coefficients that are used in polynomial equations to convert
the coordinates. The size of the matrix depends on the order of
transformation. The goal in calculating the coefficients of the
transformation matrix is to derive the polynomial equations for which
there is the least possible amount of error when used to transform the
reference coordinates of the GCPs into the source coordinates. It is not
always possible to derive coefficients that produce no error. For
example, in the figure below, GCPs are plotted on a graph and
compared to the curve that is expressed by a polynomial.
Every GCP influences the coefficients, even if there isnt a perfect fit of
each GCP to the polynomial that the coefficients represent. The
distance between the GCP reference coordinate and the curve is called
RMSE, which is discussed later in Camera on page 177.
Linear Transformations A 1st order transformation is a linear transformation. It can change:
Location in X or Y
Scale in X or Y
Skew in X or Y
Rotation
You can use 1st order transformations to project raw imagery to a
planar map projection, convert a planar map projection to another
planar map projection, and rectify relatively small image areas. You can
perform simple linear transformations to an image in a view or to the
transformation matrix itself. Linear transformations may be required
before collecting GCPs for the displayed image. You can reorient
skewed Landsat TM data, rotate scanned quad sheets according to the
angle of declination stated in the legend, and rotate descending data so
that north is up.
Source X Coordinate
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
GCP
Polynomial Curve
Applying GeoCorrection Tools 167
You can also use a 1st order transformation for data already projected
onto a plane. For example, SPOT and Landsat Level 1B data is already
transformed to a plane, but might not be rectified to the map projection
you want. When doing this type of rectification, it is not advisable to
increase the order of transformation if a high RMSE occurs first.
Examine other factors, such as the GCP source and distribution, and
then look for systematic errors.
The transformation matrix for a 1st order transformation consists of six
coefficientsthree for each coordinate (X and Y):
Coefficients are used in a 1st order polynomial as follows:
Where:
x and y are source coordinates (input)
x
0
and y
0
are rectified coordinates (output)
The coefficients of the transformation matrix are as above.
Nonlinear
Transformations
You can use 2nd order transformations to convert Lat/Lon data to a
planar projection, for data covering a large area (to account for the
Earths curvature), and with distorted data (for example, due to camera
lens distortion). Use 3rd order transformations with distorted aerial
photographs, on scans of warped maps, and with radar imagery. You
can use 4th order transformations on very distorted aerial photographs.
The transformation matrix for a transformation of order t contains this
number of coefficients:
It is multiplied by two for the two sets of coefficientsone set for X and
one for Y.
An easier way to arrive at the same number is:
Clearly, the size of the transformation matrix increases with the order of
the transformation.
a
0
a
1
a
2
b
0
b
1
b
2
x
0
a
0
a
1
x a
2
y + + =
y
0
b
0
b
1
x b
2
y + + =
2 i
i 0 =
t 1 +

t 1 + ( )x t 2 + ( )
168 Applying GeoCorrection Tools
High-Order Polynomials The polynomial equations for a t order transformation take this form:
Where:
t is the order of the polynomial
a and b are coefficients
The subscript k in a and b is determined by:
Effects of Order The computation and output of a higher polynomial equation are more
complex than a lower-order polynomial equation. Therefore, you should
use higher-order polynomials to perform more complicated image
rectifications. It is helpful to see the output of various orders of
polynomials to understand the effects of different orders of
transformation in image rectification.
The following example uses only one coordinate (X) instead of the two
(X,Y) used in the polynomials for rectification. This lets you draw two-
dimensional graphs that illustrate the way higher orders of
transformation affect the output image. Because only the X coordinate
is used in these examples, the number of GCPs used is less than the
number required to perform the different orders of transformation.
Coefficients like those in this example are generally calculated by the
least squares regression method. Suppose there are GCPs with the
following X coordinates:
These GCPs allow a 1st order transformation of the X coordinates,
which is satisfied by this equation (the coefficients are in parentheses):
x
o
t

i o =
\ .
|
|
| |
i

j o =
\ .
|
|
| |
= a
k
x
i j
y
j

y
o
t

i o =
\ .
|
|
| |
i

j o =
\ .
|
|
| |
= b
k
x
i j
y
j

k
i i j +
2
------------------ j + =
Source X Coordinate
(Input)
Reference X Coordinate
(Output)
1 17
2 9
3 1
x
r
25 ( ) 8 ( )x
i
+ =
Applying GeoCorrection Tools 169
Where:
x
r
= Reference X coordinate
x
i
= Source X coordinate
This equation takes on the same format as the equation of a line
(y = mx + b). In mathematical terms, a 1st order polynomial is linear.
Therefore, a 1st order transformation is also known as a linear
transformation.
This equation is graphed below:
0 1 2 3 4
0
4
8
12
16
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
Source X Coordinate
x
r
= (25) + (-8)x
i
170 Applying GeoCorrection Tools
However, what if the second GCP were changed as follows?
These points are plotted against each other below:
A line cannot connect these points, which illustrates why they are not
expressed by a 1st order polynomial like the graph on the left. In this
case, a 2nd order polynomial equation expresses these points.
Polynomials of the 2nd order or higher are nonlinear. The graph of this
curve is drawn below:
Source X
Coordinate
(Input)
Reference X
Coordinate
(Output)
1 17
2 7
3 1
0 1 2 3 4
0
4
8
12
16
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
Source X Coordinate
x
r
31 ( ) 16 ( )x
i
2 ( )x
i
2
+ + =
0 1 2 3 4
0
4
8
12
16
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
Source X Coordinate
x
r
= (31) + (-16)
xi
+ (2)x
i
2
Applying GeoCorrection Tools 171
What if you added one more GCP to the list?
As illustrated in the graph above, this fourth GCP does not fit on the
curve of the 2nd order polynomial equation. You must increase the
order of the transformation to the 3rd order to ensure that all the GCPs
fit. The equation and graph below are possible results:
Source X
Coordinate
(Input)
Reference X
Coordinate
(Output)
1 17
2 7
3 1
4 5
0 1 2 3 4
0
4
8
12
16
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
Source X Coordinate
x
r
= (31) + (-16)
xi
+ (2)x
i
2
(4,5)
x
r
25 ( ) 5 ( )x
i
4 ( )x
i
2
1 ( )x
i
2
+ + + =
0 1 2 3 4
0
4
8
12
16
R
e
f
e
r
e
n
c
e

X

C
o
o
r
d
i
n
a
t
e
Source X Coordinate
x
r
= (25) + (-5)x
i
+ (-4)x
i
2
+ (1)x
i
3
172 Applying GeoCorrection Tools
The figure above illustrates a 3rd order transformation. However, this
equation may be unnecessarily complex. Performing a coordinate
transformation with this equation can cause unwanted distortions in the
output image for the sake of a perfect fit for all GCPs. In this example,
a 3rd order transformation probably is too high, because the output
pixels in the X direction are arranged in a different order than the input
pixels in the X direction.
In this case, a higher order of transformation probably does not produce
the results you want.
Source X
Coordinate
(Input)
Reference X
Coordinate
(Output)
1
2
3
4
x
0
1 ( ) 17 =
x
0
2 ( ) 7 =
x
0
3 ( ) 1 =
x
0
4 ( ) 5 =
x
0
1 ( ) x
0
2 ( ) x
0
4 ( ) x
0
3 ( ) > > >
17 7 5 1 > > >
1 2 3 4
1 2 3 4
Input Image
X Coordinates
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
3 4 2 1
Output Image
X Coordinates
Applying GeoCorrection Tools 173
Minimum Number of
GCPs
You can use higher orders of transformation to correct more
complicated types of distortion. However, more GCPs are needed to
use a higher order of transformation. For example, three points define
a plane. Therefore, to perform a 1st order transformation, which is
expressed by the equation of a plane, at least three GCPs are needed.
Similarly, the equation in a 2nd order transformation is the equation of
a paraboloid. Six points are required to define a paraboloid. Therefore,
at least six GCPs are required to perform a 2nd order transformation.
The minimum number of points required to perform a transformation of
order t equals:
Use more than the minimum number of GCPs whenever possible.
Although it is possible to get a perfect fit, it is rare, no matter how many
GCPs are used.
For 1st through 10th order transformations, the minimum number of
GCPs required to perform a transformation is listed in this table:
Table 8: Number of GCPs
Order of
Transformation
Minimum GCPs
Required
1 3
2 6
3 10
4 15
5 21
6 28
7 36
8 45
9 55
10 66
t 1 + ( ) t 2 + ( ) ( )
2
--------------------------------------
174 Applying GeoCorrection Tools
The Polynomial
Properties Dialog Box
The Polynomial Properties dialog has a Parameters tab in addition to
the General and Links tabs. It does not have an Elevation tab. The
General tab and the Links tab are the same as the ones featured at the
beginning of this chapter.
The Parameters tab contains a CellArray that shows the transformation
coefficients table. The table is populated when the model is solved.
To specify settings on the Polynomial Properties Parameters tab, follow
these steps:
1. Type the number for the polynomial order in the Polynomial Order field.
2. Click OK.
2
1
Applying GeoCorrection Tools 175
Rubber Sheeting Triangle-based finite element analysis is a powerful tool for solving
complicated computation problems that can be approached by small
simpler pieces. It is widely used as a local interpolation technique in
geographic applications. For image rectification, known control points
can be triangulated into many triangles. Each triangle has three control
points as its vertices. You can then use the polynomial transformation
to establish mathematical relationships between source and destination
systems for each triangle.
Because the transformation passes through each control point and is
not in a uniform manner, finite element analysis is also called rubber
sheeting. You can also call it the triangle-based rectification because
the transformation and resampling for image rectification are performed
on a triangle-by-triangle basis. Use this triangle-based technique when
other rectification methods such as polynomial transformation and
photogrammetric modeling cannot produce acceptable results.
Triangulation It is necessary to triangulate the control points into a mesh of triangles
to perform the triangle-based rectification. Watson (1994) summarily
listed four kinds of triangulation, including the Arbitrary, Optimal,
Greedy, and Delaunay triangulation. Of the four kinds, the Delaunay
triangulation is most widely used and is adopted because of the smaller
angle variations of the resulting triangles.
You can construct the Delaunay triangulation by the empty circumcircle
criterion. The circumcircle formed from three points of any triangle does
not have any other points inside. The triangles defined this way are the
most equiangular possible.
Triangle-Based
Rectification
Once the triangle mesh is generated and the spatial order of the control
points is available, you can perform the geometric rectification on a
triangle-by-triangle basis. This triangle-based method is appealing
because it breaks the region into smaller subsets. If the geometric
problem of the region is very complicated, the geometry of each subset
is much simpler and modeled through simple transformation.
For each triangle, you can use the polynomials as the general
transformation form between source and destination systems.
Linear Transformation The easiest and fastest transformation is the linear transformation with
the 1st order polynomials:
Additional information is not necessary because there are three known
conditions in each triangle and three unknown coefficients for each
polynomial.
xo a
0
a
1
x a
2
y + + =
yo b
0
b
1
x b
2
y + + =

176 Applying GeoCorrection Tools


Nonlinear
Transformation
Although linear transformation is easy and fast, it has one disadvantage
in that the transitions between triangles are not always smooth. This
phenomenon is obvious when shaded relief or contour lines are derived
from the DEM that is generated by the linear rubber sheeting. It is
caused by incorporating the slope change of the control data at the
triangle edges and vertices. In order to distribute the slope change
smoothly across triangles, the nonlinear transformation with polynomial
order larger than 1 is used by considering the gradient information.
The fifth order or quintic polynomial transformation is chosen as the
nonlinear rubber sheeting technique in the example that follows. It is a
smooth function. The transformation function and its first order partial
derivative are continuous. It is not difficult to construct (Akima 1978).
The formulation is as follows:
The 5th order has 21 coefficients for each polynomial to be determined.
For solving these unknowns, 21 conditions are available. For each
vertex of the triangle, one point value is given, and two 1st order and
three 2nd order partial derivatives are easily derived by establishing a
2nd order polynomial using vertices in the neighborhood of the vertex.
Then, the total 18 conditions are ready for use. You can obtain three
more conditions by assuming that the normal partial derivative on each
edge of the triangle is a cubic polynomial. This means that the sum of
the polynomial items beyond the 3rd order in the normal partial
derivative has a value of zero.
Checkpoint Analysis It should be emphasized that the independent checkpoint analysis is
critical for determining the accuracy of rubber sheeting modeling. For
an exact modeling method like rubber sheeting, the GCPs, which are
used in the modeling process, do not have much geometric residuals
remaining. The accuracy assessment using independent checkpoints is
recommended to evaluate the geometric transformation between
source and destination coordinate systems.
Camera The camera model is derived by space resection based on collinearity
equations and is used for rectifying any image that uses a camera as its
sensor.
The Camera Properties
Dialog Box
In addition to the General, Links, and Elevation tabs, the Camera
Properties dialog has tabs for Orientation, Camera, and Fiducials.
x
0
a
k
x
i j
y
j

j 0 =
i

i 0 =
5

=
y
0
b
k
x
i j
y
j

j 0 =
i

i 0 =
5

Applying GeoCorrection Tools 177


Orientation Tab The Orientation tab lets you choose rotation angles and center
positions for the camera.
Figure 47: Orientation Tab
The rotation angle lets you customize the Omega, Phi, and Kappa
rotation angles of the image to determine the viewing direction of the
camera. You can choose from the following options:
Unknown Rotation angle is unknown.
Estimated Rotation angle is estimated.
Fixed Rotation angle is defined.
Omega Rotation angle is roll (around the x-axis of the ground
system).
Phi Phi rotation angle is pitch (around the y-axis (after Omega
rotation)).
Kappa Kappa rotation angle is yaw (around the z-axis rotated by
Omega and Phi).
The perspective center position is given in meters and lets you enter the
perspective center for ground coordinates. You can choose from the
following options:
Unknown Select when the ground coordinate is unknown.
Estimated Select when estimating the ground coordinate.
178 Applying GeoCorrection Tools
Fixed Select when ground coordinate is defined.
X Enter the X coordinate of the perspective center.
Y Enter the Y coordinate of the perspective center.
Z Enter the Z coordinate of the perspective center.
Note: If you fill in all the degrees and meters for the rotation angle and
the perspective center position, you do not need the three links normally
required for the Camera model. If you fill in this information, do not
check the Account for Earths curvature check box on the Elevation tab.
Camera Tab The next to last tab on the Camera Properties dialog is also called
Camera. This is where you specify the camera name, the number of
fiducials, the principal point, and the focal length for the camera that
was used to capture your image.
Figure 48: Camera Tab
You can click Load or Save to open or save a file with camera
information in it.
Fiducials Tab The last tab on the Camera Properties dialog is the Fiducials tab.
Fiducials are used to compute the transformation from data file to image
coordinates.
Figure 49: Fiducials Tab
Applying GeoCorrection Tools 179
Fiducial orientation defines the relationship between the image/photo-
coordinate system of a frame and the actual image orientation as it
displays in a view. The image/photo-coordinate system is defined by
the camera calibration information. The orientation of the image is
largely dependent on the way the photograph is scanned during the
digitization stage.
The fiducials for your image are fixed on the frame and are visible in the
exposure. The fiducial information you enter on the Camera tab
displays in a CellArray on the Fiducials tab after you click the Apply
button in the Camera Properties dialog.
Compare the axis of the photo-coordinate system (defined in the
calibration report) with the orientation of the image to select the
appropriate fiducial orientation. Based on the relationship between the
photo-coordinate system and the image, you can select the appropriate
fiducial orientation. Do not use more than eight fiducials in an image.
180 Applying GeoCorrection Tools
The fiducial orientations are used under the following circumstances:
Fiducial One The marker is to the left of the image.
Fiducial Two The marker is at the top of the image.
Fiducial Three The marker is to the right of the image.
Fiducial Four The marker is at the bottom of the image.
Fiducial Measurement The software zooms in on the first
fiducial you need to collect.
Note: Selecting an inappropriate fiducial orientation results in large
RMSEs during the measurement of fiducial marks for interior orientation
and errors during the automatic tie point collection. Ensure that the
appropriate fiducial orientation is used as a function of the image/photo-
coordinate system.
Applying GeoCorrection Tools 181
IKONOS,
QuickBird, and
RPC Properties
IKONOS, QuickBird, and Rational Polynomial Coefficients (RPC)
properties are sometimes referred to together as the Rational Function
models. They are virtually the same except for the files they use, and
their dialogs in GeoCorrection properties are identical as well. The
differences are:
IKONOS files are images captured by the IKONOS satellite.
QuickBird files are images captured by the QuickBird satellite.
RPC Properties uses National Imagery Transmission Format
Standard (NITF) data.
Note: It is very important that you click the Add Links button before
clicking the GeoCorrection Properties button to open one of these
properties dialogs.
IKONOS IKONOS images are produced by the IKONOS satellite, which was
launched in September of 1999 by the Athena II rocket.
The resolution of the panchromatic sensor is 1 m. The resolution of the
multispectral scanner is 4 m. The swath width is 13 km at nadir. The
accuracy without ground control is 12 m horizontally, and 10 m
vertically; with ground control it is 2 m horizontally, and 3 m vertically.
IKONOS orbits at an altitude of 423 miles, or 681 kilometers. The revisit
time is 2.9 days at 1 m resolution, and 1.5 days at 1.5 m resolution.
The multispectral bands are as follows:
Table 9: IKONOS Bands and Wavelengths
The IKONOS Properties dialog lets you rectify IKONOS images from
the satellite. Like the other properties dialogs in GeoCorrection
Properties, IKONOS has General, Links, and Elevation tabs as well as
Parameters and Chipping tabs.
Band Wavelength
(Microns)
1, Blue 0.45 to 0.52 m
2, Green 0.52 to 0.60 m
3, Red 0.63 to 0.69 m
4, NIR 0.76 to 0.90 m
Panchromatic 0.45 to 0.90 m
182 Applying GeoCorrection Tools
The RPC file is generated by the data provider based on the position of
the satellite at the time of image capture. You can further refine the
RPCs by using GCPs. Locate this file in the same directory as the
image you intend to use in the GeoCorrection process.
QuickBird QuickBird properties let you rectify images captured by the QuickBird
satellite. Like IKONOS, QuickBird requires the use of an RPC file to
describe the relationship between the image and the Earths surface at
the time of image capture.
The QuickBird satellite was launched in October 2001. Its orbit has an
altitude of 450 kilometers, a 93.5 minute orbit time, and a 10:30 A.M.
equator crossing time. The inclination is 97.2 degrees sun-
synchronous, and the nominal swath width is 16.5 kilometers at nadir.
The sensor has both panchromatic and multispectral capabilities. The
dynamic range is 11 bits per pixel for both panchromatic and
multispectral. The panchromatic bandwidth is 450-900 nanometers.
The multispectral bands are as follows:
Table 10: QuickBird Bands and Wavelengths
Just like IKONOS, QuickBird has a Parameters tab as well as a
Chipping tab on its properties dialog. The same information applies to
both tabs as is discussed in IKONOS, QuickBird, and RPC Properties
on page 181.
RPC RPC properties let you specify the associated RPC file to use in
geocorrection. RPC properties in Image Analysis for ArcGIS let you
work with NITF data.
NITF data is designed to pack numerous image compositions with
complete annotation, text attachments, and imagery-associated
metadata.
The RPC file associated with the image contains rational function
polynomial coefficients generated by the data provider based on the
position of the satellite at the time of image capture. You can further
refine these RPCs by using GCPs. Locate this file in the same directory
as the images you intend to use in orthorectification.
Band Wavelength
(Microns)
1, Blue 0.45 to 0.52 m
2, Green 0.52 to 0.60 m
3, Red 0.63 to 0.69 m
4, NIR 0.76 to 0.90 m
Applying GeoCorrection Tools 183
Just like IKONOS and QuickBird, the RPC Properties dialog contains
the Parameters and Chipping tabs. These work the same way in all
three model properties.
IKONOS, QuickBird, and
RPC Parameters Tab
The Parameters tab on the IKONOS, QuickBird, and RPC Properties
dialogs calls for an RPC file and the elevation range. Be sure to enter
the RPC file information first.
Figure 50: IKONOS, QuickBird, and RPC Parameters Tab
There is also a check box for Refinement with Polynomial Order. This
is provided so you can apply polynomial corrections to the original
rational function model. This setting corrects the remaining error and
refines the mathematical solution. Check the Refinement with
Polynomial Order check box to enable the refinement process, and then
specify the order by clicking the arrows.
The 0 order results in a simple shift to both image X and Y coordinates.
The 1st order is an affine transformation. The 2nd order results in a 2nd
order transformation, and the 3rd order in a 3rd order transformation.
Usually, a 0 or 1st order is sufficient to reduce error not addressed by
the rational function model (RPC file).
The fields in the Elevation Range box are automatically populated by
the RPC file.
IKONOS, QuickBird, and
RPC Chipping Tab
The chipping process allows circulation of RPCs for an image chip
rather than the full, original image from which the chip was derived. This
is made possible by specifying an affine relationship (pixel) between the
chip and the full, original image. The Chipping tab is the same for
IKONOS, QuickBird, and RPC Properties.
184 Applying GeoCorrection Tools
Figure 51: Chipping Tab
You are given the choice of Scale and Offset or Arbitrary Affine as your
chipping parameters on the Chipping tab. The tab changes depending
on which chipping parameter you select from the Specify Chipping
Parameters As dropdown list as described in IKONOS, QuickBird, and
RPC Properties on page 185 and IKONOS, QuickBird, and RPC
Properties on page 186.
Full Row Count and Full Column Count fields are located at the bottom
of the Chipping tab. If the chip header contains the appropriate data, the
Full Row Count value is the row count of the full, original image. If the
header count is absent, this value corresponds to the row count of the
chip.
Applying GeoCorrection Tools 185
Scale and Offset Scale and Offset is the simpler of the two chipping parameters. The
formulas for calculating the affine using scale and offset are listed in a
box on the Chipping tab.
Figure 52: Chipping Tab using Scale and Offset
X and Y correspond to the pixel coordinates for the full, original image.
The options are:
Row Offset This value corresponds to value f, an offset value. In
the absence of header data, this value defaults to 0.
Row Scale This value corresponds to value e, a scale factor that
is also used in rotation. In the absence of header data, this value
defaults to 1.
Column Offset This value corresponds to value c, an offset value.
In the absence of header data, this value defaults to 0.
Column Scale This value corresponds to value a, a scale factor
that is also used in rotation. In the absence of header data, this
value defaults to 1.
186 Applying GeoCorrection Tools
Arbitrary Affine The Arbitrary Affine formulas display in the box on the Chipping tab
when you select that option from the Specify chipping parameters as
dropdown list.
In the formulas, x (x prime), and y (y prime), correspond to the pixel
coordinates in the chip with which you are currently working. Values for
the variables are either obtained from the header data of the chip, or
they default to the predetermined values described above.
The following is an example of the Arbitrary Affine settings on the
Chipping tab.
Figure 53: Chipping Tab using Arbitrary Affine
Landsat The Landsat dialog is used for orthorectification of any Landsat image
that uses TM or MSS as its sensor. The model is derived by space
resection based on collinearity equations. The elevation information is
required in the model for removing relief displacement.
Landsat 1-5 In 1972, the National Aeronautics and Space Administration (NASA)
initiated the first civilian program specializing in the acquisition of
remotely sensed digital satellite data. The first system was called Earth
Resources Technology Satellites (ERTS), and later renamed Landsat.
There have been several Landsat satellites launched since 1972.
Landsats 1, 2, and 3 are no longer operating, but Landsats 4 and 5 are
still in orbit gathering data. Landsats 1, 2, and 3 gathered Multispectral
Scanner (MSS) data and Landsats 4 and 5 collect MSS and TM data.
Applying GeoCorrection Tools 187
MSS The MSS from Landsats 4 and 5 has a swath width of approximately
185 170 km from a height of approximately 900 km for Landsats 1, 2,
and 3, and 705 km for Landsats 4 and 5. MSS data is widely used for
general geologic studies as well as vegetation inventories.
The spatial resolution of MSS data is 56 79 m, with a 79 79 m IFOV
(instantaneous field of view). A typical scene contains approximately
2340 rows and 3240 columns. The radiometric resolution is 6-bit, but it
is stored as 8-bit (Lillesand and Kiefer 1987).
Detectors record electromagnetic radiation (EMR) in four bands:
Bands 1 and 2 Are in the visible portion of the spectrum and are
useful in detecting cultural features, such as roads. These bands
also show detail in water.
Bands 3 and 4 Are in the near-infrared portion of the spectrum
and are useful in land/water and vegetation discrimination.
Bands 4, 3, and 2 Create a false color composite. False color
composites appear similar to an infrared photograph where objects
do not have the same colors or contrasts as they would naturally.
For example, in an infrared image, vegetation appears red, water
appears navy or black, and so on.
Bands 5, 4, and 2 Create a pseudo color composite. (A thematic
image is also a pseudo color image.) In pseudo color, the colors do
not reflect the features in natural colors. For example, roads might
be red, water yellow, and vegetation blue.
You can use different color schemes to enhance the features under
study. These are by no means all of the useful combinations of the
seven bands. The particular applications determine the bands to use.
TM The TM scanner is a multispectral scanning system much like MSS,
except the TM sensor records reflected/emitted electromagnetic energy
from the visible, reflective-infrared, middle-infrared, and thermal-
infrared regions of the spectrum. TM has higher spatial, spectral, and
radiometric resolution than MSS.
TM has a swath width of approximately 185 km from a height of
approximately 705 km. It is useful for vegetation type and health
determination, soil moisture, snow and cloud differentiation, rock type
discrimination, and so on.
The spatial resolution of TM is 28.5 28.5 m for all bands except the
thermal (band 6), which has a spatial resolution of 120 120 m. The
larger pixel size of this band is necessary for adequate signal strength.
However, the thermal band is resampled to 28.5 28.5 m to match the
other bands. The radiometric resolution is 8-bit, meaning that each pixel
has a possible range of data values from 0 to 255.
Detectors record EMR in seven bands:
188 Applying GeoCorrection Tools
Bands 1, 2, and 3 Are in the visible portion of the spectrum and
are used in detecting cultural features such as roads. These bands
also show detail in water.
Bands 4, 5, and 7 Are in the reflective-infrared portion of the
spectrum and are used in land/water discrimination.
Band 6 Is in the thermal portion of the spectrum and is used for
thermal mapping (Jensen 1996; Lillesand and Kiefer 1987).
Table 11: TM Bands and Wavelengths
Band Wave-length
(Microns)
Comments
1, Blue 0.45 to
0.52 m
Differentiates between soil and vegetation,
forest type mapping, and detecting cultural
features for mapping coastal water areas.
2, Green 0.52 to
0.60 m
Corresponds to the green reflectance of healthy
vegetation. Also useful for cultural feature
identification.
3, Red 0.63 to
0.69 m
Differentiates between many plant species. It is
also useful for determining soil boundary and
geological boundary delineations as well as
cultural features.
4, NIR 0.76 to
0.90 m
Indicates the amount of vegetation biomass
present in a scene. It is useful for crop
identification and emphasizes soil/crop and
land/water contrasts.
5, MIR 1.55 to
1.75 m
Detects the amount of water in plants, which is
useful in crop drought studies and in plant health
analyses. This is also one of the few bands that
is used to discriminate between clouds, snow,
and ice.
6, TIR 10.40 to 12.50
m
Detects vegetation and crop stress, heat
intensity, insecticide applications, and thermal
pollution. It is also used to locate geothermal
activity.
7, MIR 2.08 to
2.35 m
Discriminates between geologic rock type and
soil boundaries, as well as soil and vegetation
moisture content.
Applying GeoCorrection Tools 189
Figure 54: Landsat MSS versus Landsat TM
Band Combinations for
Displaying TM Data
You can display different combinations of TM bands to create different
composite effects. The order of the bands corresponds to the RGB
color guns of the monitor. The following combinations are commonly
used to display images:
Bands 3, 2, 1 Create a true color composite. True color means
that objects look as they would to the naked eyesimilar to a color
photograph.
Bands 4, 3, 2 Create a false color composite. False color
composites appear similar to an infrared photograph where objects
do not have the same colors or contrasts as they would naturally.
For example, in an infrared image, vegetation appears red, water
appears navy or black, and so on.
Bands 5, 4, 2 Create a pseudo color composite. (A thematic
image is also a pseudo color image.) In pseudo color, the colors do
not reflect the features in natural colors. For example, roads may be
red, water yellow, and vegetation blue.
You can use different color schemes to bring out or enhance the
features under study. These are by no means all of the useful
combinations of the seven bands. The application determines the
bands to use.
Landsat 7 The Landsat 7 satellite, launched in 1999, uses Enhanced Thematic
Mapper Plus (ETM+) to observe the Earth. The capabilities new to
Landsat 7 include the following:
15 m spatial resolution panchromatic band
5% radiometric calibration with full aperture
Radiometric
Resolution
0-127
Radiometric
Resolution
0- 255
1 Pixel =
57 m x 79 m
1 Pixel =
30 m x 30 m
3 Bands
7 Bands
M
S
S
T
M
190 Applying GeoCorrection Tools
60 m spatial resolution thermal IR channel
The primary receiving station for Landsat 7 data is located in Sioux
Falls, South Dakota at the USGS EROS Data Center (EDC). ETM+
data is transmitted using X-band direct downlink at a rate of 150 Mbps.
Landsat 7 is capable of capturing scenes without cloud obstruction, and
the receiving stations can obtain this data in real time using the X-band.
Stations located around the globe, however, only receive data for the
portion of the ETM+ ground track where the receiving station can see
the satellite.
Landsat 7 Data Types One type of data available from Landsat 7 is browse data. Browse data
is a lower resolution image for determining image location, quality and
information content. Another type of data is metadata, which is
descriptive information on the image. This information is available via
the internet within 24 hours of being received by the primary ground
station. Moreover, EDC processes the data to Level 0r. This data is
corrected for scan direction and band alignment errors only. Level 1G
data, which is corrected, is also available.
Landsat 7 Specifications Information about the spectral range and ground resolution of the bands
of the Landsat 7 satellite is provided in the following table:
Table 12: Landsat 7 Characteristics
Landsat 7 has a swath width of 185 kilometers. The repeat coverage
interval is 16 days, or 233 orbits. The satellite orbits the Earth at 705
kilometers.
The Landsat Properties
Dialog Box
The Landsat Properties dialog in GeoCorrection Properties has the
General, Links, and Elevation tabs already discussed in this chapter. It
also has a Parameters tab as discussed in the next section.
Band
Number
Wavelength
(Microns)
Resolution
(m)
1 0.45 to 0.52 m 30
2 0.52 to 0.60 m 30
3 0.63 to 0.69 m 30
4 0.76 to 0.90 m 30
5 1.55 to 1.75 m 30
6 10.4 to 12.5 m 60
7 2.08 to 2.35 m 30
Panchromatic (8) 0.50 to 0.90 m 15
Applying GeoCorrection Tools 191
Parameters Tab The Landsat Properties Parameters tab groups the settings for
specifying the type of sensor used to capture your data, the Scene
Coverage (if you choose Quarter Scene you also choose the quadrant),
the Number of Iterations, and the Background.
The Parameters tab is shown below.
Figure 55: Landsat Properties Parameters Tab
192 Applying GeoCorrection Tools
193 Glossary
Glossary 193
Glossary
abstract symbol
An annotation symbol that has a geometric shape, such as a
circle, square, or triangle. These symbols often represent
amounts that vary from place to place, such as population
density, yearly rainfall, and so on.
accuracy assessment
The comparison of a classification to geographical data that is
assumed to be true. Usually, the assumed true data is derived
from ground truthing.
American Standard Code for Information Interchange
A basis of character sets...to convey some control codes,
space, numbers, most basic punctuation, and unaccented
letters a-z and A-Z.
analysis mask
An option that uses a raster dataset in which all cells of interest
have a value and all other cells have no data. Analysis mask
lets you perform analysis on a selected set of cells.
ancillary data
The data, other than remotely sensed data, that is used to aid
in the classification process.
annotation
The explanatory material accompanying an image or a map.
Annotation can consist of lines, text, polygons, ellipses,
rectangles, legends, scale bars, and any symbol that denotes
geographical features.
AOI
See area of interest.
a priori
Already or previously known.
area
A measurement of a surface.
area of interest
A point, line, or polygon that is selected as a training sample or
as the image area to use in an operation.
194 Glossary
ASCII
See American Standard Code for Information Interchange.
aspect
The orientation, or the direction that a surface faces, with
respect to the directions of the compass: north, south, east,
west.
attribute
The tabular information associated with a raster or vector layer.
average
The statistical mean; the sum of a set of values divided by the
number of values in the set.
band
A set of data file values for a specific portion of the
electromagnetic spectrum of reflected light or emitted heat (red,
green, blue, near-infrared, infrared, thermal, and so on) or
some other user-defined information created by combining or
enhancing the original bands, or creating new bands from other
sources. Sometimes called channel.
bilinear interpolation
Uses the data file values of four pixels in a 2 2 window to
calculate an output value with a bilinear function.
bin function
A mathematical function that establishes the relationship
between data file values and rows in a descriptor table.
bins
Ordered sets of pixels. Pixels are sorted into a specified
number of bins. The pixels are then given new values based
upon the bins to which they are assigned.
border
On a map, a line that usually encloses the entire map, not just
the image area as does a neatline.
boundary
A neighborhood analysis technique that is used to detect
boundaries between thematic classes.
brightness value
The quantity of a primary color (red, green, blue) for a pixel on
the display device. Also called intensity value, function memory
value, pixel value, display value, and screen value.
Glossary 195
buffer zone
A specific area around a feature that is isolated for or from
further analysis. For example, buffer zones are often generated
around streams in site assessment studies so that further
analyses exclude these areas that are often unsuitable for
development.
Cartesian
A coordinate system in which data is organized on a grid and
points on the grid are referenced by their X,Y coordinates.
camera properties
Camera properties are for the orthorectification of any image
that uses a camera for its sensor. The model is derived by
space resection based on collinearity equations. The elevation
information is required in the model for removing relief
displacement.
categorize
The process of choosing distinct classes to divide your image
into.
cell
1. A 1 1 area of coverage. Digital terrain elevation data
(DTED) is distributed in cells. 2. A pixel; grid cell.
cell size
The area that one pixel represents, measured in map units. For
example, one cell in the image may represent an area 30 30
on the ground. Sometimes called the pixel size.
checkpoint analysis
The act of using checkpoints to independently verify the degree
of accuracy of a triangulation.
circumcircle
A triangles circumscribed circle; the circle that passes through
each of the triangles three vertices.
class
A set of pixels in a GIS file that represents areas that share
some condition. Classes are usually formed through
classification of a continuous raster layer.
class value
A data file value of a thematic file that identifies a pixel as
belonging to a particular class.
196 Glossary
classification
The process of assigning the pixels of a continuous raster
image to discrete categories.
classification accuracy table
For accuracy assessment, a list of known values of reference
pixels, supported by some ground truth or other a priori
knowledge of the true class, and a list of the classified values of
the same pixels, from a classified file to be tested.
classification scheme (or classification system)
A set of target classes. The purpose of such a scheme is to
provide a framework for organizing and categorizing the
information that is extracted from the data.
clustering
Unsupervised training; the process of generating signatures
based on the natural groupings of pixels in image data when
they are plotted in spectral space.
clusters
The natural groupings of pixels when plotted in spectral space.
coefficient
One number in a matrix, or a constant in a polynomial
expression.
collinearity
A nonlinear mathematical model that photogrammetric
triangulation is based upon. Collinearity equations describe the
relationship among image coordinates, ground coordinates,
and orientation parameters.
contiguity analysis
A study of the ways in which pixels of a class are grouped
together spatially. Groups of contiguous pixels in the same
class, called raster regions, or clumps, can be identified by their
sizes and multiplied.
continuous
A term used to describe raster data layers that contain
quantitative and related values. See also continuous data.
continuous data
A type of raster data that is quantitative (measuring a
characteristic) and has related, continuous values, such as
remotely sensed images (Landsat, SPOT, and so on).
Glossary 197
contrast stretch
The process of reassigning a range of values to another range,
usually according to a linear function. Contrast stretching is
often used in displaying continuous raster layers because the
range of data file values is usually much narrower than the
range of brightness values on the display device.
convolution filtering
The process of averaging small sets of pixels across an image.
Used to change the spatial frequency characteristics of an
image.
convolution kernel
A matrix of numbers that is used to average the value of each
pixel with the values of surrounding pixels in a particular way.
The numbers in the matrix serve to weight this average towards
particular pixels.
coordinate system
A method of expressing location. In two-dimensional
coordinate systems, locations are expressed by a column and
row, also called X and Y.
correlation threshold
A value used in rectification to determine whether to accept or
discard GCPs. The threshold is an absolute value threshold
ranging from 0.000 to 1.000.
correlation windows
Windows that consist of a local neighborhood of pixels.
corresponding GCPs
The GCPs that are located in the same geographic location as
the selected GCPs, but are selected in different files.
covariance
Measures the tendencies of data file values for the same pixel,
but in different bands, to vary with each other in relation to the
means of their respective bands. These bands must be linear.
Covariance is defined as the average product of the differences
between the data file values in each band and the mean of
each band.
covariance matrix
A square matrix that contains all of the variances and
covariances within the bands in a data file.
198 Glossary
cubic convolution
Uses the data file values of sixteen pixels in a 4 4 window to
calculate an output with cubic function.
data
1. In the context of remote sensing, a computer file containing
numbers that represent a remotely sensed image, and can be
processed to display that image. 2. A collection of numbers,
strings, or facts that requires some processing before it is
meaningful.
database
A relational data structure usually used to store tabular
information. Examples of popular databases include SYBASE,
dBASE, Oracle, and INFO.
data file
A computer file that contains numbers that represent an image.
data file value
Each number in an image file. Also called file value, image file
value, DN, brightness value, pixel.
decision rule
An equation or algorithm that is used to classify image data
after signatures are created. The decision rule is used to
process the data file values based on the signature statistics.
DEM
See digital elevation model.
density
A neighborhood analysis technique that displays the number of
pixels that have the same value as the analyzed pixel in a user-
specified window.
digital elevation model
Continuous raster layers in which data file values represent
elevation. DEMs are available from the USGS at 1:24,000 and
1:250,000 scale, and can be produced with terrain analysis
programs.
digital terrain model
A discrete expression of topography in a data array, consisting
of a group of planimetric coordinates (X,Y) and the elevations
of the ground points and breaklines.
Glossary 199
dimensionality
In classification, dimensionality refers to the number of layers
being classified. For example, a data file with three layers is
said to be three-dimensional.
divergence
A statistical measure of distance between two or more
signatures. Divergence can be calculated for any combination
of bands used in the classification; bands that diminish the
results of the classification can be ruled out.
diversity
A neighborhood analysis technique that displays the number of
different values in a user-specified window.
DTM
See digital terrain model.
edge detector
A convolution kernel, which is usually a zero sum kernel, that
smooths out or zeros out areas of low spatial frequency and
creates a sharp contrast where spatial frequency is high. High
spatial frequency is at the edges between homogeneous
groups of pixels.
edge enhancer
A high-frequency convolution kernel that brings out the edges
between homogeneous groups of pixels. Unlike an edge
detector, it only highlights edges; it does not eliminate other
features.
enhancement
The process of making an image more interpretable for a
particular application. Enhancement can make important
features of raw, remotely sensed data more interpretable to the
human eye.
extension
The three letters after the period in a file name that usually
identify the type of file.
extent
1. The image area to display in a view. 2. The area of the
Earths surface to map.
200 Glossary
feature collection
The process of identifying, delineating, and labeling various
types of natural and human-made phenomena from remotely
sensed images.
feature extraction
The process of studying and locating areas and objects on the
ground and deriving useful information from images.
feature space
An abstract space that is defined by spectral units (such as an
amount of electromagnetic radiation).
fiducial center
The center of an aerial photo.
fiducials
Four or eight reference markers fixed on the frame of an aerial
metric camera and visible in each exposure that are used to
compute the transformation from data file to image coordinates.
file coordinates
The location of a pixel within the file in X,Y coordinates. The
upper-left file coordinate is usually 0,0.
filtering
The removal of spatial or spectral features for data
enhancement. Convolution filtering is one method of spatial
filtering. Some texts use the terms filtering and spatial filtering
synonymously.
focal
The process of performing one of several analyses on data
values in an image file, using a process similar to convolution
filtering.
GCP
See ground control point.
GCP matching
For image-to-image rectification, a GCP selected in one image
is precisely matched to its counterpart in the other image using
the spectral characteristics of the data and the transformation
matrix.
geocorrection
The process of rectifying remotely sensed data that has
distortions due to a sensor or the curvature of the Earth.
Glossary 201
geographic information system
A unique system designed for a particular application that
stores, enhances, combines, and analyzes layers of
geographic data to produce interpretable information. A GIS
might include computer images, hardcopy maps, statistical
data, and any other data needed for a study, as well as
computer software and human knowledge. GISs are used for
solving complex geographic planning and management
problems.
georeferencing
The process of assigning map coordinates to image data and
resampling the pixels of the image to conform to the map
projection grid.
GIS
See geographic information system.
ground control point
Specific pixel in image data for which the output map
coordinates (or other output coordinates) are known. GCPs are
used for computing a transformation matrix, for use in rectifying
an image.
high-frequency kernel
A convolution kernel that increases the spatial frequency of an
image. Also called a high-pass kernel.
histogram
A graph of data distribution, or a chart of the number of pixels
that have each possible data file value. For a single band of
data, the horizontal axis of a histogram graph is the range of all
possible data file values. The vertical axis is a measure of
pixels that have each data value.
histogram equalization
The process of redistributing pixel values so that there are
approximately the same number of pixels with each value
within a range. The result is a nearly flat histogram.
histogram matching
The process of determining a lookup table that converts the
histogram of one band of an image or one color gun to
resemble another histogram.
hue
A component of intensity, hue, saturation that is representative
of the color or dominant wavelength of the pixel. It varies from 0
202 Glossary
to 360. Blue = 0 (and 360), magenta = 60, red = 120, yellow =
180, green = 240, and cyan = 300.
IKONOS properties
Use the IKONOS Properties dialog to perform orthorectification
on images gathered with the IKONOS satellite. The IKONOS
satellite orbits at an altitude of 423 miles, or 681 kilometers.
The revisit time is 2.9 days at 1 m resolution, and 1.5 days at
1.5 m resolution.
image data
Digital representations of the Earth that can be used in
computer image processing and GIS analyses.
image file
A file containing raster image data.
image matching
The automatic acquisition of corresponding image points on the
overlapping area of two images.
image processing
The manipulation of digital image data, including (but not
limited to) enhancement, classification, and rectification
operations.
indices
The process used to create output images by mathematically
combining the DN values of different bands.
infrared
Infrared portion of the electromagnetic spectrum.
IR
See infrared.
island polygons
When using the Seed tool, island polygons represent areas in
the polygon that have different characteristics from the areas in
the larger polygon. You have the option of using the island
polygons feature or turning it off when using the Seed tool.
ISODATA
See Iterative Self-Organizing Data Analysis Technique.
Iterative Self-Organizing Data Analysis Technique
A method of clustering that uses spectral distance as in the
sequential method, but iteratively classifies the pixels,
Glossary 203
redefines the criteria for each class, and classifies again so that
the spectral distance patterns in the data gradually emerge.
Landsat
A series of Earth-orbiting satellites that gather MSS and TM
imagery operated by EOSAT.
layer
1. A band or channel of data. 2. A single band or set of three
bands displayed using the red, green, and blue color guns. 3. A
component of a GIS database that contains all of the data for
one theme. A layer consists of a thematic image file, and may
also include attributes.
linear
A description of a function that can be graphed as a straight
line or a series of lines. Linear equations (transformations) are
generally expressed in the form of the equation of a line or
plane. Also called 1st order.
linear contrast stretch
An enhancement technique that produces new values at
regular intervals.
linear transformation
A 1st order rectification. A linear transformation can change
location in X or Y, scale in X or Y, skew in X or Y, and rotation.
lookup table
An ordered set of numbers that is used to perform a function on
a set of input values. Lookup tables translate data file values
into brightness values to display or print an image.
low-frequency kernel
A convolution kernel that decreases spatial frequency. Also
called low-pass kernel.
LUT
See lookup table.
majority
A neighborhood analysis technique that displays the most
common value of the data file values in a user-specified
window.
map projection
A method of representing the three-dimensional spherical
surface of a planet on a two-dimensional map surface. All map
204 Glossary
projections involve the transfer of latitude and longitude onto an
easily flattened surface.
maximum
A neighborhood analysis technique that displays the greatest
value of the data file values in a user-specified window.
maximum likelihood
A classification decision rule based on the probability that a
pixel belongs to a particular class. The basic equation assumes
that these probabilities are equal for all classes, and that the
input bands have normal distributions.
mean
1. The statistical average; the sum of a set of values divided by
the number of values in the set. 2. A neighborhood analysis
technique that displays the mean value of the data file values in
a user-specified window.
median
1. The central value in a set of data such that an equal number
of values are greater than and less than the median. 2. A
neighborhood analysis technique that displays the median
value of the data file values in a user-specified window.
minimum
A neighborhood analysis technique that displays the least
value of the data file values in a user-specified window.
minimum distance
A classification decision rule that calculates the spectral
distance between the measurement vector for each candidate
pixel and the mean vector for each signature. Also called
spectral distance.
minority
A neighborhood analysis technique that displays the least
common value of the data file values in a user-specified
window.
modeling
The process of creating new layers from combining or
operating upon existing layers. Modeling allows the creation of
new classes from existing classes and the creation of a small
set of images or a single image, which at a glance, contains
many types of information about a scene.
Glossary 205
mosaicking
The process of piecing together images side-by-side to create
a larger image.
MSS
See multispectral scanner.
multispectral classification
The process of sorting pixels into a finite number of individual
classes, or categories of data, based on data file values in
multiple bands.
multispectral imagery
Satellite imagery with data recorded in two or more bands.
multispectral scanner
Landsat satellite data acquired in four bands with a spatial
resolution of 57 79 meters.
nadir
The area on the ground directly beneath a scanners detectors.
NDVI
See normalized difference vegetation index.
nearest neighbor
A resampling method in which the output data file value is
equal to the input pixel that has coordinates closest to the
retransformed coordinates of the output pixel.
neighborhood analysis
Any image processing technique that takes surrounding pixels
into consideration, such as convolution filtering and scanning.
NoData
NoData is what you assign to pixel values you do not want to
include in a classification or function. By assigning pixel values
NoData, they are not given a value. Images that georeference
to non-rectangles need a NoData concept for display even if
they are not classified. The values that NoData pixels are given
are understood to be just place holders.
non-directional
The process using the Sobel and Prewitt filters for edge
detection. These filters use orthogonal kernels convolved
separately with the original image and then combined.
206 Glossary
nonlinear
Describing a function that cannot be expressed as the graph of
a line or in the form of the equation of a line or plane. Nonlinear
equations usually contain expressions with exponents. Second
order (2nd order) or higher-order equations and
transformations are nonlinear.
nonlinear transformation
A 2nd order or higher rectification.
nonparametric signature
A signature for classification that is based on polygons or
rectangles that are defined in the feature space image for the
image file. There is no statistical basis for a nonparametric
signature; it is an area in a feature space image.
normalized difference vegetation index
The formula for NDVI is IR - R / IR + R, where IR stands for the
infrared portion of the electromagnetic spectrum, and R stands
for the red portion of the electromagnetic spectrum. NDVI finds
areas of vegetation in imagery.
observation
In photogrammetric triangulation, a grouping of the image
coordinates for a GCP.
off-nadir
Any point that is not directly beneath a scanners detectors, but
off to an angle. The SPOT scanner allows off-nadir viewing.
orthorectification
A form of rectification that corrects for terrain displacement and
is used if a DEM of the study area is available.
overlay
1. A function that creates a composite file containing either the
minimum or the maximum class values of the input files.
Overlay sometimes refers generically to a combination of
layers. 2. The process of displaying a classified file over the
original image to inspect the classification.
panchromatic imagery
Single-band or monochrome satellite imagery.
parallelepiped
1. A classification decision rule in which the data file values of
the candidate pixel are compared to upper and lower limits. 2.
Glossary 207
The limits of a parallelepiped classification, especially when
graphed as rectangles.
parameter
1. Any variable that determines the outcome of a function or
operation. 2. The mean and standard deviation of data, which
are sufficient to describe a normal curve.
parametric signature
A signature that is based on statistical parameters (such as
mean and covariance matrix) of the pixels that are in the
training sample or cluster.
pattern recognition
The science and art of finding meaningful patterns in data,
which can be extracted through classification.
PCA
See principal components analysis.
piecewise linear contrast stretch
An enhancement technique used to enhance a specific portion
of data by dividing the lookup table into three sections: low,
middle, and high.
pixel
Abbreviated from picture element; the smallest part of a picture
(image).
pixel depth
The number of bits required to store all of the data file values in
a file. For example, data with a pixel depth of 8, or 8-bit data,
have 256 values ranging from 0-255.
pixel size
The physical dimension of a single light-sensitive element (13
13 microns).
polygon
A set of closed line segments defining an area.
polynomial
A mathematical expression consisting of variables and
coefficients. A coefficient is a constant that is multiplied by a
variable in the expression.
208 Glossary
principal components analysis
1. A method of data compression that allows redundant data to
be compressed into fewer bands (Jensen 1996; Faust 1989). 2.
The process of calculating principal components and producing
principal component bands. It allows redundant data to be
compacted into fewer bands (that is the dimensionality of the
data is reduced).
principal point
The point in the image plane onto which the perspective center
is projected, located directly beneath the interior orientation.
profile
A row of data file values from a DEM or DTED file. The profiles
of DEM and DTED run south to north (the first pixel of the
record is the southernmost pixel).
pushbroom
A scanner in which all scanning parts are fixed, and scanning is
accomplished by the forward motion of the scanner, such as
the SPOT scanner.
QuickBird
The QuickBird model requires the use of rational polynomial
coefficients (RPCs) to describe the relationship between the
image and the Earth's surface at the time of image capture. By
using QuickBird properties, you can perform orthorectification
on images gathered with the QuickBird satellite.
radar data
The remotely sensed data that is produced when a radar
transmitter emits a beam of micro or millimeter waves. The
waves reflect from the surfaces they strike, and the
backscattered radiation is detected by the radar systems
receiving antenna, which is tuned to the frequency of the
transmitted waves.
radiometric correction
The correction of variations in data that is not caused by the
object or scene being scanned, such as scanner malfunction
and atmospheric interference.
radiometric enhancement
An enhancement technique that deals with the individual
values of pixels in an image.
Glossary 209
radiometric resolution
The dynamic range, or number of possible data file values, in
each band. This is referred to by the number of bits into which
the recorded energy is divided. See also pixel depth.
rank
A neighborhood analysis technique that displays the number of
values in a user-specified window that are less than the
analyzed value.
raster data
A data type in which thematic class values have the same
properties as interval values, except that ratio values have a
natural zero or starting point.
rational polynomial coefficients
See RPC properties.
recoding
The assignment of new values to one or more classes.
rectification
The process of making image data conform to a map projection
system. In many cases, the image must also be oriented so
that the north direction corresponds to the top of the image.
rectified coordinates
The coordinates of a pixel in a file that have been rectified,
which are extrapolated from the GCPs. Ideally, the rectified
coordinates for the GCPs are exactly equal to the reference
coordinates. Because there is often some error tolerated in the
rectification, this is not always the case.
red, green, blue
The primary additive colors that are used on most display
hardware to display imagery.
reference coordinates
The coordinates of the map or reference image to which a
source (input) image is being registered. GCPs consist of both
input coordinates and reference coordinates for each point.
reference pixels
In classification accuracy assessment, pixels for which the
correct GIS class is known from ground truth or other data. The
reference pixels can be selected by you, or randomly selected.
210 Glossary
reference plane
In a topocentric coordinate system, the tangential plane at the
center of the image on the Earth ellipsoid, on which the three
perpendicular coordinate axes are defined.
reproject
Transforms raster image data from one map projection to
another.
resampling
The process of extrapolating data file values for the pixels in a
new grid when data has been rectified or registered to another
image.
resolution
A level of precision in data.
resolution merge
The process of sharpening a lower-resolution multiband image
by merging it with a higher-resolution monochrome image.
RGB
See red, green, blue.
RGB clustering
A clustering method for 24-bit data (three 8-bit bands) that plots
pixels in three-dimensional spectral space and divides that
space into sections that are used to define clusters. The output
color scheme of an RGB-clustered image resembles that of the
input file.
RMSE
See root mean square error.
Root mean square error
The distance between the input (source) location of the GCP
and the retransformed location for the same GCP. RMS error is
calculated with a distance equation.
RPC
See rational polynomial coefficients.
RPC properties
The RPC properties uses rational polynomial coefficients to
describe the relationship between the image and the Earth's
surface at the time of image capture. You can specify the
associated RPC file to use in your geocorrection.
Glossary 211
Rubber Sheeting
The application of nonlinear rectification (2nd order or higher).
saturation
A component of IHS that represents the purity of color and also
varies linearly from 0 to 1.
scale
1. The ratio of distance on a map as related to the true distance
on the ground. 2. Cell size. 3. The processing of values through
a lookup table.
scanner
The entire data acquisition system such as the Landsat
scanner or the SPOT panchromatic scanner.
seed tool
An Image Analysis for ArcGIS feature that automatically
generates feature layer polygons of similar spectral value.
shapefile
A vector format that contains spatial data. Shapefiles have the
.shp extension.
signature
A set of statistics that defines a training sample or cluster. The
signature is used in a classification process. Each signature
corresponds to a GIS class that is created from the signatures
with a classification decision rule.
source coordinates
In the rectification process, the input coordinates.
spatial enhancement
The process of modifying the values of pixels in an image
relative to the pixels that surround them.
spatial frequency
The difference between the highest and lowest values of a
contiguous set of pixels.
spatial resolution
A measure of the smallest object that can be resolved by the
sensor, or the area on the ground represented by each pixel.
speckle noise
The light and dark pixel noise that appears in radar data.
212 Glossary
spectral distance
The distance in spectral space computed as Euclidean
distance in n-dimensions, where n is the number bands.
spectral enhancement
The process of modifying the pixels of an image based on the
original values of each pixel, independent of the values of
surrounding pixels.
spectral resolution
A measure of the smallest object that can be resolved by the
sensor, or the area on the ground represented by each pixel.
spectral space
An abstract space that is defined by spectral units (such as an
amount of electromagnetic radiation). The notion of spectral
space is used to describe enhancement and classification
techniques that compute the spectral distance between n-
dimensional vectors, where n is the number of bands in the
data.
SPOT
SPOT satellite sensors operate in two modesmultispectral
and panchromatic. SPOT is often referred to as the pushbroom
scanner, meaning that all scanning parts are fixed, and
scanning is accomplished by the forward motion of the
scanner. SPOT pushes 3000/6000 sensors along its orbit. This
differs from Landsat, which scans with 16 detectors
perpendicular to its orbit.
standard deviation
1. The square root of the variance of a set of values used as a
measurement of the spread of the values. 2. A neighborhood
analysis technique that displays the standard deviation of the
data file values of a user-specified window.
striping
A data error that occurs if a detector on a scanning system
goes out of adjustment; that is, it provides readings consistently
greater than or less than the other detectors for the same band
over the same ground cover.
subsetting
The process of breaking out a portion of a large image file into
one or more smaller files.
Glossary 213
sum
A neighborhood analysis technique that displays the total of the
data file values in a user-specified window.
supervised training
Any method of generating signatures for classification in which
the analyst is directly involved in the pattern recognition
process. Usually, supervised training requires the analyst to
select training samples from the data that represent patterns to
be classified.
swath width
In a satellite system, the total width of the area on the ground
covered by the scanner.
summarize areas
A common workflow progression with a feature theme
corresponding to an area of interest to summarize the change
just within a certain area.
temporal resolution
The frequency with which a sensor obtains imagery of a
particular area.
terrain analysis
The processing and graphic simulation of elevation data.
terrain data
Elevation data expressed as a series of x, y, and z values that
are either regularly or irregularly spaced.
thematic change
A feature in Image Analysis for ArcGIS that lets you compare
two thematic images of the same area captured at different
times to notice changes in vegetation, urban areas, and so on.
thematic data
Raster data that is qualitative and categorical. Thematic layers
often contain classes of related information, such as land cover,
soil type, slope, and so on.
thematic map
A map illustrating the class characterizations of a particular
spatial variable such as soils, land cover, hydrology, and so on.
thematic mapper
Landsat data acquired in seven bands with a spatial resolution
of 30 30 meters.
214 Glossary
theme
A particular type of information, such as soil type or land use,
that is represented in a layer.
threshold
A limit, or cutoff point, usually a maximum allowable amount of
error in an analysis. In classification, thresholding is the
process of identifying a maximum distance between a pixel and
the mean of the signature to which it was classified.
TM
See thematic mapper.
training
The process of defining the criteria by which patterns in image
data are recognized for the purpose of classification.
training sample
A set of pixels selected to represent a potential class. Also
called sample.
transformation matrix
A set of coefficients that is computed from GCPs, and used in
polynomial equations to convert coordinates from one system
to another. The size of the matrix depends upon the order of
the transformation.
triangulation
Establishes the geometry of the camera or sensor relative to
objects on the Earths surface.
true color
A method of displaying an image (usually from a continuous
raster layer) that retains the relationships between data file
values and represents multiple bands with separate color guns.
The image memory values from each displayed band are
translated through the function memory of the corresponding
color gun.
unsupervised training
A computer-automated method of pattern recognition in which
some parameters are specified by the user and are used to
uncover statistical patterns that are inherent in the data.
variable
1. A numeric value that is changeable, usually represented with
a letter. 2. A thematic layer. 3. One band of a multiband image.
Glossary 215
4. In models, objects that have been associated with a name
using a declaration statement.
vector data
Data that represents physical forms (elements) such as points,
lines, and polygons. Only the vertices of vector data are stored,
instead of every point that makes up the element.
vegetative indices
A grayscale image that clearly highlights vegetation.
zoom
The process of expanding displayed pixels on an image so they
can be more closely studied. Zooming is similar to
magnification, except that it changes the display only
temporarily, leaving image memory the same.
216 Glossary
217 References
References 217
References
The following references were used in the creation of this book.
Akima, H., 1978, A Method for Bivariate Interpolation and Smooth
Surface Fitting for Irregularly Distributed Data Points, ACM
Transactions on Mathematical Software 4(2), pp. 148-159.
Buchanan, M.D. 1979. Effective Utilization of Color in
Multidimensional Data Presentation. Proceedings of the Society
of Photo-Optical Engineers, Vol. 199: 9-19.
Chavez, Pat S., Jr, et al. 1991. Comparison of Three Different
Methods to Merge Multiresolution and Multispectral Data:
Landsat TM and SPOT Panchromatic. Photogrammetric
Engineering & Remote Sensing, Vol. 57, No. 3: 295-303.
Conrac Corp., Conrac Division. 1980. Raster Graphics Handbook.
Covina, California: Conrac Corp.
Daily, Mike. 1983. Hue-Saturation-Intensity Split-Spectrum
Processing of Seasat Radar Imagery. Photogrammetric
Engineering& Remote Sensing, Vol. 49, No. 3: 349-355.
ERDAS 2000. ArcView Image Analysis. Atlanta, Georgia: ERDAS,
Inc.
ERDAS 1999. Field Guide. 5th ed. Atlanta: ERDAS, Inc.
ESRI 1992. Map Projections & Coordinate Management: Concepts
and Procedures. Redlands, California: ESRI, Inc.
Faust, Nickolas L. 1989. Image Enhancement. Volume 20,
Supplement 5 of Encyclopedia of Computer Science and
Technology, edited by Allen Kent and James G. Williams. New
York: Marcel Dekker, Inc.
Gonzalez, Rafael C., and Paul Wintz. 1977. Digital Image
Processing. Reading, Massachusetts: Addison-Wesley
Publishing Company.
Holcomb, Derrold W. 1993. Merging Radar and VIS/IR Imagery.
Paper submitted to the 1993 ERIM Conference, Pasadena,
California.
Hord, R. Michael. 1982. Digital Image Processing of Remotely
Sensed Data. New York. Academic Press.
218 References
Jensen, John R., et al. 1983. Urban/Suburban Land Use Analysis.
Chapter 30 in Manual of Remote Sensing, edited by Robert N.
Colwell. Falls Church, Virginia: American Society of
Photogrammetry.
Jensen, John R. 1996. Introductory Digital Image Processing: A
Remote Sensing Perspective. Englewood Cliffs, New Jersey:
Prentice-Hall.
Kloer, Brian R. 1994. Hybrid Parametric/Non-parametric Image
Classification. Paper presented at the ACSM-ASPRS Annual
Convention, April 1994, Reno, Nevada.
Lillesand, Thomas M., and Ralph W. Kiefer. 1987. Remote Sensing
and Image Interpretation. New York: John Wiley & Sons, Inc.
Marble, Duane F. 1990. Geographic Information Systems: An
Overview. Introductory Readings in Geographic Information
Systems, edited by Donna J. Peuquet and Duane F. Marble.
Bristol, Pennsylvania: Taylor & Francis, Inc.
McCoy, Jill, and Kevin Johnston. Using ArcGIS Spatial Analyst.
Redlands, California: ESRI, Inc.
Sabins, Floyd F., Jr. 1987. Remote Sensing Principles and
Interpretation. New York: W. H. Freeman and Co.
Schowengerdt, Robert A. 1983. Techniques for Image Processing
and Classification in Remote Sensing. New York. Academic
Press.
Schowengerdt, Robert A. 1980. Reconstruction of Multispatial,
Multispectral Image Data Using Spatial Frequency Content.
Photogrammetric Engineering & Remote Sensing, Vol. 46, No.
10: 1325-1334.
Star, Jeffrey, and John Estes. 1990. Geographic Information
Systems: An Introduction. Englewood Cliffs, New Jersey:
Prentice-Hall.
Swain, Philip H. 1973. Pattern Recognition: A Basis for Remote
Sensing Data Analysis (LARS Information Note 111572). West
Lafayette, Indiana: The Laboratory for Applications of Remote
Sensing, Purdue University.
Swain, Philip H., and Shirley M. Davis. 1978. Remote Sensing: The
Quantitative Approach. New York: McGraw Hill Book Company.
References 219
Tou, Julius T., and Rafael C. Gonzalez. 1974. Pattern Recognition
Principles. Reading, Massachusetts: Addison-Wesley
Publishing Company.
Tucker, Compton J. 1979. Red and Photographic Infrared Linear
Combinations for Monitoring Vegetation. Remote Sensing of
Environment, Vol. 8: 127-150.
Walker, Terri C., and Richard K. Miller. 1990. Geographic
Information Systems: An Assessment of Technology,
Applications, and Products. Madison, Georgia: SEAI Technical
Publications.
Watson, David, 1994, Contouring: A Guide to the Analysis and
Display of Spatial Data, Elsevier Science, New York.
Welch, R., and W.Ehlers. 1987. Merging Multiresolution SPOT
HRV and Landsat TM Data. Photogrammetric Engineering &
Remote Sensing, Vol. 53, No. 3: 301-303.
220 References
Index 235
Index
A
A priori
defined 207
Abstract symbol
defined 207
Accuracy assessment
defined 207
Advantages
bilinear interpolation 63
cubic convolution 65
nearest neighbor 64
American Standard Code for Information Inter-
change
defined 207
Analysis mask
defined 207
Ancillary data
defined 207
Annotation
defined 207
AOI
defined 207
Applying
data tools 51
GeoCorrection tools 167
Spectral enhancement 111
Arbitrary Affine
formulas 200
Area
defined 207
Area of interest
defined 207
ASCII
defined 208
Aspect
defined 208
Attribute
defined 208
Average
defined 208
B
Band
defined 208
Bilinear interpolation
advantages and disadvantages 63
defined 208
Options dialog preferences 63
Bin function
defined 208
Bins
defined 208
Border
defined 208
Boundary
defined 208
Brightness inversion
overview 107
Brightness value
defined 208
Brovey Transform 94
Buffer zone
defined 209
C
Camera
overview 191
properties dialog
Camera tab 192
Fiducials tab 193
Orientation tab 191
overview 191
Camera imagery
orthorectification 43
Camera Model
tutorial 43
Camera properties
defined 209
Cartesian
defined 209
Categorize
defined 209
Cell
defined 209
Cell size
defined 209
Options dialog 61
workflow 68
Checkpoint analysis
defined 209
overview 190
Chipping parameters
offset 199
236 Index
scale 199
Circumcircle
defined 209
Class
defined 209
Class value
defined 209
Classification
decision rules 157
Mahalanobis distance 159
maximum likelihood 158
minimum distance 157
nonparametric 149, 157
overview 149
Parallelepiped 160
parametric 149, 157
defined 210
enhanced data 151
limiting dimensions 151
nonparametric decision rule 149
overview 147
parametric decision rule 149
process 148
rectification 170
scheme 150
signatures 149
supervised training 148
supervised vs. unsupervised 151
tips 150
training 148
unsupervised training 148
Classification accuracy table
defined 210
Classification scheme
defined 210
Clustering
defined 210
Clusters
defined 210
initial cluster means 153
ISODATA 152
overview 152
Coefficient
defined 210
Collinearity
defined 210
Color IR to Natural Color
overview 120
Contacting
ERDAS 10
ESRI 10
Contiguity analysis
defined 210
Continuous
defined 210
Continuous data
defined 210
Contrast stretch
defined 211
for display 99
linear 98
nonlinear 98
overview 98
piecewise linear 98
varying it 99
Conversion
overview 162
using 161
Converting
features to raster 165
raster to features 162
Convolution
example 84
filtering 84
formula 85
overview 84
using 89
workflow 88
Convolution filtering
applying 84
defined 211
Convolution kernel
defined 211
Coordinate system
defined 211
Correlation threshold
defined 211
Correlation windows
defined 211
Corresponding GCPs
defined 211
Covariance
defined 211
Covariance matrix
defined 211
Create new image
overview 72
workflow 73
Creating a shapefile
tutorial 19, 20
Index 237
Cubic convolution
advantages and disadvantages 65
defined 212
Options dialog preferences 63
D
Data
defined 212
Data file
defined 212
Data file value
defined 212
Data preparation
using 71
Data tools
applying 51
Data versus information 124
Database
defined 212
updating 4
Decision rule
classification 157
defined 212
Mahalanobis distance 159
maximum likelihood 158
minimum distance 157
nonparametric 149
overview 149
Parallelepiped 160
parametric 149, 157
Decision rules
nonparametric 157
DEM
defined 212
Density
defined 212
Digital elevation model
defined 212
Digital terrain model
defined 212
Dimensionality
defined 213
Dimensions
limiting for classification 151
Disadvantages
bilinear interpolation 63
cubic convolution 65
nearest neighbor 64
Divergence
defined 213
Diversity
defined 213
DTM
defined 213
E
Edge detector
defined 213
Edge enhancer
defined 213
Education solutions
ERDAS 10
ESRI 10
Effects of order
polynomial equation 182
Enhanced data
classifying 151
Enhancement
defined 213
linear 98
nonlinear 98
radiometric 97
Environmental hazards
identifying 8
mapping 8
ERDAS
contacting 10
education solutions 10
ESRI
contacting 10
education solutions 10
Extension
defined 213
Extent 59
defined 213
workflow 67
Extent tab
Options dialog 60
F
Feature collection
defined 214
Feature extraction
defined 214
Feature space
defined 214
Features to raster
238 Index
converting 165
workflow 166
Fiducial center
defined 214
Fiducials
defined 214
File coordinates
defined 214
Filtering
defined 214
Focal
defined 214
Focal analysis
overview 92
using 94
workflow 93
G
GCP
defined 214
GCP matching
defined 214
GCPs
entering coordinates 170
minimum number 187
overview 169
General tab
Options dialog 59
workflow 66
GeoCorrection
defined 214
overview 171
properties dialog
Elevation tab 174
General tab 171
Links tab 172
overview 171
Geocorrection
tutorial 43
GeoCorrection tools
applying 167
Geographic database
updating 4
Geographic information system
defined 215
overview 123
Geoprocessing
specifying options 69
tools 69
Geoprocessing models
updating 69
Georeferencing
defined 215
only 169
overview 169
GIS
defined 123, 215
GIS analysis
performing 123
Ground control point
defined 215
Ground control points
overview 169
H
Help
accessing for Image Analysis 10
High-frequency kernel
defined 215
overview 87
High-order polynomials 182
nonlinear 182
Histogram
defined 215
Histogram equalization
defined 215
effect on contrast 104
formula 103
overview 101
tutorial 14
using 105
workflow 104
Histogram matching
defined 215
overview 105
using 107
workflow 106
Hue
defined 112, 215
I
IHS to RGB
overview 115
using 117
workflow 116
IKONOS
overview 195
Index 239
properties dialog
Chipping tab 198
Parameters tab 197
IKONOS properties
defined 216
overview 195
Image algebra 118
Image Analysis for ArcGIS
getting help 10
Getting Started 12
performing tasks 4
quick-start tutorial 11
Image Analysis options
changes for ArcGIS 69
Image data
defined 216
Image Difference
tutorial 24
workflow 142
Image difference
overview 140
Image file
defined 216
Image info
overview 57
Image Info dialog
workflow 58
Image matching
defined 216
Image processing
defined 216
Indices
defined 216
Information versus data 124
Infrared
defined 216
Initial cluster means 153
Intensity
defined 112
IR
defined 216
Island polygons
defined 216
overview 53
ISODATA
clustering 152
defined 216
Iterative Self-Organizing Data Analysis Tech-
nique
defined 216
K
Kernels
high-frequency 87
zero sum 86
L
Land cover
categorizing 5
Landsat
bands and wavelengths 201
defined 217
MSS 201
number of satellites 200
overview 200
properties dialog
Parameters tab 205
Landsat 7
data types 204
satellite 204
specifications 204
Layer
defined 217
Layer Stack
overview 143
workflow 144
layer stack 144
Linear
defined 217
Linear contrast stretch 98
defined 217
Linear transformation
defined 217
Polynomial transformation 180
Rubber Sheeting 189
Lookup table
defined 217
Low-frequency kernel
defined 217
example 87
LUT
defined 217
LUT Stretch
overview 98
using 101
workflow 100
240 Index
M
Mahalanobis distance rules
classification 159
Majority
defined 217
Map projection
defined 217
Maximum
defined 218
Maximum likelihood
defined 218
Maximum likelihood rules
classification 158
Mean
defined 218
Median
defined 218
Minimum
defined 218
GCPs 187
Minimum distance
defined 218
Minimum distance rules
classification 157
Minority
defined 218
Modeling
defined 218
Mosaicking
defined 219
overview 78
tutorial 38
workflow 80
MSS
defined 219
Landsat 201
Multispectral classification
defined 219
Multispectral imagery
defined 219
Multispectral scanner
defined 219
N
Nadir
defined 219
National Imagery Transmission Format Stan-
dard 196
Natural hazard damage
identifying 6
summarizing 6
NDVI
defined 219
Nearest neighbor
advantages and disadvantages 64
defined 219
Options dialog preferences 63
Neighborhood analysis
defined 219
density 125
diversity 125
majority 125
maximum 125
minimum 125
minority 125
overview 125
rank 125
sum 126
workflow 126
NITF 196
NoData
defined 219
overview 57
Non-directional
defined 219
Non-directional edge
overview 89
using 91
workflow 91
Nonlinear
defined 220
Nonlinear contrast stretch 98
Nonlinear transformation
defined 220
Polynomial transformation 181
Rubber Sheeting 190
Nonparametric decision rule 149
Parallelepiped 157
Nonparametric signature
defined 220
Normalized difference vegetation index
defined 220
O
Observation
defined 220
Off-nadir
Index 241
defined 220
Offset
chipping parameters 199
Options dialog
Cell Size tab workflow 68
Extent tab workflow 67
General tab workflow 66
overview 59
Preferences tab 68
Orthorectification
defined 220
overview 167
rectification 171
tutorial 43
Overlay
defined 220
P
Panchromatic
SPOT 176
Panchromatic imagery
defined 220
Parallelepiped
defined 220
rules classification 160
Parameter
defined 221
Parametric decision rules
Mahalanobis distance 157
Maximum likelihood 157
minimum distance 157
Parametric rule
overview 149
Parametric signature
defined 221
Pattern recognition
defined 221
PCA
defined 221
Performing GIS analysis 123
Piecewise linear contrast stretch 98
defined 221
Pixel
defined 221
Pixel depth
defined 221
Pixel size
defined 221
Placing links
tutorial 48
Polygon
defined 221
Polynomial
defined 221
Polynomial equation
effects of order 182
Polynomial transformation
effects of order 182
linear 180
nonlinear 181
overview 179
transformation matrix 180
Preferences
Options dialog 62, 68
Principal components analysis
defined 222
Principal point
defined 222
Profile
defined 222
Pushbroom
defined 222
scanner 176
Q
Questions about Image Analysis
finding answers 9
QuickBird
defined 222
overview 195, 196
properties dialog
Chipping tab 198
Parameters tab 197
Quick-start tutorial
Image Analysis for ArcGIS 11
R
Radar data
defined 222
Radiometric correction
defined 222
Radiometric Enhancement
about 97
Radiometric enhancement
defined 222
Radiometric resolution
defined 223
242 Index
Rank
defined 223
Raster data
defined 223
Raster tab 65
Raster to feature
converting 162
workflow 164
Rational polynomial coefficients
defined 223
Recode
by class name workflow 133
by symbology workflow 135
previously grouped image workflow 137
Recoding
defined 223
Rectification
classification 170
disadvantages 169
georeferencing 169
georeferencing only 169
orthorectification 171
overview 168
RMS error 170
thematic files 171
triangle-based 189
Red, green, blue
defined 223
Reference coordinates
defined 223
Reference pixels
defined 223
Reference plane
defined 224
Reproject
defined 224
Reproject image
overview 81
workflow 81
Resampling
bilinear interpolation 63
cubic convolution 63
defined 224
nearest neighbor 63
Rescale Image
overview 145
workflow 146
Resolution
defined 224
Resolution merge
Brovey Transform 94
defined 224
overview 94
using 96
workflow 95
RGB
defined 224
RGB clustering
defined 224
RGB to IHS
overview 112
using 114
workflow 114
RMS error
overview 170
tolerance 170
Root mean square error
defined 224
RPC
defined 224
overview 196
properties dialog
Chipping tab 198
Parameters tab 197
RPC properties
defined 224
overview 195
RSME
defined 224
tolerance of 170
Rubber Sheeting
checkpoint analysis 190
defined 225
Linear transformation 189
nonlinear transformation 190
overview 189
triangle-based rectification 189
triangulation 189
S
Satellites
IKONOS 195
Landsat 1-5 200
Landsat 7 204
QuickBird 196
SPOT 176
SPOT 4 178
SPOT Panchromatic 176
SPOT XS 177
Index 243
Saturation
defined 112, 225
Scale
chipping parameters 199
defined 225
Scanner
defined 225
panchromatic 177
pushbroom 176
SPOT 176
Scheme
classification 150
Seed Radius
overview 53
workflow 56
Seed tool
controlling 52
defined 225
properties overview 52
workflow 53
Shadow
enhancing 98
Shapefile
defined 225
Signature
defined 225
overview 149
Sites
characterizing 5
Source coordinates
defined 225
Spatial enhancement
defined 225
overview 83
Spatial frequency
defined 225
Spatial resolution
defined 225
Speckle noise
defined 225
Spectral distance
defined 226
Spectral enhancement
applying 111
defined 226
Spectral resolution
defined 226
Spectral space
defined 226
SPOT
defined 226
Panchromatic 176
pushbroom scanner 176
satellite overview 176
workflow 178
XS 177
SPOT 4 satellite 178
Standard deviation
defined 226
Stereoscopic
imagery 177
pairs 177
stretch
linear 98
nonlinear 98
Subset Image
overview 74
Subset image spectrally
workflow 77
Subsetting
defined 226
Subsetting an image spatially
workflow 77
Sum
defined 227
Summarize areas
overview 129
workflow 130
Supervised classification
overview 155
workflow 156
Supervised training
defined 227
overview 148
Supervised vs. unsupervised classification
151
Swath width
defined 227
T
Tasks
performing in Image Analysis 4
Temporal resolution
defined 227
Terrain analysis
defined 227
Terrain data
defined 227
Thematic change
244 Index
defined 227
overview 127
tutorial 27
workflow 128
Thematic data
defined 227
Thematic files
orthorectification 171
rectification 171
Thematic map
defined 227
Thematic mapper
defined 227
Theme
defined 228
Threshold
defined 228
Tips
classification 150
TM
defined 228
displaying data in bands 203
overview 201
Tools
applying GeoCorrection 167
Training
classification 148
defined 228
signatures 149
supervised 148
unsupervised 148
Training sample
defined 228
Transformation matrix
defined 228
Polynomial transformation 180
Transformations
high-order polynomials 182
linear 180, 189
nonlinear 181, 190
Triangle-based rectification 189
Triangulation
defined 228
Rubber Sheeting 189
True color
defined 228
Tutorial
exercises
adding images 14
applying histogram stretch 14
finding areas of change 24
getting started 12
identifying similar areas 19
mosaiking images 38
orthorectification of camera imagery
43
quick-start 11
U
Unsupervised Classification
tutorial 29
Unsupervised classification
clusters 152
ISODATA clustering 152
overview 152
percentage unchanged 154
pixel analysis 153
workflow 155
Unsupervised training
defined 228
overview 148
Unsupervised vs. supervised classification
151
Urban growth
identifying changes 7
Using
conversion 161
convolution 89
data preparation 71
focal analysis 94
non-directional edge 91
Resolution merge 96
utilities 139
Utilities
Image Difference 140
Layer Stack 143
Rescale Image 145
using 139
V
Variable
defined 228
Vector data
defined 229
Vegetative indices
applications 117
defined 229
examples 117
Index 245
overview 117
Vegetative stress
identifying 9
X
XS
SPOT 177
Z
Zero sum kernels 86
Zoom
defined 229
246 Index

Você também pode gostar