Você está na página 1de 52

G Intergraph Aquired by Hexagon G Managing the Port of Rotterdam

G RADARSAT-2 data G WeoGeo


Magaz i ne f or Sur veyi ng, Mappi ng & GI S Pr of es s i onal s
July/Aug. 2010
Volume 13
5
www.topcon.eu
Capture geo-referenced
360 degree images
and point clouds with any
car in your eet
SURVEY AT
SPEED
g
r
a
f
i
t
-
w
e
r
b
e
a
g
e
n
t
u
r
.
d
e
Ask the Geospatial Technician
If I were young now and would you have to choose a study, Id know what to do someone
told me two years ago. He would choose radar technology. He added that there are no
universities who teach in this field it right now. Knowing hed suggested me an interesting
theme to explore further for this magazine, Im very happy to announce the article in this
issue on radar. This highly specialized market will grow for sure in the coming years and the
possibilities are quite extraordinary, which becomes clear of the contribution in this issue.
The same thing can be said about the possibilities of cloud computing. Two articles on this
topic can be found in this issue, both written from two different angles to make things more
interesting.
The first one is an extensive and informative article on how ESRI is dealing with the cloud.
It takes away a lot of possible misconceptions that may exist on what in fact cloud comput-
ing is or not is. The second contribtion is about WeoGeo, an US based company that counts
as an interesting example of offering Software as as Service, and their approach on provid-
ing services for data management and sharing is indeed interesting. Getting the data out of
the systems where they are locked is indeed something that is surely needed. It is the same
philosophy behind something like INSPIRE, but of course seen from a different perspective.
As you can read in the interview with WeoGeos CEO Paul Bissett, he defends the geospatial
profession by saying that data created by professionals has a price because it is created by
professionals. And guess what happens if the data comes out and is recognized as soft-
ware. Yes, this sounds bizarre, but it is the truth, as a recent ruling in the US proved (read
more on it on James Fees Spatially Adjusted Blog).
Enjoy your reading!
Eric van Rees
evanrees@geoinformatics.com
July/August 2010
GeoInformatics is the leading publication for Geospatial
Professionals worldwide. Published in both hardcopy
and digital, GeoInformatics provides coverage, analysis
and commentary with respect to the international
surveying, mapping and GIS industry.
GeoInformatics is published 8 times a year.
Editor-in-chief
Eric van Rees
evanrees@geoinformatics.com
Copy Editor
Frank Arts
fartes@geoinformatics.com
Editors
Florian Fischer
ffischer@geoinformatics.com
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.com
Remco Takken
rtakken@geoinformatics.com
Joc Triglav
jtriglav@geoinformatics.com
Financial Director
Yvonne Groenhof
ygroenhof@geoinformatics.com
Advertising
Ruud Groothuis
rgroothuis@geoinformatics.com
Subscriptions
GeoInformatics is available against a yearly
subscription rate (8 issues) of 89,00.
To subscribe, fill in and return the electronic reply
card on our website www.geoinformatics.com or
contact the subscription department at
services@geoinformatics.com
Webstite
www.geoinformatics.com
Graphic Design
Sander van der Kolk
svanderkolk@geoinformatics.com
ISSN 13870858
Copyright 2010. GeoInformatics: no material may
be reproduced without written permission.
P.O. Box 231
8300 AE
Emmeloord
The Netherlands
Tel.: +31 (0) 527 619 000
Fax: +31 (0) 527 620 989
E-mail: mailbox@geoinformatics.com
Corporate
Member
Sustaining
Member
3
RADARSAT-2 data
Imagine a fully automated system used to produce accurate high
resolution radar ortho-images and -mosaics all over the world with excel-
lent reliability! Emergency management teams could then have access
to highly-accurate radar data as soon as it becomes available for their
time-sensitive needs. This was a difficult task in the past,
mainly due to the arduous process of collecting ground control points
(GCPs). It is now possible with the successful operation of the
RADARSAT-2 satellite and a new 3D hybrid satellite model that process
this data without user-collected GCPs.
C o n t e n t
July/August 2010
Articles
Intergraph to be Acquired by Hexagon AB 6
New Large-Format Airborne Digital Frame Cameras
The Intergraph DMC II Camera Range 8
Using Radar Coherent Change Detection
Monitoring Oil Production Facilities 14
Assesment of Historic Sites
In Post-Katrina New Orleans 18
Automated High Accuracy Geometric Correction and
Mosaicking without Ground Control Points
radarsat-2 data 22
Easily share vast amounts of data internally
and externally 28
iTunes for Maps
WeoGeo 36
Building Open Source Software
Geomajas 38
Open Source Solutions
Norwegian Mapping Authority 40
Spatial Information Management
Managing the Port of Rotterdam 44
Column
The Climate Change Challenge 12
Interviews
An Interview with Ken Spratlin 32
Event
Are We There Yet?
The Location Business Summit 30
Page 22
WeoGeo
As an early adopter the cloud, WeoGeo offers storing, sharing, buying and
selling of GIS Data Maps and CAD files for users worldwide. The company
has been mentioned as a best example for applying cloud computing in
Software as a Service model. Paul Bissett, CEO & Co-Founder of WeoGeo,
explains the concept behind the company, how it works, and explains why
sharing geospatial data is a good thing.
4
Page 36
Latest News? Visit www.geoinformatics.com
5
July/August 2010
On the Cover:
During April 2010, FEMA and NPS collaborated on a field assessment
methodology in post-Katrina New Orleans to meet the requirements of
the National Historic Preservation Act (NHPA). More than 40,000
structures were assessed in a fraction of the time required by
traditional data collection methods. Trimble GPS hardware and
software integrated with ESRI ArcGIS were used for the collection and
management of cultural resource data. See article at page 18.
Norwegian Mapping Authority
The Norwegian Mapping Authority (Statens Kartverk) is the central
organisation for the provision of mapping images to most public bodies
and organisations in Norway. After experiencing a vast increase in requests
for their services in 2006 and 2007, the Mapping Authority also had to
deal with an increasingly overstrained IT infrastructure. The Mapping
Authority chose to employ an IT infrastructure based on open source
software solutions, which were free of licensing costs and which proved
to be much better, performance wise.
Managing the Port of Rotterdam
Directly situated on the North Sea and stretching forty kilometers in length,
the Port of Rotterdam, NL (PoR) is the largest seaport in Europe and one
of the busiest ports in the world. A 24/7 global gateway and massive
transshipment point, it serves to swiftly and efficiently distribute goods to
hundreds of millions of European consumers. The ports massive industrial
complex provides an intermediate destination for storage, cargo handling,
processing and also distribution via various other forms of transport
including road, rail, ship, river barge and pipeline..
Page 44
Calendar 50
Advertisers Index 50
Page 40
Page 44
Intergraph to be Acquired
by Hexagon AB
Intergraph announced that it will be acquired by Hexagon AB for an enterprise value of $2.125 billion.
Hexagon is a leading global provider of precision measurement technology systems. Hexagon was founded in 1992,
is headquartered in Stockholm, Sweden and is publicly traded on the Nordic exchange with a
secondary listing on the Swiss Exchange.
By the editors
The transaction combines a leading global
measurement hardware company with a
leading global process engineering and
geospatial software company to cre-
ate a unique and differentiated
technology business in the mar-
ketplace.
Intergraph provides expertise
and leadership for Hexagons
growing software portfolio.
Upon the closing of the
transaction, Intergraph will
become a wholly-owned
subsidiary of Hexagon. We
expect our business will
continue to operate under
the Intergraph name/bran -
ding and will become the
core software growth plat-
form for the Hexagon busi-
ness.
What is the nature of
Hexagons business?
Hexagon is a leading global provider
of precision measurement technology
systems for objects in one, two or three
dimensions. The measurement systems mea-
sure with great precision and rapidly provide
access to large amounts of measurement
data. For the customer, this means greater
efficiency and productivity, improved quality
and significant material and cost savings in
the production process. From global mapping
to precision measurements with nanometer
accuracy, measurement technologies are used
in application areas ranging from infrastruc-
ture and agriculture, to raw material extrac-
tion, manufacturing industries and medical
technologies.
How will Intergraph fit within the
Hexagon portfolio?
Hexagon is a leading global provider of preci-
sion measurement technology systems with
two primary core businesses (Geosystems and
Metrology). The transaction combines a
leading global measurement hardware com-
pany with a leading global process engineer-
ing and geospatial software company to cre-
ate a unique and differentiated technology
6
Ar t i cl e
July/August 2010
"We are very pleased that Hexagon has selected
Intergraph to play a key role in their software expansion
strategy", says R. Halsey Wise, Chairman, President, and
CEO of Intergraph. "Hexagons commitment to being
number one in the market is very much in line with our
existing goals. We believe Hexagons significant global
resources and technologies will allow further investments
in our customers, software solutions, people and future."
business in the marketplace. Hexagon ope -
rates through a number of strong brand port-
folios that are well known within their respec-
tive industries. Each brand represents a strong
tradition in its geographical region and/or
industry, which is why Hexagon uses different
brands for different customer groups or in
different markets. Intergraph has brand aware-
ness and brand equity around the world and
Hexagon plans to continue to invest in
Intergraph as a marquee brand and unit.
Intergraph provides leading technical exper-
tise and domain leadership for Hexagons
growing software portfolio. Upon the closing
of the transaction, Intergraph will become a
wholly-owned subsidiary of Hexagon. We
believe our business will continue to operate
under the Intergraph name/branding and will
become the core software growth platform for
the Hexagon business.
Hexagon provides leading technology mea-
surement systems that produce a tremendous
the United States, Hexagon plans to comply
with US regulations and establish an inde-
pendent subsidiary for Intergraphs federal
and classified business, controlled by a U.S.
approved special proxy board of outside
directors controlling all operations of the busi-
ness. The appointed directors are required to
be independent of Intergraph and Hexagon
with no prior affiliation to either party and
must be approved by the Defense Security
Service (DSS). These directors are all well
known, very experienced people who have
deep experience and relationships with the
US Defense Department. We view these rela-
tionships will be helpful as we grow our busi-
ness in the years ahead.
For more information, have a look at
www.intergraph.com
amount of precise data (data inputs) in the
form of digital sensors, etc. Intergraphs
unique and differentiated software can act as
the presentation layer to visualize this
immense amount of critical and complex data
to help create actionable intelligence for both
organizations customers.
When will the deal be finalized?
We anticipate that the transaction will close
before the end of 2010. This timing is sub-
ject/dependent on certain regulatory
approvals and satisfaction of other custom-
ary conditions to closing. Prior to closing,
there will be no material changes to our daily
operations/business. It will be business as
usual.
Over the next several months, we will be
working through certain regulatory reviews
and other customary conditions to closing.
Since Hexagon is headquartered outside of
Latest News? Visit www.geoinformatics.com
Ar t i cl e
7
July/August 2010
New Large-Format Airborne Digital Frame Cameras
The Intergraph DMC II
Camera Range
At the recent ASPRS Annual Conference held in San Diego at the end of April, Intergraph announced a major new
development with the introduction of three new large-format airborne digital cameras under its Z/I Imaging brand.
The incorporation of a single large-format monolithic pan imaging sensor in each of these new cameras represents
a major advance in digital airborne imaging technology
By Gordon Petrie
Background
The [Fig. 1(a)] was the
first large-format airborne digital frame
camera to appear on the market, having
been introduced in its original prototype
form at the ISPRS Congress held in
Amsterdam in 2000. The first produc-
tion versions of the DMC were delivered
in 2003. Since then, it has proven to be
a very successful product, with over 100
units having been sold world-wide since
its introduction. The basic design com-
prised four oblique-pointing medium-
format cameras [Fig. 1(b)] arranged in a
block configuration that produced
slightly overlapping
. The resulting photos were then
rectified and stitched together to pro-
duce a single near-vertical composite
image in a rectangular format that
could be delivered to users [Fig. 1(c)]. This final
composite black-and-white pan image gave
the required coverage of the ground from a
single exposure station in a large format size
13.5k x 8k = 108 Megapixels - as required
for photogrammetric mapping purposes.
The final DMC composite panchromatic images
could also be colourized to form
using the
image data from four additional small-format
(2k x 3k = 6 Megapixels) multi-spectral cam-
eras that formed part of the overall DMC cam-
era system [Fig. 1(d)]. These four additional
cameras were all pointing in parallel in the
near-vertical (nadir) direction and did not need
to be rectified in the manner of the larger for-
mat pan images. With their large format and
perspective geometry, the final composite pan
or colour photos could readily be utilized in
the existing digital photogrammetric worksta-
tions (DPWs) and software packages
such as Intergraphs own ImageStation
products that are designed for use with
any type of aerial frame photography.
In 2008, Intergraph introduced a new
airborne multi-spectral digital camera,
called the , which started to be
delivered to customers during the sec-
ond half of 2009. This unit [Fig. 2(a)]
comprises four individual medium-for-
mat nadir-pointing cameras that gen-
erate simultaneous images in the blue,
green, red and near infra-red (NIR)
parts of the spectrum respectively.
Each camera produces an image that
is 6k x 6.8k = 42 Megapixels in size
using a DALSA CCD array having a pixel
size of 7.2 m [Fig. 2(b)]. The RMK D
camera also features electronic FMC
(forward motion compensation) and TDI (time
delay & integration) technologies. The result-
ing framing rate is one image per second. The
acquired images can either be utilized sepa-
rately as individual multi-spectral images or
they can be used in combination (merged) to
form full-resolution colour or false-colour
images. A further feature of the RMK D camera
is its use of an f = 45 mm lens for each of the
four channels. This provides the large
base:height ratio of 0.42 for good stereo-view-
ing and accurate measurement. The medium-
format RMK D camera costs approximately 50%
of the price of the larger-format DMC camera.
Thus it is intended for use by those mapping
companies and government agencies that have
not yet adopted airborne digital imaging tech-
nology because of the very high level of invest-
ment that is required to purchase a large-for-
mat airborne digital imager. The RMK D
cameras multi-spectral capabilities are also
8
Ar t i cl e
July/August 2010
Figure 1 (a) This DMC large-format digital camera is being operated
on a T-AS gyro-controlled mount in conjunction with a Z/I InFlight FMS
(flight management system), which is located to the left of the camera
and its mount. A Solid State Disk (SSD) is shown being inserted into the
front of the camera to record and store the exposed images.
Figure 1 (b) Showing the four oblique pointing cam-
eras built by Carl Zeiss that are used to acquire the
four overlapping medium-format pan images of the
DMC camera simultaneously in a single synchronized
exposure. Each set of four images is rectified and
stitched together post-flight to form the final DMC
large-format panchromatic image.
attractive to those agencies that are concerned with the imaging and
mapping of limited areas for forestry and agricultural applications or
for environmental monitoring and disaster response.
The new range of DMC II cameras that have just been introduced by
Intergraph combine many of the features of the previous DMC and RMK
D series, but they now offer much larger formats which eliminate the
need for the rectification and stitching of the panchromatic images dur-
ing their initial post-flight processing.
Imaging Sensors
The CCD imaging sensors that have been utilized in both the older and
the new series of Intergraph DMC and RMK cameras have all been sup-
plied by the Canadian company which has its headquarters in
Bromont, Quebec. Its subsidiary, DALSA Semiconductor, is located in
Waterloo, Ontario, while its main Image Sensor Solutions facility and
offices are located within the High Tech Campus in Eindhoven in the
Netherlands. The DALSA company has been a pioneer in the develop-
ment of large-format imaging sensors. In 2006, it produced the first
imaging sensor with a format of over 100 Megapixels. This CCD array
[Fig. 3] was developed for an
astronomical application on
behalf of the Astrometry
Department of the U.S. Naval
Observatory (USNO) and had
a format size of 10.5k x 10.5k pixels = 111 Megapixels, with each pixel
being 9 m in size over an active area of 4 x 4 inches (10 x 10 cm).
The new family of CCD imaging sensors that have been developed by
DALSA on an exclusive basis for Intergraph have still larger formats and
smaller pixel sizes. They also exhibit a number of quite different char-
acteristics such as fast framing rates and forward motion compensation
(FMC) that are designed to meet the specific requirements of airborne
imaging rather than astronomical applications.
The new DALSA CCD imaging sensors are still larger in size in terms of
the number of pixels that they feature. In the case of the pan sensors
that are being fitted to the new cameras, their format size is
11.2k x 12k = 140 Megapixels, with each pixel being 7.2 m in size.
The physical size of the new sensors is 3.5 x 3.2 inches (8.8 x 8.2 cm).
Their customized packaging [Fig. 4] is designed specifically for use in
the aerial imaging role in that they are hermetically sealed with a spe-
cial cover glass to ensure that their geometric accuracy is maintained
irrespective of the environmental conditions under which they are being
used. Special holders mounted within the housing ensure the long-term
thermal and mechanical stability of the sensor.
The DALSA CCD imaging
sensors that will be used in
the new and
cameras will feature a
still smaller pixel size (of 5.6
m) and a still greater num-
ber of pixels in the area
array. In the case of the
DMC II230 model, the arrays
will have 15k x 14.4k pixels
= 230 Megapixels; while, in
Latest News? Visit www.geoinformatics.com
Ar t i cl e
9
July/August 2010
Figure 1 (c) Showing the coverage
and overlaps of the four medium-
format pan images that are
acquired by the DMC camera (in
red and yellow) and the ground
coverage of the rectified and
stitched final large-format
near-vertical image (in blue).
Figure 1 (d) The
original DMC camera
as seen from below,
showing its eight
lenses four used for
its large-format pan
imaging channel
and four for its
small-format multi-
spectral imaging
channels.
Figure 2 - (a) The Intergraph RMK
D medium-format airborne
digital camera showing the
handles of two of the solid-state
disk (SSD) units on its left side
and one of its carrying handles
on its right side.
(b) The DALSA FT53 CCD frame-type image sensor that
is used to record the 6k x 6.8k (= 42 Megapixels) images
on each of the cameras four multi-spectral channels.
[a]
[b]
Figure 3 The first 100+ Megapixel CCD imaging
sensor that was built by DALSA SemiConductor in
2006 for use in an astronomical application by the
U.S. Naval Observatory (USNO).
the case of the DMC II250 model, the number of pixels will be 17.2k x
14.7k = 250 Megapixels. DALSA claims that the imaging arrays exhibit
a high sensitivity and a high dynamic range (of around 70 dB), that
allows them to capture detail in shadow areas - while, at the same
time, they possess anti-blooming characteristics that enable them to
deal with bright highlight objects and areas.
DMC II140 Camera
The camera is derived directly from the previous RMK D model,
the main change being the addition of the new large-format pan came -
ra to the existing four channel medium-format multi-spectral arrange-
ment of the RMK D [Fig. 5(a)]. Indeed it is possible for existing exam-
ples of the RMK-D camera to be upgraded to the DMC II140 specification
[Fig. 5(b)]. For this upgrade, the DMC II140 model utilizes a new single
lens for the additional panchromatic channel that has been designed
and built by specifically for photogrammetric applications
and is exclusive to Intergraph for use in the DMC II cameras. The lens
has been designed by Zeiss to produce a very high level of image qual-
ity and temperature stability. The focal length (f ) of the new lens is
92mm which, in combination with the larger size of the CCD area array,
gives an angular coverage of the terrain that approaches 50 degrees
and provides a base:height ratio of 0.35. The new panchromatic cam-
era lens has an infra-red cut-off filter that is designed to block radia-
tion beyond 710 nm wavelength. Each camera head uses a piezo-elec-
tric driven shutter that ensures the maximum degree of synchronization
of the five camera heads during the simultaneous exposure of their
images over the terrain.
The DMC II140 camera also features the image
storage technology that has been used in the existing DMC and RMK
D cameras [Fig. 6]. This provides an on-board storage capacity of 1.5
Terabytes, allowing 2,000 separate images to be stored in-flight. The
post-flight image processing is carried out using the basic software
that has already been developed for the processing of the existing
DMC and RMK D digital image data and has been upgraded to accom-
modate the new camera models.
The new DMC II models have also been designed to be compatible
with all the from Intergraph that are being utilized
with the existing RMK TOP (film), DMC and RMK D (digital) airborne
cameras. These include the Z/I Mission planning software; the Z/I
InFlight flight management system; the Readout Station; and the T-AS
and Z/I gyro-stabilized camera mounts. The wide range of GNSS/IMU
systems from third-party suppliers such as Applanix and IGI - which
are used for the measurement of the camera position and orientation
during flight operations - can all be employed with the DMC II
cameras.
Ar t i cl e
July/August 2010
Figure 4 The new DALSA 140 Megapixel CCD
imaging sensor as packaged for use in the Intergraph
DMC II140 large-format airborne digital camera.
Figure 5 (a) The four
lens cones of the
Intergraph RMK D medi-
um-format camera
surround the lens of a
small-format video cam-
era at the centre of the
supporting face plate.
These four lens cones
with their respective Red,
Green, Blue (RGB) & NIR
filters generate the indi-
vidual images that pro-
vide the multi-spectral
capability of the RMK D
camera. The vacant space
at the foot of the face plate will be occupied by the additional lens cone of the
large-format panchromatic channel if the camera is to be upgraded to the DMC
II140 standard.
Figure 5 - (b) The five
lens cones of an upgrad-
ed Intergraph RMK D
camera (to the DMC II
specification) surround
that of the video camera
at the central position.
At the left side is the
lens of the additional
large-format imaging
panchromatic channel;
the remaining four lenses are those required to generate the four
multi-spectral images as before.
Figure 6 A DMC II140 camera
with its two solid state disk
(SSD) storage devices placed in
front of it.
Figure 5 - (c) A CAD draw-
ing showing the relation-
ship of the single large-
format panchromatic
lens cone (at top); the
four medium-format
multi-spectral lens cones;
and the video camera (at
the centre) that are
utilized in the DMC II
cameras.
10
DMC II230 and DMC II250 Cameras
As noted above, the new and cameras [Fig. 7] fea-
ture the still smaller pixel size of 5.6 m and a substantially larger num-
ber of pixels in their CCD frame imaging arrays to generate pan images
of 230 and 250 Megapixels respectively. In the case of the DMC II250
model, it also features a new longer focal length lens with f =112 mm
instead of the f = 92 mm lens that is used in the other two models.
This produces images having an improved ground resolution (GSD
value) from a given flying height, while the base: height ratio with the
images acquired by the DMC II250 model (with this longer focal length
lens) is reduced slightly to 0.29.
The detailed performance characteristics and parameters of each of the
three new camera models are summarized in Table I given below.
Initial Customers
In parallel with its announcement of the new camera models, Intergraph
also released details of four companies that have already ordered DMC
II cameras for their airborne imaging operations. In the case of the
www.bjgdjw.com/EngLish/News/gsjj.asp the company has purchased two
of the new cameras for use in its aerial mapping operations. Another
order for a DMC II camera from the Far East has been placed by the
www.kyo-soku.co.jp/ index.php, which is
based in Nagano, Japan. A third order has come from
, which is based in Galloway, Ohio in the United States,
located just to the west of the city of Columbus. The company, whose
Web site is www.midwestaerialphoto.com, is already an operator of Z/I
RMK TOP film cameras and plans to utilize its new DMC II camera to
acquire data for use in the USDAs National Resources Inventory (NRI)
programme and other government and commercial imaging and map-
ping projects. Finally a German company which
is based in Marbach, near Stuttgart has also acquired a DMC II camera
[Fig. 8]. The companys Web site - www.geoplana.de - gives
details of its photogrammetric, GIS and cartographic activi-
ties.
Summary & Conclusion
Undoubtedly the introduction of the new single-chip large-for-
mat imaging sensor to generate panchromatic images in the
new Intergraph DMC II cameras represents a major advance
in the design of airborne digital frame cameras. At a stroke,
the new sensor allows a much simplified system design and
it removes the previous requirement with the original DMC
camera to provide multiple lenses, synchronized shutters and
CCD arrays in order to generate the large-format panchromat-
ic images. Besides which, the introduction of the new ima -
ging sensor removes the necessity to calibrate each panchro-
matic channel individually and collectively in favour of a much
simplified single calibration procedure. Furthermore it also
removes the need to carry out a preliminary rectification and
stitching of multiple medium-format pan images that was a
feature of the original DMC design. Finally with its modular
design and construction, the new DMC II camera offers a fair-
ly simple path for users to upgrade their system through the purchase
of an alternative lens or a denser CCD array to generate pan images.
Gordon Petrie is Emeritus Professor of Topographic Science in the Dept. of
Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K. E-
mail - Gordon.Petrie@ges.gla.ac.uk; Web Site - web2.ges.gla.ac.uk/~gpetrie
Latest News? Visit www.geoinformatics.com
Ar t i cl e
11
July/August 2010
Figure 7 (a) An Intergraph DMC II camera placed on a gyro-stabilized Z/I
Mount that can be operated either in stand-alone mode or controlled using
signals from an external IMU.
(b) A complete DMC camera with its five imaging lenses in the foreground
and the two handles of its SSD storage units protruding from the upper (top)
part of the camera.
[a]
[b]
Tabel 1.
Figure 8 A multi-spectral image of the Mercedes Benz Arena football stadium
in Stuttgart, Germany that has been acquired by a DMC II camera being operat-
ed by the Geoplana company.
Surveyors are the custodians of an enabling technology that is critically important to our future.
Surveyors should take a leading role, not only in monitoring climate change,
but in explaining it to the broader public.
I welcome and agree on this appeal stated by Tim Flannery, global expert
on climate change, when giving the keynote address at the recent FIG
Congress in Sydney 11-16 April 2010.
Surveyors are experts in measuring and mapping systems for monitoring
environmental change. They should use this expertise to explain about
the purpose and need for monitoring even minor climate related changes
and thereby take a leading role in explaining to the wider public what
climate change is all about.
Surveyors are also experts in land administration and management - they
are Land Professionals. So next to explaining climate change the survey-
ors should also take a leading role in addressing the climate change chal-
lenge in the wider context of sustainable land governance. .
The key challenges of the new millennium are clearly listed already. They
relate to climate change; food shortage; urban growth; environmental
degradation; and natural disasters. Importantly, these issues all relate to
governance and management of land.
The challenges of food shortage, environmental degradation and natural
disasters are to a large extent caused by the overarching challenge of
climate change, while the rapid urbanisation is a general trend that in
itself has a significant impact on climate change. Measures for adaptation
to climate change must be integrated into strategies for poverty reduction
to ensure sustainable development and for meeting the Millennium devel-
opment Goals (MDGs).
Adaptation to climate change can be achieved to a large extent through
building sustainable and spatially enabled land administration systems.
This should enable control of access to land the use of land. The systems
should identify all prone areas subject to sea-level rise, drought, flooding,
fires, etc. as well as measures and regulations to prevent the impact of
predicted climate change.
Key policy issues to be addressed should relate to protecting the citizens
by avoiding concentration of population in vulnerable areas and improv-
ing resilience of existing ecosystems to cope with the impact of future
climate change. Measures such as building codes may be essential in some
areas to avoid damage e.g. in relation to flooding and earthquakes. Issues
may also relate to plans for replacement existing settlements as an answer
to climate change impacts.
Urbanisation is another major change that is taking place globally. The
urban global tipping point was reached in 2007 when over half of the
worlds population was living in urban areas; around 3.3 billion people.
Urbanisation is also having a very significant impact on climate change.
Cities are where climate change measures will either succeed or fail. Rapid
urbanisation is setting the greatest test for Land Professionals in the
application of land governance to support and achieve the MDGs.
The linkage between climate change adaptation and sustainable develop-
ment should be self evident but is not well understood by the public in
general. My key message therefore is that Land Professionals should take
a leading role in explaining this linkage to the wider public. This should
also ensure that the land management perspective attracts high-level
political support and recognition.
Column
Prof. Stig Enemark enemark@land.aau.dk, is President of FIG
and Professor in Land Management at Aalborg University, Denmark
12 12
July/August 2010
ORION
It's a fine line between you
and your competition...
ALTM ORION
See the line with ALTM.
Whether your requirements demand ultra-dense ground coverage or
small target detection of above-ground features, Orion delivers with
consistent precision and accuracy. The line is clear. ALTM Orion.
Optech Incorporated
300 Interchange Way, Vaughan, Ontario, Canada L4K 5Z8
Tel: +1 905 660 0808 Fax: +1 905 660 0829 www.optech.ca
MAPPING SYSTEM
ULTRA-COMPACT TOPOGRAPHIC
Cameras with True
Forward Motion
Compensation
Featuring DiMAC
Using Radar Coherent Change Detection
Monitoring Oil Production
Facilities
The use of radar imagery is transitioning from research to operational. An example of this is a pilot project for monitoring
oil production areas in the Middle East, using Radar Coherent Change Detection. Its technical basis and background are
discussed here. The underlying phenomenology and technology of this particular project are applicable to a wide range
of infrastructure or vehicle traffic monitoring scenarios.
By Derrold W. Holcomb
As evidenced by the number and variety of radar satellites recently
launched or currently planned, the use of radar imagery is transitioning
from research to operational. One field of particular interest and appli-
cability is monitoring human activity and
infrastructure for Security and Surveillance.
In this realm, radar image processing offers
some unique and powerful capabilities.
Radar imagery is particularly suited to
infrastructure monitoring for several rea-
sons. For one, the strength of the return
radar signal is greatly determined by the
dielectric constant of the target material;
and steel gives a very strong return. So
strong, that even a small percentage of
steel pixel-fill can result in a bright pixel.
For example, an analyst can frequently
locate rail tracks or pipelines of a couple
10s of centimeters width in moderate reso-
lution (10-30 meter) radar imagery. This is
a clear example of the resolution vs. detec-
tion dichotomy that is so important in
understanding radar imagery.
Secondly, the strength of the radar return is greatly impacted by the
geometry of the target-signal interaction. These interactions are charac-
terized by models such as single-bounce, double-bounce or corner
reflector. Needless to say, human constructions are rich in structure
offering these geometries. This is readily observed by looking at a radar
image containing a city. The radar signal will bounce off a paved sur-
face, then off a building wall and back to the sensor; a classic double-
bounce yielding a strong return signal.
A third, less appreciated, factor is that the interaction of electromag-
netic waves with target materials tends to be most sensitive to materi-
als on the scale of the observation wavelength. Radar waves, ranging
from 3cm (X-band) to 70 cm (P-Band), are near the scale of objects
humans make.
All of the above considerations determine the strength, or Magnitude,
of the return radar signal; analogous to the intensity of a visible image.
But the basic radar return contains a second component not available
with conventional EO imaging. This is the Phase of the return signal as
it touches the receive antenna. As an active sensor, the radar emits a
coherent, in-phase pulse. But each discrete wave travels a different
path length on its roundtrip journey and arrives back at the sensor
at a slightly different time and position on the wave. We can not
14
Ar t i cl e
July/August 2010
Figure 1. By controlling advanced radar image processing with a user-friendly
software interface, IMAGINE SAR Interferometry makes sophisticated analysis
available to the non-expert.
Figure 2. (a) The CCD output shapefile of a full 110 sq km. InSAR-pair. An Analyst would quickly identify the series
of lines in the NW quadrant as suggestive of human activity. (b) A quick zoom-in leaves no doubt that this is a
man-made feature.
[a]
[b]
determine exact number of wavelengths in the journey, but we can
record the final fraction of a wavelength that comprises the complete
roundtrip. The amount of information that can be extracted from this
fractional-wavelength component is extraordinary.
The amount, complexity and sophistication of image processing required
to extract useful information from this fraction-of-a-wavelength informa-
tion is significant. Developing the phenomenological understanding of
the radar return and then designing algorithms to create useful prod-
ucts has taken decades, utilizing the dramatic increase in computing
power. To create radar image processing tools that produce the desired
information products, and shield the analyst from phenomenological
theory, ERDAS has partnered with the Remote Sensing Technology
Institute (RSTI) of the German Space Agency (DLR). The advanced pro-
cessing algorithms of the RSTI team were incorporated into ERDASs user-
friendly, operational software paradigm to create IMAGINE SAR
Interferometry. This suite includes DEM creation (InSAR), Coherent
Change Detection (CCD) and Surface Displacement (Subsidence)
Mapping (D-InSAR). One of these components, CCD will be highlighted
below. Figure 1 shows the intuitive Wizard Workflow interface that con-
trols this advanced image processing functionality.
Requirements and Background
Below is an outline of a pilot project for monitoring oil production areas
in the Middle East. We sought to develop estimates of medium-term
production increases or decreases and long-term stability of the fields.
To develop estimates of production changes, we wanted routine esti-
mates of human activity in the oil production areas. This needed to be
done remotely, without in-country support or ancillary information. To
do this, we found that Coherence Change Detection (CCD) could be
used to map vehicle traffic on the sand roads in the area. As this tech-
nology is sensitive to change at the cm level (i.e., at the sub-wave-
length level) and the maintenance vehicle tires were displacing the sand
several centimeters, we found that we could detect, map and quantify
vehicular activity in the oil production fields. Also, as the infrastructure
is metal, radar assisted in mapping the pipelines and correlating vehi-
cle traffic with specific infrastructure.
The long-term stability of the fields can be monitored by mapping sub-
sidence. If the oil-bearing layers are allowed to collapse, oil production
will decrease. At the extreme, the oil field can be permanently dam-
aged. Such centimeter-scale subsidence can be mapped using
Differential Interferometry (D-InSAR).
Technical Basis
Radar interferometry requires that the analyst have an InSAR-pair of
images. The collection requirements of this image-pair are quite strin-
gent. In general, the two scenes must be from the same sensor and
processed identically. They must be in complex format, that is, have
both magnitude and phase layers. Critically, the orbital location of the
two image collects must be separated by a few tens of meters, and
this value must be known to meter or better precision. Modern GPS
and sensor control capabilities can, amazingly, achieve this level of
accuracy.
For CCD, the two images are then processed to produce a coherence
image. Functionally, coherence is a moving window estimate of phase
similarity between the two images. For example, if there was no change
Latest News? Visit www.geoinformatics.com
Ar t i cl e
15
July/August 2010
Figure 3. Detailed look at co-registered Magnitude (L) and Coherence (R) images. A system of oil pipelines is mapped by the radar magnitude.
The loss of coherence along-side the pipelines suggests vehicle traffic in the time period between the two images.
in a particular pixel, the
roundtrip distance for the radar
wave would be the same in
both images and the phase dif-
ference between the two
images would be 0. This would
be perfect coherence, mathe-
matically assigned a value of 1.
Path length differences of some
fraction of a wavelength would
result in lower coherence. In
practice, sensor noise and
atmospheric variation limit the
accuracy of this phase differ-
ence measurement to roughly
1/6 of a wavelength under ideal
conditions.
In IMAGINE SAR Interferometry, the
coherence image can then be
filtered to suppress noise, con-
verted to a change detection
layer and geocoded. This
geocoded product can then be
refined via a sequence of GIS
operations, reduced to a
shapefile of detected changes and an associated listing of the attributes
of each change feature. In addition, a magnitude change detection layer
is simultaneously computed to produce a multi-color change map. These
products are then available to the Analyst for interpretation and evalu-
ation. The results of such a change detection processing regimen are
seen below.
Project
In Figure 2, an example product shapefile is shown. An analysts atten-
tion is quickly drawn to the series of lines in the NW quadrant. The
two scenes analyzed here were taken 174 days apart. We can conclude
that between the two dates there was significant activity at this loca-
tion, causing a loss of inter-scene coherence.
In Figure 2, there are also changes detected that are probably not man-
made. Because this technique is sensitive to centimeter-level change,
natural phenomena are also detected, particularly in the six month
time-span monitored here. Wind, rain and vegetation changes are all
contributing to the features detected by the software.
For routine monitoring, once the project is in place, this may be suffi-
cient information to develop a timeline and estimate of human activity
at this facility. A more detailed understanding of the feature in Figure
2b can be developed by looking at the image-maps created as inter-
mediate layers during the processing sequence. Close-ups of the
Average Magnitude and (Phase) Coherence are seen in Figure 3.
As discussed earlier, steel pipes will give a strong radar return signal
(magnitude). Thus, it is suggested that the left image in Figure 3 is
mapping the oil transmission network. The interpretation is that a set
of parallel pipes is carrying oil from the wellheads, off the bottom of
the image, to a collection tank-farm near the center-right of the image.
From there, a single pipe carries the oil to the NW where it intersects
the E-W road. The pipeline then turns W and parallels the road.
A little appreciated phenomenon that also contributes to the opera-
tional monitoring discussed here is the sub-surface imaging capability
of radar sensors. Depending on a variety of factors including the radar
wavelength, soil material, moisture etc, it is possible to detect strong
reflectors buried by a couple of meters of dry sand. Thus, the infras-
tructure seen in Figure 3a could
be underground. This is not
uncommon as shallow burial is
safer for oil pipes.
The right image in Figure 3
shows a loss of coherence
(dark pixels) mimicking the
pipelines mapped by the mag-
nitude image. This suggests
human activity, presumably
vehicles, alongside the pipe
network. The area of the
proposed tank farm shows sig-
nificant activity (loss of coher-
ence), which would be reason-
able if the infrastructure is
being maintained or upgraded.
In Figure 4, these intermediate
images are combined into a
color composite that allows an
analyst to interpret the sce-
nario based on this under-
standing of the various layers.
Note that while the area
around the tank farm shows
low coherence (activity), the tanks show high coherence and high mag-
nitude. This is consistent with steel tanks that have not been altered
between the two image acquisition dates. There are indications of a
similar metal structure in the center of the pipeline system.
Conclusion and Outlook
As analysis of this InSAR-pair has shown, this technique can be used
to monitor and map activity and infrastructure at remote oilfields.
Operationally, a regional map of existing oil production infrastructure
would be developed. The level of activity, or lack thereof, at all sites
could be routinely determined.Note that the later image of one InSAR-
pair would become the earlier image in the next InSAR-pair of that
scene. In addition, installation of new infrastructure would be mapped
to allow an understanding of site development. Obviously, this infor-
mation would be one set of the contributions required to understand
the evolution of the field. Local subsidence, from D-InSAR, has been
mentioned as another potential input. In addition, the knowledge of
the Analyst would be required to interpret the whole picture.
While this brief discussion has focused on oil production in an arid
region, the underlying phenomenology and technology is applicable to
a wide range of infrastructure or vehicle traffic monitoring scenarios.
Expect to see this application blossom over the next few years.
A part of the Engineering team at ERDAS for over two decades, Derrold
Holcomb is currently the Technical Director for Business Development
at ERDAS Inc. Since joining ERDAS in 1991, he initiated the development
of radar processing software and has been instrumental in
developing the hyperspectral software capabilities of ERDAS IMAGINE. A
current focus is defining and developing operation applications for radar
imagery. Mr. Holcomb has degrees in Chemistry (B.S.) and Geophysical
Sciences (M.S.) from Georgia Institute of Technology.
For more information, have a look at www.erdas.com.
16
Ar t i cl e
July/August 2010
Figure 4. Color composite showing the oil pipeline system, tank farm and roads that have
been used to maintain this infrastructure. This entire scenario is deduced only from the
imagery and an understanding of the radar imaging phenomenology.
I believe in reliability.
Reliability means peace of mind knowing that
your equipment will never let you down.
Regardless of the situation, you want to be able to rely on your
equipment and the results you get. Thats why Leica Geosystems
places great emphasis on dependability. Our comprehensive
spectrum of solutions covers all your measurement needs for
surveying, engineering and geospatial applications. And they are
all backed with world-class service and support that delivers
answers to your questions. When it matters most. When you
are in the field. When it has to be right.
You can count on Leica Geosystems to provide a highly reliable
solution for every facet of your job.
Leica Geosystems AG
Switzerland
www.leica-geosystems.com
The Leica Viva GNSS this exceptionally rugged,
easy-to-use instrument with a self-explanatory
interface is a fine example of our uncompromising
dedication to your needs. Reliability: yet another
reason to trust Leica Geosystems.
A s s e s m e n t o f H i s t o r i c S i t e s
During April 2010, FEMA and NPS collaborated on a field assessment methodology in post-Katrina New Orleans to meet
the requirements of the National Historic Preservation Act (NHPA). More than 40,000 structures were assessed in a
fraction of the time required by traditional data collection methods. Trimble GPS hardware and software integrated
with ESRI ArcGIS were used for the collection and management of cultural resource data.
By Felicity Boag
Introduction
The people and culture of New Orleans,
Louisiana have made the city unique among
and distinct from other cities in the United
States. Approximately one-fifth of New
Orleans urban area is in a historic district list-
ed on the National Register of Historic Places.
In 2005, Hurricane Katrina damaged tens of
thousands of historic homes throughout the
city, resulting in the single largest disaster for
cultural resources in the United States since
the National Historic Preservation Act of 1966
(NHPA) was enacted.
In response to the enormity of the disaster
and the need to resolve immediate threats to
human health and safety, one of the many
programs the Federal Emergency Management
Agency (FEMA) funded was the removal of
damaged homes in support of the City of New
Orleans. However, FEMAs Historic Preser va -
tion department immediately recognized that
this effort could have potentially affected
many historic properties, and under NHPA
were required to consider the effects of the
removal on historic resources, despite the
potential health and safety issues. As a result,
FEMA faced the difficult challenge of assisting
in rebuilding New Orleans as quickly as pos-
sible while fulfilling its obligations to consid-
er the effects of its projects on the countrys
historic resources.
Meeting NHPA Requirements
The NHPA established a national historic
preservation program and is the major law
defining historic preservation policy, establish-
ing State/Tribal Historic Preservation offices
and determining the independent roles of all
parties involved in historic preservation
efforts. Section 106 of the Act stipulates that
a federal agency must consider the effects of
projects on historic properties when taxpayer
dollars are spent on activities such as build-
ing a new highway or rebuilding a neighbor-
hood following a disaster. It mandates a
review process ensuring that the federal agen-
cy, in consultation with applicable state, trib-
al, and local parties, is aware of any adverse
effects on historic resources and mitigates
against such effects. Meeting Section 106
requirements entails identifying and review-
ing all cultural resources eligible for inclusion
on the National Register of Historic Places,
which is often an extremely time-consuming
process.
A formal survey or cultural resource assess-
ment is a requirement in so many construc-
tion projects because Section 106 applies to
any place, site or structure whether it is actu-
ally listed on the Register or simply qualifies
for listing, said Deidre McCarthy, Historian for
National Park Service (NPS) Heritage
Documentation Programs, Cultural Resource
Geographic Information System (GIS) Facility.
The site assessments are often called Section
106 surveys. A city rich in history and culture,
New Orleans had thousands of houses, mon-
uments, and neighborhoods listed on or eli-
gible for the National Register of Historic
Places. In the aftermath of Hurricane Katrina,
there simply was not time for standard
Section 106 surveys. FEMA had to develop a
methodology for assessing all of the damaged
resources quickly in order to improve the safe-
ty of New Orleans citizens.
Developing an Assessment
Methodology
To accelerate the Section 106 assessment pro-
cess in support of their disaster response,
FEMA turned to the NPS Cultural Resource GIS
Facility for help in taking advantage of the
18
Ar t i cl e
July/August 2010
In 2005, Hurricane Katrina damaged tens of thousands of historic homes throughout the city, resulting in
the single largest disaster for cultural resources in the United States since the National Historic Preservation
Act of 1966 (NHPA) was enacted.
In Post-Katrina New Orleans
capabilities of GIS and Global Positioning
Systems (GPS) to speed up the Section 106
process. The NPS was tasked with designing
a digital Section 106 process, and they took
the opportunity to field test their draft cultur-
al resource data transfer standards in storing,
managing, and sharing GIS data for the New
Orleans project.
The NPS had developed the cultural resource
data transfer standards to help facilitate data
sharing between organizations. The NPS-
designed methodology assisted FEMA in tak-
ing advantage of GPS for data collection to
speed the survey and evaluation process. In
addition, it allowed FEMA to use GIS to
consolidate and share data and speed the
concurrence process between FEMA and
State/Tri bal Historic Preservation offices.
With time considered a scarce commodity,
post-Katrina New Orleans was an ideal test
bed for applying the data transfer standards.
Working together, the NPS and FEMA devel-
oped a methodology that implemented the
draft standards for the first time in New
Orleans. The result was the successful utiliza-
tion of GPS and GIS technologies to assess
more than 40,000 structures in a fraction of
the time that might have been required had
traditional data collection methods been
used.
McCarthy explained that the need for devel-
oping data transfer standards for cultural
resource assessments arose from the desire
to utilize geospatial technology to a greater
extent as part of the overall preservation pro-
cess. GIS holds the key to integrating our
cultural resource data sources and allowing
cultural resource managers to explore new
approaches to using the data, resulting in bet-
ter resource protection, she said.
The methodology developed by NPS focused
on gathering locational data to establish a
baseline of inventory information, and ulti-
would be used in the field data collection
devices as well as the basis of the structure
of the GIS repository. The NPS and FEMA, in
consultation with the Louisiana State Historic
Preservation Office (SHPO), which plays a key
role in administering the national historic
preservation program at the state level, devel-
oped the data dictionary to reflect the stan-
dard paper survey forms already in use by the
Louisiana SHPO.
The NPS assisted the local New Orleans FEMA
office in obtaining 20 Trimble GeoExplorer
2005 series GeoXM GPS handhelds with built-
in GIS data collection capabilities. The data
dictionary was created in the Trimble GPS
Pathfinder Office software on a desktop com-
puter, and then downloaded onto the Trimble
GeoXM handhelds to serve as the form-driv-
en application within the Trimble TerraSync
software. An Environmental Systems Research
Institute (ESRI) ArcGIS Geodatabase was also
built around this structure, giving every orga-
nization access to the data required to make
an accurate assessment via the GIS of a
resources historic integrity and potential
National Register eligibility.
Development of the data dictionary was the
key factor in streamlining the cultural resource
data collection process, according to Lazaras,
stressing that considerable time had to be
dedicated up front to ensure the dictionary
was precisely tailored to the task at hand. In
the case of post-Katrina New Orleans, that
meant creating a menu that allowed data col-
lectors to describe the structures architectural
style using local terms while also assessing
its structural integrity relating to flood dam-
age.
We worked with all consulting parties,
including SHPO and the Citys Historic District
Landmarks Commission, to ensure the correct
architectural terms specific to New Orleans
were used in the data dictionary, said
Lazaras. If FEMA had been surveying fire-
mately to utilize that data within a GIS to
speed the assessment and concurrence pro-
cesses involved in Section 106 compliance.
The NPS methodology called for each
resource that might be eligible to receive
FEMA funding (and thus part of the FEMA
Section 106 requirements) to be mapped as
a point, line, or polygon. This form of data
collection enabled various expert historians
to record features and attribute data in a GIS
with geospatial location as the common ele-
ment, making it possible to share the data
among many different systems.
These are data transfer standards that relate
to documenting through feature-level meta-
data how the information was collected and
what is known about it, said McCarthy.
When that cultural data is shared, [the user]
knows exactly what they can do with it,
whether it can be used in a legal context or
whether more information is needed.
Preparing for Field Data Collection
While the City of New Orleans created a list
of condemned structures that had to be
demolished to remove safety hazards, the
FEMA Historic Preservation group met with
representatives of the NPS to develop a sys-
tematic methodology for conducting the
required assessments.
At the time Katrina struck, New Orleans had
a significant database of local surveys, but
there was little detail on individual buildings,
and most of the data were paper based.
Moreover, the neighborhoods hardest hit by
Katrina/Rita tended to be those with the least
existing documentation. [Historic preserva-
tion] data tended to be fragmented, paper-
based and not in a format that was easily
shareable, said Gail Lazaras, FEMA Historic
Preservation Specialist.
An important step in developing the method-
ology was creating the data dictionary that
Latest News? Visit www.geoinformatics.com
Ar t i cl e
19
July/August 2010
damaged buildings in another part of the
country, the data dictionary would have been
different.
After working through the process of creating
a data dictionary structure Lazaras believes
that a similar process built around a field
tested data dictionary and corresponding
geodatabase can be rescaled to fit a wide
variety of disaster situations.
Creating an Efficient Workflow
Within months after Katrina hit the Gulf Coast,
the City of New Orleans and other applicants
had submitted to FEMA the first of many lists
of buildings slated for demolition-all needing
Section 106 surveys and subsequent evalua-
tion. By the time the FEMA-funded demolition
was completed in spring 2009, FEMA crews
had visited more than 40,000 properties in
six parishes in and around the city.
In order to achieve this level of efficiency,
FEMA managed up to 20 crews at a time, and
each crew included a Secretary of Interior-
qualified architectural historian and a photog-
rapher in order to meet the requirements of
NHPAs Section 106. Each team typically
required less than a days training to learn
how to use the Trimble GeoXM handhelds for
GPS location and feature attribute collection.
In a normal day of data collection, each team
set out on foot with a list of specific proper-
ties to assess in a given neighborhood.
Because FEMA personnel did not have the
right to enter private properties without per-
mission (as well as the potentially dangerous
conditions in most of the buildings), each
crew performed its assessment standing on
the sidewalk outside the front door. As the
integrated Trimble GPS handheld device col-
lected a location point, the onscreen menu
guided the assessor through a predominantly
point-and-click process of describing more
than 40 architectural details of the roof, exte-
rior, windows and foundation. In addition, the
crews assessed the building on five aspects
of structural integrity.
As the survey menu was filled out, the data
dictionary script attached relevant metadata
to the data fields as required by the NPS data
transfer standards. Relating to the location
point, the metadata recorded the type of GPS
equipment used, accuracy range, and user
name so that others using that data in the
future would know precisely how accurate and
trustworthy it is.
While the surveyor filled out the data collec-
tion menu, the photographer used a digital
camera to snap photos looking head-on at
the building and obliquely at each side from
the front sidewalk. One other picture was usu-
ally taken from a perspective chosen by the
photographer. Photo identification numbers
were assigned to each image and entered into
the data collection menu as attributes perma-
nently attached to that property.
At the end of each day, the field crews either
hand delivered or emailed their data and
photo files to the New Orleans FEMA office.
The data was first transferred into the Trimble
GPS Pathfinder Office software where it was
quality checked and converted into shapefiles
before being uploaded directly into the pro-
ject GIS. The data collection script created
paths to the digital photos so they could be
easily linked to the property in the GIS, which
was integral to both the actual assessment
and the development of the inventory of
historic places.
Speaking to the efficiency provided by the
methodology implemented by FEMA and NPS,
FEMAs Lazaras said, We definitely had
economies of scale. FEMA could conduct on
screen assessment with SHPO for hundreds
of individual properties in a day, which put
less stress on the agencies involved and
allowed our personnel resources to be used
more effectively.
A Methodology for Future NHPA
Assessments
Looking back on the New Orleans experience,
Lazaras believes that FEMA and NPS have put
in place a methodology that can easily be
used as a framework for other organizations
to efficiently assess cultural resources and
meet Section 106 requirements. At the heart
of this methodology is a focus on field data
collection of geospatial data that is easily
shared through GIS technology, resulting in
an asset for future response and recovery
efforts at the local, state, and federal levels.
The development of this methodology also
provides a template for other organizations
to proactively develop their inventory of his-
toric places. The NPS is already thinking along
those lines, encouraging state and local
preservation offices to accurately map their
cultural resource inventory sites with GPS
technology and capture the information in a
GIS as soon as possible, using cultural
resource data transfer standards as a guide.
For more information, have a look at
www.trimble.com.
20
Ar t i cl e
July/August 2010
GPS and GIS Technologies Speed Workflow Chart
Copyright 2010 ESRI. All rights reserved. ESRI, the ESRI globe logo, ArcPad, ArcGIS, and www.esri.com are trademarks, registered trademarks, or service marks of ESRI in the United States, the European Community, or certain other jurisdictions.
Other companies and products mentioned herein may be trademarks or registered trademarks of their respective trademark owners.
Try ArcPad Today!
Download a free evaluation of ArcPad software and see how it
improves your feld productivity. Visit www.esri.com/arcpad.
ArcPad

software provides an accurate, hassle-free way to collect and share


data using a variety of mobile devices.
Simplify your data collection tasks by capturing, editing, and synchronizing
field information back to the office where advanced analysis can be
performed. ArcPad integrates with GPS, range finders, and digital cameras
to help you make more-informed decisions.
Complete time-sensitive projects, including field mapping, asset inventory,
maintenance, and inspections, while sharing critical enterprise information
across your organization quickly and efficiently.
Collect and Share Field Information
Immediately Across Your Organization
ArcPad

Synchronize with the server.


Quickly access field data in the office.
Label features. Use GIS with GPS. Preview maps. Find street routes.
For ESRI locations worldwide, visit www.esri.com/distributors.
Croatia
www.gisdata.hr
Czech Republic
www.arcdata.cz
Denmark
www.informi.dk
Estonia, Latvia,
and Lithuania
www.hnit-baltic.lt
Finland
www.esri-finland.com
France
www.esrifrance.fr
F.Y.R.O.M.
www.gisdata.hr
Germany
www.esri-germany.de
Austria
www.synergis.co.at
Belgium and
Luxembourg
www.esribelux.com
Bosnia and
Herzegovina
www.gisdata.hr
Bulgaria
www.esribulgaria.com
Georgia
www.geographic.ge
Greece and Cyprus
www.marathondata.gr
Hungary
www.esrihu.hu
Iceland
www.samsyn.is
Israel
www.systematics.co.il
Italy
www.esriitalia.it
Malta
www.geosys.com.mt
Moldova
www.trimetrica.com
The Netherlands
www.esrinl.com
Norway
www.geodata.no
Poland
www.esripolska.com.pl
Portugal
www.esri-portugal.pt
Romania
www.esriro.ro
Russia
www.dataplus.ru
Slovak Republic
www.arcgeo.sk
Slovenia
www.gisdata.hr
Spain
www.esri-es.com
Sweden
www.esri-sgroup.se
Switzerland
www.esri-suisse.ch
Turkey
www.esriturkey.com.tr
Ukraine
www.ecomm.kiev.ua
UK/Ireland
www.esriuk.com
Automated High Accuracy Geometric Correction and Mosaicking
without Ground Control Points
RADARSAT-2 data
Imagine a fully automated system used to produce accurate high resolution radar ortho-images and -mosaics all over the
world with excellent reliability! Emergency management teams could then have access to highly-accurate radar data as
soon as it becomes available for their time-sensitive needs. This was a difficult task in the past, mainly due to the arduous
process of collecting ground control points (GCPs). It is now possible with the successful operation of the RADARSAT-2
satellite and a new 3D hybrid satellite model that process this data without user-collected GCPs.
Philip Cheng and Thierry Toutin
RADARSAT-2 satellite
RADARSAT-2 is Canadas second-generation com-
mercial Synthetic Aperture Radar (SAR) satel-
lite and was designed with powerful technical
advancements that provide enhanced informa-
tion for applications such as environmental
monitoring, ice mapping, resource mapping,
disaster management, and marine surveillance.
Following the highly successful predecessor
RADARSAT-1 program (satellite launched in 1995),
RADARSAT-2 was launched in December 14, 2007.
RADARSAT-2 is the worlds most advanced
22
Ar t i cl e
July/August 2010
Figure 1a: Orthorectified RADARSAT-2 U2 data using RFM/RPC without post-processing overlaid with Google Earth
commercial C-band SAR satellite and heralds a
new era in satellite performance, imaging flexi-
bility and product selection and service offer-
ings. In addition to the RADARSAT-1 heritage
modes (Fine, Standard, Wide, ScanSAR Narrow,
ScanSAR Wide, Extended Low and Extended
High), RADARSAT-2 also offers Ultra-Fine, Multi-
Look Fine, Fine Quad-Pol, and Standard Quad-
Pol modes.
RADARSAT-2 has been designed with significant
and powerful technical advancements: (1) three
to one hundred meters resolution to accom-
modate a wide range of applications. The ultra-
fine mode improves 3D object detection and
classification. (2) Flexibility in polarization
selection (HH, HV, VV, and VH) to better dis-
criminate various surface types and improve
object detection and recognition. (3) Left and
right-looking imaging options to decrease
revisit time for greater monitoring efficiencies.
(4) Solid-state recorders to guarantee image
acquisition anywhere in the world for subse-
quent downlinking with high-capacity (300 Gb)
random access storage. (5) GPS receivers on
board the satellite provides real-time posi-
tion information to obtain GPS-derived geo-
metric accuracy and greater positional control.
Since the RADARSAT-2 satellite has multiple GPS
receivers on board with accurate real-time
positioning, this information could potentially
be used in the accurate geometric processing
and reprojection of RADARSAT-2 data replacing
the need for users to collect GCPs. This would
be a big benefit to a lot of applications where
accurate geometrically-corrected SAR images
are needed as soon as possible for time sen-
sitive applications. In this article, we will
explore the geometric correction accuracy of
different RADARSAT-2 data without user-collect-
ed GCPs using two geometric modeling, i.e.,
the empirical Rational Function Model (RFM)
with their rational polynomial coefficients
(RPC) and the deterministic 3D Toutins mod-
els (original and new hybrid).
RFM/RPC
RADARSAT-2 data are provided with 3rd-
orderRFM and the numerical values of 80
RPCs. The RFM/RPC, using an empirical/sta -
tistical algorithm developed by MacDonald
Dettwiler and Associates (MDA), approxi-
mates their 3D SAR model of RADARSAT-2. Even
if MDA mentioned that RADARSAT-2 RFM are
extremely accurate in the ability to match a
rigorous zero-Doppler SAR model, because of
Geometric Correction of RADARSAT-2
Data
For most SAR applications, it is required to
correct the data to a map projection before
it becomes useful. Orthorectification is a com-
mon geometric correction process that
requires the use of a 3D rigorous geometric
model computed from GCPs collected by the
user and a digital elevation model (DEM) to
correct for elevation distortions. However, the
collection of GCPs can be a significant
problem in various situations, such as study
regions with no available cartographic data,
no site accessibility, remote areas, feature-
less terrain (glaciers, desert), timing problem
etc. In these situations, it would be too
expensive to collect new cartographic data
and GCPs in such situations. In addition, the
collection of GCPs is almost impossible for
time sensitive applications, such as flood,
fire, volcanic eruptions or earthquakes, and
oil spill monitoring. Furthermore, the GCP
identification and collection process on SAR
images can be much more difficult than on
optical images; a problem exacerbated in
mountainous areas due to SAR-specific geo-
metric effects (foreshortening, layover and
shadow).
Latest News? Visit www.geoinformatics.com
Ar t i cl e
23
July/August 2010
Figure 1b: Orthorectified RADARSAT-2 U2 data using RFM/RPC without post-processing overlaid with Google Earth
its stability (no issues on attitude variations),
RFM accuracy is still limited by the orbit and
calibration timing uncertainties, which thus
requires these RFM issues with high resolu-
tion SAR data to be addressed. Occasionally
used in the eighties, this RFM/RPC method
received a great deal of renewed attention
with the launch of Space Imagings IKONOS
satellite, because its sensor and orbit param-
eters are not included in the image metada-
ta. The RFM/RPC method could thus be an
alternative method to 3D physical models
and it enables users, having little familiarity
with satellite data, to theoretically perform
the geometric correction without GCPs; only
a DEM is required to correct for elevation
distortions in the orthorectification. However,
systematic and random errors still exist after
applying the RPCs and the results have to
be post-processed with 2D polynomial func-
tions (zero to second orders) and several (3-
9) accurate GCPs. The order of the 2D poly-
nomial functions to be used in RPC
post-processing is a function of the type of
data, the viewing angle, the study site and
its relief. Alternatively, the original RPC can
be refined with linear equations and accu-
rate GCPs. Articles in the 2000s addressing
IKONOS, QuickBird and WorldView data
showed good results using post-processed
RPCs together with one or more GCPs. More
details about RFM/RPC can be found in the
paper of Grodecki and Dial (PE &RS January,
2003).
Original Toutins 3D physical model
The original Toutins model is a 3D rigorous
model developed by Dr.-Ing.Thierry Toutin at
Canada Centre for Remote Sensing (CCRS),
Natural Resources Canada, based on princi-
ples related to orbitography, photogramme-
try, geodesy and cartography. It further reflects
the physical reality of the complete viewing
geometry and corrects all geometric distor-
tions due to platform, sensor and Earth that
occur during the imaging process, as well as
the geometric deformations of the carto-
graphic projection. This model has been
recently adapted to the specificity of RADARSAT-
2 with a decimeter precision. The model is
user-friendly and robust and has been suc-
cessfully applied with few (3-8) GCPs to visi-
ble infrared (VIR) and SAR data, all around
the world for the past 20 years. Based upon
good-quality GCPs, the accuracy of the results
was proven to be within one-third of a pixel
for medium-resolution VIR images, one to two
pixels for high-resolution VIR images, and
within one resolution cell for SAR images. The
only constraint of Toutins model is a mini-
mum of 8 pixel-accurate GCPs are required for
processing SAR data. More details about the
original Toutins model for RADARSAT-2 can be
found in the paper of IEEE-GRSL, April & July
2009.
24
Ar t i cl e
July/August 2010
Table 1: Systematic (bias) and random errors (Std) over 58 DGPS ICPs of RFM/RPC without post-processing
and new Toutins hybrid model
Figure 2a: Orthorectified RADARSAT-2 F6 data using RFM/RPC without post-processing overlaid with Google Earth
New Toutins hybrid model
The new Toutins hybrid model, being the
most recent improvement of the original
Toutins model in 2010 for RADARSAT-2, uses the
synergy of both Toutins model and the RFM.
The metadata, including RFM and RPC, are
used to provide information on the satellite,
the sensor as well as on the ground. Since
this information is accurate enough, it is the
only required input into the original Toutins
model to accurately compute all the parame-
ters of the model. In addition to obtaining an
equivalent accuracy to the existing Toutins
model, an additional advantage of the new
hybrid model is its capacity to be applied
without collecting GCPs, which increases the
applicability of RADARSAT-2 data in the previ-
ously-mentioned situations. The user, who is
no longer required to collect any GCPs when
using this new hybrid model, will
now be able to generate accurate
RADARSAT-2 ortho-images anywhere
in the world with an accurate DEM.
RADARSAT-2 Test Data and
Software
To confirm the previous scientific
tests performed at CCRS on the new
Toutins hybrid model, additional
e.g., without GCPs. PCI Geomatics Ortho -
Engine (OE) V10.3.2 software was used for
performing these tests. This software supports
reading of different satellite data, manual or
automatic GCP/tie (TP) collection, geometric
modeling of different satellites using original
Toutins rigorous model, new Toutins hybrid
model and RFM/RPC, automatic DEM ge ne ra -
tion and editing, orthorectification, and
either manual or automatic mosaicking
(www.pcigeomatics.com).
Beauport, Canada
Beauport is located north of Quebec City,
Quebec, Canada. The elevation ranges almost
from 10m at the southeast in the city to
around 1000m in the Canadian Shield, locat-
ed to the north. Two RADARSAT-2 ultra-fine
mode, single look complex (SLC) (1 by 1 look;
1.64-2.4 by 3m resolution; 1.3 by
2.1m spacing) in VV polarization
from descending orbits, with inci-
dence angles of 30.8 - 32 (U2) and
47.5 - 48.3 (U25) at the near-far
edges, were acquired on Sept 10
and 14, 2008, respectively. Fifty-
eight DGPS survey points with 3-D
ground accuracy of 10-20 cm were
collected on both images and used
tests in PCI Geomatics operational environ-
ment were performed with different modes,
beams, geometry and processing parameters
of RADARSAT-2, acquired over four study sites
with various types of terrain, such as
urban/rural areas with flat-to-mountainous
reliefs: Beauport, Quebec and Toronto,
Ontario in Canada, Morrison, Colorado in USA,
and Yunnan in China. The authors would like
to thank Canada Space Agency and MDA for
providing the data and support for this
research. Results and accuracy of these tests
were validated on accurate differential GPS
(DGPS) independent check points (ICPs).
These results and the ortho-images are now
presented. While RFM/RPC needs to be post-
processed with several GCPs, the article will
compare this new hybrid model with the
empirical RFM/RPC, but on the same level,
Latest News? Visit www.geoinformatics.com
Ar t i cl e
25
July/August 2010
Table 3: RPC Systematic (bias) and random errors (Std) over 4 DGPS ICPs of
RFM/RPC without post-processing and new Toutins hybrid model
Figure 2b: Orthorectified RADARSAT-2 F6 data using Toutins hybrid method overlaid with Google Earth
as ICPs for validation. Table 1 shows the sta-
tistical results of RFM/RPC without post-pro-
cessing and new Toutins hybrid model for U2
and U25. It shows that RPC without post-pro-
cessing generated 2D large bias (systematic
errors) of tens of meters and 1-2 meters stan-
dard deviation (random errors) both are pre-
dominant in the X-axis, corresponding rough-
ly to the range direction where the largest
elevation error occurred. In addition, the
errors are dependent of the beams (incidence
angle): the steeper the beam, the larger the
error. The new Toutins hybrid model correct-
ed most of the RPC biases: half-resolution
biases (1-2 m), which are acceptable for most
cartographic applications. The advantages of
the new hybrid model are more obvious with
large geometric distortions, such as U2.
Figures 1a and 1b show the orthorectified U2
data using RFM/RPC and Toutins hybrid mod-
els overlaid with Google Earth, respectively. It
can be observed from figure 1a that the roads
in adjacent images are misaligned. This can
be clearly identified in the upper right portion
of the image when using the RFM/RPC
methodology. However, in figure 1b, this mis-
alignment has been eliminated by utilizing the
Toutin Hybrid Model.
Toronto, Canada
Toronto is an urban area with elevation ranges
from 80m to 200m. Three data set acquired in
August and Septem ber of 2008 were tested, i.e.,
(1) standard mode (S1), HH and VV polarization,
ground range at 12.5m image spacing with near-
far incidence angles of 20.0 to 27.2; (2) fine
mode (F6) HH polarization, ground range at
6.25m image spacing with near-far incidence
angles of 47.0 to 49.3, and (3) ultra fine mode
(U7), HH polarization, ground range at 1.56m
image spacing with near-far incidence angles of
34.8 to 36.1. Nine DGPS survey points with 3-
D ground accuracy within 1m were collected from
the images and used as ICPs for validation. Table
2 shows the statistical results of RFM/RPC with-
out post-processing and the new Toutins hybrid
model for the three modes and beams. The bias
were largely improved using the new Toutins
hybrid model while the standard deviation dif-
ferences are non-significant, except better stan-
dard deviation is obtained with the new hybrid
model for U7, certainly because of larger geo-
metric distortions with the combination of steep
incidence angle, smaller SAR resolution and
image spacing. The same is apparent with U2
in Beauport (Table 1). While the number of 9
ICPs is not statistically enough to insure an accu-
rate comparison for the random errors, it con-
firms Beauport results and the advantages of a
rigorous model with large-distortion images.
Figures 2a and 2b show the orthorectified
RADARSAT-2 F6 data overlaid with Google Earth
using RFM/RPC and new hybrid model, respec-
tively. It can be seen from figure 2a the roads
are misaligned when using RFM/RPC.
Morrison, USA
Morrison is mainly a mountainous area with ele-
vation ranges from 1600m to 2800m. A RADARSAT-
2 multi-look fine beam (MF3) with HH polariza-
tion, ground range at 6.25m image spacing was
acquired on April 10, 2009. The incidence angles
vary from 42.0 at the near-range to 44.7 at
the far-range. Due to mountainous terrain, only
4 DGPS survey points with 3-D accuracy within
1m were accurately collected from the image.
Table 3 shows the statistical results of RFM/RPC
without post-processing and new Toutins
hybrid model: again bias improvement and
non-significant standard deviation difference
with the new hybrid model. The shallow inci-
dence angles do not generate too many distor-
tions, which reduced the advantages of a rigor-
ous model. Figure 3 shows the orthorectified
data using Toutins hybrid model overlaid with
Google Earth.
Yunnan, China
Yunnan is located west of China consists of
mainly mountains with elevation ranges from
2000m to 7000m. A RADARSAT-2 multi-look fine
(MF1) with HH polarization, ground range at
6.25m resolution was acquired on May 7, 2009.
The incidence angles vary from 37.6 at the
near range to 40.7 at the far range. Survey
points were not available for this data for vali-
dation. Figure 4 shows the orthorectified image
overlaid with Google Earth.
26
Ar t i cl e
July/August 2010
Figure 3 shows the orthorectified RADARSAT-2 MF3 data using the new Toutins hybrid on Google Earth.
Figure 4: Orthorectified RADARSAT-2 MF1 data using Toutins hybrid model overlaid with Google Earth.
Automatic Mosaicking of RADARSAT-2
images
The successful generation of high accuracy
RADARSAT-2 ortho SAR means that it is now pos-
sible to create seamless mosaics of RADARSAT-2
data of a large area or a country without GCPs
using an accurate DEM. How ever, mosaicking
and color balancing are usually an extremely
time consuming process. The PCI automatic cut-
line searching, mosaicking and color balance
tools could be used to perform the entire pro-
cess automatically. No human intervention is
required during the process.
The automatic process should only be used
with the new Toutins hybrid model for the
best accuracy. The RFM/RPC will generate dif-
ferent biases (10-50 m) for each image of the
future mosaic. Consequently, these differen-
tial RFM biases will not only cause misalign-
ments (local bias due to the absence of block
adjustment) between the ortho-images but
will generate supplemental random errors in
the entire mosaic, which will be combined
with the random errors of each image.
Four RADARSAT-2 multi-look HH polariazation
fine beams at 6.25m spacing of Yunnan,
China, were also used to test the mosaicking
using the new Toutins hybrid model and
SRTM 90m DEM. The data were MF1F with
near-range incidence angle of 37.6 and far-
range incidence angles 40.6 acquired on May
7, 2009, MF22 with near-range incidence
angle of 32.3 and far-range incidence angle
of 35.6 acquired on May 13, 2009, MF6F with
near-range incidence angle of 47.5 and far-
range incidence angle of 49.9 acquired on
May 21, 2009, and MF4 with near-range inci-
dence angle of 43.3 and far-range incidence
angle of 45.9 acquired on May 23, 2009.
Figure 5 shows the mosaicked images of the
4 data separated with red lines.
Conclusions:
This article has demonstrated the superiority
of the new hybrid Toutins model without user-
collected GCPs on various critical issues for
operational applications: robustness, consis-
new hybrid Toutins model. The new Toutins
hybrid model, presented in this article, will
enable automatic mosaicking of accurate
ortho-images over a large area or an entire
country without any user-collected GCPs. Its
main advantage in operational environments
was its capacity to be applied without collect-
ing GCPs, which increases the applicability of
RADARSAT-2 data in remote and inaccessible
areas, such as northern/southern glaciers and
ice-covered sites, desert, mountains, and
more.
Dr. Philip Cheng cheng@pcigeomtaics.comis a
senior scientist at PCI Geomatics.
Dr.-Ing. Thierry Toutin toutin@nrcan-rncan.gc.ca is
a principal research scientist at the Canada Centre
Remote Sensing, Natural Resources Canada
tency, independent of modes and beams,
block adjustment process to reduce relative
errors, less systematic errors, less random
errors with images having large geometric dis-
tortions due the combination of incidence
angles, terrain relief, sensor resolution and
image spacing. On the other hand, RFM/RPC
from the RADARSAT-2 data without post-pro-
cessing could not generate accurate ortho-
images and mosaics, due mainly to the sys-
tematic/random errors dependent of modes
and beams but also larger random errors in
the large-distortion images (steep incidence
angles, high mode resolution, small image
spacing, and high relief ). Post-processing
RFM/RPC with several (3-9) accurate GCPs are
thus necessary to achieve the higher carto-
graphic standard and the same results as the
Latest News? Visit www.geoinformatics.com
Ar t i cl e
27
July/August 2010
Figure 5: Mosaicked RADARSAT-2 image of four multi-look fine beam of Yunnan, China.
Table 2: Systematic (bias) and random errors (Std) over 9 DGPS ICPs of RFM/RPC without post-processing
and new Toutins hybrid model
28
Ar t i cl e
July/August 2010
Easily share vast amounts of da
Governments at the local, regional, and national levels require current, accurate geographic information to make better
decisions in multiple areas. Geospatial technology can help government organizations better share
vast amounts of data internally and externally.
By Robert Widz
Intergraph, a leading, global provider of
geospatially powered solutions to the
defense and intelligence, public safety and
security, government, transportation, photo -
grammetry, utilities, and communications
industries, offers governments an effective
solution for data sharing called the Intergraph
GeoMedia ResPublica Intranet. It is a geospa-
tial solution based on Intergraphs GeoMedia
WebMap application.
GeoMedia WebMap maximizes the value of
organizations geographic information by pub-
lishing it on the Web providing employees,
customers, and the public fast and easy access
to geospatial data and functionality. The
GeoMedia ResPublica Intranet makes high-
quality GIS data available within an organiza-
tion to a large number of users, via an Intranet
and/or the Internet, with an unlimited number
of workstations. It offers a comprehensive
range of usage from a simple address search
to integrating and using cadastral and survey
data, developing and using land plans, aerial
photographs, main network plans and carrying
out complex spatial analyses.
Innovative Processes for Better
Efficiency
Using an intelligent caching process for
geospatial data offers the opportunity to
cache selected graphical data including
aerial photos and land use plans, either on
the server, on the LAN or on the client. These
data are then no longer produced by the map
server and can be used directly from the
cache, resulting in higher performance in
terms of real-time access to data and the
reduction of the volume of data to be trans-
ferred from the server. The process of updat-
ing the cached data on the client takes place
completely automatically via a timestamp,
and offers the option to run the web client
in offline mode without any contact with the
server. For example, this makes it possible
for mobile staff to operate the application
and also assure security in case of system
failures or other network problems.
Depending on the requirement or format of the
primary data, vector data can be transferred in
Computer Graphics Metafile - (CGM) or
Geography Markup Language - (GML) formats,
or raster data in Joint Photographics Experts
Group - (JPEG) or Portable Network Graphics -
(PNG) formats to the client, or published as a
server cache.
The general dataset is split according to object
class into appropriate squares. This tiling pro-
cess offers additional benefits for geo caching:
Only the data that can be properly viewed
on the current scale will be transferred via
the network and displayed on the client.
Rapid image refresh rate and low RAM
requirement.
Can be configured according to the available
environment (client equipment, server per-
formance, bandwidth, etc).
Only modified tiles need to be updated in
the cache.
Tile size is appropriately defined by the
administrator according to the visible scale
range and the data density.
Of course, GeoMedia ResPublica Intranet can
also be used without tiling and with permanent
live access.
Smart Client
GeoMedia ResPublica Intranet can run with-
out a browser as a standalone Java applica-
tion. In addition, GIS data can be offered to
workstations that do not have a web brows-
er installed. To start the solution, a hyperlink
from the browser or standard Java WebStart
can be used. Furthermore, the communication
between the client and application servers is
completely based on state-of-the-art web ser-
vices via Simple Object Access Protocol
(SOAP).
Modules for More Security and
Optimized Processes
The ResPublica Administrator administration
tool is fully web based and can be used in a
uniform GUI to manage all the rights needed
to operate the system securely. User identifi-
cation is based on a login and password, and
organized via groups. There is also the oppor-
tunity to use already available user directo-
ries like Active Directory or LDAP to integrate
access control for the GIS into the available
IT rights concept. Some restrictions can be
made by assigning corresponding rights to
boxes are also defined for each edit control.
While making each entry, the map at the
source is configured automatically with each
working step. This guarantees receiving a log-
ical view in each case (scale, extract, visible
layer, etc). Additional functions, such as for-
warding map extracts during the course of a
working step by email, or displaying hierar-
chical levels in a form, can be implemented
depending on the requirement.
ResPublica Automate provides numerous
options (text files, DDE, Java applet or XML)
for interfacing external desktop and/or web
applications. It lets users control the
ResPublica client in a variety of ways and it
forms the basis for all interfaces and special-
ist applications. Detail about the XML
Automate interface is that the communication
is based on the exchange of XML files and on
the use of so-called File-Watchers that
define files in the administrator user interface
and inform ResPublica Intranet about newly
created files. The export is then realized with
the command to inform the ResPublica
Intranet user via a button, which is specially
defined for the external application. Therefore
the export XML will be transferred to the soft-
ware of the third party manufacturer by stor-
ing in the export folder.
Robert Widz is Country Manager Poland and
Managing Director EMEA Government at
Intergraph. He may be reached at
robert.widz@intergraph.com
the various users and user groups (according
to spatial criteria, functional criteria, topical
criteria, or to the pre-defined analyze option).
Also, any chosen number of functionalities
can be integrated into the user interface by
defining function groups in the administration
tool of ResPublica Intranet. The administrator
can define the groups according to his own
precepts, except for standard function groups
like Querying and Measurement. Corre spon -
dingly, the administrator uses this to define
and shape the user-specific desktop for users.
Attribute queries of any chosen complexity
can be created, tested and made accessible
to all or just one user using standard SQL over
the administrator user interface. When it
comes to monitoring purposes, the adminis-
trator can access on-the-fly the login statis-
tics and other stats on queries/analyses per-
formed over the administration tool of
ResPublica Intranet. Quick and easy access to
this information empowers the administrator
to assist users at all times.
ResPublica Workflow Manager helps to prede-
fine and control complex operations. Users
can also document and log the procedures
performed. ResPublica Workflow Manager is
XML-based and guides users through an oper-
ation sequence automatically. To date, the
sequence of procedures required a planning
application by the person responsible by call-
ing up the correct functions, en ab ling/dis -
abling the requisite feature classes, calling up
the queries, etc. With Workflow Manager,
these steps are predefined and users are sim-
ply offered the function required for the cur-
rent working step. Also, the map is controlled
automatically in the background. Users can
define additional tests and conditions with
each step (node in the workflow tree). Taking
the processing of planning permissions as an
example, users need to complete or perform
pre-defined steps correctly like automatically
verifying the land parcel number for the build-
ing plot, before being able to move on to the
next step. After making each entry, users are
only presented with those continuing steps in
the operation that are feasible and logical.
This eliminates the possibility of making
incorrect entries almost completely.
With the Form Generator, users can define
which user-specific attributes they can com-
pile and edit, and in which form. The control
element, which comes into use (e.g. text,
check, combo, and list boxes) and mandatory
Latest News? Visit www.geoinformatics.com
Ar t i cl e
29
July/August 2010
ta internally and externally
The Location Business Summit
Location based services: are we there yet? and how is there(more) money to be made with location business systems? were
questions central at the Location Business Summit in Amsterdam. With an expected further growth of smart phones equipped
with a GPS, there sure is room for more location based systems and thus money to be made. The question is by whom, and
how. Also, what lessons are there to be learned from the geospatial web for driving profits? During two days, more than 50
speakers from the industry gathered to share their thoughts on these matters.
By Eric van Rees
Location based services have come a long way. Part of its success has
to do with technology, part of it with data providers and companies that
use location as a way of displaying data. And also, that the data is free
for everyone to use wherever they want. But what is the next step? Who
will lead the way in location based systems and decide what others will
do? What are the challenges ahead and how to tackle them? What lessons
are there to be learned from geospatial parties that deal with location
every day? These questions and more were addressed during the Location
Business Summit in Amsterdam.
As to be expected, this was not a technological conference, but one where
different groups of people met, discussing their thoughts and learning
from each other. Familiar parties as Google, Yahoo, Layar, Open Street Map
and TeleAtlas were present, but also marketing agencies, telecom compa-
nies as well as major hardware and software companies as Microsoft and
Dell Computers.
Where is the Money?
The main questions of the
conference were addressed by
David Gordon, Director of
Strategic Planning at Intel. One
of the main questions is
Where is the money? , mean-
ing how is there money to be
made with the location based
services (meaning advertising)?. This question came up during almost every
presentation. Its easy to see why: after Google and Nokia offering free map
services on mobile devices, mobile system providers are asking themselves
the question of how to respond to this move and how to make money
with mobile and location enabled advertising. Considering the diversity of
players in this market and the fact that the sales of GPS smart phones are
still increasing, all parties eager to take their part of the cake.
Googles Geospatial Technologist Ed Parsons followed Gordons short open-
ing presentation with a talk that focused on data rather than the services
around the data. Parsons argued that without context, data itself is irrele-
vant, because place equals points of interests and people. He made this
clear with an example that showed that the location where information is
shown is just as important as the information itself. Context defines if a
message comes through. This message was repeated in other presenta-
tions: everybody seemed to agree that theres a need to personalize
location based information for the user. The question is how and by what
means.
Of course there are a lot of barriers that may slow down the spread of
location based services, such as privacy of users and their location, their
behavior, but also the lack of a killer application that everyone uses and
technological barriers such as screen size of mobile devices and the lack
of indoor location. Some talks addressed the juridical aspects that come
with sharing information about location information or even browser
cookies that reveal information about consumer behavior.
About personalizing location based information, Parsons argued that be
able to personalize content to the individual user, the service should have
information about the user so it can give better search results. Google is
already doing this, and some speakers agreed that Google is in the driv-
ers seat in the location business market. Everyone was eager to hear
Googles presentation during the second conference day on mobile local
advertising. One of Googles new initiatives in this field is Google Local
Shopping, where inventories of shops are searchable for mobile users
through Google. The other way around is also possible: take for instance
geofencing, where mobile users receive text messages about discounts
about the shop where they are at the moment. Although research has
shown that geofencing can be quite effective as a marketing tool, it
remains to be seen if people are in favor of these marketing tools, as
they may not be personalized
and could be considered
intrusive.
Verdict
The target audience of the
conference was not clearly dif-
ferent than your normal
geospatial event. This was not
a technical conference, which
had its strengths and weak-
nesses. I for one learned a lot more on how business use location based
services and make a profit with it, but honestly there was not much new
to be learnt. There were no big announcements or exciting new products.
Augmented reality was only mentioned in one presentation, but this topic
certainly deserved wider attention, also from Layer themselves as they
wanted to keep their new product announcements for themselves and
announced a Layer event in June.
From a geospatial perspective, I was surprised how non-geospatial peo-
ple like the majority of this conference take maps for granted. Or map-
ping, for that matter, or data quality. The big discussions between crowd
sourcing (OSM) or a blend of traditional mapping and crowd sourcing
(used by Navteq) seemed to go over the heads of most attendees. Ed
Parsons remarked that people have problems with maps, mapping is not
that easy, and gave an example that perfect circles on a map with a
Mercator projection that should be read with suspicion, showing that
theres something wrong.
But the attendees noticed other barriers for location based systems to
take off fully. Roaming costs but also battery power are still big obstacles
for mobile users using location based systems. To answer the question
are we there yet?, I think the answer should be: no, not yet.
For more information, have a look at
www.thewherebusiness.com/locationsummit
Event
July/August 2010
30
Are We There Yet?
2010 Spectra Precision. All rights reserved. All other trademarks are property of their respective owners.
Simply Powerful
www.spectraprecision.com/FOCUS30
The latest and greatest in robotic technology from Spectra Precision.
30 ROBOTIC

StepDrive high speed motion technology


LockNGo advanced tracking technology


Spectra Precision Survey Pro eld software

GeoLock GPS assist technology



2, 3 and 5

Windows CE Touchscreen


2.4 GHz interference-free radio

Ultra lightweight at only 5kgs (11 lbs)


Contact your Spectra Precision dealer for a demo today. www.spectraprecision.com/dealers
An Interview with Ken Spratlin
The full range of mobile mapping systems, airborne systems, and photogrammetry & digital surface modeling
solutions can be found in Trimbles GeoSpatial product line.
Trimbles GeoSpatial product line has its roots in the industrys leading spatial imaging companies:
Joc Triglav, Editor
Q: Trimbles GeoSpatial Division was created in the middle of
2009 with merging the previously acquired well known INPHO,
Geo-3D, RolleiMetric and TopoSys companies. How does the
mixture of German and Canadian business background work
together? What were your business and technological priorities
in this first year of operation and how successful were you in
meeting them?
Background Information: It is difficult to point to an actual creation
date, but we think of it as being at Intergeo 2008 just after we
announced the acquisition of Rollei Metric and TopoSys, adding to our
prior acquisitions of INPHO and Geo-3D. We met in a Starbucks (very
American sounding isnt it) in Bremen Germany for introductions of the
respective management teams.
Since 2000, Trimble has grown to be a truly international company
in both development sites and sales regions. So working shoulder-to-
shoulder with a variety of cultures is part of our daily realityand a skill
that we must continually strive to improve. Trimbles GeoSpatial Division
is largely staffed by Germans, French Canadians, and Americans, but
we have staff in other regions as well. The groups have come together
well and are actively involved in cross-site product development.
Certainly there is a language barrier, but more challenging are time zone
differences and not being able to meet face-to-face often enoughhence
why I referred to it as a skill.
Each group was focused on different but related technologies, and had
successful products, with little overlap. So our initial priorities were
simple(1) dont break it, and (2) leverage Trimbles global distribution
strength to offer the complete product line world-wide. We are now
operating an integrated sales team. An engineering council coordinates
the 4 development sites, and they are actively working on joint devel-
opment projects. The first major fruits of these joint development
projects will be public by the time this interview is published.
32
I nt er vi ew
July/August 2010
Applanix, an innovator in GNSS+Inertial technology and its applications to mapping;
Geo-3D, an innovator of georeferenced mobile mapping technologies;
RolleiMetric, an innovator of aerial metric cameras;
TopoSys, an integrator of multiple technologies into complete aerial data capture systems;
INPHO, a developer of end-to-end photogrammetric and terrain-modeling software
And recently Definiens, a Germany-based company specializing in image analysis solutions
Q: Your companys product portfolio is covering solutions for
various airborne and ground based data acquisition systems as
well as the subsequent data processing modules and solutions.
Please, outline the main products of your portfolio, especially
in the four main areas of mobile data capture, aerial mapping,
aerial photogrammetry & laser scanning software and
applications.
Most broadly, the products are focused on mobile mapping from
aerial and land vehicles. The sensor technologies include metric cam-
eras, laser scanning, as well as integration of other sensors as appro-
priate for the specific application. Georeferencing is based on Trimbles
GNSS technologies as well as GNSS+Inertial from Trimbles Applanix
subsidiary. The products range from individual sensors to fully integrat-
ed, turnkey data capture systems, to the processing software to turn
data into answers.
The Land Mobile Data Capture systems are used for a variety of appli-
cations including roadway planning and monitoring, roadway right-of-
way asset management, and mobile surveythe applications are end-
less. Our Trimble Trident-3D software provides high levels of automation
for detection and identification of some asset types like road signs.
Automation is key to making these type systems productive and pro -
fitablefeature extraction is therefore one of our highest priorities.
The Aerial Mapping systems include the Trimble Aerial Camera, the
Trimble DSS (developed by Applanix), and the Trimble Harrier. The
Trimble Aerial Camera is a ruggedized, metric, medium format camera
for aerial mapping. The Trimble DSS is a fully integrated, turnkey aerial
mapping system including camera, direct georeferencing, flight
management system, azimuth mount, data storage, computer system,
and power distribution. The Trimble DSS RapidOrtho combines this
system with a rapid orthophoto generation workflow for applications
such as emergency response.
The Trimble INPHO software provides a complete solution for aerial
digital photogrammetric and laser scanning processing.
Q: What is the relationship between Trimbles GeoSpatial
product line and Applanix?
Applanix operates as a separate division of Trimble. Their industry
leading GNSS+Inertial technology is applied to many different applica-
tions in many different industry segmentshence the reason to operate
as a separate division. We then work together specifically on mobile
mapping, and incorporate the Applanix POS systems into the Trimble
Mobile Data Capture and Aerial Mapping products. And they developed
the Trimble DSS system. Our sales forces work together as one for the
Aerial Mapping products.
Q: Trimbles GeoSpatial Division offers data collection and
information processing productivity solutions for several key
areas like Roads, Highways and Rail; Utilities and Energy
Transmission & Distribution; etc. Which are the crucial
advantages for the customers using your solutions?
First and foremost, Trimble has been involved in these industries,
successfully providing location-based solutions and geomatics
solutions for an extended period of time. As a result, Trimble and its
business partners have gained deep domain expertise. This allows us
to better support our customers, and also hopefully be in a better
position to gain key insights about problems and translate that know -
ledge into compelling new solutions. To understand where we are going
together with our customers, it is worthwhile to review our website to
appreciate the depth and breadth of the technologies and solutions
we now have across all of Trimble.
For the industries you specifically asked about, the customers all have
a common need: systems to collect very dense datasets with high accu-
racy and high precision, combined with workflows that turn data into
answers quickly and accurately. By integrating best-of-class technolo-
gies, we are able to provide turnkey data capture systems that are high-
ly accurate and productive, while being relatively easy-to-use. I say
relatively easy-to-use because we have to remember that mobile map-
ping is still an early adopter technology. Much work remains across the
industry including within Trimble before these solutions become as
easy-to-use as say a GNSS rover.
On the processing side, we have some of the best engineers in the
world focused on applying GNSS, GNSS+Inertial, photogrammetry, laser
scanning, and other related technologies to create 3D, 4D, and
ultimately 5D models. Our Trimble INPHO software is recognized as an
industry leading solution for orthophoto production and digital surface
modelling. It is used by many of the worlds leading geospatial compa-
nies involved in mapping of all scalesfrom small projects to national
mapping.
Our Trimble Trident-3D Analyst software provides high levels of automa-
tion for extracting roadway assets and geometry from data produced
by our Land Mobile Data Capture systems. Companies that have
purchased systems from our competitors often tell us they use our soft-
ware to make these systems productive.
Q: In this decade an accelerated convergence and integration of
geospatial market segments technologies based on geospatial
imaging, like aerial mapping, land survey and GIS, seems
inevitable and Trimble as a whole is expected to be one of the
motors of these processes. How and with which activities is
your Division addressing these challenges?
This is certainly a trend Trimble expects will continue, and Trimble is
actively working to drive it. Each of these industry segments is large
and complex, with significant barriers to change. The required list of
activities is too large for any one division or any one company to exe-
cute. The activities range from product development, to interoperability
standards, to at its most fundamental level business model, both for
suppliers like Trimble as well service providers and end users. Trimbles
GeoSpatial Division spends a significant amount of time listening to
customers and exchanging ideas, particularly around which new
business models could accelerate this convergence. And we spend sig-
nificant time working with other Trimble divisions to understand how
our technology can be applied to tough problems in engineering,
construction, and other activities related to infrastructure.
Q: Please, describe in detail the idea of Trimbles Connected
Site solution and its functional contribution in creating seamless
workflow environment among the Trimble products and
technologies?
Organizations face major challenges to increase labor and machine
productivity, reduce rework, optimize processes, increase quality, and
reduce input costs (materials, fuel, etc.). The larger the scale of a
project, the more difficult it becomes for all the stakeholders to work
as a team to plan, execute, monitor, and modify those activities as
needed. With projects involving multiple locations and organizations
Latest News? Visit www.geoinformatics.com
I nt er vi ew
33
July/August 2010
(e.g. architect, engineering, construction, operator; office, field), gaining
access to current, accurate information about the project status is diffi-
cult. The key word is Connected. Trimble is developing connected
products and connected communities to speed the dissemination of
timely information accessible to all stakeholders. Visit
www.myconnectedsite.com to see what this looks like today. Within
Trimbles GeoSpatial Division, we are focused on making timely and
accurate geospatial data available to these connected communities. We
then focus on converting this data into answers within specific areas of
our expertise. The data has value to multiple project stakeholders across
multiple lifecycle phases.
Q: What is your opinion on the existing complexity and variety
of geospatial data standards and metadata? Do you see this
variety as an obstacle or as a necessity? How is Trimbles
GeoSpatial Division addressing this matter in its daily practice?
Geospatial or geomatics are such broad terms. For example, we often
refer to Trimbles Mapping & GIS solutions as addressing industry seg-
ments from archaeology to zoology. Across such wide and disparate
fields of expertise and activity, variety is required. Survival of the
fittest will take care of the standards that become obstacles to sol -
ving problems. Our customers continuously provide feedback about the
standards they need for interoperability. We address those within our
product roadmaps.
Q: How are geospatial data quality issues addressed in the
Trimble GeoSpatial product line? Which options do your
customers have in selecting and combining various kinds of
geospatial data acquisition methods for a certain spatio-tempo-
ral data quality and accuracy levels?
That is an excellent question. Data quality (and related attributes of
how the data was collected) or perhaps more specifically lack of know -
ledge of data quality is a barrier to convergence of geospatial market
segments. It is also a barrier to the use of data portals and other forms
of data aggregation. Differences in terminology, processes, regulations
or lack of regulations, and resulting data quality are significant between
different market segments. Within Trimbles GeoSpatial Division, sys-
tems cover two quite different segmentsaerial and land mobile. These
two are also quite different from land survey. For example, terrestrial
laser scanning data (those famous point clouds) typically does not
provide the accuracy of each point. However, the land survey segment
operating with GNSS and total stations uses instruments and work-
flows designed to provide accuracy of each individual measurement.
Increasingly our customers, both service providers and end users, want
to integrate data from aerial, land mobile, and survey into 3D and 4D
models. It is successfully done today, but certainly there is much room
for improvement. It will be at least academically interesting to see
whether this is ultimately solved through standardization processes, or
through technical solutions (more software!).
Q: What is your opinion on geospatial data privacy issues
raised by the public and media regarding the practical
implementation of possibilities of modern geospatial imaging,
measurement and positioning technologies? Where, if at all, are
the limits between public and private for geospatial data
acquisition and presentation? How does this problem affect
your business?
This is a legitimate issue and one that we all have to better address.
Ignoring it and then asking for forgiveness, as some have done,
creates additional barriers to adoption for the entire industry, delaying
the benefits of this technology to society. Trimble and Trimbles
GeoSpatial Division supplies products to service providers and end
users, and therefore this issue most directly impacts them as they
decide what data to collect and how to use it. However, as products
become more and more connected the lines are blurred. With this in
mind, privacy issues will become an area that manufacturers, service
providers, and end users have to address.
Q: At the end, with your excellent knowledge and technology
education background of Georgia Tech and MIT, what do you
think about modern geospatial/geolocation technology in
general? Are we already close to the situation where accurate
geolocation information becomes as ubiquitous utility as time
is for centuries or is there still a long road to go? What lies
ahead, where will the future development of geospatial
technology take us?
Geospatial technology is ubiquitous in many industries today, regard-
less of region of the world. But it is always amazing to visit with some
other industries and see how little the technology is used. Paper and
pencil is often the biggest competitor to these technologies.
In my prior career in spacecraft guidance and navigation, large scale
simulation of complex systems that could not be flight tested under all
conditions was the primary tool for experimenting and asking What
If? So personally, Im interested in the opportunities created by the
existence of large geospatial databases, aggregating data from many
different sensors, many different disciplines, and over time. Trimble has
begun to address this opportunity with activities in Road and Rail
Alignment, and Transmission Line Design & Optimization. The potential
for systems like these to provide better answers, at lower cost, and
with less project impact on the environment is compelling.
So the opportunities are still limitless. Thats what makes the geo spatial
industry both challenging and fun.
Joc Triglav is editor of GeoInformatics.
www.trimble.com
34
I nt er vi ew
July/August 2010
UlLraCam Lechnology creaLes Lhe mosL advanced aerial mapping producLs or some o Lhe
world's mosL sophisLicaLed projecLs, as well as small, single-craL operaLions.

1o sLreamline Lhe phoLogrammeLric workow process, each UlLraCam is compaLible wiLh
Lhe new UItraMap 2.0 soLware. 1his soLware provides a powerul way Lo ecienLly
manage large volumes o UlLraCam imagery, and now includes addiLional eaLures such as
MonoliLhic SLiLching Lo signicanLly improve geomeLric image accuracy or unsLrucLured
Lerrain, and MonoliLhic RadiomeLry or single CCD radiomeLric images.
l you are looking or a cosL-eecLive opLion Lo upgrade or expand your currenL hardware,
visiL microsoft.com/uItracam/gif.
Serious tools jor serious maing.
200 MicrosoL Corp. All righLs reserved. MicrosoL, vexcel lmaging CmbH, UlLraCamXp, and UlLraCamL are eiLher regisLered Lrademarks or Lrademarks o MicrosoL CorporaLion in Lhe UniLed SLaLes and/or oLher counLries.
The data you dellver ls only as good
as the technology behlnd lt.
1he Iargest image PAN image
footprint in Lhe indusLry,
ewer ighL lines required.
ultraCamX
Same impressive ooLprinL aL
lower alLiLude wiLh
a new wide-angIe Iens.
ultraCamX WiJe Angle
1he Iargest PAN image footprint
rom any medium-format mapping
camera, ideal or smaller craL.
ultraCamL
As an early adopter the cloud, WeoGeo offers storing, sharing, buying and selling of GIS Data Maps and CAD files for users
worldwide. The company has been mentioned as a best example for applying cloud computing in Software as a Service
model. Paul Bissett, CEO & Co-Founder of WeoGeo, explains the concept behind the company, how it works, and explains
why sharing geospatial data is a good thing.
By Eric van Rees
WeoGeo is an American company that offers
managing and marketplace services for geospa-
tial and CAD content through a globally acces-
sible platform. Data management and sharing
occurs through distributed and shared compu -
ting services. This is a different approach than
that of a software vendor, who builds a system
for an organization to work with. WeoGeo focus-
es on data and what happens with data. Data
producers can get more value out of their data
once it is produced by making that data avail-
able for others and sharing it, rather than pro-
ducing it again or, how the company describes
it, to do more with less.
Paul Bissett, CEO and Co-Founder of WeoGeo
explains what WeoGeo is all about: Our goal
is to provide organizations with the ability to
manage and serve their mapping products as
easily as one manages their iTunes song library.
We do this by providing content management
and monetization services that increase their
users efficiency and revenues in organizing their
mapping library.
WeoGeo Library and Market
The company offers two basic services to its
costumers: WeoGeo Library and WeoGeo
Market. First, theres the WeoGeo Library, our
content management service. Bissett: Think of
a cloud-based iTunes library service rather
than iTunes on your desktop computer. The
Library is a browser-interfaced cataloging
system for indexing, sharing, and delivering
customized geospatial content. Administrators
control User access to the Librarys content and
may optionally list datasets for sale on WeoGeo
Market. Powerful server-side Spatial ETL
(Extract, Transform, Load) provides spatial and
spectral clipping, projecting to alternate
Coordinate Systems, and file-format translation.
This ETL requirement is one of the primary rea-
sons we partnered with Safe Software says
Bissett. They have created the best tool that
we know of to provide ETL functions and are
embedded in both CAD and GIS software solu-
tions.
Secondly, theres WeoGeo Market, an iTunes
store for customers. This hosted ecommerce
site that gives any WeoGeo Library users the
ability to add a price to their mapping pro ducts.
With a simple click of a button, users can
expose these products to the world to derive
mapping sales.
Additionally, the Library is available as an app -
liance that delivers the same features as the
SaaS library product, but allows those
36
Ar t i cl e
July/August 2010
WeoGeo Library Example
A tremendous amount of geospatial content is recreated every
year because consumers of that content cannot find it.
WeoGeo
iTunes for Maps
customers with sensitive or secure data to put
it behind their organizational firewall. This
means it can be integrated into existing infra -
structures with little effort, but still offers a plug
in and go ease of use says Bissett.
Client Base
WeoGeos customers are both large and small
enterprises that are focused on the business-
side of the mapping industry. These include util-
ities, GIS engineers, government services, and
private data vendors, who work with web-based
tools, but delivered by either the Internet or
behind-the-firewall implementations.
Bissett: Our original client base was mostly
US-based imagery providers for raster-based
mapping products. This resulted from our expe-
rience as DoD (Distribution on Demand?, EvR)
imaging contractors. We have recently partnered
with Safe Software to provide more support for
mapping data users of vector-based mapping
products. We see the vector market becoming
increasingly important in our customer base.
To support this segment of the client base, the
company released integration tools for ESRI
ArcGIS Desktop users to use WeoGeo Library
and Market services from within their ArcMap
environment at the ESRI FedUC earlier this year.
While WeoGeos origins are in the US, Bissett
notices an increasing demand from Europe and
Australia for the companys products. He
explains why: I think the governments of all
western countries have become increasingly
interested in sharing their data stores.
Businesses, too, are looking for new ways to
market and distribute their content and geospa-
tial capabilities. These enterprise organizations
are less interested in sharing per se, but
instead are more interested in increasing pro-
ductivity, margins and revenues.
Licensing Issues
When asked how WeoGeo handles licensing
issues, Bissett refers to the WeoGeo Data
License. This license basically states that the
seller (Content Provider) gives the purchasers a
single use commercial license with the product.
Bissett: This license includes a derivative
works license, which gives the seller full royalty
protection on any future sales of the derivative
work. We track all revenues flows of derivative
products within the WeoGeo Market, and make
royalty payments as derivative sales are made.
The original copyright holder maintains all copy-
rights with respect to their original content.
This model allows WeoGeos Content Providers
to establish a network of re-sellers of their valu-
able mapping content; while at the same time,
it provides content buyers a consistent, license
model to consume, create, and hopefully,
recreated every year because consumers of that
content cannot find it.
Heres where WeoGeo comes in: Our primary
goal is to first create the archive and indexes
that allow for easy search and consumption,
which includes licensing. From this point, the
market will decide the value of the content.
Bissett believes that an active marketplace for
content raises the value of all content in the
marketplace, which creates an environment for
people to be paid for their expertise. Our field
is a professional field, with people who have
spent many years in training to develop their
skills to provide valuable services to their cus-
tomers. I believe that we should work to find a
way to support these people in their effort to
create a livelihood from an endeavor such as
geospatial analysis that ultimately benefits the
sustainable use of our planet. I think that Free
and Advertising-supported are not consistent
with professional mapping services.
The Future
As for the future and possibilities of the cloud,
Bissett is critical of the high expectations
people have of the possibilities of these tools
and services, as well as the speed that things
will take place: I believe that cloud comput-
ing, or rather distributed and shared comput-
ing services, will make it possible to create
better, faster, and cheaper computing tools for
our industry. Yet, I also suspect that these tools
and services are still in their infancy, and may
take longer than people expect to see a tremen-
dous surge in use and dramatic increase in ROI.
The roll out of cloud services is likely to be
more an evolutionary, than the currently hyped
revolutionary, movement.
As for WeoGeo itself, Bissett expects a lot from
the partnership with Safe Software. Currently,
Safe Software is providing the ETL functions
behind the scenes at WeoGeo. These functions
are currently limited to ETL on static files, things
like file transformation and conversions, re-
projections, etcetera. In the future, I would
expect that we would be exposing more dynam-
ic ETL functions to our customers. FME Server
2010 is a powerful platform, and we have just
begun to scratch the surface of its capabilities.
Eric van Rees is editor in chief of GeoInformatics
Magazine. For more information on WeoGeo,
please have a look at www.weogeo.comand
http://blogs.weogeo.com
resell their value-added contributions.
WeoGeo also allows for custom licensing of
data products through its Commercial Library
products. This feature will soon be implement-
ed in the WeoGeo Market; however, the com-
pany will not be responsible for tracking use or
derivative royalties for these custom licensed
data products, states Bissett. While we under-
stand the desire, and in some cases, the need
for a custom license agreement, we think that
custom licensing is part of the problem with
the mapping arena. We believe the use of
custom licensing agreements is holding our
industry back from achieving its full potential.
Data Sharing
Even though people may agree that sharing
data is a good thing to do, it is not happening
as much as it could. Bissett explains why: I
think geospatial content is a good thing,
whether it is internal enterprise sharing, free
public sharing, or for-pay public sharing. The
real issue is to get the content indexed and dis-
coverable in a manner that increases our abili-
ty to acquire and use the valuable content
locked within the silos of organizations. A
tremendous amount of geospatial content is
Latest News? Visit www.geoinformatics.com
Ar t i cl e
37
July/August 2010
Paul Bissett, CEO and Co-Founder of WeoGeo
Our goal is to provide
organizations with the ability
to manage and serve their
mapping products as easily
as one manages their iTunes
song library.
Building Open Source Software
Geomajas
Businesses are adopting GIS applications at a fast pace. But to keep up with business needs and budget requirements,
the applications need to be easily deployable, scalable, very performing and, to top it all, very budget friendly. Geomajas
offers an open-source GIS framework for the development of thin client GIS web applications that meet all these needs.
By Jan Pot
GIS applications integrate with other ICT domains, such as ERP and
Business Intel ligence, adding to the growth of the GIS market. At the same
time, a growing number of existing solutions started embedding GIS in
OEM partnerships. As the adoption rate of GIS functionality increases, end
users of traditional fat client desktop GIS applications are confronted with
the limits of the applications technological approach. More and more pro-
fessional applications migrate to the cloud, offering the provider of the
web application a lot more flexibility in terms of deployment, availability,
scalability, and security. The end user benefits as well, as the use of a
web service allows working with a true thin client. It comes as no surprise
that organizations expect to move the GIS component of their applica-
tions to the cloud as well. This is the case, for example, with a growing
number of e-government web services.
The Flemish government took the initiative to develop a GIS platform that
would be ready to support the GIS needs of the future. This led to
Geomajas, a GIS software platform for the development of rich internet
applications based on open-source technology. While the GIS market is
expected to grow, the share for open-source technology is expected to
grow even faster, thanks to an increasing number of policies and regula-
tions that promote the use of open-source technology in both the
government and private sectors. The open-source license of Geomajas
allows integrators that are active in the GIS project business to get start-
ed at no cost. The choice for open-source also holds a lot of potential for
OEM companies, as integrating GIS technology will generate added value
for other application domains, such as ERP, CRM, and more.
The Benefits of Geomajas as Open-source Software
Geomajas is developed under the GNU Affero General Public License (AGPL)
v3. The platform is based on an integrated client/server architecture, the
main point being the fact that business logic, security and data integra-
tion are completely handled at the server side. This offers considerable
advantages in terms of scalability, manageability and re-use. Other open-
source GIS architectures that are capable of editing spatial data in the
browser require a direct connection between the back-end data store and
the browser or desktop, to allow the processing of business logic and
spatial operations at client side. Geomajas runs all of this on the server,
sending only the results to the web client. Even when spatial data is being
edited in the browser, the amount of client/server traffic is kept to a mini-
mum.
On the front-end Geomajas has a thin web client that deals with the pre-
sentation, the event handling and limited spatial operations. The client
face runs in standard web browsers without the need for any plug-ins. On
the back-end Geomajas features a spatial application server. Comparable
integrated client/server architectures are only found with proprietary solu-
tions. Open-source, however, offers extra benefits. As security is one of
the main concerns in technology today, it is safe to say that open-source
technology offers more security guarantees than the proprietary world.
Open-source solutions are tested and tried by hackers just as much as
proprietary technology is, but at the end of the day the open-source com-
munity implements the feedback it gained to improve security levels.
The use of open standards and open-source technology combined with
a scalable and open architecture improves the interoperability between
open-source and proprietary solutions. Choosing a proprietary system
holds the risk of getting locked-in. High availability for proprietary sys-
tems generally needs tailored software solutions. The use of open stan-
dards avoids a solution becoming dependent on one single middleware
or hardware infrastructure. Open-source technology also facilitates the
easy deployment of GIS based web applications. At client side, no upfront
38
Ar t i cl e
July/August 2010
Technology architecture of Geomajas
versus other GIS technologies
investment is necessary, as the service is simply accessed using a web
browser. At server side, there is no upfront licence investment either.
Geomajas in Practice: A Web Application with a GIS
Component.
The Agriculture and Fisheries agency of the Flemish government is one of
the organizations that contributed to the development of Geomajas. The
agency hosts an application that allows Flemish farmers to file their yearly
reports on the use and division of farm lands over the internet. The web
application is made available through a virtual counter. They needed infor-
mation about the actual use and division of the agricultural lands: how
the lands were used, what crops were grown, and more. In 2009, the
Agricultural and Fisheries agency received about 17 percent of the
farmers reports through its virtual counter. The application was built using
.NET and Javascript. The GIS component is based on Geomajas and Oracle
database technology.
The Geomajas component of the web application adds practical GIS infor-
mation to the farmers report. The application offers an actual map of
the farmers lands, allowing him to indicate on the map what parts of
the lands were used for what types of activity. The virtual counter is
connected to the farmers history, offering the farmer information about
crop rotation and subventions. Using the virtual counter, the farmer can
also indicate which subventions he wishes to apply for, with the applica-
tion immediately running a check on the criteria. When the government
started thinking about the development of the virtual counter, it was clear
that they needed a web application whereas traditional client-server soft-
ware was not really an option here, as every installation on client side
however small it might be would be an obstacle for the farmer to actu-
ally get started with the application. The Agriculture and Fisheries agency
expects the use of the virtual counter to increase from 17 percent last
year to 40 percent this year. This rapid growth wont cause great concern
as Geomajas client-server architecture based on server-side integration
on a stateless server guarantees endless scalability.
Geosparc: Supporting Geomajas
Geosparc is the company that commercially supports Geomajas. Geosparcs
goal is to complement the open-source offering with commercial services,
provided by a network of certified partners. With the growing interest for
Geomajas, we realized their was a increasing demand for Geomajas sup-
port services, says Jan Pot, co-founder of Geosparc. With Geosparc, we
combine the innovative nature of the open-source technology with a
professional support organization.
Geosparcs offer includes proof of concept development, negotiating
Service Level Agreements, offering project support, consulting, training,
and development services. As the owner of the softwares intellectual
property rights, Geosparc also offers OEM licenses and internal use
software licenses.
Jan Pot, Marketing & Communication Manager.
For more information, have a look at www.geomajas.org and
www.geosparc.com
Latest News? Visit www.geoinformatics.com
Ar t i cl e
39
July/August 2010
Open Source Solutions
Norwegian
Mapping Authority
The Norwegian Mapping Authority (Statens Kartverk) is the central organisation for the provision of mapping images to
most public bodies and organisations in Norway. After experiencing a vast increase in requests for their services in 2006
and 2007, the Mapping Authority also had to deal with an increasingly overstrained IT infrastructure. The Mapping
Authority chose to employ an IT infrastructure based on open source software solutions, which were free of licensing
costs and which proved to be much better, performance wise.
By Gregor Bierhals
Organisation and Background
The Norwegian National Mapping Authority is
Norways main organisation when it comes to
the collection and distribution of geographic
information and mapping material. About 50
percent of the work at the Mapping Authority
focuses on the operational and distributional
services and mechanisms, serving the Fishing
department and other official departments in
Norway. The other 50 percent of their work
relate mostly to standards, such as ISO, in
order to assure that the Mapping Authoritys
output complies with other organisations and
agreed standards.
In January 2005, about 600 organisations and
partners came together to form the Noway
Digital initiative. Norway Digital is a nation-
wide program for co-operation on establish-
ment, maintenance and distribution of digital
geographic data. The aim is to enhance the
availability and use of quality geographic
information among a broad range of users,
primarily in the public sector. Erland Red,
department manager at the Mapping
Authority, further elaborates: [...] all the
municipal authorities, directorates, ministries,
the police, or the armed forces are collabo-
rating in the Norway Digital collaboration. The
principle there is that one signs an agreement
stating that I will take part and offer all my
data to the collaboration. And thus one gets
access to all the other partners data. By
sharing all the information collected by the
various partners, the allocation of data has
become much more efficient and the data
range much more extensive. This also explains
the need of standard compliance, as all the
partners have to be able to access and use
the information that is being provided
amongst the partnership.
Through the participation in Norway Digital,
the amount of WMS (Web Map Services)
40
Ar t i cl e
July/August 2010
The Norwegian Mapping Authority chose to employ an IT infrastructure based on open source software
solutions.
requests has increased dramatically. Where in
2007 already about 50.000 map images were
requested on an average day, this has
increased to roughly 300.000 in 2009, ten-
dency rising.
Budget and Funding
The Norwegian Mapping Authority is funded
by the national government of Norway.
Although there is no dedicated budget for the
IT infrastructure, as the main priority is to have
an efficient and functioning system, the nation-
al government encourages publicly funded
bodies to reduce IT costs by using free and
open software, where this is possible.
In late 2007 the team started to implement an
infrastructure based on open source software
in parallel to the proprietary software based
one already in place. At first, this was not pub-
lic and just for internal testing purposes, but
after three month of testing the solution went
live and replaced the proprietary solution. After
a year of use, the team was more than happy
to see that they had a stable solution that was
not only much better performance-wise, but
also much more economic in financial terms.
Of course the Mapping Authority also had to
make some investments for the new infra -
structure. Especially the building up of in-
house expertise was essential for this project,
as there was no more any external service
provider who the team could contact if there
was a problem. The Mapping Authority there-
fore hired a new member of staff to fill this
skill gap. In addition, the team that has been
involved in the project was sent to conferences
and learning workshops, in order to strength-
en the knowledge within the whole team. Even
though this involved some financial invest-
ments as well, the amount of money spent on
this was considerably lower compared to the
software licenses that would have had to be
purchased if the team would have decided to
stay with their proprietary solution.
Technical Issues
The main task of the Mapping Authority is to
provide maps whenever a partner organisa-
tion needs one. This process is largely auto-
mated, so all requests happen online through
the database system. The Mapping Authority
does not only have to provide the map, but
also complementary information requested by
the respective partners. Those complementary
information might be the location of ships,
weather circumstances, or national preserva-
tion areas, for example. At this point, the
Mapping Authority has around 300.000 map
images per day serving different users and
different applications that are run by the part-
ners. For a small country like Norway that is
pretty much, indicated Red, with a hint of
standard compliance. The open source soft-
ware gives the possibility to fulfil the stan-
dard 100% says Red. By using Web Map
Service, which complies with all these stan-
dards, the Mapping Authority makes sure that
all the other partners in the Norway Digital
co-operation can access the mapping images
without difficulties. With regard to the func-
tionality, it also brings several advantages, as
it entails many useful ways to display map-
ping images on the Internet. You can put
more or less real time information on top of
[the maps], like the AIS [Automatic Iden ti fi -
cation System] real time ship traffic, weath-
er information, and other information.
Change Management
At the start of the project, the Mapping
Authority had only little knowledge of open
source software environments. Therefore they
had to find ways to get acquainted with the
new system, while they were still using the
old infrastructure. As Red remembers this
process We didnt have any competence or
skills connected to open source software. But
we built it up quite fast and then we changed
the service, and ran a double operation by
having the official deliverance going from the
software, while we tested out the open source
software on the side. Starting in the fall of
2007, the team ran the systems in parallel for
about three to four month until they felt that
the system was fit for the job, and they had
gained the necessary understanding of it.
At the Mapping Authority usually three people
dedicate their time to the evaluation and dis-
tribution of geographic information and anoth-
er three people to the technical aspects of the
work. For the introduction of the open source
environment, both groups joined forces and
together with a new member of staff who had
thorough knowledge of the operating system
and general open source working methods,
they tweaked the system according to their
needs. Red explains that acquiring knowl-
edge on open source software was at the end
rather easy and fast: We found a lot of mate-
rial on the internet. There are a large number
of communities that can help you a lot and
which have already implemented the respec-
tive solutions successfully.
The three most importance improvements for
the Mapping Authority are: performance
improvement, cost savings, and the freedom
to change and adapt the software according
to their needs- independent from a software
vendor.
With the introduction of the open source solu-
tions, the team was free to adapt the soft-
ware to their needs, and they had to find
pride in his voice. On top of the WMS (Web
Map Service) that are being used most fre-
quently by the partners, the Mapping
Authority also enabled its partners to access
maps based on tiles, such as Google Maps,
which speeds up the process of accessing
information significantly.
To do all this, the Mapping Authority clearly
has to have an IT infrastructure that is effi-
cient with resources and reliable, as some of
the information may be of crucial importance
to some of the partners. The team chose to
employ Linux RedHat and several other open
source products, such as PostgreSQL, PostGIS,
and Mapserver. The BAAT in the following
chart stands for user (B), authorisation (A),
authentication (A), and counting (T). The sys-
tem allows the Mapping Authority to give the
right information to the right partner, and to
control system resources efficiently.
The system can only be accessed by the mem-
ber organisations of the Norway Digital
co-operation. To make sure that no one else
has access to the system, the gatekeepers,
which were developed on Tomcat, enable user
access control. In a case of an emergency,
they also allow the Mapping Authority to give
certain partners priority over others, i.e. when
a ship is missing and the port authorities have
to make full use of the system. The proxies,
which are all running on Apache, can be
understood as the frontier between the inter-
net and the local network at the Mapping
Authority.
On the right side of the chart at the image
below, the cache produces the tiles, which is a
fast way of presenting maps, as explained ear-
lier. On the left side of the chart, the intercep-
tors check if you have your ticket, explains
Red. In other words, they control the user
rights one has for the accessing of data. Once
user rights have been established, the intercep-
tors allow one to the balancers, which make
sure that the Mapserver is not overburdened.
The Mapserver then lets one access the maps
requested from the database (DB).
With some few exceptions, the system is run-
ning almost entirely on open source software.
Contrary to many fears, the Mapping Authority
has hardly encountered any problems since
the infrastructure went live. Considering the
breakdowns that were occurring almost on a
daily basis with the previous system, this has
been a great success for the Mapping
Authority. The open source environment man-
ages to cope with the constant increasing of
requests seemingly without problems.
The new environment also has an effect on
Latest News? Visit www.geoinformatics.com
Ar t i cl e
41
July/August 2010
other ways to solve problems. By not relying
on a support partner, one has to take respon-
sibility over the system oneself. Although this
last aspect may be fearful to some, for the
team at the Mapping Authority rather the
opposite was the case, as Red explains:
This sparks the technicians interest. It is a
challenge to him; a possibility to have the
total responsibility. You cant point to a com-
pany and say I cant do anything about it, I
need support. The new system has been an
interesting challenge for the people at the
Mapping Authority to take the responsibility
and to have the freedom to do what they
want. Now we really have the possibility to
master the whole thing. And that has been a
trigger for our people to do things, to make
things work Red further adds.
Cooperation
The Mapping Authority essentially co-operates
on two different levels: with regards to the
content (i.e. the information for maps) they
stand in close collaboration with the partners
from the Norway Digital initiative. This how-
ever had no impact on the development of
the new software environment, as the coop-
eration mainly aims at establishing a two-way
exchange of geographic information. Besides
the provision of WMS and other information,
the Mapping Authority has shared some of its
in-house developments with other partners
within the Norway Digital cooperation.
Although most of these in-house develop-
ments are rather specialized on the needs of
the Mapping Authority, other organisations
may find themselves in the need of similar
solutions. By employing open source solu-
tions, the Mapping Authority had the freedom
to share any solution they developed without
breaching any license agreements. The gov-
ernment even established the platform
www.Friprog.no for the exchange of informa-
tion, experience, and code amongst organisa-
tions and public bodies in Norway.
With regard to the development of the open
source system infrastructure, the Mapping
Authority sought the cooperation with the
online communities behind the software solu-
tions they employed. In order to gain exper-
tise and a clear understanding of the open
source software environments involved they
realized that the best way of doing so is by
referring to the online communities, such as
OpenGeo. Those collaborations were extreme-
ly helpful, and eventually became the most
important knowledge resource for the team.
Compared to the software vendor that in older
times would provided guaranteed support,
even if delayed, the team at the Mapping
Authority initially feared that it might be much
harder to rely on the volunteering support
provided through the open source software
communities on the web. However, contrary
to this assumption the Mapping Authoritys
experiences so far has been rather the oppo-
site. With the software vendor that they had
contracted we had weeks of waiting, in the
worst case even a month, remembers Red.
Now, with the open source solutions, nobody
will give you a guarantee that you get an
answer, but their experience so far has shown
that theres always someone to ask, and
there has always been an answer from some-
body. And, even better, this usually happens
within minutes. Consequently, the Mapping
Authority advices other institutions to take
the risk, as their worst fears of standing alone
with a problem have simply not come true.
Achievements and Lessons learned
We have had only positive experiences with
this. It might seem a bit boasting, but we
havent experienced a single setback, states
Red proudly. The project therefore has been
a great success for the Mapping Authority.
As stated before, the three main improve-
ments that the undertaking brought along
were: cost savings, improved stability, and
freedom to adapt the system to their needs.
Considering that the services the Mapping
Authority provides are still increasingly
requested, these three points gain in impor-
tance continuously. The stability plays an
equally important role, as more and more
partners relay on the services. By relaying on
open source solutions, the Mapping Authority
can ensure that system breakdowns do not
hinder the work of others.
One more positive aspect of open source
solutions is the ability to share developments
and expertise. Any developments that the
Mapping Authority has done themselves can
be shared with others, where this appears
useful. The Norwegian government is also
trying to promote the use and the sharing of
open source software through the portal
www.Friprog.no. Through this portal, the
government has released a kind of cook
book, as Red explains this, where organi-
sations are guided on their way to imple-
menting open source software.
Gregor Bierhals, UNU-MERIT.
This case study is brought to you by the Open
Source Observatory and Repository (OSOR),
www.osor.eu a project of the European
Commissions IDABC project
http://ec.europa.eu/idabc.
This study is based on an interview with Erland
Red,department manager at the Norwegian
Mapping and Cadastre Authority, as well as email
exchange with Francky Callewaert from the
European Commission. Additional information has
been taken from the websites listed in the Links
section, as well as further information provided by
the Norwegian Mapping and Cadastre Authority.
42
Ar t i cl e
July/August 2010
PostgreSQL screenshot
Directly situated on the North Sea and stretching forty kilometers in length, the Port of Rotterdam, NL (PoR) is the largest
seaport in Europe and one of the busiest ports in the world. A 24/7 global gateway and massive transshipment point, it
serves to swiftly and efficiently distribute goods to hundreds of millions of European consumers. The ports massive
industrial complex provides an intermediate destination for storage, cargo handling, processing and also distribution
via various other forms of transport including road, rail, ship, river barge and pipeline.
By the editors
The Port of Rotterdam Authority strives to
develop and advance Europes leading sea-
port. The Authority facilitates and supports
businesses in the port area, and acts as the
manager of the port. Focusing on space and
infrastructure planning and logistics, the
Authority is responsible for creating optimum
conditions for onsite business locations and
accompanying residential environments.
In the past decade, the shipping industry
44
Ar t i cl e
July/August 2010
S P A T I A L I N F O R M A T I O N M A N A G E M E N T
entered the digital age, and information man-
agement has progressed immensely. The digi-
tization of data and ability to transfer infor-
mation more freely has led to the unification
of formerly independent systems. Systems
dards of performance and efficiency, PoR con-
tinues to investigate ways to improve the cur-
rent system. Because geo-information became
so easily accessible via the centralized solu-
tion, the demand grew tremendously. For
Information Management, this was a trigger
to begin using web services. Web services
are no longer deemed a specialized area of
information, said Mulder. The end users
ability to interact with geospatial web services
has increased significantly over the years.
Even though requested information continues
to reside in dedicated systems across the
organization, there is a significant demand for
a more integrated view of this information.
Everybody needs access to these sources,
which calls for a service oriented information
architecture and policy, adds Mulder.
PoR identified four must have improvements
to the existing solution:
1. Web services for connecting to HAMIS
(Harbor Master Information System)
2. Multiple user interfaces for different appli-
cations
3. Ability to access externally hosted datasets
in office applications (without the need for
import)
4. More modular framework to carry out mod-
ifications to minimize expense and system
downtime
ERDAS Solution
After assessing of the Port of Rotterdams
requirements for updating their existing sys-
tem, Imagem, the authorized ERDAS reseller in
the Netherlands and Benelux, presented the
ERDAS APLOLLO solution to PoR. The overall aim
of this implementation is to provide a gener-
al geographic information architecture for all
spatial assets and all other geographically-
significant items at the PoR, said Patrick de
Groot, Sales Manager, Imagem.
integration and centralization has swept
across port operations, and even encouraged
cooperation beyond corporate borders.
Spatial Information Management
Port of Rotterdams Spatial Information
Management handles the internal processes
at the port, including guidance of ship move-
ments, commercial processes, infrastructure
management and strategic planning. More
than a decade ago, PoR made the strategic
decision to implement one single, organiza-
tion-wide database, providing the entire ope -
ra tion with a comprehensive information pack-
age. This centralized approach seeks to make
newly published data and information imme-
diately available to all relevant departments.
Spatial Information Management also provides
PoR with correct and appropriate geospatial
information for its commercial processes. As
part of the ports centralized information solu-
tion, Spatial Information Mana gement delivers
spatial information systems for harbor traffic,
leased harbor parcels, asset management and
current projects in pro gress, said Albert
Mulder, Spatial Information Manager at Port of
Rotterdam.
To date, the Spatial Information department
manages over two million spatial objects,
totaling over a hundred gigabytes. Much of the
data is self-collected by the Port, including
soundings of harbor floor, parcel boundaries
for lease contracts, environmental data and
radar data. Other data is derived from outside
sources, including a high detail general
Netherlands basemap, cadastral, aerial pho-
tography (at seven cm resolution for the entire
harbor area) and general topographic maps.
Data Management and Delivery
Challenges
The centralized information solution has been
very successful. However, to maintain stan-
Latest News? Visit www.geoinformatics.com
Ar t i cl e
45
July/August 2010
This massive development site at the Port of
Rotterdam is called tweede Maasvlakte. The
meaning of tweede is second, and this project is
an extension to the original Maasvlakte, a main
part of the harbor that can accommodate the
worlds largest ships. It is being developed in stages
with final completion expected in 2013, and will
expand the Harbors territory by a whopping 20%.
Officers at the Port of Rotterdam monitor all ship movements on wall-size video screens.
PoRs Spatial Information Management recog-
nized the power of ERDAS APLOLLO Essentials-
SDI to fully meet PoRs main requirements.
ERDAS APLOLLO Essentials- SDI is an entry level
APLOLLO product for cataloging and delivering
geospatial data over the web, via a user-
friendly interface.
One of the strong elements of the ERDAS
APLOLLO framework is its ability to cover the
whole stack of preparing data, creating web
services and visualizing those in a client,
said Mulder, without extra effort to integrate
those stages; its all in the package.
Geospatial Web Services
Its adherence to OGC standards makes it easy
for PoR staff to access geospatial web ser-
vices into a variety of office applications.
These include applications for maintenance
of infrastructure, the leasing of land parcels
and nautical applications, to name a few,
said Mulder.
PoR also intends to use the ERDAS APLOLLO
Solution Toolkit to build custom client front-
ends for their various customers. This includes
adding OGC services discovery and visualiza-
tion in custom GIS applications. Contrary to
our present client-server architecture, ERDAS
APLOLLO makes it possible to use different
viewers and update tools for each of the user
groups, said Mulder. Plus the modular frame-
work of ERDAS APLOLLO enables modifications to
be carried out with minimal, if any, system
downtime. The front end and back end
connections are very flexible, adds de Groot.
If they change something on the back end,
it does not mean they have to also change
the front end immediately, because of the
service oriented architecture.
Looking towards the future, Mulder adds,
there is no doubt that this implementation
will yield benefits to the every day operation
at the Port of Rotterdam, both in terms of
insight and of speed of delivery, which will
become more apparent over the next months
as things develop.
For more information, have a look at
www.erdas.com
The biggest challenge was combining the information from these different sources in a clear and easy way, so that both technical and also non-technical
(i.e. commercial) staff could have access to this information without specialized applications.
46
Ar t i cl e
July/August 2010
Suzhou FOIF Co.,Ltd.
For more information please visit at:
or email to:
www.foif.com.cn
internationalsales@foif.com.cn
Since 1958
FOIF TS810 READY!!!
TS810 combines with the most
powerful surveying data collection
software FOIF FieldGenius, they
make your surveying work more
productive and efficient.

FOIF FieldGenius 2010
Perfect onboard software


Built-in Temperature and Pressure Sensors
Large Full Colour Graphic Display
Guide Light System
Multiple Interface Options
Dual-Speed Drives
Touch Screen
Windows CE 5.0 Operating System
300m Reflectorless Range(OTS)
Same user interfae for TS810 & GNSS A20
All the staking tools you will ever need
Rich programs: azimuth/distance, area
offsetting, intersection, poly-line, curve
Data import/export:DXF, SHP Rw5, LandXML
Import and stake directly from a DXF file
Volume calculations
Map 3D view with colored lines
Powerful road module(3D)
ITC develops and transfers
knowl edge on geo-information
science and earth observation
ITC is the largest institute for international
higher education in the Netherlands, providing
interna tional education, research and project
services. The aim of ITC's activities is the inter-
national exchange of knowledge, focusing on
capacity building and institutional development
in developing countries and countries in
transition.
Programmes in Geo-information Science
and Earth Observation
Master of Science (MSc) degree (18 months)
Master degree (12 months)
Postgraduate diploma (9 months)
Diploma (9 months)
Certificate course (3 weeks-3 months)
Distance course (6 weeks)
Courses in the degree programmes
Applied Earth Sciences
Geoinformatics
Governance and Spatial Information Management
Land Administration
Natural Resources Management
Urban Planning and Management
Water Resources and Environmental Management
I NTERNATI ONAL I NS TI TUTE F OR GEO- I NF ORMATI ON S CI ENCE AND EARTH OBS ERVATI ON
www.itc.nl
For more information:
ITC Student Registration office
P.O. Box 6, 7500 AA Enschede
The Netherlands
E: education@itc.nl
I: www.itc.nl
Cologne, October 5
th
to 7
th
, 2010
Calendar 2010/2011
Advertiser Page
ERDAS www.erdas.com 43
ESRI Inc. www.esri.com 21
FOIF www.foif.com.cn 47
GEODIS www.geodis.cz 7
INTERGEO www.intergeo.de 49
ITC www.itc.nl 48
Leica Geosystems www.leica-geosystems.com 17
Optech Inc. www.optech.ca 13
RACURS www.racurs.ru 39
Sokkia www.sokkia.net 52
Spectra Precision www.spectraprecision.com 31
Topcon Europe BV www.topcon.eu 2
Trimble www.trimble.com/geospatial 51
VEXCEL Imaging www.microsoft.com/ultracam 35
Advertisers Index
August
01-05 August +
San Diego, CA, San Diego Convention Center, U.S.A.
Internet: http://spie.org/x30491.xml
01-05 August Devices +
San Diego, CA, San Diego Convention Center, U.S.A.
Internet: http://spie.org/x13192.xml
01-05 August +
San Diego, CA, San Diego Convention Center, U.S.A.
Internet: http://spie.org/x13188.xml
03-06 August A -

Arequipa, Peru
E-mail: conference@applied-geoinformatics.org
Internet: http://applied-geoinformatics.org
07-12 August
Ponta Delgada, Azores Islands, Portugal
E-mail: gislands2010@uac.pt
Internet: www.gislands.org
09-12 August
Kyoto, ICC Kyoto, Japan
Internet: www.isprscom8.org/index.html
16-18 August 2010 A
Charlotte, NC, U.S.A.
Internet: www.urisa.org/conferences/addressing/info
30 August-02 September
Las Vegas, NV, ARIA Resort at CityCenter, U.S.A.
Internet: www.intergraph2010.com
September
01-03 September
Cork, Ireland
E-mail: rspsoc2010@ucc.ie
Internet: www.rspsoc2010.org
01-03 September A
A NA -

Paris, France
Internet: http://pcv2010.ign.fr
02-03 September -
Paris, France
E-mail: p.chynoweth@salford.ac.uk or r.w.craig@lboro.ac.uk
Internet: www.cobra2010.com
06-09 September
Barcelona, Spain
Internet: http://2010.foss4g.org/index.php
13-17 September
Alice Springs, Australia
Tel: +61 (414) 971 349
Internet: www.15.arpc.com
14-17 September AR
A
Freiburg, Germany
E-mail: silvilaser2010@felis.uni-freiburg.de
Internet: www.silvilaser.de
15-17 September

Skopje, Republic of Macedonia
E-mail: sdiconf2010@gmail.com
Internet: http://sdi2010.agisee.org
19-21 September
Yokohama, Japan
E-mail: g-expo@gsi.go.jp
Internet: www.g-expo.jp
20-23 September
Toulouse, France
Internet: http://spie.org/x6262.xml
20-23 September


Gaeta, Italy
E-mail: conference@racurs.ru
Internet: www.racurs.ru
21-24 September GNSS
Portland, OR, Oregon Convention Center, U.S.A.
Tel: +1 (703) 383-9688
E-mail: membership@ion.org
Internet: www.ion.org/meetings
22-24 September
New Delhi Expo XXI, India
Internet: www.oesallworld.com
23-24 September
Skudai, Johor, Universiti Teknologi, Malaysia
Tel: +607 553 0806
Fax: +607 556 6163
E-mail: alias@utm.my
Internet: www.fksg.utm.my/Research_Group/3dgis/activities/
3DGIS%20Brochure.pdf
28-30 September A
Stratford-upon-Avon, Holiday Inn, U.K.
Internet: www.agigeocommunity.com
28 September-01 October A
A
Orlando, FL, U.S.A.
E-mail: wnelson@urisa.org
Internet: www.urisa.org
October
04-08 October Week
Santiago, Chile
E-mail: lars@saf.cl
Internet: www.lars.cl
05-07 October
Cologne, Cologne Exhibition Centre, Germany
Internet: www.intergeo2010.de
17-20 October &
Dearborn, MI, USA
Tel: +1 909-793-2853, ext. 4347
E-mail: egug@esri.com
Internet: www.esri.com/egugconference
18-20 October
Denver, CO, USA
Tel: +1 909-793-2853, ext. 3743
E-mail: healthgis@esri.com
Internet: www.esri.com/healthgis
19-21 October GNSS
Braunschweig, Germany
Tel: +49 (228) 20197 0
Fax: +49 (228) 20197 19
E-mail: dgon.bonn@t-online.de
Internet: www.enc-gnss2010.org
25-27 October HDS
San Ramon, CA, U.S.A.
Internet: http://hds.leica-geosys-
tems.com/en/Events_6441.htm?id=6896
26-28 October
Rome, Italy
E-mail: info@esriitalia.it
Internet: www.esriitalia.it
November
03-04 November
Berlin, Germany
Internet: www.igg.tu-berlin.de/3dgeoinfo
08-10 November
The Mirage, Las Vegas
Internet: www.trimblesurveyevents.com
15-18 November A
Orlando, FL, Doubletree Hotel,U.S.A.
Internet: www.asprs.org/orlando2010
29 November-03 December
Tunesia
Internet: www.geotunis.org
30 November-01 December AR
The Hague, The Netherlands
Tel: +44 (0)1453 836363
E-mail: info@lidarmap.net
Internet: www.lidarmap.org
February 2011
07-09 February AR
Astor Crowne Plaza, New Orleans,LA,U.S.A.
Internet: www.lidarmap.org
13-19 February
Obergurgl, Tirol, Austria
Info: Dr. Thomas Weinold
Tel.: +43 (0)512 507 6755 or 6757
Fax: +43 (0)512 507 2910
E-mail: geodaetischewoche@uibk.ac.at
Internet: http://geodaesie.uibk.ac.at/obergurg.html
April
05-07 April Ocean -

Southampton, U.K.
Internet: www.oceanbusiness.comg
06-07 April -
Southampton, U.K.
Internet: www.offshoresurvey.co.uk
Please feel free to e-mail your calendar notices to:calendar@geoinformatics.com
50
July/August 2010
WideAngle
Extend Your Reach.
Big mapping project, tight deadlines, tough margins. When the weather clears, you need
to grab every pixel you can. Which is why you have the Trimble Digital Sensor System
WideAngle on board. Your new 60MP medium format sensor, combined with a precision
35mm lens, delivers cross-track coverage approaching that of a traditional lm camera,
and more:
wide swaths of crisp, well-balanced ortho and stereo imagery
reduced in-air time with Applanix IN-Fusion
high accuracy mapping over extra-long baselines
using Trimble VRS technology
All this in a turn-key, directly georeferenced, medium
format imaging solution.
For more about Trimbles Digital Sensor Systems,
go to trimble.com/geospatial.
www.trimble.com/geospatial
2010 Trimble Navigation Limited. All rights reserved. PC-013 (06/10)
Trimble
Trimble
GNSS Recei ver
The entirely new Sokkia GNSS system provides
unsurpassed versatility and usability for
RTK,network RTK and static survey, enhancing
efciency in all types of eld work.
www.sokkia.net
Scalable - Affordable - Triple Wireless Technologies
ULTIMATE
VERSATILITY

Você também pode gostar