Você está na página 1de 4

Supporting Image Document Triage And MetadataBased Recommendations For Disaster Response

Francesca Picarazzi, Joshua Sheehy, Frank Shipman, Robin Murphy


Department of Computer Science and Engineering
Texas A&M University
College Station, TX 77843-3112, USA
AbstractWhen responding to a disaster, sorting through
imagery collected by unmanned aerial vehicles (UAVs) is a tedious
process that can be made easier and more efficient through use of
Information Technology. The goal of this research is to provide a
user-friendly and efficient way for people who are responding to a
disaster to organize and analyze UAV imagery. To accomplish this,
extensions were made to PerCon including support for
categorization and automatic generation of related imagery based
on image metadata. PerCon, the Personalized and Contextual Data
environment, is an analytic workspace designed to enable effective
management of large, heterogeneous data sets. PerCon provides a
visual workspace in which the user is able to manipulate and form
relationships between data objects through spatial organization,
data visualizations and annotations. The tools added allow the user
to manually classify images using user-defined categories,
automatically generate images taken at a similar location to a
particular image, and generate a map indicating the location at
which a particular image was taken. As this project extends
PerCon, responders have access to all of the tools that the analytic
workspace PerCon offers along with the categorization and
automatic generation of related imagery functions, enhancing the
user experience. The extensions added to PerCon were tested by
heuristic evaluation.
Keywordsdisaster response, heterogeneous data management,
data integration, analytic workspaces

I. INTRODUCTION
After a disaster, imagery of the disaster area is collected by
unmanned aerial vehicles to aid in finding survivors. There is
an increasing need for an efficient way to organize and analyze
imagery such as this, in order to find survivors quicker and
speed up the disaster response and recovery stages, and
information technology can be used for this purpose. This
project is the implementation of extensions to PerCon
including support for categorization ad automatic generation of
related imagery based on metadata. The goal of this research is
to provide a user-friendly and efficient way for people to
organize and analyze imagery collected by unmanned aerial
vehicles when responding to a disaster. The following
additions to PerCon are the initial goals set for the project:
1.)
2.)
3.)

User-defined categories to classify images


Determine the part of a map at which a particular
image was taken
Determine if an image is part of a mosaic

4.)
5.)
6.)
7.)

Determine the images which were taken nearby a


particular image
Determine if an image has satellite data
Determine relevant river and flood stage data for an
image
Visualize a particular area over time using available
imagery

Three of these goal were met and the rest appear in future
work. The goals which were met are user-defined categories
(1), determining the part of a map where an image was taken
(2) and determining the images which were taken nearby an
image (4).
PerCon, the Personalized and Contextual Data environment, is
an analytic workspace designed to enable effective
management of large heterogeneous data sets. PerCon provides
a visual workspace (VKB3) in which the user is able to
manipulate and form relationships between data objects
through spatial organization, data visualizations and
annotations.
II. EXTENSIONS TO PERCON
The tools produced by this project are user-defined categories,
generating nearby images and generating the area in a map of a
particular image.
A. User-Defined Categories
The user is able to define categories which are used to classify
imagery; this is shown in figure 1. Each category is given a
meaningful name and description by the user at creation.
Images can then be associated with individual categories by
dragging image files from the browser panel to the target
category. When an item is associated with a category, a colored
icon is added to the left of the image file in the browser panel
indicating the category it has been associated with. An item
may be added to multiple categories. An attribute is then added
to the image file indicating the category, or categories, it is
associated with, which allows PerCon to recognize that a
particular image belongs to a category.

simply hide the category panel from the categories tab.


While a category is hidden, the category name will still appear
in the toolbar at the top of PerCon. To open a hidden category,
right-click in the Categories tab and select Open category.
There are also options to load category items after you open a
category and to decategorize, or remove from the category,
selected items. Figure 2 shows an example of the
categorization function in which the user has defined three
categories and has added items to each category. The
individual categories appear at the bottom of PerCon along
with their items, while the category list appears at the top right
of the workspace. Notice the colored icons next to image files
in the browser panel on the left.

Figure 1. UAV imagery is input to PerCon and the user categorizes the imagery
using the PerCon workspace.

The user is able to add, close and delete categories. Once a


category is created, a panel will appear in the Categories tab
at the bottom of PerCon, and items which belong to this
category may be dropped to this panel. Also, the name of the
newly created category will appear in section of the top toolbar
named Categories. The name will remain here until the
category is deleted. This Categories toolbar may be removed
from the toolbar by the user, and individual category names
may be clicked to display the user input description for the
category. Deleting a category will remove the category tags
from all items in the category while closing a category will

B. Generating Related Imagery


Support for automatic generation of related imagery is also
added to PerCon. This includes generating images within a
user-defined distance of a particular image and generating a
map indicating the location of a particular image. These tools
require that the images be tagged with latitude and longitude
GPS coordinates before use.
When analyzing imagery, the user may want to see other
pictures that were taken near a particular image or get an idea
of where an image was taken. For this purpose, features
enabling automatic generation of nearby images and area in a
map of an individual image were added to PerCon.

Figure 2. An example of the categorization function

Figure 3. An example of the automatic generation of nearby images

Figure 4. An example of viewing an image in a map

Simply right-click on an image, select nearby images, enter


the maximum distance desired and the suggestion tab at the
bottom of the screen will populate with images within the input
distance of the image. This process will traverse through all
available images in the background, and will notify the user the
quantity of images found within the input distance once all
images have been checked. Figure 3 shows an example of this
function. In the figure, three photos have been added to the
VKB panel. The bottom panel is populated with images after
right-clicking on the leftmost of the three images, selecting

view nearby images, and entering a search radius of 50


meters.
Similarly, the user may select area in a map after rightclicking on an image and a small map will appear in the
workspace indicating with a red marker where the image
location exists on the map. Figure 4 shows an example of this
function. The map on the right was generated by right-clicking
on the smaller photo to the left of the map and selecting view
area in a map.

III. TESTING PROCEDURES


The tools developed were tested by heuristic evaluation. The
project was presented to responders and researchers at a
Summer Institute in July 28, 2015 where it received feedback
regarding utility. The feedback received indicated the
following:
1.)
2.)

The automatic generation of nearby images is a useful


tool.
Drag and drop for categories is not as quick or easy as
other similar tools that exist, so a better interface, such
as a touch interface, would benefit the project.
IV. CONCLUSION AND FUTURE WORK

The tools developed in this project will allow people to respond


to disasters quickly and intuitively. Along with the
categorization and automatic generation of related imagery
functions, the user has access to all of the tools that the visual
workspace that PerCon has to offer, enhancing the user
experience.
Future work includes automatic generation of relevant river
and flood stage data, improved categories, determining if an
image is part of a mosaic, determining if an image has satellite
data, visualizing an area over time, and implementing touch
functionality. Improved categories involves turning the project
into a groupware so that multiple users may create categories
and classify images simultaneously. Providing an effective way
to visualize many images over time will also enable a
responder to analyze imagery quicker. Lastly, touch
functionality will allow the user to categorize items more
quickly and intuitively.
ACKNOWLEDGMENT
This material is based upon work supported by the National
Science Foundation.

Você também pode gostar