Você está na página 1de 9

Int. J. Radiation Oncology Biol. Phys., Vol. 70, No. 3, pp.

944952, 2008
Copyright 2008 Elsevier Inc.
Printed in the USA. All rights reserved
0360-3016/08/$see front matter

doi:10.1016/j.ijrobp.2007.10.048

PHYSICS CONTRIBUTION

PROJECTOR-BASED AUGMENTED REALITY FOR INTUITIVE INTRAOPERATIVE


GUIDANCE IN IMAGE-GUIDED 3D INTERSTITIAL BRACHYTHERAPY
ROBERT KREMPIEN, M.D.,* HARALD HOPPE, PH.D.,y LUDER KAHRS, PH.D.,y SASCHA DAEUBER, PH.D.,y
OLIVER SCHORR, PH.D.,y GEORG EGGERS, M.D., D.D.S.,z MARC BISCHOF, M.D.,*
MARC W. MUNTER, M.D.,* JUERGEN DEBUS, M.D., PH.D.,* AND WOLFGANG HARMS, M.D.*
* Department of Radiation Oncology, University of Heidelberg, Heidelberg, Germany; y Institute for Process Control and Robotics,
Department of Computer Science, University of Karlsruhe, Karlsruhe, Germany; and z Department of Cranio-Maxillo-Facial Surgery,
University of Heidelberg, Heidelberg, Germany
Purpose: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation.
Methods and Materials: The developed system consists of a common video projector, two high-resolution charge
coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting
coded-light patterns to register the patient and superimpose the operating field with planning data and additional
information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient.
Results: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection
and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the
surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with
direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the
data to the patients current position and therefore eliminated the need for rigid fixation. Because of soft-part
displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projectors
position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range,
0.32.7 mm).
Conclusions: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial
brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation
and monitoring of needle implantation. 2008 Elsevier Inc.
Interstitial brachytherapy, Image guidance, Frameless navigation, Augmented reality.

INTRODUCTION

In recent years, different methods for the intraoperative


visualization of preoperatively defined surgical planning
data have become a field of considerable interest (14). To
avoid constraining the surgeon to persistently reorientate
his view from the patient to the monitor and vice versa, great
efforts have been made to directly visualize the surgical planning data within the operation field (1316).
Augmented reality (AR) is a technology in which a computer-generated image is superimposed onto the users vision
of the real world, giving the user additional information generated from the computer model (13, 17). This technology is
different from virtual reality, in which the user is immersed in
a virtual world generated by the computer. The AR system
brings the computer into the real world of the user by augmenting the real environment with virtual objects. Thus,

Systems for image-guided surgery allow for interactive visualization of preoperative or intraoperative imaging data from
a patient in correlation with the patients anatomy during an
operation (14). Because precise placement of needles is
mandatory in interstitial brachytherapy, many brachytherapy
procedures would benefit from image-guided needle implantation (58). This approach would considerably help optimize the extent and geometry of an implant and decrease
the risk of injury to critical organs (59). Ultrasound image
guidance in prostate brachytherapy is a good example of
how image guidance enables optimized needle distributions
(6, 7). One of the most important steps is providing the planning data in the operating room in a reasonable way without
losing the improved accuracy (1012).

Conflict of interest: none.


Received Dec 5, 2006, and in revised form Oct 25, 2007.
Accepted for publication Oct 25, 2007.

Reprint requests to: Robert Krempien, M.D., Department of


Clinical Radiology, University of Heidelberg, INF 400, 69120
Heidelberg, Germany. Tel: (+49) 6221-56-8201; Fax: (+49) 622156-5353; E-mail: robert_krempien@med.uni-heidelberg-de
944

Augmented reality in image-guided 3D interstitial brachytherapy d R. KREMPIEN et al.

the users view of the real world is enhanced by additional information in the form of labels, three-dimensional (3D) rendered models such as tumor localization or planning data (13).
Recently, techniques were developed to use surface features to evaluate the patients position, adapt planning data
to the changed patient position, and project planning data
on the surface and correct it when necessary (1822). The
aim of this study is to use AR for intuitive intraoperative
guidance in interstitial needle implantation to: (1) enable registration of the patients position, (2) enable frameless image
guidance without the need of screw markers for navigation,
(3) allow for interactive visualization of planning data in correlation with the patients anatomy during intervention, (4)
provide data for image-guided brachytherapy intraoperatively on the patients surface, and (5) adapt planning data
to changing patient positions.
Here, we report on our experiences with a system for projector-based AR in interstitial brachytherapy. In a first clinical study, the prototype system was tested with 10 patients. In
the clinical testing, we evaluated the entire process chain
from image acquisition to data projection and determined
the overall accuracy.
METHODS AND MATERIALS
Patients
The present study was performed in accordance with ethical standards issued by the University of Heidelberg (Heidelberg, Germany)
and the Declaration of Helsinki of 1975 as revised in 2000. Institutional and governmental approval was obtained. After informed
consent was obtained, 10 patients with biopsy-proven recurrent
head-and neck-cancer were treated by using image-guided interstitial brachytherapy. Brachytherapy needle implantation was virtually
planned, and needles were implanted by using an adapted frameless
navigation system (8, 9).

Image-guided brachytherapy
A surgical navigation system adapted to interstitial brachytherapy
procedures (8) was used for virtual planning of interstitial brachytherapy needle implantation. Initially, a planning computed tomography (CT) scan was acquired. After segmentation of tumor extent
and adjacent risk structures, a target volume was defined. The
surgical navigation system then was used for calculation of an
optimized virtual needle distribution, taking into account the surgical needle implantation with regard to risk structures in the needle
trajectory (Fig. 1).

System description
The developed navigation system using AR for the intraoperative
visualization of surgical planning data was composed of a common
video projector, two high-resolution charge coupled device (CCD)
cameras, and an off-the-shelf notebook (2 GHz central processing
unit (CPU) dual core, 1 GByte random access memory (RAM);
Fig. 2a). The projector was used as scanning device by projecting
coded light patterns to register the patient. Use of a video projector
allowed online projection of planning data and additional information in arbitrary colors in the operating field. Subsequent movements
of the nonfixed patient were detected by stereoscopically tracking
passive markers attached to the patient (2325).

945

Image acquisition
The first task consisted of calibrating both the video projector
and CCD cameras within the same coordinate system. The system
was formed by a surface scanner that was used to generate a 3D
point cloud of the patients skin surface immediately before the
intervention started. This was accomplished by projecting
a sequence of stripe patterns (coded light) on top of the region
of interest with the aid of the integrated video projector
(Fig. 2b). The cameras gathered the deformation of the lines on
the surface and the corresponding images and were analyzed in
consideration of shifting grey values. Because their widths corresponded approximately to the lattice parameter of the CCD
matrix, moire patterns emerged. The software was able to analyze
the patterns and decode the bodys surface, represented by a 3D
point cloud (Fig. 3a).

Intraoperative registration by coded light


The projector was used as scanning device to yield a surface
scan of the patients current position. After the initial scanning
of the patients position, the 3D point cloud (Fig. 3a) was registered with the planning CT data. The CT images were transformed
to a distance tomogram (Fig. 3b). An arbitrary matching algorithm
served to minimize the sum over all distances from the CT surface
to the point cloud (2325). The result of this process was information of the patients present position compared with the CT-based
planning data, i.e., a 4  4 transformation matrix describing postponement and rotation (initial transformation [Tini]) (23, 24). This
enabled matching of the preoperatively segmented surface of the
diagnostic image data (CT, magnetic resonance imaging) on which
the surgical plan was defined with the intraoperative patient
position (Fig. 3c).

Intraoperative tracking
Because intraoperative conditions impede continuous scanning
and to avoid attaching rigid fixations to the patient, subsequent
movements of the nonfixed patient were detected by stereoscopically tracking passive markers attached to the patient (Fig. 4c).
These were tracked and registered by analyzing corresponding
images from the integrated cameras by using Hough transformations. The first marker registration was performed simultaneously
with the described initial scan of the patient. This step yielded a second transformation Tcur(t) mapping the patients initial (Tini) to the
current position. The global transformation Tglobal(t) = Tcur(t) Tini
allowed us to continuously transfer the surgical planning data to
the patients coordinate system, taking into account its actual position. The corresponding 2D projector bitmap, which was projected
onto the 3D patients surface, was edited to avoid or minimize distortion in areas of steep and bended surfaces. However, maintaining
lengths and angles required a triangulated surface (e.g., Delaunay
triangulation) to predict and correct deformations by using normal
vectors and other features (23, 24).

Projection of planning data and intuitive intraoperative


guidance
After registration, the system calculated the projection of virtual
planning data (Fig. 4a) on the surface of the patient (Fig. 4b). This
information was augmented (projected) to the real situation during
needle implantation. Tumor extent and entrance point of each
needle were projected on the skin (Fig. 4d). Needles then were
positioned on the skin. Furthermore, the trajectories of the needles

946

I. J. Radiation Oncology d Biology d Physics

Volume 70, Number 3, 2008

Fig. 1. Real-time navigated needle implantation. Initially, a planning computed tomography (CT) scan of the patient is
acquired. For virtual planning of needle implantation, the CT data are sent to the planning system. After segmentation
of tumor and risk structures, virtual distribution of the brachytherapy needles is planned, taking relevant surrounding
risk structures into account. An entrance point and needle pathway for each needle are determined. Intraoperatively, the
projector is used as a scanning device to yield a surface scan of the patients current position to register the patient. After
the initial scanning of the patients position, the three-dimensional (3D) point cloud is registered with the planning CT data.
This enables matching of the preoperatively segmented surface of the diagnostic image data (CT, magnetic resonance
imaging) on which the surgical plan was defined with the patients intraoperative position. Subsequent movements of
the nonfixed patient are detected by stereoscopically tracking passive markers attached to the patient. After registration,
the system calculated the projection of virtual planning data on the surface of the patient. During needle implantation,
this information is augmented (projected) to the real situation. Tumor extent and entrance point of the each needle are
projected onto the skin, and needle implantation is monitored.

were visualized and the brachytherapist was guided by arrows to


find the right implantation angle. The corresponding trajectory
was then visualized color coded on the skin, and any deviation
of the needle and the depth of the needle tip was monitored by
intraoperative tracking of the needle and visualized on the skin
(Fig. 5).

Accuracy measurements
Guidance systems can use rigid-body transformations to accomplish registration of an image volume and the physical space of
the operating setup itself. Accuracy is important to these systems,
as is knowledge of the level of accuracy. Three useful measures
of error are suggested for analyzing the accuracy of point-based
registration methods: (1) fiducial localization error, which is the
error in locating the fiducial points; (2) fiducial registration error,
which is the root-mean-square distance between corresponding fiducial points after registration; and (3) target registration error (TRE),

which is the distance between corresponding points other than the


fiducial points after registration (10, 11).
The TRE describes the real distance between a point in the target region after registration and its real localization. This error incorporates
all errors in localization and registration (fiducial localization error and
fiducial registration error) and is the measure of the true surgical
application accuracy (10, 11). Because TRE represents the relevant
error for surgical navigation, we investigated the TRE of the system.
Before CT scanning, five markers detectable in CT imaging and
surface scanning were pasted to the area of interest. These were
used to calculate the accuracy of registration and projection. Positions of the projected points were optically tracked and compared
with the position of the pasted skin markers. After the implantation
procedure, a second CT scan was obtained and registered with the
planning CT to account for the exact achieved needle position and
deviation from the virtually planned position. Accuracy was measured as the TRE between the achieved and projected markers and
needle position.

Augmented reality in image-guided 3D interstitial brachytherapy d R. KREMPIEN et al.

Fig. 2. (a, b) System description. (a) The developed navigation


system using augmented reality for intraoperative visualization of
surgical planning data consists of an off-the-shelf video projector,
two charge coupled device (CCD) cameras, and a state-of-the-art
personal computer (PC; 2-GHz central processing unit dual core,
1 GByte random access memory). Use of a video projector allows
visualization of planning and additional information (numerics,
distances, and so on) in arbitrary colors. Furthermore, it is used to
register the patients position. (b) Surface scanning is performed
with structured light. During scanning, the projector is used by
projecting a sequence of stripe patterns (structured light) on top of
the region of interest with the aid of the integrated video projector.
The corresponding images are acquired by the cameras, analyzed in
consideration of shifting gray values, and yield a three-dimensional
(3D) point cloud of the scanned area. The cameras gather the deformation of the lines on the surface. Because their widths correspond
approximately to the lattice parameter of the CCD matrix, moire
patterns emerge. The software is able to analyze the patterns and
decode the bodys surface, represented by a 3D point cloud. The
error of a single point is less than 1 mm. The density of the point
cloud is approximately 4/mm2. VGA =video graphics array.

RESULTS
The described method enables the surgeon to visualize
planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the
operation (Fig. 4a4d). The system allowed us to meet occlusion by simply moving the video projector to an appropriate
position because changes in position of the video projector
are equivalent to those of the patient. Furthermore, the
tracking system allowed dynamic adjustment of the data to
the patients current position and therefore eliminated the
need for rigid fixation with stereotactic frames or similar
devices (Fig. 4c). The system was particularly attractive
because both the video projector and cameras were multiply
used (registration/visualization and registration/tracking,

947

respectively; Fig. 2a and 2b). This presented system-enabled


intuitive intraoperative guidance of needle implantations
(Fig. 5).
The surface scanner was used to generate a 3D point cloud
of the patients skin surface immediately before the interventions started (Fig. 3a). The error of a single point was less
than 1 mm. Density of the point cloud was approximately
4/mm2. The 3D point cloud of the patients surface was
used to register the planning data (Fig. 3c) with the patients
current position and calculate the projection of the planning
data onto the patients surface (Fig. 4b).
Resolution of the currently used video projector was 800 
600, whereby the field of projection covered an area of approximately 24  32 cm. Therefore, the achieved resolution
was one third to one half of a millimeter and presently is
superior to any head-mounted display. The achieved accuracy of the projected surgical planning data was about 1
mm immediately after the initial scan of the patient (without
tracking) and deteriorated to 3 mm while tracking the
patients position.
In a first clinical study, the prototype system was tested
with 10 patients. In clinical testing, we evaluated the entire
process chain (Fig. 1) from image acquisition to data projection and determined overall accuracy. In addition to measuring absolute accuracy, arbitrary landmarks on top of the
patients skin surface were defined within the planning
system and used to measure relative projection accuracy.
We could show that the registration process was performed
accurately by matching the scanned skin surface of the
patient to its CT-generated counterpart (Fig. 3a3c). Because
of soft-part displacement, we obtained an average deviation
of 1.1 mm by moving the patient, whereas changing the projectors position resulted in an average deviation of 0.9 mm.
The average TRE of the projected target points to the pasted
CT-visible markers was 0.45 mm.
Mean deviation of all needles of an implant was 1.4 mm
(range, 0.32.7 mm). Mean deviation per implant depended
on the number of implanted needles and increased with the
number of needles (0.3 mm with three implanted needles,
2.7 mm with 12 implanted needles). Optimization to account
for these deviations was done by means of geometric optimization to achieve a uniform dose throughout the target. Dose
was prescribed to the 90% isodose enclosing the target
volume. The preplanned volume was supposed to be encompassed by the 90% isodose. Mean preplanned volume was
81 cm3 (range, 20165 cm3), whereas achieved mean 90%
volume was 76 cm3 (range, 18150 cm3). Comparing the
target coverage of the planned and achieved 90% volumes,
a mean of 92% of planned volume was covered by the
achieved 90% volume (range, 7997%). The proportion of
volume receiving 150% of the prescribed dose was 12.8%
(range, 7.121%). The mean homogeneity index was
0.81% (range, 0.710.92%; SD, 0.05%). In all cases, image
guidance enabled avoidance of risk structures (vessels and
bone) during surgical needle placement. The proportion of
the mandibular bone receiving more than 100% of the dose
was a mean of 4.6% (range, 012%) (Table 1).

948

I. J. Radiation Oncology d Biology d Physics

Volume 70, Number 3, 2008

Fig. 3. (ac) Registration. (a) Three-dimensional (3D) point cloud of the patients surface. Cameras gather the deformation
of the lines on the surface. Because their widths correspond approximately to the lattice parameter of the charge coupled
device (CCD) matrix, moire patterns emerge. The software is able to analyze the patterns and decode the bodys surface,
represented by a 3D point cloud. The error of a single point is less than 1 mm. The density of the point cloud is approximately 4/mm2. (b) Virtual shell of the patient obtained from the planning computed tomography (CT). The surface of the
patient is reconstructed from the planning CT. (c) Registration; after the initial scanning of the patients present position,
data must be registered with the planning CT data. The CT images are transformed to a distance tomogram. An arbitrary
matching algorithm served to minimize the sum over all distances from the CT surface to the point cloud. The result of this
process is the information of the patients present position compared with the CT-based planning data, i.e., a 4  4 transformation matrix describing postponement and rotation. After removing inevitable outliers, the resulting point cloud can be
matched to the preoperatively segmented surface of the diagnostic image data on which the surgical plan was defined.

Augmented reality in image-guided 3D interstitial brachytherapy d R. KREMPIEN et al.

949

Fig. 4. (ad) Projection of planning data. (a) Virtual planning of brachytherapy needle implantation. (b) After registration,
the system calculates the projection of planning data on the surface of the patient during needle implantation. (c) Subsequent movements of the nonfixed patient are detected by stereoscopically tracking passive markers attached to the patients
upper jaw. (d) In a first clinical study, the entrance point of each needle is projected on the skin, and needles then are positioned on the skin. Because intraoperative conditions impede continuous scanning and to avoid attaching rigid fixations to
the patient, succeeding registrations are performed by tracking markers pasted to the patient and the brachytherapy needles.

The first clinical testing showed that the developed lowcost AR system is very accurate and meets clinical demands.
Furthermore, the surgeon is not affected by head-mounted
devices and is able to perform the intervention without interference of his general practice.
DISCUSSION
Complex surgical interventions are increasingly realized
with the aid of computer-based operation planning systems.
The most important step from preoperative planning to intraoperative realization consists of providing planning data in
a reasonable and easy-to-handle way without forfeiting the
preoperatively achieved accuracy (13, 10). This can be realized by using AR techniques (13, 17). We developed a spatial

AR system that directly projects the planning data onto the


patient.
Conventional interstitial brachytherapy is performed
according to preoperative images and clinical presentation
by placing the needles before treatment planning. The therapy plan is calculated based on the achieved location of the
needles (5, 7). The quality of an implant depends on coverage
of the planning target volume and the implant geometry
enabling a homogeneous dose distribution. Integration of
navigation systems to interstitial brachytherapy procedures
enables virtual planning of brachytherapy needle implantation to optimize the extent and geometry of the application
to cover the target and avoid the risk of injury to critical
organs. Few reports were published about the utility of navigation systems in interstitial brachytherapy. They use either

I. J. Radiation Oncology d Biology d Physics

950

Volume 70, Number 3, 2008

Fig. 5. Intuitive intraoperative guidance. (Left to right) 1. After registration, the target point is projected on the patient skin.
To find the planned trajectory of the needles, an arrow is visualized on the patients skin to guide the brachytherapist to the
right implantation angle. 2. When the correct target point and needle trajectory are found, the needle angle and depth are
visualized color coded on the skin by concentric circles. 3. Any deviation of the needle and the depth of the needle tip are
monitored during the implantation process by intraoperative tracking of the needle. Deviations from the trajectory are visualized by concentric circles. 4. The planned depth of the needle is reached when both concentric circles are congruent
with each other and is visualized by changing colors. 5. Any further implantation is also color coded by changing color.

infrared or electromagnetic digitizers and depend on patient


immobilization. Reported mean implant deviations were
515 mm (8, 9, 26, 27). However, actual placement and final
position of the needles are the critical ones. A perfect virtual
plan on a predefined target does not necessarily imply execution of a perfect plan (3, 10, 11). Therefore, one of the most
important steps between planning and intraoperative performance is providing the planning data in the operating room
in a reasonable way without losing the improved accuracy.
The setup of patients in CT-based image-guided brachytherapy is encoded in the 3D information of the planning CT and
is available digitally. In effect, the CT plan surface is a virtual shell and should be used as such. This requires 3D body

surface information to be captured live in the operating room


and then manipulated for comparison with the CT virtual
shell (21, 22).
The achieved accuracy of the projected surgical planning
data was about 1 mm immediately after the initial scan of
the patient (without tracking) and deteriorated to 3 mm while
tracking the patients position. The inaccuracy traces back to the
minimum number of three tracked markers and the currently
used camera resolution (744  576). Current efforts concentrate
on improving the accuracy and increasing the update frequency
from 1.65 Hz. From the point of view of the project-involved
surgeons, a vision frequency of 10 Hz and accuracy of approximately 1 mm should meet their clinical demands.

Table 1. Treatment data: localization, dose, number of needles, irradiated volume, and accuracy of navigated needle placement
Deviation of needles
(mm) from
planned position
No.

Localization

Dose (PDR) (0.5 Gy/pulse/h)

No. of needles

Volume (cm3)

Mean

Maximum

1
2
3
4
5
6
7
8
9
10

Tongue
Base of tongue
Floor of mouth
Tongue
Tongue
Base of tongue
Floor of mouth
Base of tongue
Floor of mouth
Floor of mouth

10
13
18
10
12
15
18
20
18
15

8
9
9
12
5
12
3
7
11
6

71
48
48
150
33
79
18
43
76
40

2.3
2.7
0.9
1.1
1.0
2.2
0.3
0.8
1.3
1.8

2.9
4.3
2.6
2.3
2.9
5.4
2.3
3.7
3.1
0.9

Abbreviation: PDR = pulsed dose rate brachytherapy.


Mean indicates mean deviation of all needles of each implant, and maximum indicates maximum deviation of a single needle in each implant.

Augmented reality in image-guided 3D interstitial brachytherapy d R. KREMPIEN et al.

These first clinical data show that projector-based AR for


intuitive intraoperative guidance in image-guided needle
implantation is feasible for interstitial brachytherapy. In all
cases, good coverage within the target volume was possible
by using image-guided brachytherapy. In clinical practice
with this approach, patient movements and/or tissue deformations caused by the implantation procedures can lead to
soft-tissue displacement. Both problems have to be taken
into account in image-guided needle implantation. The
presented system did not allow real-time monitoring of
soft-tissue movements. Comparing the distance between
planned (1.2 cm equidistant) and achieved positions of all
needles of an implant, mean deviation was 1.4 mm. A
mean of 92% of the planned volume was covered by the
achieved 90% volume (range, 7997%), indicating that the
reported mean deviation of needle implantation was acceptable for the individual cases with no relevant compromise
of target coverage. However, it has to be noted that coverage
was less ideal in larger implant volumes than in smaller
implant volumes.
In cases in which soft-tissue displacement occurs, realtime feedback using an imaging modality may be needed to
adapt the virtual planning according to the changed anatomy
(3, 6). Intraoperative CT imaging enabled real-time monitoring of needle implantation and therefore the possibility to
account for variations in image-guided needle implantation.
In addition, preplanning of brachytherapy needle implantations has a number of potential disadvantages: Alterations
in organ volume and shape between the time of the preplan
and the implantation procedure and the necessity of registration of the preimplantation imaging study with the actual
patient position and setup may introduce inaccuracies into
the implantation process. Intraoperative planning eliminates
these disadvantages (3).
Because the presented data proved the feasibility and principle accuracy of image-guided interstitial brachytherapy,
our current approach is to use intraoperative CT (Siemens
Emotion Dual Slice CT with Sliding Gantry; Siemens, Erlangen, Germany) for intraoperative treatment planning. To
account for changes in tissue geometry or needle deviations,
a CT image can be acquired and the virtual planning can be
adapted for eventual needle misplacements or tissue movements, if necessary. A good example for the benefits of
image-guided needle implantation is real-time ultrasound
guidance in patients with prostate brachytherapy (6, 7).
The power of modern computers has rekindled interest in
clinical stereophotogrammetry, the spatial analysis of
objects from stereo image pairs (1822). To identify large
numbers of corresponding points is an inherent problem
for smooth objects, such as the patients skin, where there
are few distinctive features in the images being interrogated.
Structured light projection is designed to measure the
continuous topology of a surface and follow any topologic
changes, i.e., by generating large numbers of position
coordinates, often in the form of height maps (23, 24). In
comparison to marker techniques, ideally suited to monitor
a small number of critical points on a host surface, structured

951

light projection is a global measurement technique, particularly useful when the behavior of an entire surface is of interest. Moving from identifiable points and lines to the
projection of a continuous area-pattern of structures light
for surface measurement is attractive because it offers the
opportunity to measure tens of thousands of closely spaced
surface points simultaneously in one subsecond video image frame. This technique reduces the error of a single
point to less than 1 mm with a density of the point cloud
of approximately 4/mm2. The optical sensor height map of
the patient and the CT virtual shell are simultaneously
rendered as 3D surfaces to enable interactive control and
perspective viewing of the two surfaces as their relative
orientation is altered.
After registration, the virtual planning of the needle
implantation can be adapted to the actual patient position in
the operating room. To avoid constraining the surgeon to persistently reorient his view from the patient to the monitor and
vice versa, great efforts are being made to directly visualize
the surgical planning data in the operation area. Augmented
reality is a technology in which a computer-generated image
is superimposed onto the users vision of the real world,
giving the user additional information generated from the
computer model (2831). Above all, head-mounted or seethrough displays have great popularity (32). These are
mounted on the surgeons head and used to directly superimpose data in the visual path without obscuring it. However, in
consideration of precision, resolution, vision frequency, and
sterility, head-mounted displays still show great disadvantages and often lead to queasiness of the porter (31).
The described system distinguishes itself from other alternative technologies (head-mounted displays, see-through
glasses) because it renounces both the attachment of unpleasant screw markers for navigation and rigid fixation of the patient during the surgical intervention. Furthermore, costs of
the presented system are much less than those for techniques
based on expensive navigation systems. Another advantage
of using a video projector to visualize surgical planning
data is that all surgically involved persons are able to share
the same AR view on the planning data and additional information without supplying them all with expensive headmounted displays. In addition, the surgeon is not obliged to
wear encumbering devices on his head and performs the surgical intervention without interference of his general practice.
The use of projector-based AR proved to be feasible in interstitial brachytherapy and enabled: (1) the registration of
planning data with actual patient position, (2) intraoperative
visualization of planning data on the patient with high accuracy (<1 mm), (3) intraoperative tracking of the patient and
needles, (4) adaptation of planning data on changes in the
patients position, and (5) intuitive intraoperative guidance.
It is a noninvasive, easy, quick, inexpensive, and reliable
solution for the future real-time interactive adaptation of virtually planned interstitial implantation in combination with
intraoperative imaging to account for intraoperative changes
in tissue geometry or needle deviations after each needle
implantation.

952

I. J. Radiation Oncology d Biology d Physics

CONCLUSION
This report describes complete work flow from data acquisition to intraoperative image guidance for computer-based
interstitial brachytherapy. Augmented reality is a noninvasive
technique proved to be feasible and accurate in image-guided

Volume 70, Number 3, 2008

interstitial brachytherapy. It enabled intraoperative visualization and monitoring of planning data on the patients surface.
Thus, intuitive real-time intraoperative orientation, guidance,
and monitoring of the implantation process are implemented
in interstitial brachytherapy.

REFERENCES
1. Peters TM. Image-guidance for surgical procedures. Phys Med
Biol 2006;51:R505R540.
2. Satava RM. Emerging technologies for surgery in the 21st century. Arch Surg 1999;134:11971202.
3. Jolesz FA. Image-guided procedures and the operating room of
the future. Radiology 1997;204:601612.
4. Bucholz RD, Smith KR, Laycock KA, et al. Three-dimensional
localization: From image-guided surgery to information-guided
therapy. Methods 2001;25:186200.
5. Kolotas C, Baltas D, Zamboglou N. CT-based interstitial HDR
brachytherapy. Strahlenther Onkol 1999;175:419427.
6. Zelefsky MJ, Yamada Y, Marion C, et al. Improved conformality and decreased toxicity with intraoperative computer-optimized transperineal ultrasound-guided prostate brachytherapy.
Int J Radiat Oncol Biol Phys 2003;55:956963.
7. Nag S, Ciezki JP, Cormack R, et al. Intraoperative planning and
evaluation of permanent prostate brachytherapy: Report on the
American Brachytherapy Society. Int J Radiat Oncol Biol
Phys 2001;51:14221430.
8. Krempien R, Hassfeld S, Kozak J, et al. Frameless image guidance improves accuracy in three-dimensional interstitial brachytherapy needle placement. Int J Radiat Oncol Biol Phys 2004;
60:16451651.
9. Krempien R, Grehn C, Haag C, et al. Feasibility report for
retreatment of locally recurrent head and neck cancer by combined brachy-chemotherapy using frameless image guided 3D
interstitial brachytherapy. Brachytherapy 2005;4:159167.
10. Fitzpatrick JM, West JB. The distribution of target registration
error in rigid-body point-based registration. IEEE Trans Med
Imaging 2001;20:917927.
11. West JB, Fitzpatrick JM, Toms SA, et al. Fiducial point placement and the accuracy of point-based, rigid body registration.
Neurosurgery 2001;4:810816.
12. Yuan ML, Ong SK, Nee AY. Registration based on projective
reconstruction technique for augmented reality systems. IEEE
Trans Vis Comput Graph 2005;11:254264.
13. Shuhaiber JH. Augmented reality in surgery. Arch Surg 2004;
139:170174.
14. Wacker FK, Vogt S, Khamene A, et al. An augmented reality
system for MR image-guided needle biopsy: Initial results in
a swine model. Radiology 2006;238:497504.
15. Khan MF, Dogan S, Maataoui A, et al. Accuracy of biopsy needle navigation using the Medarpa systemComputed tomography reality superimposed on the site of intervention. Eur Radiol
2005;15:23662374.
16. De Buck S, Maes F, Ector J, et al. An augmented reality system
for patient-specific guidance of cardiac catheter ablation procedures. IEEE Trans Med Imaging 2005;24:15121524.
17. Regenbrecht H, Baratoff G, Wilke W. Augmented reality projects in the automotive and aerospace industries. IEEE Comput
Graph Appl 2005;25:4856.

18. Bert C, Metheany KG, Doppke KP, et al. Clinical experience


with a 3D surface patient setup system for alignment of partial-breast irradiation patients. Int J Radiat Oncol Biol Phys
2006;64:12651274.
19. Bert C, Metheany KG, Doppke K, et al. A phantom evaluation
of a stereo-vision surface imaging system for radiotherapy
patient setup. Med Phys 2005;32:27532762.
20. MacKay RI, Graham PA, Logue JP, et al. Patient positioning using detailed three-dimensional surface data for patients undergoing conformal radiation therapy for carcinoma of the prostate: A
feasibility study. Int J Radiat Oncol Biol Phys 2001;49:225230.
21. Moore CJ, Graham PA. 3D dynamic body surface sensing and
CT-body matching: A tool for patient set-up and monitoring in
radiotherapy. Comput Aided Surg 2000;5:234245.
22. Moore C, Lilley F, Sauret V, et al. Opto-electronic sensing of
body surface topology changes during radiotherapy for rectal
cancer. Int J Radiat Oncol Biol Phys 2003;56:248258.
23. Dauber S, Hoppe H, Krempien R, et al. Intraoperative guidance
of pre-planned bone deformations with a surface scanning system. Stud Health Technol Inform 2002;85:110115.
24. Hoppe H, Dauber S, Kubler C, et al. A new, accurate and easy to
implement camera and video projector model. Stud Health
Technol Inform 2002;85:204206.
25. Kahrs LA, Hoppe H, Eggers G, et al. Visualization of surgical
3D information with projector-based augmented reality. Stud
Health Technol Inform 2005;111:243246.
26. Strassmann G, Heyd R, Cabillic-Engenhard R, et al. Accuracy
of 3D needle navigation in interstitial brachytherapy in various
body regions. Strahlenther Onkol 2002;178:644647.
27. Bale RJ, Freysinger W, Gunkel AR, et al. Head and neck
tumors: Fractionated frameless stereotactic interstitial brachytherapy-initial experience. Radiology 2000;214:591595.
28. Rosenthal M, State A, Lee J, et al. Augmented reality guidance
for needle biopsies: An initial randomized, controlled trial in
phantoms. Med Image Anal 2002;6:313320.
29. Paul P, Fleig O, Jannin P. Augmented virtuality based on stereoscopic reconstruction in multimodal image-guided neurosurgery: Methods and performance evaluation. IEEE Trans Med
Imaging 2005;24:15001511.
30. Bockholt U, Bisler A, Becker M, et al. Intra-operative support
via augmented reality techniques in endoscopic surgery.
Comput Aided Surg 2003;8:310315.
31. Marmulla R, Hoppe H, Muhling J, et al. An augmented reality
system for image-guided surgery. Int J Oral Maxillofac Surg
2005;34:594596.
32. Figl M, Ede C, Hummel J, et al. A fully automated calibration
method for an optical see-through head-mounted operating
microscope with variable zoom and focus. IEEE Trans Med
Imaging 2005;24:14921499.

Você também pode gostar