Escolar Documentos
Profissional Documentos
Cultura Documentos
944952, 2008
Copyright 2008 Elsevier Inc.
Printed in the USA. All rights reserved
0360-3016/08/$see front matter
doi:10.1016/j.ijrobp.2007.10.048
PHYSICS CONTRIBUTION
INTRODUCTION
Systems for image-guided surgery allow for interactive visualization of preoperative or intraoperative imaging data from
a patient in correlation with the patients anatomy during an
operation (14). Because precise placement of needles is
mandatory in interstitial brachytherapy, many brachytherapy
procedures would benefit from image-guided needle implantation (58). This approach would considerably help optimize the extent and geometry of an implant and decrease
the risk of injury to critical organs (59). Ultrasound image
guidance in prostate brachytherapy is a good example of
how image guidance enables optimized needle distributions
(6, 7). One of the most important steps is providing the planning data in the operating room in a reasonable way without
losing the improved accuracy (1012).
the users view of the real world is enhanced by additional information in the form of labels, three-dimensional (3D) rendered models such as tumor localization or planning data (13).
Recently, techniques were developed to use surface features to evaluate the patients position, adapt planning data
to the changed patient position, and project planning data
on the surface and correct it when necessary (1822). The
aim of this study is to use AR for intuitive intraoperative
guidance in interstitial needle implantation to: (1) enable registration of the patients position, (2) enable frameless image
guidance without the need of screw markers for navigation,
(3) allow for interactive visualization of planning data in correlation with the patients anatomy during intervention, (4)
provide data for image-guided brachytherapy intraoperatively on the patients surface, and (5) adapt planning data
to changing patient positions.
Here, we report on our experiences with a system for projector-based AR in interstitial brachytherapy. In a first clinical study, the prototype system was tested with 10 patients. In
the clinical testing, we evaluated the entire process chain
from image acquisition to data projection and determined
the overall accuracy.
METHODS AND MATERIALS
Patients
The present study was performed in accordance with ethical standards issued by the University of Heidelberg (Heidelberg, Germany)
and the Declaration of Helsinki of 1975 as revised in 2000. Institutional and governmental approval was obtained. After informed
consent was obtained, 10 patients with biopsy-proven recurrent
head-and neck-cancer were treated by using image-guided interstitial brachytherapy. Brachytherapy needle implantation was virtually
planned, and needles were implanted by using an adapted frameless
navigation system (8, 9).
Image-guided brachytherapy
A surgical navigation system adapted to interstitial brachytherapy
procedures (8) was used for virtual planning of interstitial brachytherapy needle implantation. Initially, a planning computed tomography (CT) scan was acquired. After segmentation of tumor extent
and adjacent risk structures, a target volume was defined. The
surgical navigation system then was used for calculation of an
optimized virtual needle distribution, taking into account the surgical needle implantation with regard to risk structures in the needle
trajectory (Fig. 1).
System description
The developed navigation system using AR for the intraoperative
visualization of surgical planning data was composed of a common
video projector, two high-resolution charge coupled device (CCD)
cameras, and an off-the-shelf notebook (2 GHz central processing
unit (CPU) dual core, 1 GByte random access memory (RAM);
Fig. 2a). The projector was used as scanning device by projecting
coded light patterns to register the patient. Use of a video projector
allowed online projection of planning data and additional information in arbitrary colors in the operating field. Subsequent movements
of the nonfixed patient were detected by stereoscopically tracking
passive markers attached to the patient (2325).
945
Image acquisition
The first task consisted of calibrating both the video projector
and CCD cameras within the same coordinate system. The system
was formed by a surface scanner that was used to generate a 3D
point cloud of the patients skin surface immediately before the
intervention started. This was accomplished by projecting
a sequence of stripe patterns (coded light) on top of the region
of interest with the aid of the integrated video projector
(Fig. 2b). The cameras gathered the deformation of the lines on
the surface and the corresponding images and were analyzed in
consideration of shifting grey values. Because their widths corresponded approximately to the lattice parameter of the CCD
matrix, moire patterns emerged. The software was able to analyze
the patterns and decode the bodys surface, represented by a 3D
point cloud (Fig. 3a).
Intraoperative tracking
Because intraoperative conditions impede continuous scanning
and to avoid attaching rigid fixations to the patient, subsequent
movements of the nonfixed patient were detected by stereoscopically tracking passive markers attached to the patient (Fig. 4c).
These were tracked and registered by analyzing corresponding
images from the integrated cameras by using Hough transformations. The first marker registration was performed simultaneously
with the described initial scan of the patient. This step yielded a second transformation Tcur(t) mapping the patients initial (Tini) to the
current position. The global transformation Tglobal(t) = Tcur(t) Tini
allowed us to continuously transfer the surgical planning data to
the patients coordinate system, taking into account its actual position. The corresponding 2D projector bitmap, which was projected
onto the 3D patients surface, was edited to avoid or minimize distortion in areas of steep and bended surfaces. However, maintaining
lengths and angles required a triangulated surface (e.g., Delaunay
triangulation) to predict and correct deformations by using normal
vectors and other features (23, 24).
946
Fig. 1. Real-time navigated needle implantation. Initially, a planning computed tomography (CT) scan of the patient is
acquired. For virtual planning of needle implantation, the CT data are sent to the planning system. After segmentation
of tumor and risk structures, virtual distribution of the brachytherapy needles is planned, taking relevant surrounding
risk structures into account. An entrance point and needle pathway for each needle are determined. Intraoperatively, the
projector is used as a scanning device to yield a surface scan of the patients current position to register the patient. After
the initial scanning of the patients position, the three-dimensional (3D) point cloud is registered with the planning CT data.
This enables matching of the preoperatively segmented surface of the diagnostic image data (CT, magnetic resonance
imaging) on which the surgical plan was defined with the patients intraoperative position. Subsequent movements of
the nonfixed patient are detected by stereoscopically tracking passive markers attached to the patient. After registration,
the system calculated the projection of virtual planning data on the surface of the patient. During needle implantation,
this information is augmented (projected) to the real situation. Tumor extent and entrance point of the each needle are
projected onto the skin, and needle implantation is monitored.
Accuracy measurements
Guidance systems can use rigid-body transformations to accomplish registration of an image volume and the physical space of
the operating setup itself. Accuracy is important to these systems,
as is knowledge of the level of accuracy. Three useful measures
of error are suggested for analyzing the accuracy of point-based
registration methods: (1) fiducial localization error, which is the
error in locating the fiducial points; (2) fiducial registration error,
which is the root-mean-square distance between corresponding fiducial points after registration; and (3) target registration error (TRE),
RESULTS
The described method enables the surgeon to visualize
planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the
operation (Fig. 4a4d). The system allowed us to meet occlusion by simply moving the video projector to an appropriate
position because changes in position of the video projector
are equivalent to those of the patient. Furthermore, the
tracking system allowed dynamic adjustment of the data to
the patients current position and therefore eliminated the
need for rigid fixation with stereotactic frames or similar
devices (Fig. 4c). The system was particularly attractive
because both the video projector and cameras were multiply
used (registration/visualization and registration/tracking,
947
948
Fig. 3. (ac) Registration. (a) Three-dimensional (3D) point cloud of the patients surface. Cameras gather the deformation
of the lines on the surface. Because their widths correspond approximately to the lattice parameter of the charge coupled
device (CCD) matrix, moire patterns emerge. The software is able to analyze the patterns and decode the bodys surface,
represented by a 3D point cloud. The error of a single point is less than 1 mm. The density of the point cloud is approximately 4/mm2. (b) Virtual shell of the patient obtained from the planning computed tomography (CT). The surface of the
patient is reconstructed from the planning CT. (c) Registration; after the initial scanning of the patients present position,
data must be registered with the planning CT data. The CT images are transformed to a distance tomogram. An arbitrary
matching algorithm served to minimize the sum over all distances from the CT surface to the point cloud. The result of this
process is the information of the patients present position compared with the CT-based planning data, i.e., a 4 4 transformation matrix describing postponement and rotation. After removing inevitable outliers, the resulting point cloud can be
matched to the preoperatively segmented surface of the diagnostic image data on which the surgical plan was defined.
949
Fig. 4. (ad) Projection of planning data. (a) Virtual planning of brachytherapy needle implantation. (b) After registration,
the system calculates the projection of planning data on the surface of the patient during needle implantation. (c) Subsequent movements of the nonfixed patient are detected by stereoscopically tracking passive markers attached to the patients
upper jaw. (d) In a first clinical study, the entrance point of each needle is projected on the skin, and needles then are positioned on the skin. Because intraoperative conditions impede continuous scanning and to avoid attaching rigid fixations to
the patient, succeeding registrations are performed by tracking markers pasted to the patient and the brachytherapy needles.
The first clinical testing showed that the developed lowcost AR system is very accurate and meets clinical demands.
Furthermore, the surgeon is not affected by head-mounted
devices and is able to perform the intervention without interference of his general practice.
DISCUSSION
Complex surgical interventions are increasingly realized
with the aid of computer-based operation planning systems.
The most important step from preoperative planning to intraoperative realization consists of providing planning data in
a reasonable and easy-to-handle way without forfeiting the
preoperatively achieved accuracy (13, 10). This can be realized by using AR techniques (13, 17). We developed a spatial
950
Fig. 5. Intuitive intraoperative guidance. (Left to right) 1. After registration, the target point is projected on the patient skin.
To find the planned trajectory of the needles, an arrow is visualized on the patients skin to guide the brachytherapist to the
right implantation angle. 2. When the correct target point and needle trajectory are found, the needle angle and depth are
visualized color coded on the skin by concentric circles. 3. Any deviation of the needle and the depth of the needle tip are
monitored during the implantation process by intraoperative tracking of the needle. Deviations from the trajectory are visualized by concentric circles. 4. The planned depth of the needle is reached when both concentric circles are congruent
with each other and is visualized by changing colors. 5. Any further implantation is also color coded by changing color.
Table 1. Treatment data: localization, dose, number of needles, irradiated volume, and accuracy of navigated needle placement
Deviation of needles
(mm) from
planned position
No.
Localization
No. of needles
Volume (cm3)
Mean
Maximum
1
2
3
4
5
6
7
8
9
10
Tongue
Base of tongue
Floor of mouth
Tongue
Tongue
Base of tongue
Floor of mouth
Base of tongue
Floor of mouth
Floor of mouth
10
13
18
10
12
15
18
20
18
15
8
9
9
12
5
12
3
7
11
6
71
48
48
150
33
79
18
43
76
40
2.3
2.7
0.9
1.1
1.0
2.2
0.3
0.8
1.3
1.8
2.9
4.3
2.6
2.3
2.9
5.4
2.3
3.7
3.1
0.9
951
light projection is a global measurement technique, particularly useful when the behavior of an entire surface is of interest. Moving from identifiable points and lines to the
projection of a continuous area-pattern of structures light
for surface measurement is attractive because it offers the
opportunity to measure tens of thousands of closely spaced
surface points simultaneously in one subsecond video image frame. This technique reduces the error of a single
point to less than 1 mm with a density of the point cloud
of approximately 4/mm2. The optical sensor height map of
the patient and the CT virtual shell are simultaneously
rendered as 3D surfaces to enable interactive control and
perspective viewing of the two surfaces as their relative
orientation is altered.
After registration, the virtual planning of the needle
implantation can be adapted to the actual patient position in
the operating room. To avoid constraining the surgeon to persistently reorient his view from the patient to the monitor and
vice versa, great efforts are being made to directly visualize
the surgical planning data in the operation area. Augmented
reality is a technology in which a computer-generated image
is superimposed onto the users vision of the real world,
giving the user additional information generated from the
computer model (2831). Above all, head-mounted or seethrough displays have great popularity (32). These are
mounted on the surgeons head and used to directly superimpose data in the visual path without obscuring it. However, in
consideration of precision, resolution, vision frequency, and
sterility, head-mounted displays still show great disadvantages and often lead to queasiness of the porter (31).
The described system distinguishes itself from other alternative technologies (head-mounted displays, see-through
glasses) because it renounces both the attachment of unpleasant screw markers for navigation and rigid fixation of the patient during the surgical intervention. Furthermore, costs of
the presented system are much less than those for techniques
based on expensive navigation systems. Another advantage
of using a video projector to visualize surgical planning
data is that all surgically involved persons are able to share
the same AR view on the planning data and additional information without supplying them all with expensive headmounted displays. In addition, the surgeon is not obliged to
wear encumbering devices on his head and performs the surgical intervention without interference of his general practice.
The use of projector-based AR proved to be feasible in interstitial brachytherapy and enabled: (1) the registration of
planning data with actual patient position, (2) intraoperative
visualization of planning data on the patient with high accuracy (<1 mm), (3) intraoperative tracking of the patient and
needles, (4) adaptation of planning data on changes in the
patients position, and (5) intuitive intraoperative guidance.
It is a noninvasive, easy, quick, inexpensive, and reliable
solution for the future real-time interactive adaptation of virtually planned interstitial implantation in combination with
intraoperative imaging to account for intraoperative changes
in tissue geometry or needle deviations after each needle
implantation.
952
CONCLUSION
This report describes complete work flow from data acquisition to intraoperative image guidance for computer-based
interstitial brachytherapy. Augmented reality is a noninvasive
technique proved to be feasible and accurate in image-guided
interstitial brachytherapy. It enabled intraoperative visualization and monitoring of planning data on the patients surface.
Thus, intuitive real-time intraoperative orientation, guidance,
and monitoring of the implantation process are implemented
in interstitial brachytherapy.
REFERENCES
1. Peters TM. Image-guidance for surgical procedures. Phys Med
Biol 2006;51:R505R540.
2. Satava RM. Emerging technologies for surgery in the 21st century. Arch Surg 1999;134:11971202.
3. Jolesz FA. Image-guided procedures and the operating room of
the future. Radiology 1997;204:601612.
4. Bucholz RD, Smith KR, Laycock KA, et al. Three-dimensional
localization: From image-guided surgery to information-guided
therapy. Methods 2001;25:186200.
5. Kolotas C, Baltas D, Zamboglou N. CT-based interstitial HDR
brachytherapy. Strahlenther Onkol 1999;175:419427.
6. Zelefsky MJ, Yamada Y, Marion C, et al. Improved conformality and decreased toxicity with intraoperative computer-optimized transperineal ultrasound-guided prostate brachytherapy.
Int J Radiat Oncol Biol Phys 2003;55:956963.
7. Nag S, Ciezki JP, Cormack R, et al. Intraoperative planning and
evaluation of permanent prostate brachytherapy: Report on the
American Brachytherapy Society. Int J Radiat Oncol Biol
Phys 2001;51:14221430.
8. Krempien R, Hassfeld S, Kozak J, et al. Frameless image guidance improves accuracy in three-dimensional interstitial brachytherapy needle placement. Int J Radiat Oncol Biol Phys 2004;
60:16451651.
9. Krempien R, Grehn C, Haag C, et al. Feasibility report for
retreatment of locally recurrent head and neck cancer by combined brachy-chemotherapy using frameless image guided 3D
interstitial brachytherapy. Brachytherapy 2005;4:159167.
10. Fitzpatrick JM, West JB. The distribution of target registration
error in rigid-body point-based registration. IEEE Trans Med
Imaging 2001;20:917927.
11. West JB, Fitzpatrick JM, Toms SA, et al. Fiducial point placement and the accuracy of point-based, rigid body registration.
Neurosurgery 2001;4:810816.
12. Yuan ML, Ong SK, Nee AY. Registration based on projective
reconstruction technique for augmented reality systems. IEEE
Trans Vis Comput Graph 2005;11:254264.
13. Shuhaiber JH. Augmented reality in surgery. Arch Surg 2004;
139:170174.
14. Wacker FK, Vogt S, Khamene A, et al. An augmented reality
system for MR image-guided needle biopsy: Initial results in
a swine model. Radiology 2006;238:497504.
15. Khan MF, Dogan S, Maataoui A, et al. Accuracy of biopsy needle navigation using the Medarpa systemComputed tomography reality superimposed on the site of intervention. Eur Radiol
2005;15:23662374.
16. De Buck S, Maes F, Ector J, et al. An augmented reality system
for patient-specific guidance of cardiac catheter ablation procedures. IEEE Trans Med Imaging 2005;24:15121524.
17. Regenbrecht H, Baratoff G, Wilke W. Augmented reality projects in the automotive and aerospace industries. IEEE Comput
Graph Appl 2005;25:4856.