Você está na página 1de 8

This article has been accepted for inclusion in a future issue of this journal.

Content is final as presented, with the exception of pagination.


IEEE TRANSACTIONS ON CYBERNETICS

Linear Tracking for 3-D Medical Ultrasound Imaging


Qing-Hua Huang, Zhao Yang, Wei Hu, Lian-Wen Jin, Gang Wei, and Xuelong Li, Fellow, IEEE

AbstractAs the clinical application grows, there is a rapid technical development of 3-D ultrasound imaging. Compared with 2-D
ultrasound imaging, 3-D ultrasound imaging can provide improved
qualitative and quantitative information for various clinical applications. In this paper, we proposed a novel tracking method for
a freehand 3-D ultrasound imaging system with improved portability, reduced degree of freedom, and cost. We designed a sliding
track with a linear position sensor attached, and it transmitted
positional data via a wireless communication module based on
Bluetooth, resulting in a wireless spatial tracking modality. A traditional 2-D ultrasound probe fixed to the position sensor on the
sliding track was used to obtain real-time B-scans, and the positions of the B-scans were simultaneously acquired when moving
the probe along the track in a freehand manner. In the experiments, the proposed method was applied to ultrasound phantoms
and real human tissues. The results demonstrated that the new system outperformed a previously developed freehand system based
on a traditional six-degree-of-freedom spatial sensor in phantom
and in vivo studies, indicating its merit in clinical applications for
human tissues and organs.
Index TermsApplication system, one degree of freedom,
volume reconstruction, wireless spatial tracking, 3-D ultrasound.

I. INTRODUCTION
HREE dimensional ultrasound imaging technology has
attracted growing attentions and been well developed because of its significant advantages in illustrating entire tissues
and providing quantitative analysis. In comparison with conventional 2-D ultrasound images, a 3-D ultrasound image allows viewing of an arbitrarily orientated image plane within the
patient and provides volume measurement of organs or lesions.
In comparison with computerized tomography and magnetic

Manuscript received August 15, 2012; revised November 8, 2012; accepted


November 16, 2012. This work was supported by the National Basic Research
Program of China (973 Program) under Grant 2012CB316400, the National
Natural Science Funds of China under Grant 61125106, Grant 61001181, Grant
61075021, and Grant 91120302, and the Guangdong Natural Science Foundation (S2012010009885). This paper was recommended by Associate Editor
S. H. Rubin of the former IEEE Transactions on Systems, Man and Cybernetics,
Part C.
Q.-H. Huang, W. Hu, and G. Wei are with the Guangdong Provincial
Key Laboratory of Short-Range Wireless Detection and Communication and
the School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510006, China (e-mail: qhhuang@scut.edu.cn;
710072848@qq.com; ecgwei@scut.edu.cn).
Z. Yang and L.-W. Jin are with the School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510006, China
(e-mail: yangdxng100@126.com; eelwjin@scut.edu.cn).
X. Li is with the Center for Optical Imagery Analysis and Learning, State
Key Laboratory of Transient Optics and Photonics, Xian Institute of Optics
and Precision Mechanics, Chinese Academy of Sciences, Xian 710119, China
(e-mail: xuelong_li@opt.ac.cn).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TSMCC.2012.2229270

resonance imaging, 3-D ultrasound is a low-cost solution to


obtaining 3-D medical images [3], [4].
As far as the scanning methods are concerned, 3-D ultrasound
imaging systems can be divided into three categories: mechanical scanners [2], [3], freehand techniques [14], and 2-D arrays [13]. Nowadays, for convenient purposes, some ultrasound
machines are equipped with 3-D volumetric probes which make
use of mechanical scanning or 2-D arrays embedded within a
dedicated housing. The 3-D probes can provide fast imaging of
human tissues. However, they are highly expensive, and their
fields of view and resolutions are relatively limited.
In contrast, 3-D freehand ultrasound imaging systems based
on conventional 2-D ultrasound machine acquire a series of 2-D
ultrasound images (B-scans) and reconstruct them into a 3-D
volume dataset using positional information corresponding to
the B-scans. Because the scanning is freely controlled by users,
the data can be captured anywhere in human body and the field
of view can be unlimitedly extended. Therefore, the freehand
system is regarded as the most flexible and the cheapest solution
to 3-D ultrasound imaging [14].
In tracked freehand approaches, a position sensor is attached
to the probe to identify its gestures and motions during the
scanning. According to a review [1], there have been various
sensors designed to capture gestures and motions of an object. In previously reported 3-D ultrasound systems, electromagnetic and optical sensors are mostly often used to locate the
probe [6][9], [12]. These sensors offer six degrees of freedom
(DoFs) and can measure the rotations and translations of the
probe. Hence, the freehand system allows users to arbitrarily
manipulate the probe during the raw data acquisition.
However, there is a tradeoff between the flexibility of data
acquisition and the accuracy of volume reconstruction for a
freehand 3-D ultrasound system. It is obvious that the readings
of a position sensor are not absolutely accurate. Measurement
error is inevitable. The position sensor (e.g., an electromagnetic
sensor) is very sensitive to small flutters: A small jitter in the
position sensor reading would result in visible artifacts in images
[10]. For a 6-DoF system, there are two kinds of error, i.e.,
the translational error and the rotational error. Comparing with
the translational error, the rotational error would lead to much
larger inaccuracies in the reconstruction process. In case the
probe is significantly rotated during a freehand scanning, large
reconstruction error might occur. To this end, the movement of
the probe with a uniform speed and little rotation was strongly
recommended [11], and hence, a single sweep of the probe in
a uniform manner was preferred rather than that in an arbitrary
manner [15].
Actually, a linear mechanical scanning device can make a
single sweep using a motor which drives the probe to move in a
fixed and linear path. The B-scans can be regularly collected and
are visually parallel to each other. Because the scanning path

2168-2267/$31.00 2012 IEEE

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
2

Fig. 1.

IEEE TRANSACTIONS ON CYBERNETICS

Illustration of the proposed 1-DoF 3-D ultrasound imaging system.

is limited to a straight line, the rotational error can be avoided,


leading to a more accurate location of each B-scan. Nevertheless,
the fixed path and probe motion decrease the flexibility of the
scanning, and the additional driving device increases the cost
and inconvenience in real applications.
In order to achieve a good tradeoff between the freehand
scanning and the mechanical scanning, we proposed a 1-DoF
freehand system for 3-D ultrasound imaging in this paper. The
1-DoF system limited the motion of the probe in a single direction and avoided probe rotations. It looked like a mechanical
scanning system in which the probe was driven by a hand instead of a motor. Therefore, the flexibility of tracked freehand
3-D imaging techniques partially remained. In addition, without using traditional spatial sensor (e.g., the electromagnetic or
optical sensor), its cost was further reduced and the whole system looked more portable because a wireless spatial tracking
technique was adopted.
In Section II, the methods for development of the proposed
system are introduced. Section III addresses the experiments
and illustrates the experimental results. Finally, conclusions are
drawn in Section IV.
II. METHODS
A. System design
As illustrated in Fig. 1, our 3-D ultrasound imaging system consisted of three parts: a conventional ultrasound machine
(Sonix RP, Ultrasonix Medical Corporation, Richmond, BC,
Canada) with a linear probe (L14-5/38) and a convex probe
(C3-7/50) to generate B-scans, a common computer with a
video capture card (NI-IMAQ PCI-1405, National Instruments
Corporation, Austin, TX) installed to collect images from the
ultrasound machine, and a linear sliding track with a position
sensor that provided positional readings.
As illustrated in Fig. 1, the sliding track was actually a digital
caliper (Digital Scale Units, Model 812103, Anyi Instrument
Co. Ltd., Shanghai, China) and the readings for distance measurement could be electronically measured and displayed in real
time. We designed a positioning module in which a microcon-

trol unit (MCU, Model PIC16F877A, Microchip Technology


Inc., Chandler, AZ) was used to read and process the digital
distance measures from the caliper and a Bluetooth module was
embedded to wirelessly transmit the digital readings to a remote Bluetooth adaptor on a computer. The positioning module
was fixedly attached to the measuring module of the caliper and
could be freely moved together with it. With a well-designed
clip, the probe was fixed to the positioning module which could
be regarded as a position sensor responsible for recording current positions of the caliper and transmitting the measures via
the Bluetooth module embedded.
To make sure that the position sensor could accurately record
the real positions, we made 20 experiments. In each experiment,
the measuring module of the caliper was randomly placed and
the distance measuring result was transferred to the positioning
module. The positioning module then wirelessly transmitted the
current readings to the computer. The readings received by the
computer were compared with that displayed in the caliper itself.
In all of the experiments, we did not found any measurement
difference between the two types of readings, indicating that
there was no measurement distortion caused by the positioning
module.
The position module could be freely moved along the sliding
track in the system. Hence, the DoF was 1. In our current design, the outer dimensions of the sliding track was about 350
50 H mm, where H was the height of the track and could be
adjusted within a range of 50200 mm in real applications.
When the probe was moved by an operator, the B-scans and
the positional measures could be simultaneously collected by the
computer. The video capture card installed in the PC was used
to digitize and acquire real-time 2-D B-scans. The frame rate for
raw image acquisition was 25 Hz. In our system, we acquired
8-bit gray images to save the amount of storage. A Bluetooth link
was established in the system to communicate between the positioning module and the computer. A Bluetooth adapter should
be inserted into one of USB ports on the computer to receive
the positional information wirelessly transmitted from the positioning module. Therefore, our system looked more portable
than those previously developed using 6-DoF position sensors.
A software system was developed using Visual C++ (Microsoft Co. Ltd., Redmond, WA) in the computer and was responsible for data acquisition, image processing, volume reconstruction, and visualization. Visualization Toolkits (VTK, Kitware Inc., Clifton Park, NY) were integrated into the software
system for image display and volume rendering. The computer
was equipped with a Pentium dual-core CPU at 2.5 GHz and
2-GB RAM. Fig. 2 demonstrates the work flow of the system.
B. Spatial Calibration
A conventional freehand 3-D ultrasound system should be
calibrated to establish the correspondence between the B-scans
and the spatial data. There are two types of calibration, i.e., the
spatial calibration and temporal calibration. The spatial calibration is required to determine the spatial relationship between
the B-scan image plane and the position sensor attached to the
probe. The temporal calibration is necessary for determining

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HUANG et al.: LINEAR TRACKING FOR 3-D MEDICAL ULTRASOUND IMAGING

coordinate system of the sliding track T to that of the volume V,


is the location of the pixel in the volume coordinate
px)
and Cv (
system. The spatial calibration was employed to discover s Tp
which is defined as follows:

cos cos cos sin sin sin cos


sin cos sin sin sin + cos cos

s
Tp =
sin
cos sin
0

Fig. 2.

Diagram of work flow for the 3-D ultrasound imaging system.

cos sin cos + sin sin

Xo

sin sin cos cos sin

Yo

. (2)
Zo

cos cos
0

From Fig. 3, we assumed that the position sensor be fixedly


attached to the sliding track and should move along the Zaxis of T without any rotations and translations along the other
directions. Thus, t Ts could be simplified as

1 0 0

0 1 0

Ts =
0 0 1
0 0 0

Fig. 3. Illustration of the spatial calibration experiment. P, S, T , and V denote


the coordinate system of B-scan image plane, the position sensor, the sliding
track, and the volume dataset, respectively.

the temporal offset between the timestamps of the positional


information and the B-scans.
Because the proposed system was different from the freehand
3-D system with 6 DoFs in terms of the scanning mode, it could
not be calibrated using conventional methods. In this paper, we
designed a new spatial calibration method with a newly designed
phantom. As illustrated in Fig. 3, there were 5 5 small silicon
balls each of which was supported by a stick on a flat table. Given
that the width, height, and length of the table form a volume
coordinate system Cv , the 3-D locations of the balls could be
precisely measured. Using the proposed scanning protocol, the
balls were fully scanned by the ultrasound probe. In the volume
reconstruction, every pixel from the B-scans should be relocated
in the volume coordinate system. Thus, each pixels scan plane
location was first transformed into the coordinate system of the
position sensor attached to the sliding track and then into the
volume coordinate system. The overall mathematical expression
is as follows:
= v Tt t Ts s Tp

px)
px
Cv (

(1)

is the location of a pixel, s Tp is the transformation


where
px
matrix from the coordinate system of the B-scan image plane P
to that of the position sensor S, t Ts is the transformation matrix
from the coordinate system of the position sensor S to that of
the sliding track T, v Tt is the transformation matrix from the

Z (t)

(3)

where Z(t) is the distance between the origin of T and the


location of the spatial sensor at time t.
It is observed that there are six unknown parameters (, ,
, Xo, Yo, and Zo) in s Tp . Similarly, there are six unknown
parameters in v Tt . Considering the two scaling factors (i.e.,
Sx and Sy) for the image resolution of the B-scan, there are
totally 12 parameters to be identified. Because the locations of
the balls have been measured and known, we could build up
a group of nonlinear homogeneous equations with respect to
manually marked locations of the balls on the B-scan image
planes. To solve these equations, robust LevenbergMarquardt
algorithm [19] which is an efficient method for the solution of
nonlinear problems was performed to estimate , , , Xo, Yo,
and Zo in s Tp .
C. Temporal Calibration
In the experiments for temporal calibration, we placed eight
silicon balls each of which was supported by a stick fixed at the
bottom of a water tank, as illustrated in Fig. 4. The diameter of
each ball was 1.5 mm. The positions corresponding to balls 1
and 8 were recorded by the positioning module and denoted as
PA and PB, respectively. We moved the probe to sweep all of the
balls back and forth for several runs. In the collected images,
those showing the centers of the balls were regarded as the
image markers and their timestamps were picked out as the time
markers. At the same time, the positional data corresponding
to the time markers were acquired and regarded as positional
markers.
The positional markers recorded during the temporal calibration experiment were normalized to [0, 1] according to the range

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
4

Fig. 4.

IEEE TRANSACTIONS ON CYBERNETICS

Illustration of the temporal calibration experiment.

Fig. 6.

Construction of the volume coordinates system.

shown in Fig. 5(a). Obviously, the temporal shift with the


minimum root-mean-square error was the time delay between
the positional readings and the B-scans.
D. Volume Reconstruction and Analysis

Fig. 5. Temporal calibration results. (a) Real position and the position recorded
in the software. (b) Root-mean-square errors between the two data streams.

of [PA, PB]. Similarly, the image markers were normalized to


[0, 1] according to the range between balls 1 and 8.
The two normalized positions extracted from the image markers and the positional markers, respectively, were then interpolated using a spline interpolation, as shown in Fig. 5(a). Time
delays between the two data streams could be observed. The
time offset could be obtained by registering the two types of
normalized positional measure. A search range of (500 ms,
500 ms) was used to find the optimal temporal shift corresponding to the lowest registration error. Fig. 5(b) shows the curve of
root-mean-square error in registration of the two data streams

Once the image sequence and corresponding positional data


were transferred into the computer simultaneously, the procedure of volume reconstruction was required to form 3-D ultrasound images. With the reconstructed 3-D images, the functions
for data visualization and analysis could be realized.
At the beginning of the volume reconstruction, a volume
coordinate system should be established. In our previously developed freehand 3-D ultrasound system [12], we made use of
a 6-DoF spatial sensor, and the volume coordinate system was
determined by two predefined scanning positions. In contrast,
the scanning method was simplified and the DoF was reduced to
1 in this paper. As a result, the definition of volume coordinates
became straightforward. As illustrated in Fig. 6, the Z-axis was
parallel to the moving direction of the probe, and the Y -axis
was vertical to the ground. The origin of the volume coordinate
system was set to be the left-top point of the first collected Bscan. The width (in X-axis) and height (in Y -axis) of the voxel
arrays were set to be the same as those of the B-scan, and the
length along Z-axis was determined by the B-scan which was
the farthest from the origin.
With the volume coordinate system, the pixels on B-scans
should be transformed from the coordinates of 2-D image plane
to the coordinates of volume. The process is described by (1).
Having transformed all pixels to the volume coordinate system,
a data interpolation method should be performed to compute
the values of the voxel array in the 3-D image based on the
pixel values. In the past decade, various interpolation methods
have been proposed for volume reconstruction of 3-D ultrasound [16][18]. Barry et al. [6] proposed to use a spherical
region centered about each voxel and compute the weighted
average of all pixels falling into the region. Their method
is called distance weighted interpolation. We previously proposed an improve method called squared distance weighted
(SDW) interpolation [12] to preserve more image details, as

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HUANG et al.: LINEAR TRACKING FOR 3-D MEDICAL ULTRASOUND IMAGING

Fig. 9.
Fig. 7.

Display of a reconstructed cylinder in the resolution phantom.

Display of three orthogonal slices of the fetus volume.

Fig. 10. Illustrations for measurement of distances between smaller targets in


the resolution phantom.
Fig. 8.

Function for distance measurement based on three orthogonal views.

the following:

C =
I V

k =0


k
Wk I V
P
n

,
Wk

Wk =

1
(dk + )2

(4)

k =0

C ) is the intensity of the voxel at the volume coordiwhere I(V


C , n is the number of pixels falling within the predefined
nate V
 k ) is the intenC , I(V
spherical region centered about voxel V
P
 k , Wk is the
sity of the kth pixel at the kth image coordinate V
P
relative weight for the kth pixel, dk is the distance from the kth
C ),
 k to the center of the voxel (V
vector transformed from V
P
and is a positive parameter for adjusting the effect of the
interpolation.
In this paper, we applied the SDW method to compute voxel
values in reconstruction of volume data. The parameter was
empirically set to be 0.3. The reconstructed volume was rendered using the VTK. In order to observe the internal structures
of the volume, we developed functions for reslicing volume,
clipping volume, and generating orthogonal slices. Three orthogonal slices of a fetus volume are shown in Fig. 7.
In addition, the functions for measurement of distance were
developed. From the orthogonal views, the distance between
any two points could be easily measured, as illustrated in Fig. 8.
The measured results were used for evaluation of the imaging
accuracy of the proposed system in this paper.

Fig. 11.

New freehand scanning protocol for an in vivo test.

E. Experimental Methods
To assess the accuracy of the proposed 3-D ultrasound imaging system, we first conducted phantom experiments. An ultrasound resolution phantom (Model 044, CIRS, Inc., Norfolk,
VA) was employed. The phantom contained a number of coplanar anechoic tubby cylindrical lesions with different lengths
and diameters. One of the longest cylinders was reconstructed,
as shown in Fig. 9.We scanned the phantom for ten runs and
obtained ten sets of volumetric data. With reference to the documented dimensions, the imaging errors could be quantitatively
measured. As illustrated in Fig. 9, the diameters in X- and
Y -directions, and the length in Z-direction could be measured

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
6

IEEE TRANSACTIONS ON CYBERNETICS

TABLE I
QUANTITATIVE MEASUREMENT RESULTS OF THE DIMENSIONS ILLUSTRATED IN FIG. 7 AND THE DISTANCE ILLUSTRATED IN FIG. 8

within the reconstructed volume. The mean and standard deviation (SD) of measurement errors were reported in the next
section.
In addition to the cylindrical targets, we also made use of the
smaller targets to evaluate the accuracies of distance measurement based on the volumes produced by our system. As shown
in Fig. 10, the distance between any of two adjacent targets
could be measured and compared with the documented value.
The measurement errors indicated the imaging accuracy of the
system.
Moreover, a fetus phantom (Model 068, CIRS, Inc., Norfolk,
VA) was employed in the phantom study. Because the imaging
depth of the linear probe was too short to observe the entire
fetus, it was scanned by the convex probe which was of low
center frequency and larger field of view. Its 3-D image could
be used to qualitatively validate our system.
In our in vivo experiment, a young male subject (25 years old)
was recruited. The subjects forearm was immersed in a water
tank and scanned by the linear probe, as illustrated in Fig. 11.
The reconstructed volume could demonstrate the performance
of the proposed system in real applications. In order to better
illustrate the usefulness of the proposed system, it was also
compared with a previously reported freehand system which
was equipped with a 6-DoF spatial sensor (miniBird, Model
500, Ascension, VT) [12] in the experiments.
III. RESULTS
Fig. 11 illustrates the process of data collection using the
proposed system. The subjects forearm immersed in a water
tank was scanned. In the experiments, the sampling rate for
image acquisition was 21 Hz, and that for collecting positional
data was 35 Hz. The region of interest of the B-scans was of
480 450 pixels for the linear probe, and 280 240 pixels
for the convex probe. The spatial calibration experiments were
conducted for ten runs. , , and were 0.03 0.01 , 0.06
0.03 , and 0.04 0.01 , respectively, and Xo, Yo, and Zo
were 4.32 0.13 cm, 8.85 0.09 cm, and 0.46 0.06 cm,
respectively. From the temporal calibration experiments, the
time delay of the positional data stream relative to the image
data stream was 188.0 2.2 ms.
Table I presents the quantitative measurement results using two freehand 3-D ultrasound systems with 1-DoF and
6-DoF position sensors, respectively. It can be obviously observed that the 1-DoF position sensor resulted in more accurate

Fig. 12. Reconstruction of a fetus phantom using (a) the proposed system and
(b) a 6-DoF freehand 3-D ultrasound system.

measurement results, indicating a significantly improved imaging performance in comparison with traditional 6-DoF sensors.
In another word, the imaging accuracy of the proposed system
was improved by 0.462.14%. For the proposed system, the
average error for the volume measurement of the cylindrical
lesion embedded in the resolution phantom was 1.06 1.46%
(mean SD).
Fig. 12 shows two typical 3-D images of the fetus phantom
using the proposed system and the 6-DoF system, respectively.
In Fig. 12(a), the volume data consisting of 280 240 720
voxels was reconstructed using 793 B-scans collected in two
sweeps, and the computation time was 251 s. The head, limbs,
and trunk of the fetus can be clearly observed. Although the
reconstructed surfaces were not as smooth as the real one, the
shape of each body part was almost identical to that of the
phantom according to a qualitative evaluation, indicating good
performance of the proposed system in 3-D ultrasound imaging.
In Fig. 12(b), the volume data was computed based on the raw Bscans and spatial data generated by the 6-DoF sensor. However,
the reconstruction errors appearing to be some distortions can
be obviously seen in the face, head, shoulder, and arm of the
fetus.
Fig. 13 illustrates two volumes reconstructed from a part of
the subjects forearm based on the proposed system and the
6-DoF system, respectively. In Fig. 13(a), there were totally 210
B-scans collected, and the volume reconstruction took 49.3 s.
The volume data consisted of 160 120 240 voxels. In comparison with Fig. 13(a), the volume shown in Fig. 13(b) appears
less smooth, which was due to the relatively large reconstruction

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HUANG et al.: LINEAR TRACKING FOR 3-D MEDICAL ULTRASOUND IMAGING

Fig. 13. Reconstructed volume image from a portion of a subjects forearm


using (a) the proposed system and (b) the 6-DoF freehand 3-D ultrasound
system.

Fig. 14. Qualitative comparison between a raw B-scan in (a) and a reconstructed slice in (b).

errors. Fig. 14 demonstrates a qualitative comparison between


a raw B-scan and a reconstructed slice which spatially corresponded to the B-scan. From the two images, it can be observed
that the anatomical structures shown on the reconstructed slice
are almost identical to those shown on the raw B-scan, indicating
an accurate volume reconstruction.
IV. DISCUSSIONS AND CONCLUSION
In this paper, a simplified 3-D ultrasound imaging system with
a novel linear tracking technique was proposed and developed.
To demonstrate the accuracy and practical clinical practicality,
phantoms and in vivo experiments results were provided. Generally, the experimental results for the phantoms and the subject
demonstrated that our system was reliable to obtain 3-D ultrasound images with an acceptable computational speed and outperformed a previously developed freehand system with a 6-DoF
spatial sensor in terms of the accuracy of volume reconstruction
according to the qualitative and quantitative comparisons.
Nevertheless, the flexibility of our system decreased due to
the reduced DoF, which might result in some limitations in data
collection. Users could not scan the probe in an arbitrary manner. For some body parts, the skin surface was not adequately
flat and the probe might cause inhomogeneous pressure on the
skin during the scanning. Thus, the soft tissues would deform unevenly and the 3-D volume might not be properly reconstructed.
One of solutions was to immerse the body part to be scanned
into water and the probe was noncontact to the skin, as illustrated in Fig. 11. Another potential solution was to take advantages of motion tracking and interpolation algorithms [5] which
have been widely applied for video data analysis to recover the

deformed tissues in the process of volume reconstruction. This


will be our future research work.
Although the DoF of our system was limited, it had several
advantages that should be useful for clinical applications. First,
the cost of the spatial location device was much reduced in comparison with those used in conventional tracked freehand 3-D
ultrasound systems, e.g., the electromagnetic and optical sensors. The positioning module and the Bluetooth adaptor could
be easily available at very low prices in the market. Second, the
spatial sensor that transmitted data in a wireless manner was
relatively more portable, hence could be working for on site applications. Third, the rotational errors in registering the B-scans
to the 3-D space could be avoided; hence, the reconstruction accuracy was improved. It has been well recognized that a linear
scanning obviously leads to more accurate 3-D imaging because
a 6-DoF position sensor (e.g., the optical and magnetic sensors)
is often interfered by obstacles and metallic/ferromagnetic materials, and the measurement error for rotations during the scanning might lead to significantly reconstruction error [14]. Hence,
some authors [14], [20] suggested that the scanning should be
limited within one or multiple sweeps in each of which the
motion of the probe is approximately linear along with a single
direction. The linear tracking technique based on a linear sliding
track used in our system could well constrain the probe motion to
a linear trajectory and thus avoid rotational errors in measuring
the probe positions, eventually improving the performance of
3-D imaging. Finally, the new system partially remained the advantages of the tracked freehand system. For example, the user
could freely move the probe back and forth along the sliding
track, the moving speed and the scanning range could be fully
controlled, and the height and the orientation of the sliding track
could be easily adjusted according to a specific application.
In conclusion, the proposed 1-DoF probe tracking technique
for 3-D ultrasound imaging achieved a good balance between the
flexibility of data acquisition and the quality of volume reconstruction. Moreover, it became more portable than previously
reported 3-D ultrasound systems due to the wireless spatial data
transmission, and its cost was significantly reduced due to the
inexpensive positional module. Therefore, it can be expected
that the new 3-D ultrasound system will be successfully used in
the hospital and have a promising market prospect.
REFERENCES
[1] S. Berman and H. Stern, Sensors for gesture recognition systems, IEEE
Trans. Syst., Man, Cybern. C, Appl. Rev., vol. 42, no. 3, pp. 277290, May
2012.
[2] P. Toonkum, N. C. Suwanwela, and C. Chinrungrueng, Reconstruction of
3D ultrasound images based on cyclic regularized SavitzkyGolay filters,
Ultrasonics, vol. 51, pp. 136147, 2011.
[3] T. R. Nelson and D. H. Pretorius, Three-dimensional ultrasound imaging, Ultrasound Med. Biol., vol. 24, no. 9, pp. 12431270, 1998.
[4] A. Fenster, D. B. Downey, and H. N. Cardinal, Three-dimensional ultrasound imaging, Phys. Med. Biol., vol. 46, no. 5, pp. 6799, 2001.
[5] T. K. Sinh, N. C. Tang, J. C. Tsai, and J. N. Hwang, Video motion interpolation for special effect applications, IEEE Trans. Syst., Man, Cybern.
C, Appl. Rev., vol. 41, no. 5, pp. 720732, Sep. 2011.
[6] C. D. Barry, C. P. Allott, N. W. John, P. M. Mellor, P.A. Arundel,
D. S. Thomson, and J. C. Waterton, Three-dimensional freehand ultrasound: Image reconstruction and volume analysis, Ultrasound Med. Biol.,
vol. 23, no. 8, pp. 12091224, 1997.

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
8

IEEE TRANSACTIONS ON CYBERNETICS

[7] R. W. Prager, A. H. Gee, G. M. Treece, and L. Berman, Freehand 3D


ultrasound without voxels: Volume measurement and visualization using
the Stradx system, Ultrasonics, vol. 40, no. 18, pp. 109115, 2002.
[8] A. Ali and R. Logeswaran, A visual probe localization and calibration
system for cost-effective computer-aided 3D ultrasound, Comput. Biol.
Med., vol. 37, pp. 11411147, 2007.
[9] O. V. Solberg, F. Lindseth, L. E. B, S. Muller, J. B. L. Bakeng,
G. A. Tangen, and T. A. N. Hernes, 3D ultrasound reconstruction algorithms from analog and digital data, Ultrasonics, vol. 51, pp. 405419,
2011.
[10] R. J. Housden, A. H. Gee, G. M. Treece, and R. W. Prager, Sensorless
reconstruction of unconstrained freehand 3D ultrasound data, Ultrasound
Med. Biol., vol. 33, no. 9, pp. 408419, 2007.
[11] G. York and Y. M. Kim, Ultrasound processing and computing: Review
and future directions, Annu. Rev. Biomed. Eng., vol. 1, pp. 559588,
1999.
[12] Q. H. Huang, Y. P. Zheng, M. H. Lu, and Z. R. Chi, Development of
a Portable 3D Ultrasound imaging system for musculoskeletal tissues,
Ultrasonics, vol. 43, pp. 153163, 2005.
[13] M. P. Fronheiser, S. F. Idriss, P. D. Wolf, and S. W. Smith, Vibrating
interventional device detection using real-time 3-D color Doppler, IEEE
Trans. Ultrason. Ferroelectr. Freq. Control, vol. 55, no. 6, pp. 13551362,
Jun. 2008.
[14] A. H. Gee, R. W. Prager, G. M. Treece, and L. Berman, Engineering a
freehand 3D ultrasound system, Pattern Recogn. Lett., vol. 24, no. 4/5,
pp. 757777, 2003.
[15] A. H. Gee, G. M. Treece, R. W. Prager, C. J. C. Cash, and L. Berman,
Rapid registration for wide field of view freehand three-dimensional
ultrasound, IEEE Trans. Med. Imaging, vol. 22, no. 11, pp. 13441357,
Nov. 2003.
[16] R. N. Rohling, A. H. Gee, and L. Berman, A comparison of freehand
three-dimensional ultrasound reconstruction techniques, Med. Image
Anal., vol. 3, no. 4, pp. 339359, 1999.
[17] Q. H. Huang and Y. P. Zheng, Volume reconstruction of freehand threedimensional ultrasound using median filters, Ultrasonics, vol. 48, no. 3,
pp. 182192, 2008.
[18] Q. H. Huang, Y. P. Zheng, M. H. Lu, T. F. Wang, and S. P. Chen, New
adaptive interpolation algorithm for 3D ultrasound imaging with speckle
reduction and edge preservation, Comput. Med. Imaging Graph., vol. 33,
no. 2, pp. 100110, 2009.
[19] R. W. Prager, R. N. Rohling, A. H. Gee, and L. Berman, Rapid calibration for 3-D freehand ultrasound, Ultrasound Med. Biol., vol. 24, no. 6,
pp. 855869, 1998.
[20] Q. H. Huang and Y.P. Zheng, A new scanning approach for limb extremities using a water bag in freehand 3-D ultrasound, Ultrasound Med. Biol.,
vol. 31, no. 4, pp. 575583, 2005.

Qing-Hua Huang received the B.E. and M.E.


degrees in automatic control and pattern recognition,
both from the University of Science and Technology of China, Hefei, China, in 1999 and 2002, respectively, and the Ph.D. degree in biomedical engineering from the Hong Kong Polytechnic University,
Hong Kong, in 2007.
Since 2008, he has been an Associate Professor with the School of Electronic and Information
Engineering, South China University of Technology,
Guangzhou, China. His research interests include ultrasonic imaging, medical image analysis, bioinformatics, intelligent computation, and its applications.

Zhao Yang received the B.E. degree in communication engineering from Hubei University, Wuhan,
China, in 2008. He joined the School of Electronic
and Information Engineering, South China University of Technology, Guangzhou, China, as a Masters student in 2009 and is currently working toward
the Ph.D. degree in machine learning and computer
vision.

Wei Hu received the B.E. degree in electronics and


information engineering from Jiangnan University,
Wuxi, China, in 2010. He is currently working toward
the Masters degree with the School of Electronic and
Information Engineering, South China University of
Technology, Guangzhou, China.
His research interests include 3-D medical ultrasound imaging.

Lian-Wen Jin received the B.S. degree from the


University of Science and Technology of China,
Hefei, China, and the Ph.D. degree from the South
China University of Technology, Guangzhou, China,
in 1991 and 1996, respectively.
He is currently a Professor with the School of
Electronic and Information Engineering, South China
University of Technology. He has contributed to more
than 100 scientific papers. His research interests
include character recognition, pattern analysis and
recognition, image processing, machine learning, and
intelligent systems.

Gang Wei was born in January 1963. He received the B.S. degree from Tsinghua University, Beijing, China, and the M.S. and Ph.D. degrees from South China University of Technology
(SCUT), Guangzhou, China, in 1984, 1987, and 1990,
respectively.
He was a visiting scholar with the University of
Southern California, Los Angeles, from June 1997
to June 1998. He is currently a Professor with the
School of Electronic and Information Engineering,
SCUT. He is a Committee Member of the National
Natural Science Foundation of China. His research interests include digital signal processing and communications.

Xuelong Li (M02SM07F12) is currently a Researcher (full Professor) with


the Center for Optical Imagery Analysis and Learning, State Key Laboratory
of Transient Optics and Photonics, Xian Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xian, China.

Você também pode gostar