Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
Computer Science Technical Report Archive
/
USC Computer Science Technical Reports, no. 794 (2003)
(USC DC Other)
USC Computer Science Technical Reports, no. 794 (2003)
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
A Hybrid Line-based Tracking System for Outdoor Environments
Bolan Jiang Ulrich Neumann Suya You
Integrated Media Systems Center
University of Southern California
bolanj@pollux.usc.edu, {uneumann|suyay}@graphics.usc.edu
Abstract
We describe a hybrid tracking system for outdoor
augmented reality applications in this paper. There are
two main contributions of the proposed system: 1) It
makes use of naturally occurring line features to recover
6ODF pose; 2) A high precision gyro is used to track 2D
lines across frames for large inter-frame motions. The
proposed system has been tested for outdoor augmented
reality application. The experiments show that the
proposed hybrid 2D-2D line tracking algorithm can
reduce the need for 3D-2D matching task, and high
accuracy is achieved for the pose recovery.
1. Introduction
Augmented reality applications can enhance a user’s
perception by inserting the computer-generated virtual
information into the user’s view of the real world. They
have many potential applications in military,
manufacture, entertainment and medicine etc. Ideally, the
virtual information should maintain a correct spatial
relationship with the real world. One of the key
requirements for accomplishing this illusion is a tracking
system that accurately estimates the 6DOF pose of the
user (or the camera if the user views the world through
the camera).
In this paper, we focus on the tracking system for an
outdoor urban environment. Tracking in an outdoor
environment is much more difficult than tracking indoors
because of the lack of control and knowledge of the
environment. Several sensor technologies such as GPS,
inertial sensor and compass can be used for tracking in
outdoor environments. However, the disadvantages of
these sensors, such as signal degradation and drift, make
them difficult to achieve high accuracy.
Model-based vision methods detect features for which 3D
positions are already known, and use the corresponded
2D-3D feature pairs to recover 6DOF pose. These model-
based methods can achieve high accuracy and have been
applied successfully for indoor augmented reality
applications [11,12]. We believe that model-based
tracking is a reasonable choice for outdoor urban
environments for the following reasons: 1) There are
plenty of naturally occurring features such as points and
lines in an outdoor environment; 2) To build models for a
large scale urban environment is possible [14, 17]. One of
the concerns about model-based tracking systems is the
requirement for 3D model acquisition. However, the 3D
measurements are unavoidable for an augmented reality
application. The real scenes need to be modeled to
facilitate the placement of the virtual information.
To use model-based tracking systems outdoor, naturally
occurring feature is a more practical choice than artificial
marker. Different from some existing outdoor tracking
systems that use point features [2,3], our tracking system
uses naturally occurring line features. Line features are
common in an outdoor urban environment. They
generally remain in the view much longer than point
features, and they can be useful when partially occluded.
Lines can be localized more accurately than points since
there are more supporting pixels on a line.
One of the challenges to use model-based vision tracking
systems outdoor is to match natural 3D model and 2D
detected features (3D-2D matching). Our goal is to use
2D-2D feature matching to reduce the need for the
difficult 3D-2D matching task. After a 2D feature is
corresponded, this 2D feature is tracked across the
following frames. As long as this feature is kept tracking
in 2D, 3D-2D matching is not necessary. For a short
period, the captures images exhibit great similarity. This
similarity makes 2D-2D matching easier than 3D-2D
matching.
Though 2D-2D matching is easier than 3D-2D matching,
it becomes more difficult when the inter-frame motion is
large. In our tracking system, a high precision gyroscope
is used to assist 2D inter-frame feature matching. A gyro
can measure the inter-frame rotation accurately for a short
period, especially for a large and sudden motion. When
the rotation dominates the inter-frame motion, which is
true when the features are relative far, the positions of a
2D feature can be accurately predicted by the inter-frame
rotation measured by gyroscope. By only searching a
1
small neighborhood of the predicted features, the 2D
feature can be detected fast and accurately.
The rest of this paper is organized as follows: Section 2
reviews some related work. A hybrid 2D-2D line tracking
method is described in section 3. Section 4 discusses the
line-based 6DOF pose estimation. The results of the
experiments are reported in section 5.
2. Related Work
Global Positioning System (GPS), electronic compass and
high-quality gyroscopes are popular choices for the
tracking system for outdoor environments [1]. However,
each sensor has its disadvantages that limit a high precise
tracking system. For examples, gyro measurement is
accurate for a short period or a sudden motion. For a long
period, there is accumulative drift. Electric compass is
sensitive to the magnetic disturbance. GPS is good for
distant object and can only work in an open area.
Since each sensor has its advantages and disadvantages,
many research have been devoted to hybrid tracking
systems. Many hybrid tracking systems contain a vision-
based tracking module to achieve high accuracy. In [2],
the 2D motion of point features is used to correct the drift
of a gyro. This method accomplishes high accuracy.
However, it is very computationally expensive. In [3],
point-like natural landmarks are also used to compensate
gyro drift. The gyro measurement is used to predict the
feature location, which speeds up the feature matching
operation.
In [4], an inertial and a vision-based tracking system are
integrated together. Both systems can measure 6DOF
pose. The vision-based system uses the natural point
features. The 3D and 2D points are matched based on the
geometric relationship among the points. The inertial
tracker composed of accelerometers and gyros. The
relative changes from the inertial tracker are used to
predict pose for the vision-based tracker.
Compared with those existing hybrid tracking systems,
our system uses natural line features rather than points.
Lines are prominent in an outdoor environment,
especially in an urban environment. For example the
boundaries of the buildings, the edges of the windows are
line features. To cover the same tracking range, fewer
lines are necessary than points since lines generally
remain in the view longer.
There are some vision methods [7, 8] to use lines for real-
time tracking. They also use 2D-2D matching to assist
3D-2D matching. However, all these methods assume
slow inter-frame motions to build 2D-2D
correspondences. Our system relaxes this constraint by
using a high precision gyro for prediction.
3. Hybrid 2D-2D Line Matching
This section describes how to use the gyro measurement
to assist 2D-2D line matching. Suppose a point on the
image is represented by ( . If the motion
between these two images is pure rotation
i
t
p
t
I
T i
t
i
t
y x ) 1 , ,
R ∆ , the
corresponding points of on the image is
i
t
p
1 + 1 + t
I
+ +
+ +
+ +
+ +
=
=
+
+
+
1
1
33 32 31
23 22 21
33 32 31
13 12 11
1
1
1
d y d x d
d y d x d
d y d x d
d y d x d
y
x
p
i
t
i
t
i
t
i
t
i
t
i
t
i
t
i
t
i
t
i
t
i
t
, (1)
where
1
33 32 31
23 22 21
13 12 11
−
∆ =
= RK K
d d d
d d d
d d d
D
and K is the intrinsic matrix for the camera.
When the viewing distance is relatively far compared the
inter-frame translation of the user, the translation can be
ignored. And the position for p can be approximated
by equation (1). This assumption is reasonable for an
outdoor augmented reality application.
i
t 1 +
As shown in Figure 1, for each point (cyan points) on a
detected line l on the image I , its gradient in the
t t
Figure 1 Hybrid line tracking by gyro measurement
2
direction normal to l is computed. If the absolute value
of the gradient of a point exceeds some threshold
t
i
t
p η ,
it is predicted on the image by equation (1). The two
predicted end points (red dots) form a predicted line l .
The relative rotation
1 + t
I
−
+1 t
R ∆ in equation (1) is measured by
a gyro. For each predicted point (red or magenta
dot), its correspondent (black dot) is searched in a
1D interval { } in the direction normal
to the predicted line l . The position for is given
by:
−
+
i
t
p
1
i
t 1 +
, J J
p
[−
+1
] ,
1
q
j
t+
j ∈
−
t
i
t
p
1 +
j
1 + t
G
] J , J j [
max
− ∈
arg =
β . η
i
t
G
i
t
G
1
1 + t
t
α β
1 +
j
t
i
t
q p
1 1
*
+ +
= ,
where α < <
+
i
t
j
t
G
G
1
and >
j
t
G
1 +
j
t
q
1 +
t
I
is the gradient in the direction normal to the l for
on the image I . is the gradient in the
direction normal to the l for the point on the image
.
−
+ t
i
t
p
α is a constant smaller than 1 while β is a constant
greater than 1. and are used to ensure the
similarity between the two gradients. The directional
gradient is computed by the convolution masks defined in
[10].
The detected points are fit into a straight line l by
Hough transform. Suppose there are N detected points
and M points are used for line fitting. If the ratio
is smaller than a pre-defined threshold, the
detected line will be discarded. Since a small r indicates
there is not a line in the small area around the predicted
line.
t
N M r / =
Due to the error of the gyro measurement and the ignored
translation, there are errors for the predicted points. If the
predicted points are out of the search interval around the
true value, the line could be lost or wrongly detected. In
order to discard the outlier and recover the lost lines, the
detected line pairs are used to compute the global affine
transformation between two images. A point
on the line l is transformed on the
image by affine transformation as
T i
t
i
t
i
t
y x p ) , ( =
1 + t
I
j
t
+
=
+
+
6
5
4 3
2 1
1
1
e
e
y
x
e e
e e
y
x
i
t
i
t
i
t
i
t
,
where ) , , , , , (
6 5 4 3 2 1
e e e e e e = θ is the affine
transformation parameters. The transformed point should
lie on the line l and satisfy
j
t 1 +
( )
4 3 1 5 2
+ + + +
+ t
i
t
j
t
i
t
y e x e b e y e 0 ) (
1 6 1 1 1
= + +
+ + +
j
t
i i
t
j
t
c e x e a
where is the parameter for the line l . If
there are n lines detected on the image I ,
) , , (
j j j
c b a
j
t 1 +
1 + t
θ is
estimated by a RANSAC method [16] as follows:
∑∑
=
+
=
+ =
n
j
j j
t
i
c p W
i
1
2 '
1
2
1
) ) ( ( min arg
ˆ
'
θ θ
θ
,
where and are two end points for the line l
and
.
1
1
j
t
p
+
[ )
1
i
+
=
2
1
j
t
p
+
t
i
t
a x
j
t
] (
1 1 1 1 1 1
j
t
j
t
i
t
j
t
i
t
j
t
i
t
j j
t
j
t
b a y b x b y a p W
+ + + + + +
After the affine transformation is estimated, the new
predictions are computed by . For a detected line, if its
distance to the affine prediction exceeds some threshold,
it is discarded as an outlier. For a lost line or an outlier,
the line detection is processed again around the affine
prediction.
θ
ˆ
θ
ˆ
4. Pose Estimation
An extened Kalman filter is used to track the 6DOF pose.
As in [12, 13], the state vector is defined
as ) , , , , , , , , , , , ( ϕ φ θ ϕ φ θ G ∆ ∆ ∆ = z y x z y x s
) , , z y , ,
, where
represent position and ( (x ) ϕ φ θ ∆ ∆ ∆ ) , , , (
z y x w
q q q q =
are
Euler angles representing the relative rotation. The global
rotation is stored in an external quaternion
. q
The extended Kalman filter consists of a prediction and
measurement correction step as follows:
Time update:
Q A P A P
s f s
T
t t t t
t t
+ =
=
−
+
−
+
ˆ ˆ
) ˆ ( ˆ
1
1
Measurement correction:
−
+ +
−
+ +
−
+ +
− −
+
−
+
−
+
−
+
− =
− + =
+ =
=
1 1
1 1 1 1
1
1 1
1 1
ˆ
) (
ˆ
) ˆ ( ˆ ˆ
)
ˆ
(
ˆ
) ˆ ( ˆ
t t
t t t t
T
t
T
t
t t
P KH I P
u u K s s
R H P H H P K
s h u
where
t
s ˆ : the state vector estimate for the time t
3
−
t
s ˆ : the current prediction for the state vector at time t
t
P
ˆ
: the estimate for the state covariance matrix
f : the dynamic model for prediction
t
A : Jacobian matrix for f
Q : the process noise covariance matrix.
R : the measurement noise covariance matrix
1 + t
u : the observed measurements.
h : the measurement function computing the predicted
measurement u based on the current state estimate
−
t
ˆ
H : the Jacobian matrix for h
The time update and dynamic model for our tracking
system are similar to [12, 13]. For the measurement
correction, there are two kinds of measurement: one is the
relative rotation measured by gyro and the other one is
the set of detected lines on the video image.
The gyro can directly measure the Euler angles for the
inter-frame relative rotation. And the measurement
function for gyro is
. []
−
+
−
+
=
1 3 3 3 3 1
ˆ 0 0 0 ) ˆ (
t x x t g
s I s h
In our tracking system, a 3D model is represented by its
two 3D end points V and V . These two end points are
projected on the image based on the current pose
estimate. These two back-projected 2D points form the
2D projection of the 3D model line as
1 2
, ))
ˆ
ˆ
ˆ
2 ( )
ˆ
ˆ
ˆ
1 ((
1
1
1
1
1
1
1
− ×
− =
−
+
−
+
−
+
−
+
−
+
−
+
− −
t
t
t
t
t
t
T
z
y
x
z
y
x
V
z
y
x
V R K
m
m
m
where K is the intrinsic matrix for the camera ,
)
ˆ
( )
ˆ
( ) ˆ ( ) ˆ ( (
1 1 1
−
+
−
+
−
+
∆ ∆ ∆ =
t x t y t z t
R R R q R R θ φ ϕ and
t
q ˆ is the quaternion representing the global rotation
estimate for last frame. The measurement function for the
line measurement is
+
+ +
+
+ +
=
2 2
2 2
2 2
1 1
2
1
y x
z y x
y x
z y x
m m
m m y m x
m m
m m y m x
d
d
, (2)
where and are the two end points
for the detected lines. d and are the distances from
the end points of the detected line to the projected line.
Equation (2) is an implicit measurement function, since it
uses both state vector and measurement as the variables.
In the measurement correction, ( computed by
equation (2) is used as the predicted measurement. A zero
vector is used as the observed measurement because the
distance should be zero for the true values of the detected
line and the pose. If there are N detected lines, they are
used for measurement correction one by one in a similar
way to [13].
T
y x ) , (
1 1
T
y x ) , (
2 2
1 2
d
)
T
d d
2 1
,
Figure 2 Hybrid vision and gyroscope sensor
The gyro measurement correction is a linear process
while the line measurement correction is a non-linear one.
So the gyro measurement correction should be processed
first, which can provide a good initial value for the non-
linear line measurement correction.
After the measurement correction step, the relative
rotation is integrated into the global orientation. The
corresponding Euler angles in the state vector are set to
zeros.
5. Experiments
As shown in Figure 2, the hybrid sensor module is
composed of a CCD video camera (Sony XC-999 with
6mm lens), and three orthogonal rate gyroscopes
(GyroChip II QRS14-500-103, from Systron Donner),
which are tightly covered by a foam block to provide
shock protection. The video images are captured by a
Matrox frame grabber (Meteor II) and three gyroscopes
are sampled via a 16-bit A/D converter (National
Instrument DAQPCI-AI-16XE-20). The whole capture
and tracking system is run on a Dell Pentimum-IV 1GHz
machine.
A sequence was captured outdoors when the camera was
viewing two buildings and was rotated and moved around
the buildings. The capture process lasted about 80
4
seconds. When an image was captured, the measurement
from gyro was capture at the same time. To evaluate the
performance for the different inter-frame motions, the
captured sequence was re-sampled at the frame rate of
5Hz and 1Hz respectively. The inter-frame rotations
measured by the gyro for the two sequences are shown in
Figure 3.
5.1. 2D Line Tracking
First we tested the hybrid 2D line tracking on the two
sequences and compared it with a vision-only 2D line
tracking method. The search interval for the line detection
is [-3, +3] in our implementation.
Ten lines were detected on the first frame and were
tracked over the sequence. The ratios between the
correctly detected lines and the visible lines on each
frame are shown in Figure 4. For both sequences, the
vision-only method lost all the lines quickly due to the
large inter-frame motions, while the hybrid method kept
tracking most lines over the whole sequence. Based on
the test, it is clear that hybrid method can improve
robustness for 2D line tracking dramatically. And a main
reason for the improvement is to use a high precision
gyro to predict the positions of the detected lines. The
prediction errors for vision-only method and hybrid
method were compared in the experiments. The error is
-15
-10
-5
0
5
10
15
1
20
39
58
77
96
115
134
153
172
191
210
229
248
267
286
305
Frame Id
degree
roll yaw pitch
(a) 5Hz sequence
-30
-20
-10
0
10
20
30
40
1
8
15
22
29
36
43
50
57
64
71
Frame Id
degree
roll yaw pitch
(b) 1Hz sequence
Figure 3 The Euler angles of the relative rotations
measured by the gyro
0
0.2
0.4
0.6
0.8
1
1.2
1
20
39
58
77
96
115
134
153
172
191
210
229
248
267
286
305
Frame Id
ratio
vision-only hybrid
(a) 5Hz sequence
0
0.2
0.4
0.6
0.8
1
1.2
1
5
9
13
17
21
25
29
33
37
41
45
49
53
57
61
65
69
73
77
Frame Id
ratio
vision-only hybrid
(b) 1 Hz sequence
Figure 4 The ratios between the correctly detected
lines and the visible lines
0
5
10
15
20
25
30
1
25
49
73
97
121
145
169
193
217
241
265
289
313
Frame Id
pixel
vision-only hybrid
(a) 5hz sequence
0
20
40
60
1
8
15
22
29
36
43
50
57
64
71
Frame Id
pixel
vision-only hybrid
(b) 1hz sequence
Figure 5 Comparisons of the average of the
distances from the predicted end points to the
detected lines
Vision-only: the detected lines on the last frame are the
predicted lines
Hybrid: the predicted lines are estimated by gyro
measurement
5
modeled as the average of the distances from the
predicted end points to the detected lines. Compared with
the vision-only method, the proposed hybrid method can
reduce the predicted errors greatly.
The snapshots for the hybrid 2D line tracking are shown
in Figure 6. For both 5hz and 1hz sequences, the lines
were correctly localized due the prediction by gyro
measurement. In the snapshot for 1hz sequence, the gyro
predictions for the two vertical lines on the right side
were out of the search interval. And these two lines were
not detected based on the gyro predictions. After
computing affine transform from the other successfully
detected lines, they were predicted more accurate by the
affine transform and then detected correctly.
5.2. Pose Estimation
This section describes the experiments for 6DOF pose
recovery. The 3D polygon model of the buildings is
extracted from LiDar Data as in [14]. On the first frame,
the user manually matched ten detected lines with their
corresponding model lines. Due to the hybrid 2D tracking
method described in section 3, the 3D-2D matching is not
necessary for the following frames.
(a) 5hz sequence
(b) 1hz sequence
Figure 6 Snapshots for hybrid 2D line tracking
Blue lines are the detected lines on the last frame.
Green lines are the predicted lines by gyro.
Red lines are the detected lines on the current frame
0
0.2
0.4
0.6
0.8
1
1
24
47
70
93
116
139
162
185
208
231
254
277
Frame id
pixel
(a) 5hz sequence
0
0.2
0.4
0.6
0.8
1
1
8
15
22
29
36
43
50
57
64
71
Frame Id
pixel
(b) 1hz sequence
Figure 7 The average of the distances between
the end points of the detected lines and the
projected lines
To evaluate the performance of the tracking system, the
average of the distances from the end points of the
detected lines to the back-projected model lines were
computed. From the results shown in Figure 7, the
proposed tracking system accomplish high accuracy for
both 5hz and 1hz sequence.
The wire-frame model of the buildings was annotated on
the video by the estimated poses. As shown in Figure 8,
the wire-frame model maintains an appropriate spatial
relationship with the real images for both sequences.
The proposed tracking system can be run between 8hz
and 10hz on the current hardware setting.
6. Conclusion
We describe a hybrid line-based tracking system for
outdoor augmented reality applications. The proposed
system uses a high precision gyro to assist the model-
based tracking module. The tracking system makes use of
6
naturally occurring lines to recover 6DOF pose. The
experiments shows that the 2D lines can be tracked
robustly even for large inter-frame motions by using the
high precision gyro for 2D prediction. And the proposed
hybrid 2D line tracking method can reduce the need for
3D-2D matching, which is a difficult task for an outdoor
augmented reality application.
7. Acknowledgement
This work is supported by U.S. Naval Research Office
(ONR). We thank the Integrated Media Systems Center, a
National Science Foundation Engineering Research
Center, for their support and facilities. We also thank
Ariborn Inc for providing us with the LiDAR data.
8. References
[1] R. Azuma, J.W. Lee, B. Jiang, J.Park, S. You, and
U.Neumann, “Tracking in Unprepared Environments for
Augmented Reality Systems”, Computer & Graphics 23.
6 (December 1999), 787-793.
(a) 5hz sequence
(b) 1hz sequence
Figure 8 Virtual object and real environment overlay for outdoor augmented reality experiments
Red lines are the detected lines. Green lines are the back-projections of the 3D models lines. A blue wire-frame model of the
buildings is annotated on the real video images.
[2] K. Satoh, M. Anabuki, H. Yamamoto, H. Tamura, “A
Hybrid Registration Method for Outdoor Augmented
Reality”, Proceedings of IEEE Int’l Symposium on
Augmented Reality, pp 67-76.
[3] S. You, U. Neumann, and R. Azuma, “Orientation
Tracking for Outdoor Augmented Reality Registration”,
IEEE Computer Graphics and Applications, vol. 19, no.
6, Nov./Dec. 1999, pp34-42.
[4] M. Ribo, P. Lang, H. Ganster, M. Bradner, G. Stock,
and A. Pinz, “Hybrid Tracking for Outdoor Augmented
Reality Applications”, IEEE Computer Graphics and
Applications vol. 22. Nov/Dec. 2002, pp54-63.
[5] D. Stricker, “Tracking with Reference Images: A
Real-Time and Markerless Tracking solution for Outdoor
Augmented Reality Applications”, International
7
Symposium on Virtual Reality, Archaeology, and
Cultural Heritage (VAST01), Nov 28-30, 2001.
[6] I. Stamos and P. Allen, “Automatic registration of 2-
D with 3-D imagery in urban environments”,
[7] E. Marchand, P. Bouthemy, F. Chaumette and V.
Moreau, “Robust real-time visual tracking using a 2D-3D
model-based approach”, IEEE International Conference
on Computer Vision, ICCV’ 99, Volume 1, Pages 262-
268, Kerira, Greece, September 1999.
[8] D. Lowe, “Robust model-based motion tracking
through the integration of search and estimation”,
International Journal of Computer Vision, 8(2):113-122,
1992.
[9] K. Chia, A. Cheok, S. Prince, “Online 6DOF
Augmented Reality Registration from Natural Features”,
IEEE and ACM International Symposium on Mixed and
Augmented Reality 2002.
[10] P. Bouthemy, “A Maximum Likelihood Framework
for Determining Moving Edges”, IEEE Transactions on
Pattern Analysis and Machine Intelligence, Vol. 11, No.
5, May 1989.
[11] U. Neumann, Y. Cho, “A Self-Tracking Augmented
Reality System”, Proceedings of ACM Virtual Reality
Software and Technology ’96, pp.109-115.
[12] G. Welch, G. Bishop, “SCAAT: Incremental
Tracking with Incomplete Information”, Proceedings of
SIGGRAPH 97, Computer Graphics, pp. 333-344.
[13] J. Park, B. Jiang and U. Neumann, “Vision-based
Pose Computation: Robust and Accurate Augmented
Reality Tracking”, Proceedings of the 2
nd
IEEE and ACM
International Workshop on Augmented Reality ’99, pp. 3-
12.
[14] S. You, J. Hu, U. Neumann, and P. Fox, “Urban Site
Modeling From LiDAR”, Second International Workshop
on Computer Graphics and Geometric Modeling CGGM’
2003, (to appear) Montreal, Canada, May 2003.
[15] E. Foxlin, and L. Naimark, “VIS-Tracker: A
Wearable Vision-Inertial Self-Tracker”, IEEE Virtual
Reality 2003 (VR2003), March 22-26, 2003, Los
Angeles, CA.
[16] M. A. Fischler and R. C. Bolles, “Random Sample
Consensus: A Paradigm for Model Fitting with
Applications to Image Analysis and Automated
Cartography”, Communications of the ACM, Vol. 24,
No. 6, June 1981, pp. 381-395.
[17] C. Fruth and A. Zakhor, “3D Model Generation for
Cites Using Aerial Photographs and Ground Level Laser
Scans”, IEEE Conference on Computer Vision and
Pattern Recognition, 2001.
8
Linked assets
Computer Science Technical Report Archive
Conceptually similar
PDF
USC Computer Science Technical Reports, no. 767 (2002)
PDF
USC Computer Science Technical Reports, no. 861 (2005)
PDF
USC Computer Science Technical Reports, no. 829 (2004)
PDF
USC Computer Science Technical Reports, no. 922 (2011)
PDF
USC Computer Science Technical Reports, no. 940 (2014)
PDF
USC Computer Science Technical Reports, no. 705 (1999)
PDF
USC Computer Science Technical Reports, no. 691 (1999)
PDF
USC Computer Science Technical Reports, no. 761 (2002)
PDF
USC Computer Science Technical Reports, no. 843 (2005)
PDF
USC Computer Science Technical Reports, no. 786 (2003)
PDF
USC Computer Science Technical Reports, no. 817 (2004)
PDF
USC Computer Science Technical Reports, no. 785 (2003)
PDF
USC Computer Science Technical Reports, no. 668 (1998)
PDF
USC Computer Science Technical Reports, no. 790 (2003)
PDF
USC Computer Science Technical Reports, no. 796 (2003)
PDF
USC Computer Science Technical Reports, no. 639 (1996)
PDF
USC Computer Science Technical Reports, no. 949 (2014)
PDF
USC Computer Science Technical Reports, no. 944 (2014)
PDF
USC Computer Science Technical Reports, no. 939 (2013)
PDF
USC Computer Science Technical Reports, no. 810 (2003)
Description
BolanJiang, Ulrich Neumann, Suya You. "A hybrid line-based tracking system for outdoor environments." Computer Science Technical Reports (Los Angeles, California, USA: University of Southern California. Department of Computer Science) no. 794 (2003).
Asset Metadata
Creator
Jiang, Bolan
(author),
Neumann, Ulrich
(author),
You, Suya
(author)
Core Title
USC Computer Science Technical Reports, no. 794 (2003)
Alternative Title
A hybrid line-based tracking system for outdoor environments (
title
)
Publisher
Department of Computer Science,USC Viterbi School of Engineering, University of Southern California, 3650 McClintock Avenue, Los Angeles, California, 90089, USA
(publisher)
Tag
OAI-PMH Harvest
Format
8 pages
(extent),
technical reports
(aat)
Language
English
Unique identifier
UC16269793
Identifier
03-794 A Hybrid Line-based Tracking System for Outdoor Environments (filename)
Legacy Identifier
usc-cstr-03-794
Format
8 pages (extent),technical reports (aat)
Rights
Department of Computer Science (University of Southern California) and the author(s).
Internet Media Type
application/pdf
Copyright
In copyright - Non-commercial use permitted (https://rightsstatements.org/vocab/InC-NC/1.0/
Source
20180426-rozan-cstechreports-shoaf
(batch),
Computer Science Technical Report Archive
(collection),
University of Southern California. Department of Computer Science. Technical Reports
(series)
Access Conditions
The author(s) retain rights to their work according to U.S. copyright law. Electronic access is being provided by the USC Libraries, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
USC Viterbi School of Engineering Department of Computer Science
Repository Location
Department of Computer Science. USC Viterbi School of Engineering. Los Angeles\, CA\, 90089
Repository Email
csdept@usc.edu
Inherited Values
Title
Computer Science Technical Report Archive
Description
Archive of computer science technical reports published by the USC Department of Computer Science from 1991 - 2017.
Coverage Temporal
1991/2017
Repository Email
csdept@usc.edu
Repository Name
USC Viterbi School of Engineering Department of Computer Science
Repository Location
Department of Computer Science. USC Viterbi School of Engineering. Los Angeles\, CA\, 90089
Publisher
Department of Computer Science,USC Viterbi School of Engineering, University of Southern California, 3650 McClintock Avenue, Los Angeles, California, 90089, USA
(publisher)
Copyright
In copyright - Non-commercial use permitted (https://rightsstatements.org/vocab/InC-NC/1.0/