Você está na página 1de 48

X INTRODUCTION

In this topic, we will be looking at the concept of basis and dimension for a vector
space. Prior to this we need to first look at the linear independence of a vector set
in a vector space.
In the previous topics, it was shown that the vector space V is spanned by the
vector set S = { }
1 2
, ..., ,
n
v v v if each vector in V can be formed by the linear
combinations of
1 2
, ..., .
n
v v v The span set is important because some of the
properties of the entire vector space can be deduced from the properties of the
span set. It is thus important that the smallest span set S is obtained. This will be
discussed further in this section.
T
T
o
o
p
p
i
i
c
c
1
1
0
0

X
Basisand
Dimension
LEARNING OUTCOMES
By the end of this topic, you should be able to:
1. Find the smallest span set;
2. Explain the concept of the basis that involve the span set and linear
independence; and
3. Determine the dimension of a vector space.
TOPIC 10 BASIS AND DIMENSION W
159
LINEAR INDEPENDENCE
In this section, we will discuss the concept of linear independence of a vector set
in a vector space.
Example 10.1
The set { }
1 2 3
, , v v v with
1 2 3
(1, 2, 5), (2, 1, 0), (7, 1, 5) v v v = = =
is linearly dependent because
1 2 3
3 0. v v v + =
Definition:
- The set { }
1 2
, ...,
n
S v v v = is said to be linearly independent if
1 1 2 2
... 0
n n
v v v o o o + + + =
implies that
1 2
0, 0, ..., 0.
n
o o o = = =
- If there exist scalars
1 2
, , ...,
n
o o o not all zero, such that
1 1 2 2
... 0,
n n
v v v o o o + + + =
then the set S is said to be linearly dependent.
10.1
Skill: Define linear independence and linear dependence of a vector set.
Interprete geometrically the meaning of linear independence in R
2
and R
3
.
X TOPIC 10 BASIS AND DIMENSION
160
Example 10.2
Show that the set S = {1 + x, 3x + x
2
, 2 + x x
2
} is linearly independent in the
vector space P
3
.
Solution
Step 1: Equate the linear combinations to zero.
Assume
2 2
1 2 3
(1 ) (3 ) (2 ) 0. x x x x x o o o + + + + + =
Then
2
1 3 1 2 3 2 3
( 2 ) ( 3 ) ( ) 0 x x o o o o o o o + + + + + =
giving
1 3
2 o o + = 0
1 2 3
3 o o o + + = 0
2 3
o o = 0
This implies that
1 2 3
0. o o o = = = Thus, the set S is linearly
independent.
TOPIC 10 BASIS AND DIMENSION W
161
Example 10.3
Show that the vector set
(1, 0, 0), (0, 1, 0) and (0, 0, 1) i j k = = =
in the vector space R
3
is linearly independent.
Solution
Step 1: Equate the linear combinations to zero.
1 2 3
0 i j k o o o + + =
becomes
1 2 3
(1, 0, 0) (0, 1, 0) (0, 0, 1) (0, 0, 0) o o o + + =
giving
1 2 3
( , , ) (0, 0, 0). o o o =
Thus,
1 2 3
0, 0, 0. o o o = = =
Conclusion: The set
{ }
, , i j k is therefore linearly independent in R
3
.
X TOPIC 10 BASIS AND DIMENSION
162
10.1.1 Geometrical Interpretation of Linear
Dependence in R
2

What is the geometrical interpretation of linear independence in R
2
if the scalars
o
1
and o
2
are zero?
Example 10.4
Show that the set {sin x, x, cos x} is linearly independent in the vector space
C[0, 2t].
Solution
Step 1: Equate the linear combinations to zero.
Assume
1 2 3
sin cos 0 x x x o o o + + = (10.1)
(Note that the zero vector on the right-hand side of (10.1) is the zero
function in C[0, 2t], that is,
| | 0( ) 0 0, 2 .) x x t = e
Step 2: Assign several values to x and calculate o.
Equation (10.1) is true for all x in the interval [0, 2t].
Let
-
3 1 2
0, then 0. Thus, sin 0 x x x o o o = = + =
-
2 1
, then 0. Thus, sin 0 x x t o o = = =
-
1
, then 0. Thus the set
2
x
t
o = =
{sin x, x, cos x} is linearly independent.
TOPIC 10 BASIS AND DIMENSION W
163
Assume
1
x and
2
x are dependent. There exist, o
1
and o
2
not both zero such that
1 1 2 2
0. x x o o + =
- If o
1
= 0, then
- If o
2
= 0, then
1
2 1
2
. x x
o
o
=
Thus, one of the vectors is a multiple of the other.
On the other hand, assume
1 2
. x x o =
Then
1 2
0 x x =
Because the coefficients of
1
x and
2
x are not both zero, therefore
1
, x
2
x are
linearly dependent.
Thus,
1
x and
2
x are linearly dependent in R
2
one of the vectors is a multiple of
the other.
Figure 10.1: Independent and dependent linears (R
2
)
Geometrically: Two vectors in R
2
are linearly dependent both lie on the
same line through the origin. (See Figure 10.1(a))
1
x
0
0
(a) Linear dependence in R
2
(b) Linear independence in R
2
2
x
1
x
2
x
X TOPIC 10 BASIS AND DIMENSION
164
10.1.2 Geometrical Interpretation of Linear
Independence in R
2

Assume
1 2 3
, and x x x are linearly dependent in R
3
. Then, there exist
1
, o
2
o and
3
o not all zero such that
1 1 2 2 3 3
0. x x x o o o + + =
Assume
1
0, o = then
3 2
1 2 3
1 1
. x x x
o o
o o
=
This means
1
x is an element of the subspace W spanned by
2
x and
3
, x that is
1 2 3
span ( , ). x W x x e =
The subspace W can be:
- a plane through the origin (when
2
x and
3
x are linearly independent) or
- a line through the origin (when
2
x and
3
x are linearly dependent) or
- the origin (when
1 2 3
0). x x x = = =
Because all lines through the origin must lie on the planes through the origin it
can be concluded that
1 2 3
, and x x x must also all lie on the same plane through
the origin.
On the other hand, assume
1 2 3
, , x x x all lie on the same plane through the origin.
Then, it can be that:
- all three vectors are the zero vector, or
- all three lie on the same line, or
- all three lie on the same plane spanned by two vectors,
2
x and
3
, x say.
TOPIC 10 BASIS AND DIMENSION W
165
Therefore, in all cases,
1
x is the linear combination of
2
x with
3
x , that is
1 2 2 3 3
. x x x o o = +
Then
1 2 2 3 3
1 0. x x x o o =
This means that the set { }
1 2 3
, , x x x is linearly dependent. Refer to Figure 10.2 for
further example.
Figure 10.2: Independent and dependent linears (R
3
)
10.1.3 Linear Independence Theorem
Consider the following three vectors in R
3
x = (1, 1, 2), y = (2, 3, 1) and z = (1, 3, 8).
Let S be any subspace generated by , x y and , z that is
S = span { } , , . x y z
Geometrically: Three vectors in R
3
are linearly dependent
they all lie on the plane through the origin.
1
x
2
x
3
x
1 2 2 3 3
x x x o o = +
2
x
3
x
0
0
(a) Linearly dependent in R
3
(b) Linearly independent in R
3
X TOPIC 10 BASIS AND DIMENSION
166
It can be shown that
3 2 or 3 2 0. z x y x y z = + + =
We will show that
S = span { } , , x y z = span { } , . x y
It is obvious that
span { } , x y _ span { } , , x y z (10.2)
It remains to be shown that
span { } , , x y z _ span { } , . x y
For this, let v e span { } , , . x y z Then
for scalars , and .
(3 2 )
( 3 ) ( 2 ) .
v x y z
x y x y
x y
o o
o
o
= + +
= + + +
= + + +
This means that
{ } span , . v x y e
Therefore
span { } , , x y z _ span { } , x y (10.3)
From the results in (10.2) and (10.3)
S = span { } , , x y z = span { } , . x y
From the previous discussion, since the set { } , , x y z is linearly dependent, then
the subspace S can be represented by the span of any two given vectors.
It can be shown that the set of any two vectors from , , x y z is linearly
independent. Thus, it is not possible that
S = span { } x or span { } y or span { }. z
TOPIC 10 BASIS AND DIMENSION W
167
Span { }, x span { } y and span { } z is the proper subset of S.
The above discussion is a special case of the following theorem:
Proof:
(a)
Step 1: Assume the vector
1
x can be expressed as a linear combination of
2 3
, , ..., .
n
x x x
Then
1 2 2 3 3
... .
n n
x x x x o o o = + + +
Step 2: Let x be any Vector in V.
Because { }
1 2
, , ...,
n
x x x span V, then
1 1 2 2
1 2 2 3 3 2 2
1 2 2 2 1 3 3 3 1
...
( ... ) ...
( ) ( ) ... ( ) .
n n
n n n n
n n n
x x x x
x x x x x
x x x

o o o
o o o
= + + +
= + + + + + +
= + + + + + +
Therefore, any vector x in V can be written as a linear combination of the vectors
2 3
, , ..., .
n
x x x This means that the set of n 1 vectors V. { }
2 3
, , ...,
n
x x x spans V.
Theorem 10.1.
(a) Assume that the set { }
1 2
, , ...,
n
x x x spans the Vector space V and one
of the Vectors is a linear combination of the other n 1 Vectors. Then
the set of the n 1 such Vectors span V.
(b) Let S = { }
1 2
, , ...,
n
x x x be a set of Vectors in a Vector space V. One of
the Vectors is a linear combination of the other n 1 Vectors if and
only if S is linearly dependent.
X TOPIC 10 BASIS AND DIMENSION
168
(b)
Step 1: Assume
1
x is a linear combination from
2 3
, , ..., .
n
x x x
Then
1 2 2 3 3
...
n n
x x x x o o o = + + +
or
1 2 2 3 3
... 0.
n n
x x x x o o o =
Thus, the set { }
1 2
, , ...,
n
x x x is linearly dependent because the
coefficients of
1
x is non-zero.
Step 2: Assume the set is linearly dependent.
On the other hand, assume
1 1 2 2
... 0
n n
x x x o o o + + + =
and at least one of the
i
o , say
1
o is non-zero.
Then
3 2
1 2 3
1 1 1
... .
n
n
x x x x
o o o
o o o
=
Note: It can be shown that if { }
1 2
, , ...,
n
S x x x = is the non-zero vector set in V,
then S is linearly dependent if and only if
j
x is a linear combination of the
preceding vectors in S. In other words,
S is linearly dependent
1 1 2 2 1 1
... .
j j j
x x x x o o o

= + + +
TOPIC 10 BASIS AND DIMENSION W
169
From the above theorem, we obtain the following results:
The converse of this statement is also true. This will be proven in the following
theorem:
Proof: (by contradiction)
Step 1: Find the false statement.
Assume the statement is false.
This means that
- { }
1 2
, , ...,
n
x x x is not linearly dependent and
- { }
1 2
, , ...,
n
x x x not the minimum span set.
Because the set { }
1 2
, , ...,
n
x x x is not the minimum span set, there
must exists a subset, say{ }
2 3
, , ...,
n
x x x , that spans V.
Theorem 10.2. Assume that { }
1 2
, , ...,
n
S x x x = spans V. If S is linearly
independent then S is the minimum span set.
Assume that { }
1 2
, , ...,
n
x x x spans the vector space V.
- The set { }
1 2
, , ...,
n
x x x is linearly dependent
the n 1 vectors from { }
1 2
, , ...,
n
x x x spans V.
{ }
1 2
, , ...,
n
x x x is not the minimum span set.
- If { }
1 2
, , ...,
n
x x x is the minimum span set, then { }
1 2
, , ...,
n
x x x is
linearly independent.
X TOPIC 10 BASIS AND DIMENSION
170
Step 2: Prove the set is linearly dependent.
We know that
1
. x V e Thus,
1 2 2 3 3
...
n n
x x x x o o o = + + +
with
2 3
, , ..., .
n
o o o eR Therefore
1 2 2 3 3
... 0
n n
x x x x o o o =
This shows that the set { }
1 2
, , ...,
n
x x x is linearly dependent contradicting the
original hypothesis. Therefore, the original statement must be true.
The minimum span set is called the basis. This concept will be explored further in
the following section.
Example 10.5
Determine whether the set {(2, 1, 2), (3, 2, 2), (2, 2, 0)} is linearly dependent
in R
3
.
Solution
Step 1: Equate the linear combination to zero.
Assume
o (2, 1, 2) + (3, 2, 2) + (2, 2, 0) = (0, 0, 0).
Step 2: Equate corresponding components.
By equating the corresponding components, we obtain the
following homogeneous system of equations:
2 3 2 0
2 2 0
2 2 0 0
o
o
o
+ + =
+ + =
+ =
TOPIC 10 BASIS AND DIMENSION W
171
The following theorem gives the general condition for linear dependency for n
vectors in R
n
.
Theorem 10.3. Let
1 2
, , ...,
n
x x x be n Vectors in R
n
with
1 2
( , , ..., )
T
i i i ni
x x x x =
for i = 1, 2, ..., n and X = (x
ij
), with the
1
x forming the columns of X. Then
1 2
, , ...,
n
x x x are linearly dependent if and only if X is singular.
The converse,
1
, ..,
n
x x linearly independent X non-singular.
Step 3: Find the determinant of the matrix.
The determinant of the coefficient matrix is obtained as below:
2 3 2
1 2 2 2( 2 4) 2( 4 6) 0 0
2 2 0
= + + + =

Conclusion: This matrix is singular. The system, therefore, possess non-trivial
solutions. Thus, the set above is linearly dependent.
X TOPIC 10 BASIS AND DIMENSION
172
The determinant can also be used to show that the set { }
1 2
, , ...,
n
f f f is linearly
independent in the vector space C
(n1)
[a, b]. This is demonstrated in the following
theorem.
In the above theorem, the function | |
1 2
, , ...,
n
W f f f is called the Wronskian
1 2
, , ...,
n
f f f and its value at the point x is
| |
1
1
2
2
1
1 2
( 1)
( 1) ( 1)
2
( ) ( )
( )
( )
( ) ( )
, , ..., ( )
( )
( ) ( )
n
n
n
n
n
n n
f x f x
f x
f x
f x f x
W f f f x
f x
f x f x


' ' '
=
"
"
#
"
Theorem 10.4. Let
1 2
, , ...,
n
f f f be n Vectors in C
(n1)
[a, b]. If there exists
one point x
0
in [a, b] such that
| |
1 2 0
, , ..., ( ) 0,
n
W f f f x =
then the set { }
1 2
, , ...,
n
f f f is linearly independent.
Example 10.6
Consider,
i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1).
In Example 10.3, we have shown that the set
{ }
, , S i j k = is linearly
independent in R
3
. The linear independency of the set can also be shown by
using determinant, that is
1 0 0
0 1 0 1.
0 0 1
=
TOPIC 10 BASIS AND DIMENSION W
173
Proof:
Step 1: The contrapositive method.
Assume that
1 2
, , ...,
n
f f f are linearly dependent.
Then there exist
1 2
, , ...,
n
o o o not all zero such that
| |
1 1 2 2
( ) ( ) ... ( ) 0, , .
n n
f x f x f x x a b o o o + + + = e
Step 2: Take derivatives up to n 1. We obtain
1 1 2 2
( 1) ( 1) ( 1)
1 1 2 2
( ) ( ) ... ( ) 0
( ) ( ) ... ( ) 0
n n
n n n
n n
f x f x f x
f x f x f x
o o o
o o o

' ' ' + + + =
+ + + =
#
Step 3: Write the equation in the Wronskian form.
This means that for each x e |a, b|, the system of equations
1
1
1 2
2 2
1
( 1) ( 1)
( 1)
2
( )
( ) ( ) 0
( ) ( ) 0
( )
( )
( ) 0
( )
n
n
n
n n
n
n
f x
f x x f x
f x x f x
f x
f x x f x
f x

| |
| | | |
|
| |
' ' '
|
| |
=
|
| |
|
| |
|
\ . \ .
\ .
"
"
# #
#
"
possess one non-trivial solution (o
1
, o
2
, ..., o
n
)
T
.
Conclusion: Thus, if the set { }
1 2
, , ...,
n
f f f is linearly dependent in C
(n1)
[a, b]
then for each e [a, b], the above coefficient matrix will be singular.
Thus,
| |
1 2
, , ..., ( ) 0,
n
W f f f x = for each x e [a, b].
X TOPIC 10 BASIS AND DIMENSION
174
The converse of the above theorem, that is, if
1 2
, , ...,
n
f f f is linearly independent
then
| |
1 2 0 0
, , ..., ( ) 0, for an
n
W f f f x x = e [a, b], is not true.
Example 10.7
Show that the set
{ }
2
, ,
x x x
e e e

is linearly independent in C[1, 1].


Solution
Step 1: Find the Wronskian for this set.
2
2 2
2
2 2
, , 2
4
1 1 1
1 1 2 6 0.
1 1 4
x x x
x x x x x x
x x x
x x x x
e e e
W e e e e e e
e e e
e e e e

=

= = =
Since the Wronskian is non-zero the set is thus linearly independent.
TOPIC 10 BASIS AND DIMENSION W
175
This can be seen in the following example.
Example 10.8
Show that the set {x
2
, x | x |} is linearly independent in the vector space
C[1, 1] but its Wronskian W[x
2
, x | x |](x) = 0.
Solution
Step 1: Equate the linear combinations to zero.
Assume
| |
2
1 2
0, 1, 1 . x x x x o o + = e
Step 2: Find the value of the scalar.
Thus, for x = 1 and x = 1, we obtain
1 2
1 2
0 and
0.
o o
o o
+ =
=
The solution is
1 2
0 o o = = and therefore the set {x
2
, x | x |} is
linearly independent. However,
2
2
, ( ) 0.
2 2
x x
x
W x x x x
x x
= =

X TOPIC 10 BASIS AND DIMENSION
176
BASIS
It is usual to think of a line as having one dimension, a plane two dimensions
and the space around us three dimensions This concept of the dimension of the
n-space will be discussed in this section.
Definition: Let V be any vector space. A finite subset S in V is called the
basis of V if
(a) S is linearly independent; and
(b) S spans V.
Skill: Understanding basis and dimension.
Proving theorems about basis and dimension.
10.2
EXERCISE 10.1
1. Determine whether the following sets are linearly dependent or
otherwise in the given vector space.
(a) A = {(1, 2, 5), (2, 5, 1), (1, 5, 2)} in R
3
.
(b) C = , f g, h} in C(R) (that is the vector space made up of
all continuous functions : f R R) with
( ) ,
t
f t e = g(t) =
2
sin , ( ) . t h t t =
(c) D = {sin
2
x, cos
2
x} in C[0, 2t].
(d) F = {1, 1 + x, 1 + x + x
2
} in P
3
.
2. Let V be a vector space and assume that { }
1 2
, , ...,
n
S v v v = is
linearly independent. Show that { }
1 2
, , ...,
n
S v v v ' = does not span
V.
TOPIC 10 BASIS AND DIMENSION W
177
Note: If S is a basis, then all vectors in S are different and non-zero, as explained
below.
- If two vectors in S are similar.
Assume
1 2
. v v = Then
1 2 3
1 1 0 ... 0 0.
n
v v v v + + + =
This means that { }
1 2
, , ...,
n
S v v v = is linearly dependent, a contradiction
because S is a basis.
- If one of the vectors in S is zero.
Assume
1
0. v = Then
1 2
1 0 ... 0 0.
n
v v v + + + =
This means that S is linearly dependent, a contradiction because S is a basis.
Similarly, the set { }
1 2
, , ...,
n
e e e , with
1 2
(1, 0, ..., 0), (0, 1, 0, ..., 0), ..., (0, ..., 0, 1),
n
e e e = = =
is called the standard basis of R
n
.
It can be shown that the set {(1, 0, 0), (1, 1, 0), (1, 1, 1)} is also a basis of R
3
. This
means that the basis of a vector space is not unique.
Example 10.9
Assume
i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1).
In Example 10.3, it was shown that the set S = { i , j , k } is linearly
independent in R
3
.
Since any vector v = (a, b, c) in R
3
can be written as , v ai bj ck = + +
therefore S spans R
3
. Hence, S is a basis of R
3
. This basis is called the
standard basis of R
3
.
X TOPIC 10 BASIS AND DIMENSION
178
Example 10.10
Show that {1, x, , x
n1
} is a basis of P
n
.
Solution
Step 1: Determine if the set spans P
n
.
Let
p(x) = a
0
+ a
1
x + + a
n
x
n1
be any polynomial in P
n
. It is clear that p(x) is a linear combination
of the elements of the set {1, x, , x
n1
}.
Thus, span{1, x, , x
n1
} = P
n
.
Step 2: Determine if the set is linearly independent.
Assume
1
0 1
... 0.
n
n
c c x c x

+ + + =
Then
0 1
... 0.
n
c c c = = = =
Thus, {1, x, , x
n1
} is linearly independent and therefore
{1, x, , x
n1
} is a basis of P
n
.
Show that the set {(1, 0, 0), (1, 1, 0), (1, 1, 1)} is also a basis of R
3
.
ACTIVITY 10.1
TOPIC 10 BASIS AND DIMENSION W
179
Example 10.11
Assume
11 12 21 22
1 0 0 1 0 0 0 0
, , and .
0 0 0 0 1 0 0 1
M M M M

= = = =


Show that the set { }
11 12 21 22
, , , S M M M M = is a basis of M
2,2
(R).
Solution
Step 1: Show that S is linearly independent.
Assume
11 12 21 22
0, aM bM cM dM + + + =
that is,
1 0 0 1 0 0 0 0 0 0
.
0 0 0 0 1 0 0 1 0 0
a b c d

+ + + =


Then
0 0
0 0
a b
c d

=


implying that a = b = c = d = 0. Thus S is linearly independent.
Step 2: Show that S spans M
2,2
(R).
Assume
a b
A
c d

=


is any vector in M
2,2
(R).
Note that
A = aM
11
+ bM
12
+ cM
21
+ dM
22
Thus, S spans M
2,2
(R) and hence S is a basis of M
2,2
(R).
X TOPIC 10 BASIS AND DIMENSION
180
Proof:
Assume { }
1 2
, , ...,
m
S u u u ' = is a set of m vectors in V with m > n. It will be shown
that S' is linearly dependent.
Step 1: Express the vectors in V as linear combinations of vectors in S.
Since { }
1 2
, , ...,
n
S v v v = is a basis of V, then each vector in V can be
expressed as a linear combination of vectors in S. Thus,
1 1 1 2 2
...
i i in n
u a v a v a v = + + +
for i = 1, 2, , m.
Step 2: Introduce the scalars c
1
, c
2
, and c
m
, not all zero.
The linear combination of vectors in S' can be written as
1 1 2 2
...
m m
c u c u c u + + +
1 1 1 2
1 1 1
1
1 1
.
n n n
j j j j m mj j
j j j
n m
ij j
j i
c a v c a v c a v
a c v
= = =
= =
= + +
| |
=
|
\ .


Now, assume
1 1 2 2
... 0.
m m
c u c u c u + + + =
Then
1 1
0.
n m
ij i j
j i
a c v
= =
| |
=
|
\ .

Theorem 10.5. Let { }
1 2
, , ...,
n
S v v v = be a basis of a vector space V. Then
any set of m vectors in V with m > n is linearly dependent.
TOPIC 10 BASIS AND DIMENSION W
181
Conclusion: Because the set { }
1 2
, , ...,
n
v v v is linearly independent, then
1
0
m
ij i
i
a c
=
=

for j = 1, 2, ..., n.
This equation represents a system of homogeneous equations with the number of
unknowns greater than the number of equations. The system therefore, possess
non-trivial solutions
1 2
( , , ... )
m
c c c ' ' ' such that
1 1 2 2
... 0.
m m
c u c u c u ' ' ' + + =
Hence, { }
1 2
, , ...,
m
S u u u ' = is linearly dependent.
Proof:
Because S is a basis and T is linearly independent, then from Theorem 10.5 (its
contrapositive statement), n s m.
By a similar argument, since T is a basis and S is linearly independent, it has to be
that m s n. Therefore, m = n.
Definition: Let V be a non-empty vector space. Then V is said to be a finite
dimensional space if there exists a finite set { }
1 2
, , ...,
n
v v v that is a basis for
V. If such a set does not exist then V is said to be an infinite dimensional.
The number of vectors in a basis is the dimension. If { }
1 2
, , ...,
n
v v v is a
basis of V, then the dimension of V is n and we write dim(V) = n. The
improper subspace
{ }
0 for any vector space V is said to have the zero
dimension.
Corollary: If { }
1 2
, , ...,
m
S u u u = and T = { }
1 2
, , ...,
n
v v v are both bases of a
vector space V, then m = n.
X TOPIC 10 BASIS AND DIMENSION
182
In general, to show that the set { }
1 2
, , ...,
n
v v v is a basis for the vector space V, we
must show that the set is linearly independent and spans V.
However, if it is known that the dimension of V is finite, then it is sufficient to
show that the set is linearly independent or spans V.
This is stated in the following theorem:
Theorem 10.6. Let V be a vector space with dimension n and let
{ }
1 2
, , ..., .
n
S v v v V = _
(a) If S is linearly independent, then S is a basis for V.
(b) If S spans V, then S is a basis for V.
(c) No set containing less that n vectors can spans V.
Example 10.12
Show that P, the vector space of all polynomials has infinite dimensions.
Solution (By contradiction)
Step 1: Use Theorem 10.5.
Assume P has finite dimensions, say dim(P) = n.
From Theorem 10.5, any n + 1 vector is linearly dependent.
However, the vector set {1, x, x
2
, , x
n
} is linearly independent
since
W[1, x, x
2
, , x
n
] = 0
This is a contradiction.
Conclusion: Thus, P must have infinite dimensions.
TOPIC 10 BASIS AND DIMENSION W
183
Proof:
(a)
Step 1: Assume that { }
1 2
, , ...,
n
S v v v = is linearly independent.
In order to show that S is a basis it must be shown that it spans V.
Step 2: Use Theorem 10.5.
Let v be any vector in V. From Theorem 10.5
1 2
, , ..., ,
n
v v v v is linearly dependent.
Thus, there exist scalars c
1
, c
2
, , c
n
, c
n+1
, not all zero such that
1 1 2 2 1
... 0.
n n n
c v c v c v c v
+
+ + + + =
The scalar c
n+1
= 0, otherwise
1 2
{ , , ..., }
n
v v v is linearly dependent.
Thus,
1 2
1 2
1 1 1
... .
n
n
n n n
c c c
v v v v
c c c
+ + +
=
Conclusion: Because v is any vector in V, it can be expressed as the linear
combination of elements in S, and thus S spans V. Therefore, S is a basis for V.
(b)
Step 1: Assume that S spans V.
We will show that S is linearly independent, using the method of
contradiction.
X TOPIC 10 BASIS AND DIMENSION
184
Step 2: Assume that S is linearly dependent.
Then one vector in S, say ,
n
v is a linear combination of the other
vectors. This means that { }
1 2 1
, , ...,
n
v v v

spans V. If { }
1 2 1
, , ...,
n
v v v

is
linearly dependent, we can eliminate one more vector and still have a
span set. If this process of elimination is continued, we will finally
obtain a span set that is linearly independent with number of elements
k < n. This is clearly a contradiction since dim (V) = n.
Thus, S must be linearly independent.
(c) Use similar argument as in (b).
Example 10.13
Show that the set S = {(1, 2, 1), (2, 9, 0), (3, 3, 4)} is a basis for R
3
.
Solution
Step 1: Use Theorem 10.6.
We know that dim(R
3
) = 3 and the number of vectors in S is also 3.
Using Theorem 10.6 it is sufficient to show that S is linearly
independent.
Step 2: Prove that S is linearly independent using the determinant.
Note that
1 2 3
2 9 3 1(36 0) 2(8 3) 3(0 9) 1
1 0 4
= + =
Conclusion: Thus, S is linearly independent and hence S is a basis for R
3
.
Why do you think a set with less than n vectors can span V?
ACTIVITY 10.2
TOPIC 10 BASIS AND DIMENSION W
185
Proof:
(a)
Step 1: Assume that { }
1 2
, , ...,
m
v v v is a maximal linearly independent subset
greater than S and . w S e
Then
{ }
1 2
, , ..., ,
m
v v v w is linearly dependent.
Step 2: Prove that w is a linear combination of { }
1 2
, , ..., .
m
v v v
Because { }
1 2
, , ...,
m
v v v is linearly independent, then there does not
exist a
k
v that is a linear combination of the previous vectors.
Thus, w is a linear combination of
1 2
, , ..., .
m
v v v This means that
{ }
1 2
span , , ..., .
m
w v v v e
Therefore
{ }
1 2
span , , ..., .
m
S v v v _
Theorem 10.7. Assume S spans the vector space V.
(a) Any maximal linearly independent subset from S is a basis for V.
(b) If each vector in S that is a linear combination of the previous vectors
are eliminated from S, then the vectors left in S is a basis for V.
X TOPIC 10 BASIS AND DIMENSION
186
Step 3: Prove that { }
1 2
, , ...,
m
v v v is a basis for V.
Because
{ }
1 2
span span , , ...,
m
V S v v v V = _ _
therefore
{ }
1 2
span , , ..., .
m
v v v V _
It is also known that { }
1 2
, , ...,
m
v v v is linearly independent, hence
{ }
1 2
, , ...,
m
v v v is a basis for V.
(b)
The vectors left from the eliminations constitute a maximal linearly independent
subset of S. Thus, from (a) the vectors are a basis for V.
Example 10.14
Consider the set
{ }
3
, , , , S i j k l = _ R with
i = (1, 0, 0), j = (0, 1, 0), k = (0, 0, 1), l = (1, 1, 1).
Step 1: To determine if the set is maximally linearly independent, first
determine whether the set is linear independent or linear dependent
Note that
{ }
i linearly independent;
{ }
, i j linearly independent;
{ }
, , i j k linearly independent;
{ }
, , , i j k l linearly dependent.
Then
{ }
, , i j k is the maximally linearly independent set. Thus,
{ }
, , i j k is a basis for R
3
.
TOPIC 10 BASIS AND DIMENSION W
187
Proof:
Step 1: Assume the linearly independent set in the vector space has dimension
n, and r < n.
Assume { }
1 2
, , ...,
r
S v v v = is linearly independent and r < n. This
means that span { }
1 2
, , ...,
r
v v v is a proper subset for V. Then, there
exist a vector
1 1
but
r r
v V v
+ +
e e span { }
1 2
, , ..., .
r
v v v
Thus, the set { }
1 2 1
, , ..., ,
r r
v v v v
+
is linearly independent.
Theorem 10.8. Assume that { }
1 2
, , ...,
r
S v v v = is a linearly independent set
in the vector space V with dimension n, and r < n. Then, S can be expanded
to be the basis for V.
Step 2: Determine if the set spans R
3
in order to find the minimal span set
for R
3
.
We also note that
{ }
i does not span R
3
;
{ }
, i j does not span R
3
;
{ }
, , i j k spans R
3
;
{ }
, , , i j k l spans R
3
.
Conclusion: The set
{ }
, , , i j k is the minimum span set for R
3
. Thus,
{ }
, , i j k is a basis for R
3
.
X TOPIC 10 BASIS AND DIMENSION
188
Step 2: Expand the set.
If r + 1 < n.
By a similar method, the set { }
1 2 1
, , ..., ,
r r
v v v v
+
can be expanded to a
set that contains r + 2 linearly independent vectors. This process can
be continued until we obtain a linearly independent set.
{ }
1 2 1
, , ..., , , ..., .
r r n
v v v v v
+
The following example shows a method of obtaining the basis that contain certain
vectors in R
4
.
Example 10.15
Obtain a basis for R
4
that contain the vectors
v = (1, 0, 1, 0) and w = (1, 1, 1, 0).
Solution
Step 1: Find out if the set spans R
4
.
Consider the standard basis { }
1 2 3 4
, , , e e e e for the space R
4
with
1
e = (1, 0, 0, 0),
2
e = (0, 1, 0, 0),
3
e = (0, 0, 1, 0),
4
e = (0, 0, 0, 1).
Assume { }
1 2 3 4
, , , , , . S v w e e e e = Since { }
1 2 3 4
, , , e e e e spans R
4
,
then S also spans R
4
.
TOPIC 10 BASIS AND DIMENSION W
189
Step 2: We will obtain the basis by eliminating each of the vectors that are
linear combinations of the previous vectors.
Because
-
1
e is not a linear combination of v and , w then
1
e is
maintained in S.
-
2
e is a linear combination of , v w and
1
e , we therefore
eliminate
2
e .
The vector
3
, e can also be eliminated because
3
e is a linear
combination of , v w and
1
. e
Finally, we find that the required basis is { }
1 4
, , , . v w e e
EXERCISE 10.2
1. Determine whether each of the following set is a basis for the given
vector space V.
(a) S = {(1, 1, 0), (1, 0, 1), (0, 1, 1)}, V = R
3
.
(b) T =
2,2
1 0 0 1 1 1 1 0
, , , , ( ).
0 1 1 0 0 1 0 0

=
`

)
V M R
(c) U = {1 + x, x + x
2
, x
2
}, V = P
3
.
2. Obtain a basis for R
3
containing the vectors (1, 0, 2) and (0, 1, 3).
3. Determine the basis and dimension for each of the following
subspace:
(a) S = {(a + b, a b, a, b) : a, b e R} _ R
4
.
(b) T = {a + b(x + x
2
) : a, b e R} _ P
3
.
X TOPIC 10 BASIS AND DIMENSION
190
COORDINATES; CHANGE OF BASIS
Think of how a coordinate can be represented without reference to any axis.
There is a close relationship between basis and the coordinate system. In this
section, we will look at this relationship and the various results arising from the
change of bases for vector spaces.
In plane geometry, a coordinate (a, b) is fixed at a point P on the plane by
referring to a pair of orthogonal axes.
For example, referring to the axes in Figure 10.3a, consider the two perpendicular
vectors
1
v and
2
, v each of length l, and both starting from the same point O. This
vector forms a basis for R
2
. By drawing two lines originating from P, each
perpendicular to an axis (Figure 10.3b) we obtain the vectors a
1
v and b
2
v such
that
1 2
OP av bv = +
JJJG
.
It is clear that the numbers a and b obtained in this way is the same as the
coordinate of P relative to the coordinate system in Figure 10.3a. Hence, it can be
seen that the coordinate of P is a number that is required to express the vector OP
JJJG
in terms of the basis vectors
1
v and
2
v .
To place the coordinate at a point on the plane, it not necessary that the basis
vectors
1
v and
2
v must be of length 1; any bases for R
2
can be used.
Skill: State the relationship between basis and the coordinate system.
Find the relationship between the vector space for the old basis and the
vector space for the new basis.
10.3
TOPIC 10 BASIS AND DIMENSION W
191
(a) Without vectors (b) With vectors formed
Figure 10.3: Coordinate system
For example, by using basis vectors
1
v and
2
v in Figure 10.4, we can place a
unique coordinate at the point P by extending P parallel to the basis vector to
obtain , OP
JJJG
the diagonal of a parellelogram defined by the vectors a
1
v and b
2
, v
hence
1 2
. OP av bv = +
JJJG
Figure 10.4: Diagonal of a parallelogram
X TOPIC 10 BASIS AND DIMENSION
192
We can regard (a, b) as the coordinate of P relative to the basis { }
1 2
, . v v This
means the coordinate is important because it can be extended to more general
vector spaces. To do this, we need the following results.
Assume { }
1 2
, , ...,
n
S v v v = a basis of a finite-dimensional vector space V. Because
S spans V, each vector in V can be expressed as the linear combination of vectors
in S. Further, the linear independence of S means that there is only one way to
express a vector as linear combination of vectors in S, as stated in the following
theorem.
Proof:
Step 1: Assume the vector v in V can be made up by two linear combinations.
Assume the vector v in V can be written as
1 1 2 2
...
n n
v c v c v c v = + + +
and also
1 1 2 2
...
n n
v d v d v d v = + + +
Step 2: Subtract the second equation from the first, we obtain
1 1 1 2 2 2
( ) ( ) ... ( ) 0.
n n n
c d v c d v c d v + + + =
Theorem 10.9. If { }
1 2
, , ...,
n
S v v v = is a basis for the vector space V, then
each vector v in V can be uniquely expressed in the form
1 1 2 2
... .
n n
v c v c v c v = + + +
TOPIC 10 BASIS AND DIMENSION W
193
Step 3: Find the scalar value.
The linear combination of the basis S implies
1 1 2 2
0, 0, ..., 0
n n
c d c d c d = = =
that is
1 1 2 2
, , ..., .
n n
c d c d c d = = =
If
{ }
1 2
, , ...,
n
S v v v =
is a basis for the vector space of finite dimensions V, and
1 1 2 2
...
n n
v c v c v c v = + + +
is the expression for v in terms of the basis S, then the scalar
1 2
, , ...,
n
c c c is called the coordinate for v relative to the basis S.
- The coordinate vector for v relative to S, denoted as ( ) ,
S
v is the
vector in R
n
defined as
1 2
( ) ( , , ..., ).
S n
v c c c =
- The matrix of the coordinate for v relative to S is denoted as
[ ]
S
v and is the n 1 matrix, defined as
| |
1
2
S
n
c
c
v
c



=



#
X TOPIC 10 BASIS AND DIMENSION
194
Example 10.16
In Example 10.13, it was shown that the set
S = {(1, 2, 1), (2, 9, 0), (3, 3, 4)} is a basis for R
3
.
(a) Obtain the coordinate vector and coordinate matrix for v = (1, 3, 4) with
respect to S.
(b) Obtain the vector v in R
3
such that the coordinate vector with respect to
S is (1, 1, 2).
Solution
(a)
Step 1: Obtain the scalar value of a, b, c such that
(1, 3, 4) = a(1, 2, 1) + b(2, 9, 0) + c(3, 3, 4).
By equating equivalent components, we obtain
a + 2b + 3c = 1
2a + 9b + 3c = 3
a + 4c = 4
By solving this system of equations we obtain
a = 48, b = 8, c = 13.
Step 2: Obtain the coordinate vector and coordinate matrix.
Then
| |
48
((1, 3, 4)) ( 48, 8, 13) and (1, 3, 4) 8 .
13
S
S


= =



TOPIC 10 BASIS AND DIMENSION W
195
Example 10.17
Consider the the vector space R
3
with the standard basis S =
{ }
, , i j k where
i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1).
If v = (a, b, c) is any vector in R
3
, then
v = (a, b, c) = a(1, 0, 0) + b(0, 1, 0) + c(0, 0, 1)
= . ai bj ck + +
This means that
v = (a, b, c) = ( )
S
v
In other words, the component for a vector v relative to the xyz-coordinate
system, is the same as the coordinate of v relative to the standard basis
{ }
, , . i j k
(b)
Step 1: Use the given coordinate vector to obtain the vector . v
Using the definition of the coordinate vector
( )
S
v = (1, 1, 2)
we obtain
v = 1(1, 2, 1) 1(2, 9, 0) + 2(3, 3, 4)
= (1, 2, 1) (2, 9, 0) + (6, 6, 8)
= (5, 1, 9)
X TOPIC 10 BASIS AND DIMENSION
196
10.3.1 Changing Bases
If we change the basis of a vector space from A to a new one, A', how can the
coordinate matrix [ ] ,
A
v of a vector, v be related to the new coordinate matrix,
[ ] ?
A
v
'
Consider the case of 2-demensional space for simplicity.
Assume
- A = { }
1 2
, u u is the old basis, and
- A' = { }
1 2
, v v is the new basis.
Step 1: Write out the coordinate matrix of the vector for the old basis relative
to the new basis
Say the matrix is
| | | |
1 2
and ,
A A
a c
u u
b d
' '

= =


(10.4)
that is
1
u
1 2
av bv = +
2
u
1 2
cv dv = + (10.5)
Step 2: Assume w is any vector in V and assume the old coordinate matrix
| |
A
w
o


=


(10.6)
such that
1 2
. w u u o = + (10.7)
TOPIC 10 BASIS AND DIMENSION W
197
Step 3: Obtain the new coordinate for , w and express w in terms of the new
basis A'.
For this, substitute (10.5) into (10.7). This gives
1 2 1 2
( ) ( ) w av bv cv dv o = + + +
or
1 2
( ) ( ) . w a c v b d v o o = + + +
Thus, the new coordinate matrix for w is:
| |
A
a c
w
d
o
o
'
+
=

+

this can be rewritten as
| |
A
a c
w
b d
o

'

=


or from (10.6) we obtain
| | | |
A A
a c
w w
b d
'

=


This equation states that the new coordinate matrix
| |
A
w
'
can be
obtained by multiplying the old coordinate matrix
| |
A
w on the left
with the following matrix.
a c
P
b d

=


The columns of this matrix give the coordinate of the old basis vector
relative to the new basis vector. Hence, we can state the following
method to overcome the problems when changing the bases.
X TOPIC 10 BASIS AND DIMENSION
198
If we were to change the basis for a vector space V, from { }
1 2
, , ..., ,
n
A u u u = say,
to a new basis, { }
1 2
, , ..., ,
n
B v v v = say, then the previous coordinate matrix | |
A
v
for a vector v can be related to the new coordinate matrix
| |
B
v by the equation
| | | |
B A
v P v = (10.8)
where the column matrix P is the coordinate matrix for the old vector relative to
the new basis, that is the column vector for P is
| | | | | |
1 2
, , ..., .
n
B B B
u u u
Symbolically, the matrix P can be written as
| | | | | |
1 2
...
n
B B B
P u u u =

and is called the transition matrix from A to B.
Example 10.18
Consider the bases
{ } { }
1 2 1 2
, and , A u u B v v = =
For R
2
with
1 2 1 2
1 0 1 2
, ; , .
0 1 1 1
u u v v

= = = =


(a) Obtain the transition matrix from A to B.
(b) By using Equation (10.8), obtain
| |
B
v if
2
4
v

=


.
TOPIC 10 BASIS AND DIMENSION W
199
Solution
(a)
Step 1: Obtain the coordinate matrix for the old basis vectors
1
u and
2
u
relative to the new basis B.
Following the procedure used in Example 10.16(a), it can be shown
that
1 1 2
2 1 2
2
u v v
u v v
= +
=
such that
| | | |
1 2
1 2
and .
1 1
B B
u u

= =


Step 2: Obtain the transition matrix.
Thus, the transition matrix from A to B is
1 2
.
1 1
P

=


(b)
Step 1: Express the old matrix.
By inspection we obtain
| |
2
.
4
A
v

=


Step 2: Use Equation (10.8) and the transition matrix from Section (a) to
find the new matrix.
We obtain
| |
1 2 2 2 8 6
.
1 1 4 2 4 2
B
v
+
= = =



X TOPIC 10 BASIS AND DIMENSION
200
Example 10.19
In the above example, we obtained the transition matrix from the basis A to
basis B. However, the transition matrix from B to A can also be obtained.
Assume that B is the old basis and A the new basis. As usual, the column of the
transition matrix is the coordinate of the old basis vector relative to the new
basis.
Step 1: Obtain the coordinate matrix of the old basis vector relative to the
new basis.
By inspection we obtain
1 1 2
2 1 2
2
v u u
v u u
= +
= +
implying that
| | | |
1 2
1 2
and .
1 1
A A
v v

= =


Step 2: Obtain the transition matrix.
Thus, the transition matrix from B to A is
1 2
.
1 1
Q

=


Verify this result by taking
1 2
6 2 . v v v =
ACTIVITY 10.3
TOPIC 10 BASIS AND DIMENSION W
201
The above results are generalised in the following theorem (given without proof):
In other words, if P is the transition matrix from basis A to basis B, then for each
vector : v
| | | |
| | | |
1
.
B A
A B
v P v
v P v

=
=
Before proceeding to the next topic, test your understanding by attempting the
folowing exercises.
Theorem 10.10. If P is the transition matrix from basis A to basis B, then
(a) P is invertible.
(b) P
1
is the transition matrix from basis B to basis A.
Step 3: Multiply the transition matrix from A to B by the transition matrix
from B to A.
We obtain
1 2 1 2 1 0
1 1 1 1 0 1
PQ I

= = =


this shows that Q = P
1
.
X TOPIC 10 BASIS AND DIMENSION
202
EXERCISE 10.3
1. Obtain the coordinate vector and coordinate matrix for the
vector v = (4, 9) relative to the basis S = {(1, 0), (0, 1)}.
2. Obtain the coordinate vector and coordinate matrix for the
vector v = (2, 1, 3) relative to the basis S = {(1, 0, 0),
(2, 2, 0), (3, 3, 3)}.
3. Obtain if ( )
S
v v = (6, 1, 4) and S is a basis as in Question 2.
4. Consider the basis { } { }
1 2 1 2
, and , A u u B v v = = for the vector
space R
2
, with
1 2 1 2
1 0 2 3
, , and .
0 1 1 4
u u v v

= = = =


(a) Obtain the transition matrix from A to B.
(b) Obtain the coordinate matrix
| |
A
v for the vector
3
.
5
v

=


Use Equation (10.8) to calculate
| |
.
B
v
(c) Obtain the transition matrix from B to A.
5. Determine whether each of the following set is linearly
dependent or linearly independent in the given vector space:
(a) A = {(1, 2, 1), (1, 2, 1), (3, 2, 1), (2, 0, 0)} in R
3
.
(b) B = {1, sin
2
x, cos
2
x} in C[0, 2t].
(c) C = {1, x
2
, x
2
2} in P
3
.
TOPIC 10 BASIS AND DIMENSION W
203
6. Assume x = (2, 1, 3), y = (3, 1, 4) and z = (2, 6, 4).
(a) Show that { } , , x y z is linearly dependent.
(b) Show that { } , x y is linearly independent.
(c) Determine the dimension for span { } , , . x y z
(d) Describe the geometrical interpretation for span { } , , . x y z
7. Obtain a basis for R
3
that contains the vectors (1, 0, 2) and
(0, 1, 3).
8. Obtain the dimension of the subspace spanned by {x
2
, x
2
x 1,
x + 1}.
9. Assume
1 1
0 0
A

=


and U = {M e M
2,2
(R) | AM = M}.
(a) Obtain a basis for U that does not contain A.
(b) Obtain a basis for U that contains A.
10. Obtain the coordinate vector and coordinate matrix for the
vector v = (1, 1) relative to the basis S = {(2, 4), (3, 8)}.
11. Obtain the coordinate vector and coordinate matrix for the
vector v = (5, 12, 3) relative to the basis S = {(1, 2, 3),
(4, 5, 6), (7, 8, 9)}.
X TOPIC 10 BASIS AND DIMENSION
204
12. Consider the basis { } { }
1 2 1 2
, and , A u u B v v = = for the vector
space R
2
, with
1 2 1 2
2 4 1 1
, , and .
2 1 3 1
u u v v

= = = =



(a) Obtain the transition matrix from A to B.
(b) Obtain the coordinate matrix
| |
A
v for the vector
3
5
v

=


and use Equation (10.8) to calculate
| |
.
B
v
(c) Obtain the transition matrix from B to A.
13. Consider the basis { } { }
1 2 3 1 2 3
, , and , , A u u u B v v v = = for the
vector space R
3
, with
1 2 3
1 2 3
3 3 1
0 , 2 , 6 ,
3 1 1
6 2 2
6 , 6 , 3 .
0 4 7
u u u
v v v


= = =





= = =



(a) Obtain the transition matrix from A to B.
(b) Obtain the coordinate matrix
| |
A
v for the vector
5
8
5
v


=



and use Equation (10.8) to calculate
| |
.
B
v
TOPIC 10 BASIS AND DIMENSION W
205

- The basis for a vector space is a vector set that is linearly independent and
spans the vector space.
- The number of vectors in the basis is called the dimension of the vector space.
- A given basis can be converted to a different basis through the use of the
transition matrix.
-

Basis
Coordinate
Coordinate vector
Infinite dimensional
Linear dependence
Linear independence
Proper subset
Transition matrix
Wronskian
Zero dimension

Você também pode gostar