MATH 4A - Linear Algebra With Applications: Lecture 17: Linear Independence, Bases, and Coordinate Systems

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Linear independence and bases Whittling down spanning sets Coordinate systems

MATH 4A - Linear Algebra with Applications


Lecture 17: Linear independence, bases, and coordinate systems

10 May 2019

Reading: §4.3-4.6
Recommended problems from §4.3: 1, 3, 5, 7, 9, 11, 13, 15, 21,
22, 23, 33
Recommended problems from §4.4: 1, 3, 5, 9, 11, 13, 15, 16, 17,
19, 21, 27, 31
Announcement: there were some issues with some homework
problems from HW11 and HW12, so I removed them from the
assignments. Also, HW12 includes a few questions from today.
Linear independence and bases Whittling down spanning sets Coordinate systems

Lecture plan

1 Linear independence and bases

2 Whittling down spanning sets

3 Coordinate systems
Linear independence and bases Whittling down spanning sets Coordinate systems

Lingering issue

At the moment, our concrete understanding of images and kernels


(among other things!) in terms of matrices only works for linear
transformations Rn → Rm .

We would like to find similarly concrete description for linear


transformations between abstract vector spaces V → W . To this
end, in the next few days we will figure out how to associate
matrices to abstract linear transformations. First, we need to
discuss bases of vector spaces, which generalize the standard basis
vectors of Rn
Linear independence and bases Whittling down spanning sets Coordinate systems

Yet another not so new definition

A set of vectors {v1 , v2 , . . . , vp } in a vector space V is linearly


independent if the vector equation

c1 v1 + c2 v2 + · · · + cp vp = 0

has only the trivial solution.

In the case V = Rn , this is the exact same definition from before.


We can similarly generalize the definition of linearly dependent and
linear dependence relation from Rn to abstract vector spaces V .
Linear independence and bases Whittling down spanning sets Coordinate systems

Recall

The standard basis of Rn is the set of n vectors {e1 , e2 , . . . , en }


where      
1 0 0
0 1 0
     
e1 =   , e2 =   · · · en = 0 .
0 0  
. . .
 ..   ..   .. 
0 , 1
This set of vectors has two properties that work especially nicely
together:
1 The vectors span Rn .
2 The vectors are linearly independent.
The first condition says we can generate any vector v in Rn as a
linear combination of the ei , and the second condition implies that
there is exactly one way to write v as such a linear combination.
Linear independence and bases Whittling down spanning sets Coordinate systems

Key Idea: any set of vectors in a vector space V with these


two properties (called a basis for V ) is “as good as” the
standard basis of Rn

In the next few lectures, we will use such sets of vectors to put
“coordinates” on V , allowing us to identify V with Rn . Then we
can use all of the matrix algebra we know and love to solve
problems in V .
Linear independence and bases Whittling down spanning sets Coordinate systems

Definition

Let H be a subspace of a vector space V . A basis of H is a set of


vectors B = {b1 , b2 , . . . , bp } in V such that
(i) B is a linearly independent set, and
(ii) H = Span B = Span{b1 , b2 , . . . , bp }.
(Note: condition (ii) forces B to be a subset of H, not just a
subset of V .)
Linear independence and bases Whittling down spanning sets Coordinate systems

Examples

Note that if V is a vector space, H = V is always a subspace of V .


So it makes sense to talk about a basis of V .

The motivating example is the standard basis of Rn . It is clearly a


basis.

But Rn has lots of other bases. If we choose any n vectors


“randomly” (so that we avoid “probability 0 events” like three
points lying on the same line) then they almost surely span Rn .
E.g. consider the set of vectors
     
 8 1 −2 
B=  −1 , −4 , 1  .
   
−1 0 0
 
Linear independence and bases Whittling down spanning sets Coordinate systems

Examples

There are many (basically equivalent) ways we could use to


determine if B is a basis. Here’s one way: form the matrix
 
8 −1 −1
A =  1 −4 0 
−2 1 0

and compute its determinant:

det A = 7.

Thus A is invertible, which tells us its column vectors are linearly


independent (because A is one-to-one) and span R3 (because A is
onto).

We conclude B is a basis for R3 .


Linear independence and bases Whittling down spanning sets Coordinate systems

Examples

Let S = {1, t, t 2 , . . . , t n }. I claim S is a basis for Pn . Why?

If p(t) = c0 + c1 t + c2 t 2 + · · · + cn t n , then clearly

p(t) = c0 · 1 + c1 · t + c2 · t 2 + · · · + cn · t n ,

which shows S spans Pn .

To see S is linearly independent, suppose some linear combination


of the vectors in S satisfies

c0 · 1 + c1 · t + c2 · t 2 + · · · + cn · t n = 0.

From algebra, we know that the only way this is possible is if


c0 = c1 = · · · = cn = 0, which means the only linear dependence
relation on S is the trivial one. Thus S is linearly independent.
Linear independence and bases Whittling down spanning sets Coordinate systems

From a generating set to a basis

Let v1 , . . . , vp be some vectors in the vector space V . We know

H = Span{v1 , . . . , vp }

is a subspace of V . How can we find a basis for H?


Linear independence and bases Whittling down spanning sets Coordinate systems

Example

Consider the three vectors


     
19 1 −14
v1 = 13 v2 = −1 v3 = −18 ,
1 4 19

and let H = Span{v1 , v2 , v3 }. Let’s find a basis for H.

Note that v3 = 5 · v2 − v1 (of course,


 the way we discover this is
by row reducing v1 v2 v3 0 ). Suppose x is some vector in H.
Then there exist scalars c1 , c2 , c3 such that

x = c1 v1 + c2 v2 + c3 v3 .
Linear independence and bases Whittling down spanning sets Coordinate systems

Since v3 = 5 · v2 − v1 , we can substitute out the v3 :

x = c 1 v1 + c 2 v2 + c 3 v3
= c1 v1 + c2 v2 + c3 (5 · v2 − v1 )
= (c1 − 5)v1 + (c2 + 5c3 )v2

What does this show? The vector x in H = Span{v1 , v2 , v3 } we


started with is actually in Span{v1 , v2 }!

Of course, by definition, Span{v1 , v2 } is a subspace of H. What


we just showed is the converse: H is a subspace of Span{v1 , v2 }.
We conclude that H = Span{v1 , v2 }.

Since v1 and v2 are not scalar multiples of one another, they are
linearly independent.

We conclude that {v1 , v2 } is a basis for H.


Linear independence and bases Whittling down spanning sets Coordinate systems

More generally
When we consider H = Span{v1 , . . . , vp }, any nontrivial linear
dependence relation among the v1 , . . . , vp allows us to remove a
vector. More precisely:
Theorem
Let S = {v1 , . . . , vp } be a set in V , and let H = Span{v1 , . . . , vp }.

(a) If one of the vectors in S—say, vk —is a linear combination of


the remaining vectors in S, then the set formed from S by
removing vk still spans H.
(b) If H 6= {0}, some subset of S is a basis of H.

Intuitively: if S is a spanning set of H, then we can toss out some


of the vectors in S to get a basis of H.

(The proof of part (a) is a simple generalization of the example


above. To prove part (b), apply part (a) several times.)
Linear independence and bases Whittling down spanning sets Coordinate systems

Application: finding a basis for column spaces

Let A be a m × n matrix. We can apply the previous theorem to


find a basis for the column space Col A.

By definition, Col A is the subspace of Rm spanned by the columns


of A. So the previous theorem tells us some subset of the columns
of A is a basis of Col A.

Which subset?
Linear independence and bases Whittling down spanning sets Coordinate systems

Application: finding a basis for column spaces

Theorem
The pivot columns of a matrix A form a basis for Col A.

NOTE: don’t forget that the pivot columns of A are the columns
of A that contain pivots after reducing to echelon form E . They
are NOT columns of E .

(Proof: Let E be an echelon form of A. It is easy to see the pivot


columns of E are linearly independent. The set of solutions to
E x = 0 are the same as the set of solutions to Ax = 0, and
solutions to Ax = 0 are the same thing as linear dependence
relations between the columns of A. Thus, the pivot columns of A
are linearly independent. By similar reasoning, the nonpivot
columns of A are linear combinations of the pivot columns, so we
don’t need them when we compute Col A.)
Linear independence and bases Whittling down spanning sets Coordinate systems

A little bit of a tangent: finding a basis for a null space

In the previous lecture, we discussed how to find a spanning set of


the null space of a matrix A: row reduce the augmented matrix
(A 0) to determine the parametric vector form of the solution set.
In fact, this procedure determines a basis of Nul A. I’ll let you
think about that.
Linear independence and bases Whittling down spanning sets Coordinate systems

Motivation

Bases are the most important way to make abstract vector spaces
V concrete.

The basic idea (pun intended) is to “identity” each of the vectors


bi in a basis {b1 , . . . , bn } of V with the standard basic vector ei in
Rn . Let’s make this more precise.
Linear independence and bases Whittling down spanning sets Coordinate systems

A preliminary uniqueness result

Theorem
Let B = {b1 , . . . , bn } be a basis for the vector space V . Then for
each vector x in V , there exists a unique set of scalars c1 , . . . , cn
such that
v = c1 b1 + · · · + cn bn .
Linear independence and bases Whittling down spanning sets Coordinate systems

Definition

Let B = {b1 , . . . , bn } be a basis for the vector space V . The


coordinates of x relative to B are the weights c1 , . . . , cn such that
x = c1 b1 + · · · + cn bn .

The coordinate mapping determined by B is the linear


transformation (we’ll prove it’s linear shortly)

V → Rn
 
c1
 c2 
x 7→ [x]B =  . 
 
 .. 
cn

We call [x]B the coordinate vector of x determined by B.


Linear independence and bases Whittling down spanning sets Coordinate systems

Example

Consider the basis B = {1, t, t 2 , t 3 } of P3 . If

p(t) = a0 + a1 t + a2 t 2 + a3 t 3

is a vector in P3 , its coordinate vector with respect to B is


 
a0
a1 
[p(t)]B = 
a2  .

a3
Linear independence and bases Whittling down spanning sets Coordinate systems

Example

Consider the nonstandard basis of R2


   
−3 −1
B= ,
13 1

and the vector  


1
x= .
2
To determine [x]B , we must figure out how to write x as a linear
combination of the vectors in B.
Linear independence and bases Whittling down spanning sets Coordinate systems

In other words, we must solve the linear system


 
−3 −1 1
13 1 2

The reduced echelon form is


3
 
1 0 10
0 1 − 19
10

Thus    
3 −3 19 −1
x= − ,
10 13 10 1
so  
3/10
[x]B = .
−19/10
Linear independence and bases Whittling down spanning sets Coordinate systems

iClicker

Consider the basis    


−1 1
B= ,
1 1
of R2 , and the vector  
0
x= .
1
What is the first entry of [x]B ?
(a) 1
(b) −1
(c) 1/2

(d) 2/2
(e) −1/2
Linear independence and bases Whittling down spanning sets Coordinate systems

The coordinate mapping

Theorem
Let B be a basis for the vector space V . Then the coordinate
mapping is a one-to-one and onto linear transformation.

You might also like