Lecture 4 LinAlg
Lecture 4 LinAlg
Lecture 4 LinAlg
Linear maps
Definition
Let k, m, n ∈ N, let A be an m × n matrix and let B be an n × k matrix
with column vectors b1 ,...,bk . We then define the matrix product AB as
the matrix with Abj as its jth column vector, where 1 ≤ j ≤ k.
range
A B
(domain) (codomain)
Reason:
T (u + v) = 0 = 0 + 0 = T (u) + T (v)
T (cu) = 0 = c 0 = cT (u)
Reason:
T (u + v) = u + v = T (u) + T (v)
T (cu) = c u = cT (u)
Linear maps Lecture 4 9 / 40
Zero goes to zero
Fact
If T : Rn → Rm is a linear transformation then T (0) = 0.
u
T(
e1 )
sin(θ)
θ
cos(θ)
e1
e2
T(
e 2) cos(θ)
θ
-sin(θ)
2 0
Answer: A = .
0 1
2 0
Answer: A = . What is the matrix B of the transformation from
0 1
the right image to the left one?
2 0
Answer: A = . What is the matrix B of the transformation from
0 1
1/2 0
the right image to the left one? Answer: B = .
0 1
2.0 2.0
1.5 1.5
1.0 1.0
0.5 0.5
2.0 2.0
1.5 1.5
1.0 1.0
0.5 0.5
1 0
Answer: A = .
0 2
2.0 2.0
1.5 1.5
1.0 1.0
0.5 0.5
1 0
Answer: A = . What is the matrix B of the transformation from
0 2
the right image to the left?
2.0 2.0
1.5 1.5
1.0 1.0
0.5 0.5
1 0
Answer: A = . What is the matrix B of the transformation from
0 2
1 0
the right image to the left? Answer: B = .
0 1/2
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
1 3/4
Answer: A = .
0 1
1.0 1.0
0.5 0.5
-0.5 -0.5
-1.0 -1.0
1.0 1.0
0.5 0.5
-0.5 -0.5
-1.0 -1.0
0 1
Answer: A = .
−1 0
1.0 1.0
0.5 0.5
-0.5 -0.5
-1.0 -1.0
1.0 1.0
0.5 0.5
-0.5 -0.5
-1.0 -1.0
1 0
Answer: A = .
0 −1
Linear maps Lecture 4 32 / 40
Reflection through x = y
Example
What is the matrix A of the linear transformation depicted?
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 1.2 -0.2 0.2 0.4 0.6 0.8 1.0 1.2
-0.2 -0.2
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 1.2 -0.2 0.2 0.4 0.6 0.8 1.0 1.2
-0.2 -0.2
0 1
Answer: A = .
1 0
Linear maps Lecture 4 34 / 40
Projection onto the x-axis
Example
What is the matrix A of the linear transformation depicted?
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 -0.2 0.2 0.4 0.6 0.8 1.0
-0.2 -0.2
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 -0.2 0.2 0.4 0.6 0.8 1.0
-0.2 -0.2
1 0
Answer: A = .
0 0
Linear maps Lecture 4 36 / 40
Projection onto the y -axis
Example
What is the matrix A of the linear transformation depicted?
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 -0.2 0.2 0.4 0.6 0.8 1.0
-0.2 -0.2
1.2 1.2
1.0 1.0
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
-0.2 0.2 0.4 0.6 0.8 1.0 -0.2 0.2 0.4 0.6 0.8 1.0
-0.2 -0.2
0 0
Answer: A = .
0 1
Linear maps Lecture 4 38 / 40
Injectivity and Surjectivity
Definition
A transformation T : Rn → Rm is
one-to-one if each b in Rm is the image of at most one x in Rn ;
onto if each b in Rm is the image of at least one x in Rn .
Theorem
Let T : Rn → Rm be a linear transformation. Then T is one-to-one if and
only if the equation T (x) = 0 has only the trivial solution.
Corollary
Let T : Rn → Rm be given by T (x) = Ax. Then:
a) T is one-to-one if and only if the columns of A are linearly
independent;
b) T is onto if and only if the columns of A span Rm .
Remark
The theorem on the previous slide shows that it is enough to check
injectivity at 0. Every other failure of injectivity will just be a translation
of this one, so it is the solution set of T (x) = 0 that tells us how much
“loss of information” there is when T is applied to Rn .