Professional Documents
Culture Documents
Tommy MacWilliam, 13
tmacwilliam@college.harvard.edu
May 5, 2010
Contents
Table of Contents
1 Linear Equations
1.1 Standard Representation of a Vector
1.2 Reduced Row Echelon Form . . . . .
1.3 Elementary Row Operations . . . . .
1.4 Rank of a Matrix . . . . . . . . . . .
1.5 Dot Product of Vectors . . . . . . . .
1.6 The Product A~x . . . . . . . . . . .
1.7 Algebraic Rules for A~x . . . . . . . .
.
.
.
.
.
.
.
5
5
5
5
5
5
5
5
.
.
.
.
.
.
.
.
.
.
6
6
6
6
6
6
6
7
7
7
7
.
.
.
.
.
.
.
.
.
.
.
8
8
8
8
8
8
9
9
9
9
9
9
4 Linear Spaces
4.1 Linear Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.2 Finding a Basis of a Linear Space V . . . . . . . . . . . . . . . . . . . . . . .
4.3 Isomorphisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
10
10
10
2 Linear Transformations
2.1 Linear Transformations . . . . . . .
2.2 Scaling Matrix . . . . . . . . . . . .
2.3 Orthogonal Projection onto a Line
2.4 Reflection Matrix . . . . . . . . . .
2.5 Rotation Matrix . . . . . . . . . . .
2.6 Shear Matrix . . . . . . . . . . . .
2.7 Matrix Multiplication . . . . . . . .
2.8 Invertibility . . . . . . . . . . . . .
2.9 Finding the Inverse . . . . . . . . .
2.10 Properties of Invertible Matrices . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
Space
. . . .
. . . . . . .
. . . . . . .
Elimination
. . . . . . .
. . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
11
11
11
11
11
11
11
12
12
12
12
12
13
13
13
13
13
13
14
14
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
14
14
14
15
15
15
.
.
.
.
.
.
.
.
.
.
15
15
15
16
16
16
16
16
16
16
17
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
17
17
17
17
17
17
18
18
18
18
18
18
19
19
19
Linear Equations
1.1
1.2
x
y
If a column contains a leading 1, then all ther other entires in that column are 0.
If a row contains a leading 1, then each row above it contains a leading 1 further to
the left.
1.3
1.4
Rank of a Matrix
1.5
1.6
A~x =
1.7
w
~1
w
~ 1 ~x
..
..
~x =
.
.
w
~ n ~x
w
~n
Linear Transformations
2.1
Linear Transformations
2.2
Scaling Matrix
2.3
k 0
0 k
~x w
~
w
~ w
~
w
~
where w
~ is a vector parallel to the line L.
2.4
Reflection Matrix
2.5
a b
b a
Rotation Matrix
2.6
cos sin
sin cos
Shear Matrix
1 k
0 1
1 0
k 1
2.7
Matrix Multiplication
The product of matrices BA is defined as the matrix of the linear transformation T (~x) =
B(A~x) = (BA)~x.
(AB)C = A(BC)
A(B + C) = AB + AC
(kA)B = A(kB) = k(AB)
2.8
Invertibility
2.9
2.10
If an n n matrix A is invertible:
The linear system A~x = ~b has a unique solution ~x for all ~b in Rn
rref (A) = In
rank(A) = n
im(A) = Rn
ker(A) = ~0
The column vectors of A are linearly independent and form a basis of Rn
3
3.1
Image of a Function
image(f ) = {f (x) : x in X} = {b in Y : b = f (x), for some x in X}
3.2
Span
span(~v1 , , ~vn ) = {c1~v1 + + cm~vm : c1 cn in R}
3.3
Kernel
The kernel of a linear transformation T is the solution set of the linear system:
A~x = ~0
The zero vector is in the kernel of T
If ~v1 and ~v2 are in the kerne; of T , then so is ~v1 + ~v2
If ~v is in the kernel of T , then so is k~v
3.4
Subspaces of Rn
3.5
Linear Independence
1. If a vector ~vi in ~v1 , , ~vn can be expressed as a linear combination of the vectors
~v1 , , ~vi1 , then ~vi is redundant.
2. The vectors ~v1 , , ~vn are linearly independent if no vector is redundant.
3. The vectors ~v1 , , ~vn form a basis of a subspace V if they span V and are linearly
independent
8
3.6
Dimension
3.7
Rank-Nullity Theorem
For an n m matrix A:
dim(kerA) + dim(imA) = m
3.8
Coordinates
For a basis B = (~v1 , , ~vn ) of a subspace V , any vector ~x can be written as:
~x = c1~v1 + + cn~vn
c1 , , cn are called the B-coordinates of ~x, with
c1
[~x]B = ...
cn
~x = S[~x]B
where S = [v1 vn ]
3.9
Linearity of Coordinates
3.10
The matrix B that transforms [~x]B into [T (~x)]B is called the B-matrix of T :
[T (~x)]B = B[~x]B ,
3.11
Similar Matrices
or B = S 1 AS
Linear Spaces
4.1
Linear Spaces
A linear space V is a set with rules for addition and scalar multiplication that satisfies the
following properties:
1. (f + g) + h = f + (g + h)
2. f + g = g + f
3. There exists a neutral element n in V such that f + n = f
4. For each f in V there exists a g in V such that f + g = 0
5. k(f + g) = kf + kg
6. (c + k)f = cf + kf
7. c(kf ) = (ck)f
8. 1f = f
4.2
4.3
Isomorphisms
10
5.1
5.2
Orthonormal Vectors
The vectors ~u1 , , un are orthonormal if they are unit vectors orthogonal to each other.
1 if i = j
~ui ~uj =
0 if i 6= j
5.3
5.4
Orthogonal Complement
5.5
Pythongorean Theorem
||~x + ~y ||2 = ||~x||2 + ||~y ||2
5.6
Cauchy-Schwarz Inequality
||~x ~y || ||~x||||~y ||
11
5.7
5.8
~x ~y
||~x||||~y ||
Gram-Schmidt Process
5.9
QR Decomposition
5.10
Orthongonal Transformation
5.11
Transpose
The transpose AT of an m n matrix A is the n m matrix whose ijth entry is the jith
entry of A.
(AB)T = B T AT
(AT )1 = (A1 )T
rank(A) = rank(AT ).
12
5.12
An n n matrix A is symmetric if AT = A.
An n n matrix A is skew-symmetric if AT = A.
5.13
The orthongonal projection onto a subspace V with an orthonormal basis ~u1 , , ~un is
QQT ,
or equivalently,
A(AT A)1 AT
5.14
Least-Squares Solution
5.15
The inner product of a linear space V , denoted hf, gi, has the following properties:
hf, gi = hg, f i
hf + h, gi = hf, gi + hh, gi
hcf, gi = chf, gi
hf, f i > 0
5.16
5.17
Trace of a Matrix
5.18
5.19
Fourier Analysis
1
fn (t) = a0 + b1 sin(t) + c1 cos(t) + + bn sin(nt) + cn cos(nt)
2
where
Z
1
1
f (t) dt
a0 = hf, i =
2
2
Z
1
bk = hf, sin(kt)i =
f (t) sin(kt) dt
Z
1
f (t) cos(kt) dt
ck = hf, cos(kt)i =
Determinants
6.1
Sarruss Rule
For an nn matrix A, write the first n1 columns to the right of A, then multiply along the
diagonal to get 2n products. Subtract the first n products, then add the second n products
to get the determinant.
6.2
Patterns
14
6.3
6.4
Laplace Expansion
n
X
i=1
n
X
j=1
6.5
det(AT ) = det(A)
det(AB) = det(A)det(B)
If A and B are similar, then det(A) = det(B)
det(A1 ) =
7
7.1
1
det(A)
= det(A)1
7.2
Characteristic Polynomial
det(A In ) = (1)n n + (1)n1 tr(A)n1 + + det(A)
15
7.3
Algebraic Multiplicity
7.4
For an n n matrix A
det(A) = 1 n =
n
Y
k=1
tr(A) = 1 + + n =
n
X
k=1
7.5
Eigenspace
E = ker(A In ) = {~v in Rn : A~v = ~v }
7.6
Eigenbasis
An eigenbasis for an n n matrix A conists of the eigenvectors of A and forms a basis for
Rn .
7.7
Geometric Multiplicty
7.8
Diagonalization
7.9
If A = SDS 1 , then
At = SDt S 1
16
7.10
Stable Equilibrium
9.1
9.2
x(t) = ekt x0
9.3
d~
x
dt
= A~x
d~x
= A~x, ~x(t) = c1 e1 t~v1 + + cn en t~vn
dt
where ~v1 , ~vn forms a real eigenbasis of A with eignvalues 1 , , n
9.4
pt
~x(t) = e S
cos(qt) sin(qt)
sin(qt) cos(qt)
S 1~x0
9.5
17
9.6
Eigenfunctions
9.7
9.8
9.9
If the zeros of pT () are p q, then the solutions to its differential equation are of the form
f (t) = ept (c1 cos(qt) + c2 sin(qt))
9.10
A differential equation of the form f 0 (t) af (t) = g(t) has a solution of the form
Z
at
f (t) = e
eat g(t) dt
9.11
9.12
f (x, y)
If (a, b) is an equilibrium point of the system
, such that f (a, b) = 0 and g(a, b) =
g(x, y)
0, then the system is approximated near (a, b) by the Jacobian matrix:
#
du " f
f
(a,
b)
(a,
b)
u
x
y
dt
= g
dv
g
v
(a, b) y (a, b)
dt
x
9.13
X
2
f (x, t) =
bn sin(nx)en t
n=1
9.14
2
where bn =
f (x, 0) sin(nx) dx
0
an sin(nx) cos(nct) +
n=1
where an =
R
0
19
bn
sin(nx) sin(nct)
nc