Professional Documents
Culture Documents
John Norstad
j.norstad@mac.com
http://homepage.mac.com/j.norstad
Abstract
V × V −1 = V −1 × V = I
Proof:
Suppose V w = 0 for some column vector w 6= 0:
v1,1 · · · v1,n w1 0
.. .. .. = ..
. . . .
vn,1 · · · vn,n wn 0
We first show that we must have wi 6= 0 for some i 6= b. Suppose this is not
true, and wi = 0 for all i 6= b. Row a of the equation becomes:
n
X
0= va,j wj = va,b wb
j=1
Row i of V w is:
n
X n
X
vi,j wj = vi,b wb + vi,j wj
j=1 j=1
j6=b
n n
X va,j X
−
= vi,b w̃j
va,b
+ vi,j w̃j
j=1 j=1
j6=b j6=b
n
X vi,b va,j
= vi,j − w̃j
va,b
j=1
j6=b
n
X
= for i 6= a ṽi,j w̃j = 0
j=1
j6=b
n
X va,b va,j
= for i = a va,j − w̃j = 0
va,b
j=1
j6=b
Proof:
First suppose V w = c has a solution for all column vectors c and we are given
a vector c̃. Define the column vector c as:
c̃i if i 6= a
ci =
0 if i = a
w̃i = wi for i 6= b
2 PROOFS OF THE PROPOSITIONS 4
Row i of Ṽ w̃ is:
n n
X X vi,b va,j
ṽi,j w̃j = vi,j − w̃j
va,b
j=1 j=1
j6=b j6=b
n
X vi,b va,j vi,b va,b
= vi,j − wj − vi,b − wb
j=1
va,b va,b
n
X vi,b va,j
= vi,j − wj − 0
j=1
va,b
n n
X vi,b X
= vi,j wj − va,j wj
j=1
va,b j=1
vi,b
= ci − ca
va,b
= ci
= c̃i
So Ṽ w̃ = c̃ has a solution for all column vectors c̃ and the first half of our proof
is complete.
For the other direction suppose Ṽ w̃ = c̃ has a solution for all column vectors c̃
and we are given a column vector c. Define:
ca vi,b
c̃i = ci − for i 6= a
va,b
Row i of V w is:
n
X n
X
vi,j wj = vi,b wb + vi,j wj
j=1 j=1
j6=b
n n
ca X va,j X
= vi,b
va,b − w̃j
+ vi,j wj
va,b
j=1 j=1
j6=b j6=b
n
ca vi,b X vi,b va,j
= + vi,j − wj
va,b va,b
j=1
j6=b
2 PROOFS OF THE PROPOSITIONS 5
n
ca vi,b X ca vi,b
= for i 6= a + ṽi,j wj = + c̃i = ci
va,b va,b
j=1
j6=b
n
ca va,b X va,b va,j
= for i = a + va,j − wj = ca
va,b va,b
j=1
j6=b
So V w = c has a solution for all column vectors c and our proof is complete.
Proof:
Our proof is by induction on the matrix dimension n.
The case n = 1 is trivial. V is the matrix (v1,1 ) and the two conditions hold iff
v1,1 = 0.
Suppose the theorem is true for all square matrices with dimension n − 1 and
let V be a square matrix of dimension n.
If all elements of V are 0 the theorem is trivial because both conditions are
clearly true. So we may assume that V has an element va,b 6= 0. Define the
(n − 1) × (n − 1) matrix Ṽ as above. Then:
V w = 0 for some w 6= 0
iff Ṽ w̃ = 0 for some w̃ 6= 0 (by Lemma 1)
iff Ṽ w̃ = c̃ has no solution for some c̃ (by the induction hypotheses)
iff V w = c has no solution for some c (by Lemma 2)
Proof:
Suppose V has an inverse V −1 and let c be any column vector. Define w = V −1 c.
Then V w = V (V −1 c) = (V V −1 )c = Ic = c. Thus V w = c has a solution for all
c, and V is non-singular by Definition ??.
Now suppose V is non-singular. For each i let ci be the column vector with
values:
1 if a = i
cia =
0 if a 6= i
Because V is non-singular, V wi = ci has a solution for all i. Let V −1 be the
matrix defined by setting column i of V −1 to wi . Then:
V × V −1 = V × w1 · · · (wn ) = c1 · · · (cn ) = I
V × V −1 = I
V −1 × V × V −1 × V = V −1 × (I) × V
V −1 × V × V −1 × V = V −1 × I × V
(I) × (I) = V −1 × V
I = V −1 × V