You are on page 1of 66

Matrices

Matrix (matrices)

DEFINITION

Column 2 Column 3 Column 4

Column 1

a11
a
21

Row 1
Row 2

a31

Row 3

a13

a14

a22

a23

a32

a33

a24
a34

a
m1

Row m

a12

am 2

am 3

am 4

A matrix of m rows and n columns is called


a matrix with dimensions m x n.
Example: Find the dimensions.
3 8 9
2 3 4

1.)
2.)
2 5
1 1

6 7 8
2 X 3

10
3.)
2
X
1

3 X 3

4.) 3 4
1 X 2

PRACTICE: Find the dimensions.


3

1.)

5
1
4
4
0

3 0
2.)

0 3

3 X 2

2 X 2

4.)

1X 2

5
5.)

2X 1

1 2 3

3.) 0 1 8
0 0 1
3 X 3

6.) 3
1X 1

ADDITION and SUBTRACTION of


MATRICES

To add matrices, we add the corresponding


elements. They must have the same
dimensions.

5
A
4

0
1

5 6 0 3
A+ B

4 2 1 3

6 3
B

2 3
1 3

6 4

2 1 3
2.)

1 0 1

0
0

0
0

0
0

2 1 3

1 0 1
When a zero matrix is added to another
matrix of the same dimension, that same
matrix is obtained.

To subtract matrices, we subtract the


corresponding elements. The matrices must
have the same dimensions.
1

3.) 2
3

11
2 1

3 2

2
0

2 (1)

03
1 3

1
1
2

1
3

3
5

3
4

PRACTICE PROBLEMS:

4 1 6 5
1.)

6 3 7 3

2 6

13 0

1 3 2
2 1 5 1 4 7

2.)

4 0 5
6 4 3 2 4 8

ADDITIVE INVERSE OF A MATRIX:

1
A
3

1 5
0

1
A

0 2

1 5

Find the additive inverse:

2 1 5
2

6
6 4 3

1 5

4 3

Scalar Multiplication:

1 2 3

k 1 2 3
4 5 6

2k
3k
1k

1k 2k 3k
4k 5k 6k

We multiply each element of the matrix


by scalar k.

Examples:

9 0
3 0

1.) 3

12 15
4 5
1 2 x
5 10 5 x

2.) 5 4 y 1 20 5 y
5

2
2
0 5 x
0 25 5 x

Properties of Matrix Operations

Let A,B, and C be matrices with the same dimension


Associative Property of Addition
(A+B)+C = A+(B+C)
Commutative Property of Addition
A+B = B+A
Distributive Property of Addition and
Subtraction
S(A+B) = SA+SB
S(A-B) = SA-SB
NOTE: Multiplication is not included!!!

Elementary row and column


operations

The following operations applied to the augmented


matrix [A|b], yield an equivalent linear system
Interchanges: The order of two rows/columns can
be changed
Scaling: Multiplying a row/column by a nonzero
constant
Sum: The row can be replaced by the sum of that
row and a nonzero multiple of any other row.
One can use ERO and ECO to find the Rank as follows:
EROminimum # of rows with at least one nonzero entry
or
ECOminimum # of columns with at least one nonzero entry

Linear Systems in Matrix Form


a11 x1 a12 x2 a1n xn b1
a11 a12
a
a22
a21 x1 a22 x2 a2 n xn b2
21

an1 x1 an 2 x2 ann xn bn
an1 an 2

Math for CS

Lecture 2

a1n x1 b1
a2 n x2 b2



ann xn bn

16

(1)

Solution of Linear Systems


Each side of the equation

A x b

(2)

Can be multiplied by A-1 :


1

A A x A b
Due to the definition of A-1:

A A x I x x

Therefore the solution of (2) is:


1

xA b

Consistency (Solvability)
A-1 does not exist for every A.
The linear system of equations Ax=b has a
solution, or said to be consistent if
Rank{A}=Rank{A|b}
A system is inconsistent when
Rank{A}<Rank{A|b}
Rank{A} is the maximum number of linearly independent
columns or rows of A. Rank can be found by using ERO
(Elementary Row Oparations) or ECO (Elementary column
operations).

An inconsistent example:
Geometric interpretation

1 2 x1 4
2 4 x 5

ERO:Multiply the first row with


-2 and add to the second row
1 2
0 0

1 2 4
0 0 3

Math for CS

Rank{A}=1

Rank{A|b}=2 > Rank{A}


Lecture 2

19

Uniqueness of solutions
The system has a unique solution if
Rank{A}=Rank{A|b}= n,
where n is the order of the system.

Math for CS

Lecture 2

20

If Rank{A}=n
Det{A} 0 A-1 exists Unique solution

1 2
1 1

Math for CS

x1 4
x 2
2

Lecture 2

21

If Rank{A}=m<n
Det{A} = 0 A is singular so not invertible
infinite number of solutions (n-m free variables)
under-determined system

1 2 x1 4
2 4 x 8

2
Rank{A}=Rank{A|b}=1
Consistent so solvable

Eigenvalues and Eigenvectors


A nonzero vector x is an eigenvector (or
characteristic vector) of a square matrix A if
there exists a scalar such that Ax = x. Then
is an eigen value (or characteristic value)
of A.
Note: The zero vector can not be an
eigenvector even though A0 = 0. But = 0
can be an eigen value.

Example:
2
2 4
Show x is an eigenvector for A
3 6
1

2 4 2 0
Solution : Ax

3 6 1 0
2
But for 0, x 0
1

0

Thus, x is an eigenvector of A, and 0 is an eigenvalue.

Eigenvalues

Let x be an eigenvector of the matrix A. Then there must exist an


eigenvalue such that
Ax = x or, equivalently,
Ax - x = 0

or

(A I)x = 0
If we define a new matrix B = A I, then
Bx = 0
If B has an inverse then x = B-10 = 0. But an eigenvector cannot be zero.
Thus, it follows that x will be an eigenvector of A if and only if B does
not have an inverse, or equivalently det(B)=0, or
det(A I) = 0
This is called the characteristic equation of A. Its roots determine the
eigenvalues of A.

1
2
A
15
2
1
0
A

02

Eigenvalues: examples
Example 1: Find the eigenvalues of

2 12
I A
( 2)( 5) 12
1 5
2 3 2 ( 1)( 2)

two eigenvalues: 1, 2
Note: The roots of the characteristic equation can be repeated. That is, 1 = 2 == k.
If that happens, the eigenvalue is said to be of multiplicity k.
Example 2: Find the eigenvalues of

2 1
0
I A 0
2
0 ( 2) 3 0
0
0 =2is2an eigenvector of multiplicity 3.

Eigenvectors

To each distinct eigenvalue of a matrix A there will correspond at least one eigenvector
which can be found by solving the appropriate set of homogenous equations. If i is an
eigenvalue then the corresponding eigenvector xi is the solution of

Example 1 (cont.):
3 12
1 4
1 : (1) I A

1
4
0
0

x1 4 x2 0 x1 4t , x2 t
x1
4
x1 t , t 0
1
x2
4 12
1 3
2 : (2) I A

1
3
0
0

x1
3
x2 s , s 0
1
x2

(A iI)xi = 0

Eigenvectors
Example 2 (cont.): Find the eigenvectors of

2
1
0
A
02

Recall that = 2 is an eigenvector of multiplicity 3.


Solve the homogeneous linear system represented by

0 1 0 x1 0
( 2 I A)x 0 0 0 x2 0
0 0 0 x3 0
Let x1 s, x3 t. The eigenvectors of = 2 are of the form
x1
x x2
x3

0 s

0 t

0 s and t not both zero.


0 ,
1

Properties of Eigenvalues and Eigenvectors


Definition: The trace of a matrix A, designated by tr(A), is the sum of the
elements on the main diagonal.
Property 1: The sum of the eigenvalues of a matrix equals the trace of the
matrix.
Property 2: A matrix is singular if and only if it has a zero eigenvalue.
Property 3: The eigenvalues of an upper (or lower) triangular matrix are
the elements on the main diagonal.
Property 4: If is an eigenvalue of A and A is invertible, then 1/ is an
eigenvalue of matrix A-1.

Properties of Eigenvalues and


Eigenvectors
Property 5: If is an eigenvalue of A then k is an eigenvalue of kA
where k is any arbitrary scalar.
Property 6: If is an eigenvalue of A then k is an eigenvalue of Ak for
any positive integer k.
Property 8: If is an eigenvalue of A then is an eigenvalue of AT.
Property 9: The product of the eigenvalues (counting multiplicity) of a
matrix equals the determinant of the matrix.

Linearly independent eigenvectors


Theorem: Eigenvectors corresponding to distinct (that is, different)
eigenvalues are linearly independent.
Theorem: If is an eigenvalue of multiplicity k of an n n matrix A
then the number of linearly independent eigenvectors of A associated
with is given by m = n - r(A- I). Furthermore, 1 m k.
Example 2 (cont.): The eigenvectors of = 2 are of the form
x1 s
1
0 s and t not both zero.
x x2 0 s 0 t 0 ,
x3 t
0
1
= 2 has two linearly independent eigenvectors

LINEAR INDEPENDENCE
Definition: A set of vectors {v1, , vp} in n
is said to be linearly independent if the vector
equation x1v1 x2 v 2 ... x p v p 0
has only the trivial solution. The set {v1, , vp} is
said to be linearly dependent if there exist
weights c1, , cp, not all zero, such that

c1v1 c2 v 2 ... c p v p 0

Equation (1) is called a linear dependence


relation among v1, , vp when the weights are not
all zero. A set is linearly dependent if and only if it
is not linearly independent.
Example 1: Let
1 ,
4 , and
2 .

v1 2 v 2

v3 1

0

a. Determine if the set {v1, v2, v3} is linearly


independent.
b. If possible, find a linear dependence relation
among v1, v2, and v3.

Solution: We must determine if there is a nontrivial


solution of the following equation.

1
4
x1 2 x 2 5 x3



3
6

2
0
1 0

0
0

Row operations on the associated augmented matrix


show that

1 4 2
2 5 1

3 6 0

0
0

1 4 2 0
: 0 3 3 0 .

0 0 0 0

x1 and x2 are basic variables, and x3 is free.


Each nonzero value of x3 determines a nontrivial
solution of (1).
Hence, v1, v2, v3 are linearly dependent.

b. To find a linear dependence relation among v1,


v2, and v3, row reduce the augmented matrix
and write the new system:

1 0 2 0
0 1 1 0

0 0 0 0

x1 2 x3 0
x2 x3 0

x1 2 x3 x2 x3
Thus,
,
, and x3 is free.

00

x3 5
Choose
.
x2 value
5 for x3say,
x1 any
10nonzero
Then
and
.

CAYLEY HAMILTON THEOREM


Every

square

matrix

satisfies

characteristic equation.
Let A = [aij]nn be a square matrix
then,

a11 a12 ... a1n


a

a
...
a
22
2n
A 21
.... .... .... ....

a n1 a n 2 ... a nn

n n

its

own

Let the characteristic polynomial of A be ()


Then,
() = A - I

a11 -
a
21

a 12 ...
a22 - ...
...

an1

an2

...

The characteristic equation is

| A -I|= 0

...

a 1n
a 2n
...

... ann -

p0 n+ p 1 n-1+ p 2

n-2

+...+ p n= 0

We are to prove that

p0 A n + p1A n-1 + p 2 A n-2 +...+ pnI= 0 ...(1)

Note 1:- Premultiplying equation (1) by A-1 , we


have
0 = p0 A n-1 + p1A n-2 + p 2 A n-3 +...+ pn-1I + pn A -1
A -1 =-

1
[p0 A n-1 + p1A n-2 + p 2 A n-3 +...+ pn-1I]
pn

This result gives the inverse of A in terms of (n1) powers of A and is considered as a practical
method for the computation of the inverse of the
large matrices.
Note 2:- If m is a positive integer such that m > n
then any positive integral power Am of A is linearly
expressible in terms of those of lower degree.

Example 1:Verify Cayley Hamilton theorem for the matrix


A=

2 1 1
1 2 1

1 1 2

. Hence compute A-1 .

Solution:- The characteristic equation of A is


2
A I 0 i.e., 1
1
or

2 1 0
1 2

3 6 2 9 4 0

(on simplification)

To verify Cayley Hamilton theorem, we have to


show that A3 6A2 +9A 4I = 0 (1)
Now,

2 1 1 2 1
A 2 1 2 1 1 2
1 1 2 1 1
6 5 5
A 3 A 2 A 5 6 5
5 5 6

1
2
2
1
1

6 5 5
5 6 5
5 5 6
1 1 22 22 21
2 1 21 22 21
1 2 21 21 22

A3 -6A2 +9A 4I = 0

22 22 21
6 5 5
= 21 22 21 - 6

5
6

5 5 6
21 21 22

0 0 0

= 0 0 0 0
0 0 0

2 1 1
+ 9

1
2

1 1 2

1 0 0

-4 0 1 0
0 0 1

This verifies Cayley Hamilton theorem.

Now, pre multiplying both sides of (1) by A-1 , we


have
A2 6A +9I 4 A-1 = 0
=> 4 A-1 = A2 6 A +9I
6 5 5
4 A 1 5 6 5 6
5 5 6
3 1 1
1
A 1 1 3 1
4
1 1 3

2 1 1
1 0 0 3 1 1
1 2 1 9 0 1 0 1 3 1

1 1 2
0 0 1 1 1 3

44

Example 2:Given

1
A 0
3

2 1
1 1
1 1

find Adj A by using Cayley

Hamilton theorem.

Solution:- The characteristic equation of the given


matrix A is
A I 0
or

i.e.,

3 3 2 5 3 0

-1
1 0
1
(on simplification)

45

By Cayley Hamilton theorem, A should satisfy


A3 3A2 + 5A + 3I = 0
Pre multiplying by A-1 , we get
A2 3A +5I +3A-1 = 0
1
A -1 (A 2 3A 5I)
... (1)
3
1 2 1 1 2 1
2 5 4
Now, A 2 A.A 0 1 1 0 1 1 3 2 2
3 1 1 3 1 1
6 4 1
3 6 3
3A 0 3 3
9 3 3

46

2 5 4 3 6 3 5 0 0
1

From(1), A 1 3 2 2 0 3 3 0 5 0
3

6
4

1
9

3
3
0
0
5

0 1 1
1
3 4
1
3
3 7
1
We know that, A 1

Adj. A
A

Adj. A A 1 A

47

Now,

A 0

1 3

Adj. A ( 3)
3

0
Adj. A 3
3

1
4
7

0
3
3

1
4
7

1
1
1

1
1
1

48

DIAGONALISATION OF A
MATRIX
Diagonalisation of a matrix A is the process of
reduction A to a diagonal form.
If A is related to D by a similarity transformation,
such that D = M-1AM then A is reduced to the
diagonal matrix D through modal matrix M. D is
also called spectral matrix of A.

49

REDUCTION OF A MATRIX TO
DIAGONAL FORM
If a square matrix A of order n has n linearly
independent eigen vectors then a matrix B can
be found such that B-1AB is a diagonal matrix.
Note:- The matrix B which diagonalises A is called
the modal matrix of A and is obtained by
grouping the eigen vectors of A into a square
matrix.
50

Similarity of matrices:A square matrix B of order n is said to be a


similar to a square matrix A of order n if
B = M-1AM for some non singular matrix M.
This transformation of a matrix A by a non
singular matrix M to B is called a similarity
transformation.
Note:- If the matrix B is similar to matrix A, then B
has the same eigen values as A.

51

Example:Reduce the matrix A =

1 1 2
0 2 1

0 0 3

to diagonal form by

similarity transformation. Hence find A3.


Solution:- Characteristic equation is
1- 1
0 2-

2
1 0
3 -

=>
= 1, 2, 3
Hence eigen values of A are 1, 2, 3.
52

Corresponding to = 1, let X1 =
vector then

x1
x
2

be the eigen

x 3

(A I ) X1 0
0 1 2
0 1 1

0 0 2

x1
0
x 0
2

x 3
0
x 2 2x 3 0
x2 x3 0
2x 3 0

x1 k1, x 2 0 x 3
1
X1 k1 0
0
53

Corresponding to = 2, let X2 =
vector then,

x1
x
2
x 3

be the eigen

(A 2I ) X 2 0
- 1 1 2 x1
0
0 0 1 x 0

2

0 0
0
1 x 3

x1 x 2 2x 3 0
x3

x3 0

x1 k 2 , x 2 -k 2 , x 3 0
1
X 2 k 2 - 1
0
54

Corresponding to = 3, let X3 =
vector then,

x1
x
2
x 3

be the eigen

(A 3I ) X3 0
- 2 1 2 x1
0
0 - 1 1 x 0

2

0 0 0 x 3
0

2 x1 x 2 2x 3 0
x2 x 3 0

x 2 k 3 , x 3 - k 3 , x1

3
k3
2

3
X3 k 3 - 2
2
55

Hence modal matrix is


1
M 0
0

1
-1
0

3
2
2

M 2
-2
Adj. M 0
0

2
2

- 1

M1

1
Adj. M

M
0

1
-1
0

- 1
2
1
1

56

1 1
M1AM 0 - 1

0 0

1
0
0

1
2
1
1

1 1 2 1 1 3
0 2 1 0 - 1 2

0 0 3 0 0 2

0 0
2 0
0 3

Now, since

D = M-1AM

=>

A = MDM-1
A2 = (MDM-1) (MDM-1)
= MD2M-1

[since M-1M =

I]
57

Similarly,

A3 = MD3M-1
1 1
0 -1

= 0 0

1 -7
0 8
3
A =
0 0

1
1
1
3 1 0 0

2
2 0 8 0 0 - 1 1

1
2 0 0 27 0 0

32
- 19
27

58

ORTHOGONAL TRANSFORMATION
OF A SYMMETRIC MATRIX TO
DIAGONAL FORM
A square matrix A with real elements is said to
be orthogonal if AA = I = AA.
But AA-1 = I = A-1A, it follows that A is orthogonal if
A = A-1.
Diagonalisation by orthogonal transformation is
possible only for a real symmetric matrix.

59

If A is a real symmetric matrix then eigen


vectors of A will be not only linearly independent
but also pairwise orthogonal.
If we normalise each eigen vector and use them
to form the normalised modal matrix N then it can
be proved that N is an orthogonal matrix.

60

The similarity transformation M-1AM = D takes


the form NAN = D since N-1 = N by a property of
orthogonal matrix.
Transforming A into D by means of the
transformation NAN = D is called as orthogonal
reduction or othogonal transformation.
Note:- To normalise eigen vector Xr, divide each
element of Xr, by the square root of the sum of the
squares of all the elements of Xr.

61

Example :Diagonalise the matrix A =

2
0
4

0
6
0

4
0
2

by means of an

orthogonal transformation.
Solution:Characteristic equation of A is
2
0
4
0
6
0 0
4
0
2
(2 )(6 )(2 ) 16(6 ) 0

2, 6, 6
62

x1
when = -2,let X 1 = x 2 be the eigen vector
x 3
then

(A + 2I)X1 = 0
4
0

4 x1
0
8 0 x 2 = 0
4 0 4 x 3
0
4x1 + 4x 3 = 0 ...(1)
0

8x 2 = 0 ...(2)
4x1 + 4x 3 = 0

...(3)

x1 = k1, x 2 = 0, x 3 = -k 1

1
X1 = k1 0
-1

63

x1
when = 6,let X 2 = x 2 be the eigen vector
x 3
then

(A - 6I)X 2 = 0
-4 0
0 0

4 x1
0 x 2
4 0 -4 x 3
4x1 + 4x 3 = 0

= 0

4x1 - 4x 3 = 0
x1 = x 3 and x 2 isarbitrary
x 2 must be so chosen that X 2 and X3 are orthogonal among themselves
and also each is orthogonal with X1.
64

Let X 2 = 0 and let X 3


1

Since X3 is orthogonal to X1

- =0

...(4)

X3 is orthogonal to X 2

+ =0

...(5)

Solving (4)and(5), we get = = 0 and is arbitra ry.


0
Taking = 1, X 3 = 1
0
1 1
Modal matrix is M = 0 0
-1 1

0
1
0
65

The normalised modal matrix is

1
1

0
2
2

N = 0
0
1
1

1
0

2
2
1
1
0
1
1

2
2
0

2 0 4 2
2

1
1

D = N'AN =
0
0 6 0 0
0
1

2
2

4 0 2 1
1
0
1
0
0

2
2

-2 0 0
D = 0 6 0
which is the required diagonal matrix.
0 0 6
66

You might also like