You are on page 1of 18

EE

Data Communications and


Networking
S
Source
C
Code
d D
Design
i

N
G

Dr Salem Aljareh

Source Code Design


Decision Tree
0

000
x1

001 010
x2
x3

011
x4

100
x5

101
x6

110
x7

111
x8

Weather Info System


Visibility States
Very Poor Poor
Moderate
111

10
0

Moderate 0

Good

110

1
1

Poor 10
Good 110

1
111

Very
y Poor

Shannon Fano Encoding


1. List symbols in descending probability order
2 Divide
2.
Di ide table into 2 eq
equal
al (or nearl
nearly eq
equal)
al) values
al es of probabilit
probability
3 Allocate binary 0 to top half and binary1 to bottom half
3.
4. Divide both upper
pp and lower section in two
5. Allocate binary 0 to top half and binary 1 to bottom half
6. Repeat (4) and (5) until cant go further

Shannon Fano Code Example


x1: P = 0.25

x2: P = 0.25

x3: P = 0.15

x4: P = 0.1

x5: P = 00.1
1

x6: P = 00.08
08

x7: P = 0.06

x8: P = 0.01

Shannon Fano Solution


0.25
0.25
0 15
0.15
0.1
01
0.1
0.08
0.06
0.01

0
1

Shannon Fano Encoding (con.)


X1

X2

X4

X5

X6

X7

X8

0 25 00.25
0.25
25 00.15
15 00.1
1

00.1
1

00.08
08

0 06
0.06

0 01
0.01

00

110

1110 11110 11111

01

X3

100

101

Calculate the following:The code length


The code Entropy
The code Efficiencyy

Shannon Fano Encoding (solution)


Th code
The
d llength
th iis given
i
by:
b

L = ii ==18 P ( x i ) l i
= ( 0 . 25 + 0 . 25 ) 2 +
( 0 . 15 + 0 . 10 + 0 . 10 ) 3 +
0 . 08 4 + ( 0 . 06 + 0 . 01 ) 5
= 2 . 72 digits
di i / symbol
b l

Shannon Fano Encoding (solution)


The entropy H(X) is given by:
i=8

H ( X ) = P ( x i ) log 2 (
i =1

1
)
P ( xi )

1
1
= 2 0 . 25 log 2
+ 0 . 15 log 2
+
0 . 25
0 . 15
1
1
2 0 . 10 log 2
+ 0 . 08 log 2
+
0 . 08
0 . 10
1
1
0 . 06 log
g 2
+ 1 0 . 01 log
g 2
0 . 06
0 . 01
= 2 . 67 bits / symbol

Shannon Fano Encoding (solution)


The efficiency of the code is :

H(X )
=
100
L
2.67
=
100
2.72
= 98.2%

Huffman Encoding (The Algorithm)


List the symbols in descending order of probability.
Combine the 2 smallest probabilities to make a new
probability.
Repeat the process until all the symbols have been
combined.
combined
Note: You should always end up with a total of 1.
1

Huffman Encoding (Example)


x1: P = 0.25

x2: P = 0.25

x3: P = 0.15

x4: P = 0.1

x5: P = 00.1
1

x6: P = 00.08
08

x7: P = 0.06

x8: P = 0.01

00

x1

0.25

10

x2

0 25
0.25

0
0
0

x3
x4

0.15

0.45
0.1

0.2
x5

0.1

0.30

x6
x7
x8
01111

0.08
0.06
0.01

0.15

0.07
1
1

1.0

0.55

Huffman Encoding (Solution)


x1

00

x2

10

x3

010

x4

110

x5

111

x6

0110

x7

01110

x8

01111

Huffman Encoding (evaluation)


For the above Code,
Calculate the following:The code length
The code Entropy
The code Efficiency

Huffman Encoding (Evaluation -result)


The code length = 2.72 digits/symbol
The code Entropy = 2.67 bits/symbol
The code Efficiency = 98.2%
Compare to those for the Shannon Fano Coding

Practical Example
Suppose we have to encode 8 symbols to put on a 2400 bit/sec
circuit.
At a raw 3 bits/symbol this is equivalent to 800 symbols/sec
At encoded 2.72 bits/symbol this is equivalent to 882 symbols/sec
Improvement = 82/800 = 10.25%
YOU GET MORE FOR YOUR MONEY BY SOURCE CODING

Implementation of Huffman Code


Fax machine

A4 page scanned from top left-hand corner to the right-hand bottom corner
Each line is subdivided into 1728 p
picture elements (p
(pels).
)
Each pel quantized into either black ple or white pel.
The total number of pels is approximately is just under 2 pels.
If a binary 1 is used to represent a white pel and binary 0 represent a black
pel and
If the signalling rate is 4800bps the time taken to transmit a page using the
above scheme is:
2 x 106/4800 = 417 seconds (approx 7 minutes)
The scanned data contains considerable redundancy.
the transmission speed can be reduced by a factor of about 10.
Itss achieved by allocating binary code for run of black or white pils based on
It
the probability of occurrence of each run length.

You might also like