ict12-lec5
-
Upload
faheem-babar-chattha -
Category
Documents
-
view
224 -
download
0
Transcript of ict12-lec5
-
7/31/2019 ict12-lec5
1/11
Information and Coding
TheoryLec5
Dr M Shamim Baig
-
7/31/2019 ict12-lec5
2/11
Source Coding
Algorithms
-
7/31/2019 ict12-lec5
3/11
-
7/31/2019 ict12-lec5
4/11
Compact Code (Compression)
Shannon First Theorem (on Source Coding)Given DMS of Entropy H, the AverageCodeword Length for distortionless source is
upper bounded by H.
Example Source Codes :-
Shannon-Fano Code
Huffman code
Lampel-Ziv Code
-
7/31/2019 ict12-lec5
5/11
Shannon-Fano Code AlgorithmLength of code for a symbol is inversely proportional to symbol probability
1. Rearrange the symbols in the order ofdescending probabilities
2. Partition the set into two equiprobaleparts, and assign 0 & 1 to upper & lowerparts respectively.
3. Repeat step 2 for each part till singleelement parts.
4. Assign labels from left to right.
-
7/31/2019 ict12-lec5
6/11
Example: Shannon-Fano Coding
DMS
A= { a0, a1,.a5}
PA = {0.2, 0.12, 0.25, 0.08, 0.3, 0.05} Find Shannon-Fano Code
Its Entropy, Efficiency & Redundancy
Does it pass the Krafts inequality test.
-
7/31/2019 ict12-lec5
7/11
Huffman Code AlgorithmLength of code for a symbol is inversely proportional to symbol probability
1. Rearrange the symbols in the order ofdescending probabilities
2. The last two symbols are combined into one
symbol3. Steps 1 & 2 are repeated till only two symbols
are obtained
4. Starting from the last pair & tracing backward
we assign 0 &1 to each pair of symbolscombined
5. Repeat step 4 till we reach original symbol set
-
7/31/2019 ict12-lec5
8/11
Example: Huffman Code
DMS
A= { a0, a1,.a5}
PA
= {0.15, 0.12, 0.25, 0.10, 0.3, 0.08}
Find Huffman code
Its Entropy, Efficiency & Redundancy
Does it pass the Krafts inequality test.
2nd order, 3rd order extensions ofHuffman code
-
7/31/2019 ict12-lec5
9/11
Lempel- Ziv Algorithm(Dynamic Dictionary Code: Universal Source code)
Parse the source code into variable length blockscalled phrases which have not occurred earlier &have the last letter different than any of the previousphrases
List the phrases serially according to their
occurrence in a dictionary (Table) & give this serialnumber a value whose representation is one bit lessthan the fixed codeword size
The codeword for the new phrase is the serial value
of the prefix string appended with the innovation bit.For initialization, the prefix string serial number forcoding the first phrase is 0.
The decoder constructs an identical table (dictionary)at the receiver & decodes the received sequence
accordingly
-
7/31/2019 ict12-lec5
10/11
Example Lempel-Ziv Code
Using Lempel-Ziv algorithm Encode thefollowing binary digits string:-
Binary string = 101101000000010000
Find compression ratio & comment
Decode the above encoded data to verify
-
7/31/2019 ict12-lec5
11/11
Huffman vs Lempel-ziv Code
A priory probability of symbols vs On the fly
Stationary probability distribution vs Dynamic
Good for DMS vs Both Types
Fixed to Variable vs Variable to Fixed
Usage