III. Turbo Codes. © Tallal Elshabrawy Applications of Turbo Codes Worldwide Applications &...

Post on 01-Jan-2016

226 views 0 download

Tags:

Transcript of III. Turbo Codes. © Tallal Elshabrawy Applications of Turbo Codes Worldwide Applications &...

III. Turbo Codes

© Tallal Elshabrawy

Applications of Turbo Codes

Worldwide Applications & Standards Data Storage Systems

DSL Modem / Optical Communications

3G (Third Generation) Mobile Communications

Digital Video Broadcast (DVB)

Satellite Communications

IEEE 802.16 WiMAX

© Tallal Elshabrawy

Basic Concept of Turbo Codes

Invented in 1993 by Alian Glavieux, Claude Berrou and Punya Thitimajshima

Basic Structure:

a) Recursive Systematic Convolutional (RSC) Codes

b) Parallel concatenation and Interleaving

c) Iterative decoding

© Tallal Elshabrawy

Recursive Systematic Convolutional Codes

+

+

Non Systematic Convolutional Code

Recursive Systematic Convolutional Code

+

+

c1

c2

g1=[1 1 1] g2=[1 0 1]

G=[g1 g2]

+

c1

c2

g1=[1 1 1] g2=[1 0 1]

G=[1 g2/g1]

© Tallal Elshabrawy

Non-Recursive Encoder Trellis Diagram

s0 (0 0)

s1 (1 0)

s2 (0 1)

s3 (1 1)

00 00 00

11 11 11

10

01 01

01

10 10

11

00

00 00 00

11 11 11

10 10 10

01 01 01

01 01 01

10 10 10

11 11 11

00 00 00

© Tallal Elshabrawy

Systematic Recursive Encoder Trellis Diagram

s0 (0 0)

s1 (1 0)

s2 (0 1)

s3 (1 1)

00 00 00

11 11 11

10

01 01

01

10 10

11

00

00 00 00

11 11 11

10 10 10

01 01 01

01 01 01

10 10 10

11 11 11

00 00 00

© Tallal Elshabrawy

Non-Recursive & Recursive Encoders

Non Recursive and recursive encoders both have the same trellis diagram structure

Generally Recursive encoders provide better weight distribution for the code

The difference between them is in the mapping of information bits to codewords

© Tallal Elshabrawy

Parallel Concatenation with Interleaving

Two component RSC Encoders in parallel separated by an Interleaver The job of the interleaver is to de-correlate the encoding process of the

two encoders Usually the two component encoder are identicalxk : Systematic information bit

yk(1) : Parity bit from the first RSC encoder

yk(2) : Parity bit from the second RSC encoder

RSC Encoder 1

RSC Encoder 2

π

xk

yk(1)

yk(2)

xk

1/3 Turbo Encoder

© Tallal Elshabrawy

InterleavingPermutation of data to de-correlate encoding process of both encodersIf encoder 1 generates a low weight codeword then hopefully the interleaved input information would generate a higher weight codewordBy avoiding low weight codewords the BER performance of the turbo codes could improve significantly

© Tallal Elshabrawy

Turbo Decoder

Iteratively exchanging information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits

rkRSC Decoder 1

RSC Decoder 2

ππ-1

xk*

© Tallal Elshabrawy

Turbo Codes Performance

Principles of Turbo Codes: Keattisak Sripiman http:www.kmitl.ac.th/dslabs/TurboCodes

© Tallal Elshabrawy

Turbo Decoding

Turbo decoders rely on probabilistic decoding of component RSC decoders

Iteratively exchanging soft-output information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits

As the number of iteration grows, the decoding performance improves

12

© Tallal Elshabrawy

Turbo Coded System Model

u = [u1, …, uk, …,uN] : vector of N information bits

xs = [x1s, …, xk

s, …,xNs] : vector of N systematic bits after turbo-coding of u

x1p = [x11p, …, xk

1p, …,xN1p] : vector of N parity bits from first encoder after turbo-coding of u

x2p = [x12p, …, xk

2p, …,xN2p] : vector of N parity bits from second encoder after turbo-coding of u

x = [x1s, x1

1p, x12p, …, xN

s, xN1p, xN

2p] : vector of length 3N of turbo-coded bits for u

y = [y1s, y1

1p, y12p, …, yN

s, yN1p, yN

2p] : vector of 3N received symbols corresponding to turbo-coded bits of u

ys = [y1s, …, yk

s, …,yNs] : vector of N received symbols corresponding to systematic bits in xs

y1p = [y11p, …, yk

1p, …,yN1p] : vector of N received symbols corresponding to first encoder parity bits in x1p

y2p = [y12p, …, yk

2p, …,yN2p] : vector of N received symbols corresponding to second encoder parity bits in x2p

u* = [u1*, …, uk*, …,uN*] : vector of N turbo decoder decision bits corresponding to u

13

k k*k

1 Pr u 1 Pr u 1u

1 otherwise

y y Pr[uk|y] is known as the a posteriori probability

of kth information bit

RSC Encoder

1

RSC Encoder

2

MU

X

π

Channel

DEM

UX

TurboDecoding

u=[u1, …, uk, …,uN]

x y

ys

y1p

y2p

u*

N-bitInterleaver

xs

x1p

x2p

© Tallal Elshabrawy

Log-Likelihood Ratio (LLR)

The LLR of an information bit uk is given by:

14

kk

k

Pr u 1L u ln

Pr u 1

Given that Pr[uk=+1]+Pr[uk=-1]=1

kL uk

k

Pr u 1e

1 Pr u 1

kL uk kPr u 1 e 1 Pr u 1

k

k k

L u

k L u L u

e 1Pr u 1

1 e 1 e

k

k

L u

k L u

ePr u 1

1 e

© Tallal Elshabrawy

Maximum A Posteriori Algorithm

15

k

k

k

Pr u 1L u ln

Pr u 1

yy

y

kk k

1 L u 0u * sign L u

1 otherwise

yy

RSC Encoder

1

RSC Encoder

2

MU

X

π

Channel

DEM

UX

TurboDecoding

u=[u1, …, uk, …,uN]

x y

ys

y1p

y2p

u*

N-bitInterleaver

xs

x1p

x2p

© Tallal Elshabrawy

Some Definitions & Conventions

16

s 1p 2p s 1p 2pa a a b b b[y ,y ,y ,...,y ,y ,y ]b

ays0

s1

s2

s3

s0

s1

s2

s3

Sk-1 Skuk yk

p

RSC Encoder

1

RSC Encoder

2

MU

X

π

Channel

DEM

UX

TurboDecoding

u=[u1, …, uk, …,uN]

x y

ys

y1p

y2p

u*

N-bitInterleaver

xs

x1p

x2p

uk=+1

uk=-1

k 1 ks',s S s',S s

k k 1 k s ',sPr S s S s' Pr u

N1y y

© Tallal Elshabrawy

Derivation of LLR

17

Define S(i) as the set of pairs of states (s’,s) such that the transition from Sk-1=s’ to Sk=s is caused by the input uk=i, i=0,1, i.e.,

S(0) ={(s0 ,s0), (s1 ,s3), (s2 ,s1), (s3 ,s2)}

S(1) ={(s0 ,s1), (s1 ,s2), (s2 ,s0), (s3 ,s3)}

( i )k k 1 k

S

Pr u i Pr S s',S s N N

1 1y y Remember Bayes’ Rule

Pr A,BPr B A

Pr A

( i )k 1 k

Sk

Pr S s',S s,

Pr u iPr

N1

N1 N

1

y

yy

s0

s1

s2

s3

s0

s1

s2

s3

Sk-1 Skuk yk

p

uk=+1

uk=-1

© Tallal Elshabrawy

Derivation of LLR

18

Define

k k 1 kσ s',s Pr S s',S s, N1y

k

k

k

Pr u 1L u ln

Pr u 1

N

1N

1 N

1

yy

y

( i )k 1 k

Sk

Pr S s',S s,

Pr u iPr

N1

N1 N

1

y

yy

(1) (1)

(0 )(0 )

k kS S

kkk

SS

σ s',s Pr σ s',s

L u ln lnσ s',sσ s',s Pr

N1

N

1 N1

y

yy

© Tallal Elshabrawy

Derivation of LLR

19

k k 1 kσ s',s Pr S s',S s, N1y

k k 1 k kσ s',s Pr S s',S s, ,y , k-1 N1 k+1y y

k k 1 k k k 1 k kσ s',s Pr S s',S s, ,y Pr S s',S s, ,y

N k-1 k-1k+1 1 1y y y

depends only on Sk and is independent on Sk-1, yk

and

k-11yN

k+1y

k k 1 k k kσ s',s Pr S s',S s, ,y Pr S s k-1 N1 k+1y y

k k 1 k k k 1 kσ s',s Pr S s', Pr S s,y S s', Pr S s

k-1 k-1 N1 1 k+1y y y

k k 1 k k k 1 kσ s',s Pr S s', Pr S s,y S s' Pr S s k-1 N1 k+1y y

Sk , yk depends only on Sk-1 and is independent onk-11y

© Tallal Elshabrawy

Derivation of LLR

20

k k 1 k k k 1 kσ s',s Pr S s', Pr S s,y S s' Pr S s k-1 N1 k+1y y

k kα s Pr S s, k1y

k kβ s Pr S s Nk+1y

k k k k 1γ s',s Pr S s,y S s'

k k 1 k kσ s',s α s' γ s',s β s

(1)

(0 )

k 1 k kS

kk 1 k k

S

α s' γ s',s β s

L u lnα s' γ s',s β s

N

1y

© Tallal Elshabrawy

Derivation of LLR

21

S0 Sk-1 Sk Sk+1 SN

y= y1 yk-1 yk yk+1 yN

k 1α s' kγ s',s kβ s

s0

s1

s2

s3

© Tallal Elshabrawy

Computation of αk(s)

22

k k k 1 kall s '

α s Pr S s, Pr S s',S s, k k1 1

y y

k 0 k 0 k 1 k 0all s '

α s Pr S s , Pr S s',S s , k k1 1

y y

Example

k 0 k 0 k 1 0 k 0 k 1 2 k 0α s Pr S s , Pr S s ,S s , Pr S s ,S s , k k k1 1 1y y y

k k 1 k k k 1all s '

α s Pr S s', Pr S s,y S s', k-1 k-1

1 1

y y

k k 1 k k k 1all s '

α s Pr S s', Pr S s,y S s' k-11

y

k k 1 kall s '

α s α s' γ s',s

© Tallal Elshabrawy 23

k k 1 kall s '

α s α s' γ s',s

Given the values of γk(s’,s) for all index k, the probability αk(s) can be forward recursively computed. The initial condition α0(s) depends on the initial state of the convolutional encoder

Computation of αk(s)

Forward Recursive Equation

0 0 0α s Pr S s, Pr S s 01y

0 0

0 0

α s 1

α s 0 s s

The encoder usually starts at state 0

© Tallal Elshabrawy

Backward Recursive Equation

Computation of βk(s)

24

k kβ s Pr S s Nk+1y

k k 1 k 1 k k k 1 k 1s ''

β s Pr S s'',y S s Pr S s,S s'',y Nk+2y

k k 1 k 1 k 1 ks ''

β s Pr S s'' Pr S s'',y S s Nk+2y

k k 1 k 1s ''

β s β s'' γ s,s''

Given the values of γk(s’,s) for all index k, the probability βk(s) can be backward recursively computed. The initial condition βN(s) depends on the final state of the trellis

First encoder usually finishes at state s0 N 0 Nβ s 1, β s 0 s 0

Second encoder usually has open trellis Nβ s 1 s

© Tallal Elshabrawy

Computation of γk(s’,s)

25

k k k k 1γ s',s Pr S s,y S s'

k k k 1 k k k 1γ s',s Pr y S s',S s Pr S s S s'

ak k k 1 k kγ s',s Pr y S s',S s Pr u

ak k k kγ s',s Pr y x Pr u

Note thats p s p

k k k k k ky y y ,x x x

s s p pk k k k k kPr y x Pr y x Pr y x

© Tallal Elshabrawy

Computation of γk(s’,s)

26

2 2s s p pk k k k

2 2

y x y x

2σ 2σk k 2 2

1 1Pr y x e e

2πσ 2πσ

2 2s s p pk k k k

2 2

y x y x

2σ 2σk k 2

1Pr y x e e

2πσ

kak

k

Pr u 1L u ln

Pr u 1

ak

a ak k

L ua a

k kL u L u

1 ePr u 1 ,Pr u 1

1 e 1 e

ak a

k ka

k

L u 2u L u 2a

k L u

ePr u e

1 e

2 2s s p pak k k k

k a2 2 k ka

k

y x y x L u 2u L u 22σ 2σ

k 2 L u

1 eγ s',s e e e

2πσ 1 e