ETSF15 Physical layer communication - eit.lth.se · PAM (Pulse Amplitude Modulation) § NRZ and...

38
ETSF15 Physical layer communication Stefan Höst

Transcript of ETSF15 Physical layer communication - eit.lth.se · PAM (Pulse Amplitude Modulation) § NRZ and...

ETSF15Physical layer communication

Stefan Höst

Physicallayer

• Analogvsdigital(Previouslecture)

• Transmissionmedia

• Modulation

• Represent digitaldatainacontinuousworld

• Disturbances,Noiseanddistortion

• Information

2

TransmissionmediaGuidedmedia

• Fibre opticcable

• Twistedpaircoppercables

• Coaxcable

Unguidedmedia

• Radio

• Microwave

• Infrared

Fibreoptic

• Transmissionisdonebylightin

aglasscore(verythin)

• Totalreflectionfromcoreto

cladding

• Bundleseveralfibres inone

cable

• Notdisturbedbyradiosignals

❚4

Opticalnetworkarchitekture

Pointtopoint

• Twonodesareconnected byonededicated fibre

Pointtomulti-point

• Onepoint isconnected toseveralendnodes

• PON(PassiveOpticalNetwork)

❚5

Twistedpaircoppercables

Twocopperlinestwistedaroundeachother

• Twistingdecreasesdisturbances(andemission)

• Usedfor

• Telephony grid(CAT3)

• Ethernet (CAT5,CAT6andsometimes CAT7)

Coaxcable

Oneconductorsurroundedbyashield

• Usedfor

• Antennasignals

• Measurement instrumentations

Radiostructures§ Singleantennasystem

§ MIMO(MultipleInMultipleOut)

Frombitstosignals

§ Principlesofdigitalcommunications

Internet

Digitaldata

9

On-offkeying• SendonebitduringTb secondsandusetwo

signallevels,“on”and“off”,for1and0.

Ex.

t

s(t)

T 2T 3T

x=10010010101111100

A

0

a(t) = A ⋅ x 0 ≤ t ≤ Tb

10

Non-returntozero(NRZ)

• SendonebitduringTb secondsandusetwo

signallevels,+A and-A,for0and1.

Ex.

t

s(t)

T 2T 3T

x=10010010101111100

A

-A

a(t) = A ⋅(−1)x 0 ≤ t ≤ Tb

11

DescriptionofgeneralsignalWiththepulseform! " = $, 0 ≤ " < )*,thesignalscanbedescribedas

, " =- ./!(" − 2)*)/

12

Twosignalalternatives

On-offkeying

./ = 4/ ⇒ ,6 " = 0 and,7 " = !(")NRZ

./ = −1 9: ⇒ ,6 " = ! " and,7 " = −!(")

Manchestercoding

• Togetazeropassingineachsignaltime,splitthe

pulseshapeg(t) intwopartsanduse+/- as

amplitude.

Ex.

T

g(t)

t

A

T/2

-A

13

DifferentialManchestercoding§ Useazerotransition atthestarttoindicatethedata.

§ Foratransmitted 0thesamepulseasprevious slotis

used,while foratransmitted 1theinvertedpulse is

used,i.e../ = ./;7 −1 9:

14

PAM (Pulse Amplitude Modulation)

§ NRZandManchesterareformsofbinaryPAM

§ Thedataisstoredintheamplitudeand

transmittedwithapulseshapeg(t)

§ Graphicalrepresentation

15

a(t) = an ⋅g(t) an = (−1)x

M-PAM

§UseM amplitudelevelstorepresentk=log2(M)bits

t

s(t)

T 2T 3T

x=10 01 00 10 10 11 11 10 00

A

-A

3A

-3A

01:

10:

00:

11:

16

§Ex.Twobitspersignal(4-PAM)

M-PAM

§ Ex:4-PAM

17

§ Ex:8-PAM

Bandwidth of signal

§ Thebandwidth,W,isthe(positiveside)frequency

bandoccupiedbythesignal

§ Sofaronlybase-bandsignals(centeredaroundf=0)

18

Pass-band signal

§ Frequencymodulatethesignaltoacarrier

frequency<6

19

§ Thefollowingmultiplicationcentersthesignal

aroundthecarrierfrequency<6, " = . " = cos 2B<6"where .(") isabase-bandsignal

Modulationinfrequency⇥ =

# F

s(t)

t

�3A

�1A

1A

3A

cos(2⇡f0t)

t

sf0(t)

t

�3A

�1A

1A

3A

⇤ =S(f)

f

12 (�(f + f0) + �(f � f0))

f�f0 f0

12 (S(f + f0) + S(f � f0))

f�f0 f0

20

ASK(AmplitudeShiftKeying)

§ Useon-offkeyingatfrequencyf0.

§ Ex.

s(t) = xng(t − nT )cos(2π f0t)n∑

21

BPSK(BinaryPhaseShiftKeying)

§ UseNRZatfrequencyf0,butviewinformationin

phase

s(t) = (−1)xn g(t − nT )cos(2π f0t) = g(t − nT )cos(2π f0t + xnπ )n∑

n∑

t

s(t)

T 2T 3T

x=10010010101111100

A

-A22

M-QAM(QuadratureAmplitudeModulation)Usethatcos(2B<6") andsin(2B<6") areorthogonal(forhigh<0)tocombinetwoorthogonalM-PAM

constellations

g(t)cos

g(t)sin

g(t)cos

g(t)sin

23

OFDMOrthogonalFrequencyDivisionMultiplexing

§ NQAMsignalscombinedinanorthogonalmanner

§ Usedine.g.ADSL,VDSL,WiFi,DVB-C&T&H,LTE,etc

f

W

24

Idea of OFDM implementation

25 (x1,..., xN )∈!

N (y1,..., yN ) = IFFT (

!x)

QAM

QAM

QAM

(a1,...,aN )∈Z16N

QAMmapping

IFFT

Frequencydomain Timedomain

Some important parameters

§ Ts timepersymbol

§ Rs symbolpersecond

§ Es energypersymbol

26

k=bitpersymbol

§ Tb=Ts /k timeperbit

§ Rb=kRs bitpersecond[bps]

§ Eb=Es /k energyperbit

§ SNR,Signaltonoiseratio

♦averagesignalpowerrelativetonoisepower

§ W Bandwidth,frequencybandoccupiedbysignal

§ Bandwidthutilisation:bitspersecondperHz[bps/Hz]

ρ = RbW

Impairments on the communication channel (link)§ Attenuation

§ Multipathpropagation(fading)

§ Noise

27

y(t) = x(t)∗h(t)+ n(t)

Noisedisturbances• Thermalnoise(Johnson-Nyquist)

• Generated bycurrentinaconductor

• -174dBm/Hz(=3.98*10-18 mW/Hz)

• Impulsenoise(Oftenusergenerated,e.g.electricalswitches)

• Intermodulation noise(Fromothersystems)

• Cross-talk (Users inthesamesystem)

• Backgroundnoise(Misc disturbances)

https://en.wikipedia.org/wiki/Johnson-Nyquist_noise

28

Some Information TheoryEntropy

§ Discretecase:X discreterandomvariable

Entropyisuncertaintyofoutcome (fordiscrete case)

§ Continuouscase:X continuousrandomvariable

29

H (X) = E[− log2 p(X)]= − p(x)log2 p(x)x∑

H (X) = E[− log2 f (X)]= − f (x)log2 f (x)R∫ dx

Example Entropy

LetX beabinaryrandom

variablewith

P(X=0)=p

P(X=1)=1-p.

Thebinaryentropy

functionis

30

h(p) = − p log2 p − (1− p)log2(1− p)

ii

“InfoTheory” — 2016/1/13 — 15:20 — page 45 — #53 ii

ii

ii

3.2. Entropy 45

p

h(p)

1

1/2 1

Figure 3.1: The Binary entropy function.

The definition of the entropy is also valid for vectorised random variables, suchas (X, Y) with the joint probability function p(x, y).

DEFINITION 3.5 The joint entropy for a pair of random variables with the jointdistribution p(x, y) is

H(X, Y) = EXY⇥

� log p(X, Y)⇤

= �Âx,y

p(x, y) log p(x, y) (3.23)

Similarly, in the general case with an n-dimensional vector X = (X1, . . . , Xn),the joint entropy function is

H(X1, . . . , Xn) = EX

� log p(X)⇤

= �Âx

p(x) log p(x) (3.24)

EXAMPLE 3.6 Let X and Y be the outcomes from two independent fair dice.Then the joint probability is P(X, Y = x, y) = 1/36 and the joint entropy

H(X, Y) = �Âx,y

136

log1

36= log 36 = 2 log 6 ⇡ 5.1699 (3.25)

Clearly, the uncertainty of the outcome of two dice is twice the uncertainty ofone die.

Let Z the sum of the dice, Z = X + Y. The probabilities are shown in thefollowing table

Z 2 3 4 5 6 7 8 9 10 11 12P(Z) 1

362

363

36436

536

636

536

436

336

236

136

Compression

Theentropysetsalimitonthecompressionratio

§ ConsiderasourceforX withN symbolsandthe

distributionP(N).Inaverageasymbolmustbe

representedbyatleastH(P) bits.

§ Wellknowncompressionalgorithmsarezip,gz,

png,Huffman

§ Lossy compressione.g.jpeg andmpeg

31

Some more Information TheoryMutual information

§ LetX andY betworandomvariables

§ TheinformationaboutX byobservingY isgiven

by

32

I(X;Y ) = E log2P(X,Y )P(X)P(Y )

⎡⎣⎢

⎤⎦⎥

§ Thisgives

I(X;Y ) = H (X)+ H (Y )− H (X,Y )

Example Mutual Information

P(X,Y) Y=0 Y=1X=0 0 3/4X=1 1/8 1/8

33

TherandomvariablesX andY hasthejointdistribution

That givesandand

P(X = 0) = 3 / 4 P(X = 1) = 1/ 4P(Y = 0) = 1/ 8 P(Y = 1) = 7 / 8

Entropies: H (X) = h( 14 ) = 0.8114H (Y ) = h( 18 ) = 0.5436H (X,Y ) = − 3

4 log 34 − 1

8 log 18 − 1

8 log 18 = 1.0613

Information: I(X;Y ) = H (X)+ H (Y )− H (X,Y ) = 0.2936

Some Information TheoryChannel capacity

§ Thechannelisamodelofthetransmissionlink.

§ TransmitX andreceiveY.Howmuchinformation

canthereceivergetfromthetransmitter?

§ Thechannelcapacity isdefinedas

34

C = maxp(x )

I(X;Y )

AWGNAdditive White Gaussian Noise channel

§ LetX bebandlimited inbandwidthW

§ ,where

§ Thecapacityis

[bps]

§ whereP isthepowerofX, i.e.E[X2]=P.

§ Itisnotpossibletogethigherdatarateonthis

channel!

35

N ∼ N 0, N0 / 2( )Y = X + N

C =W log2 1+P

N0W⎛⎝⎜

⎞⎠⎟

AWGN Example (VDSL)

§ Considerachannelwith

E = 17GHIJK = −60MNO/HIQ6 = −145MNO/HI

§ PowermW

§ NoisemW/Hz

§ CapacityMbps

36

P = 10−60/10 ⋅17 ⋅106

N0 = 10−145/10

C =W log 1+ PN0W( ) =W log 1+ 10−60/10

10−145/10( ) = 480

Shannon’s fundamental limit

§ PlotcapacityvsW

§ Istherealimit?

37

ii

“InfoTheory” — 2016/1/13 — 15:20 — page 253 — #261 ii

ii

ii

9.3. Fundamental Shannon limit 253

W

C

Figure 9.7: Capacity as a function of the bandwidth W for a fixed total signalpower P.

where Es is the average energy per transmitted symbol, Eb the average energyper information bit and Tb = Ts/k the average transmission time for each infor-mation bit. The variable Eb is very important since it can be compared betweendifferent systems, without having the same number of bits per symbol or eventhe same coding rate. Hence, a system independent signal to noise ratio can bederived as SNR = Eb/N0. From (9.53) the energy per information bit can bewritten as

Eb =PTs

k(9.54)

Since the achieved bit rate can be expressed as Rb = kTs

, the ration between C•and the bit rate Rb can be written

C•Rb

=P/N0ln 2

Tsk

=Eb/N0

ln 2> 1 (9.55)

where it is used that reliable communication requires C• > Rb. Rewriting theabove, concludes that for reliable communication

EbN0

> ln 2 = 0.69 = �1.59 dB (9.56)

The value �1.6 dB is a well known bound in communication theory, and isoften referred to as the Shannon limit or the fundamental limit. It constitutea hard limit for when it is possible to achieve reliable communication. If theSNR is less than this limit it is not possible to reach error probability that tendsto zero, independent of what system is used. To see the demands this limitputs on the system, the capacity formula will be rewritten to show a limit onthe bandwidth efficiency Rb/W in the used system. Often the bandwidth is alimited resource and then to reach a high data rate the system should have ahigh bandwidth efficiency.

§ LetW->∞

§ WithEb=PTb andTs=kTb

C∞ = limW→∞

W log 1+ P/N0W( )

= limW→∞

log 1+ P/N0W( )W = logeP/N0 = P / N0

ln2

C∞

Rb= Eb / N0

ln2>1

Eb

N0

> ln2 = −1.59dB

§ Whichgivesthefundamental

limit

AWGN with attenuation

§ LetX bebandlimited inbandwidthW

§ LetG beattenuationonchannel,G<1

§ Thecapacityis

[inbit/s]

38

C =W log2 1+|G |2 PN0W

⎛⎝⎜

⎞⎠⎟