Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

21
Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Transcript of Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Page 1: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Chapter 1

Introduction to Digital and Analog Communication Systems

(Sections 1 – 9)

Page 2: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Communication Model

Source Transmitter Channel Receiver Destination

SignalProcessing

Communication(Transmitting)

SignalProcessing

Communication(Receiving)

ComputerTelephoneHumanCameraServer

ModemLaserMicrowave TransmitterMicrophone

WireAirOptical fiberWater

ModemPhotodetectorMicrowave DishMicrophoneRadio

ComputerHumanSpeakerVideo screenServer

Definition: A communication system is a system which transmits information from a source to a destination.

Page 3: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Goals of ESE 471

• Understand how communication systems work.

• Develop mathematical models for methods and components of communication systems.

• Analyze performance of communication systems and methods.

• Understand practical systems in use.

Page 4: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)
Page 5: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)
Page 6: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

• A digital information source produces a finite set of possible messages

• An analog information source produces messages that are defined on a continuum.

• A digital communication system transfers information from a digital source to the intended receiver.

• An analog communication system transfers information from an analog source to the intended receiver.

• A signal is a measurable quantity (e.g., voltage) which bears information.

• Noise is a measurable quantity which carries undesired interference.

Definitions

Page 7: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Why Digital?

• Less expensive circuits

• Privacy and security

• Small signals (less power)

• Converged multimedia

• Error correction and reduction

Page 8: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Why Not Digital?

• More bandwidth

• Synchronization in electrical circuits

• Approximated information

Page 9: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Couch, Digital and Analog Communication Systems, Seventh Edition ©2007 Pearson Education, Inc. All rights reserved. 0-13-142492-0

Realistic Communication Model

Noise and errors make communication inaccurate.The goal is to achieve , or make them as close as possible.m(t) is the original signal at baseband.s(t) is called the transmitted signal at carrier frequency.

)( )( ~ tmtm

Page 10: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Wireless Frequency Allocations in the U.S.A.

Page 11: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)
Page 12: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Modes of Wireless Wave Propagation

Page 13: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Information

• Electronic communication systems are inherently analog.

• The process of communication is inherently discrete.

• Limitations– Noise or uncertainty– Changes in system characteristics

Page 14: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Measure of Information

• The amount of information sent from a digital source when the jth message is transmitted is given by:

where Pj is the probability of transmitting the jth message. (0 Pj 1, Ij 0)

• Average information measure of a digital source is given by:

where m is the number of possible different source messages. H is called the entropy. H is maximum when all messages are equally likely.

bits 1

log2

jj P

I

bits 1

log 211

j

m

jj

m

jjj P

PIPH

Page 15: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

• The redundancy of a source is H / Hmax.e.g., parity bit adds redundancy to source with 8 bit output.

• Deterministic signal processing never creates information.• If the source generates independent messages once every T

seconds, then the source rate R is:

R can be viewed as the average information per second.second / bits T

HR

Page 16: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Example 1-1

Digital words consisting of 12 digits and each digits can be one of 4 equally likely possibilities.

Each word, j, has probability of occurrence, Pj:

Then:

12

12 4

1

4

1

jP

bits 2424 4

1

1

12

m

j

H

bits 24bits

41

1log 122

jI

Page 17: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Another Example: 2 Dice

Suppose we are throwing 2 dice simultaneously and sum the two numbers. The outcome, x, can be 2 x 12.

The possible scenarios for these outcomes are:x = 2: (1, 1) Pr (x =2) = 1/36x = 3: (1, 2), (2, 1) Pr (x =3) = 2/36x = 4: (1, 3), (2, 2), (3, 1) Pr (x =4) = 3/36x = 5: (1, 4), (2, 3), (3, 2), (4, 1) Pr (x =5) = 4/36x = 6: (1, 5), (2, 4), (3, 3), (4, 2), (5, 1) Pr (x =6) = 5/36x = 7: (1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1) Pr (x =7) = 6/36x = 8: (2, 6), (3, 5), (4, 4), (5, 3), (6, 2) Pr (x =8) = 5/36x = 9: (3, 6), (4, 5), (5, 4), (6, 3) Pr (x =9) = 4/36x = 10: (4, 6), (5, 5), (6, 4) Pr (x =10) = 3/36x = 11: (5, 6), (6, 5) Pr (x =11) = 2/36x = 12: (6, 6) Pr (x =12) = 1/36

Page 18: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

bits 27.2

3.58 36

12.89

36

22.48

36

32.20

36

4

1.97 36

51.79

36

61.97

36

52.20

36

42.48

36

32.89

36

2 3.58

36

1

bits 58.3 36log

3611

log bits 89.2 18log

3621

log

bits 48.2 12log

3631

log bits 20.2 9log

3641

log bits 97.1 2.7log

3651

log

bits 79.1 6log

3661

log bits 97.1 2.7log

3651

log bits 20.2 9log

3641

log

bits 48.2 12log

3631

log bits 89.2 18log

3621

log bits 3.58 36log

3611

log

22122211

2210229228

227226225

224223222

H

H

II

III

III

III

Page 19: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Dice Example Continued

• H = 2.27 bits implies that in order to communicate the outcome of throwing two dice, it requires a minimum of 2.27 bits. How can we achieve this?

• For example, when we get a certain x, send a binary sequence corresponding to it as shown below.x = 2: 11010 5 bits with Pr (x =2) = 1/36 x = 3: 11011 5 bits with Pr (x =3) = 2/36x = 4: 1100 4 bits with Pr (x =4) = 3/36x = 5: 000 3 bits with Pr (x =5) = 4/36x = 6: 001 3 bits with Pr (x =6) = 5/36x = 7: 10 2 bits with Pr (x =7) = 6/36x = 8: 010 3 bits with Pr (x =8) = 5/36x = 9: 011 3 bits with Pr (x =9) = 4/36x = 10: 1110 4 bits with Pr (x =10) = 3/36x = 11: 11110 5 bits with Pr (x =11) = 2/36x = 12: 11111 5 bits with Pr (x =12) = 1/36

Average bits needed per message = 3.33 bits. Can you think of any other scheme?

Page 20: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Channel Capacity

In 1948, Claude Shannon developed a mathematical model for channel such that, insofar as the model is realistic, there exists an upper limit on the rate at which information can be transmitted from source to user error-free. This upper limit is called the channel capacity, C. It is a function of bandwidth (B) and signal-to-noise ration (S/N or SNR).

N

SBC 1log2

Page 21: Chapter 1 Introduction to Digital and Analog Communication Systems (Sections 1 – 9)

Coding

Coding Theory is the study of preprocessing the source information such that the channel capacity can be achieved (or get as close to it as possible).

coding