Signal Coding Estimation Theory 2011

download Signal Coding Estimation Theory 2011

of 7

Transcript of Signal Coding Estimation Theory 2011

  • 8/12/2019 Signal Coding Estimation Theory 2011

    1/7

    P.T.O.

    [3963] 286

    T.E. (Electronics and Telecommunication) (Semester II) Examination, 2011SIGNAL CODING AND ESTIMATION THEORY (New)

    (2008 Pattern)

    Time : 3 Hours Max. Marks : 100

    Instructions : i) Answer three questions from Section I andthree questions

    from Section II.

    ii) Answer to the two Sections should be written in separate

    answer books.

    iii) Neat diagrams must be drawn wherever necessary.

    iv) Assume suitable data ifnecessary.

    v) Use of Electronic Pocket Calculator isallowed.

    vi) Figures to the right indicate full marks.

    SECTION I

    1. a) Describe LZW (Lempel-Ziv-Welch) algorithm to encode byte streams. 4

    b) A zero memory source emits six messages (m1, m2, m3, m4, m5, m6) with

    probabilities (0.30, 0.25, 0.15, 0.12, 0.10, 0.08) respectively. Find :

    i) Huffman code. 3

    ii) Determine its average word length. 3

    iii) Find entropy of the source. 3

    iv) Determine its efficiency and redundancy. 3

    OR

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    2/7

    [3963] 286 -2-

    2. a) Explain the Mutual Information. 4

    b) A zero memory source emits six messages (N, I, R, K, A, T) with probabilities

    (0.30, 0.10, 0.02, 0.15, 0.40, 0.03) respectively. Given that A is coded as

    0. Find :

    i) Entropy of source. 4

    ii) Determine Shannon Fano Code and Tabulate them. 4

    iii) What is the original symbol sequence of the Shannon Fano coded signal

    (110011110111111110100). 4

    3. a) Explain with the help of neat diagram JPEG algorithm. 6

    b) For a Gaussian Channel

    C = B.log2((1 + (Eb/N0) (C/B))

    i) Find Shannon limit.

    ii) Draw the bandwidth efficiency diagram with (Eb/N0) dB on horizontal axis

    and (Rb/B) on vertical axis. Mark different regions and Shannon limit on

    the graph. 10

    OR

    4. a) A voice grade channel of the telephone network has a bandwidth of 3.4 kHz.Calculate the information capacity of the telephone channel for SNR of

    30 dB. 6

    b) Find a generator polynomial g(x) for a systematic (7, 4) cyclic code and find

    the code vectors for the following data vectors : 1010, 1111, 0001 and 1000.

    Given that x7+ 1 = (x + 1) (x3+ x + 1) (x3+ x2+ 1). 10

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    3/7

    -3- [3963] 286

    5. For systematic rate 21 convolutional code n = 2, k = 1 and constraint length K = 2;

    parity bit is generated by the mod-2 sum of the SR output as P = X + 1 that is

    g(1, 1) = (1 1).

    1) Draw the figure of convolutional encoder and decoder. 4

    2) Find out the output for message string {10110...}. 4

    3) Draw state diagram. 4

    4) Explain Viterbi algorithm for decoding. 6

    OR

    6. A convolutional encoder is rate 21 , constraint length K = 3, it uses two paths to

    generate multiplexed output. It consists of two mod-2 adder and two SR. The

    path 1 has g1(D) = (1 + D2) and path 2 has g2(D) = (1 + D + D2).

    1) Draw encoder diagram. 6

    2) Draw the State diagram. 6

    3) Find out the output for message input of (1 0 0 1 1). 6

    SECTION II

    7. Consider the decoding of (15, 5) error correcting BCH code with generator

    polynomial g(x) having 65432 ,,,,, as roots. The roots 42 ,, have

    the same minimum polynomial

    4421 1 XX)X()X()X( ++=== ...

    The roots 63 and have same minimum polynomial

    ...XXXX)X()X(432

    63 1 ++++==

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    4/7

    [3963] 286 -4-

    The minimum polynomial of5

    is2

    5 1 XX)X( ++=

    i) Find g(x) as LCM )}X(),X(),X({ 531 . 6

    ii) Let the received word be (0 0 0 1 0 1 0 0 0 0 0 0 1 0 0) that is r(x) = x3+ x5+ x12

    find the syndrome components given that1096

    1 =++ and51812

    1 =++ . 6

    iii) Through iterative procedure if the error location polynomial is356

    1 xx)x()x( ++== ... having roots 12103 ,, ... What are the error

    location number and error pattern e(x) ? 6

    OR

    8. a) Write RSA algorithm for generating public key and private key for encryption

    and decryption of plain text. 6

    b) Plain text was encrypted using RSA key (Kp = 33, 3). English alphabets

    (A, B.. upto Z) are numbered as (1, 2.. upto 26) respectively. The encrypted

    Ciphertext (C) transmitted as (28, 21, 20, 1, 5, 5, 26). The received signals are

    decrypted using key (Ks = 33, 7). Find out the symbols i.e. alphabets after

    decryption. Given algorithm to avoid exponentation operation. 12

    C : = 1; begin for i = 1 to E do

    C : = MOD (C . P, N); end. Where E is exponent ?

    9. a) Let Y1, Y2, ..., Yk be the observed random variables, such that

    Yk = a + bxk + Zk, k = 1, 2, ..., K

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    5/7

    The constants xk, k = 1, 2, ..., K are known, while the constants a and b arenot known. The random variables Zk, k = 1, 2, ..., K, are statistically

    independent, each with zero mean and variance 2 known. Obtain the MLestimate of (a, b). 10

    Given the likelihood function is :

    ( )

    +

    = =

    k

    kkkk

    )]bxa(y[exp)b,a(L1

    22

    2

    1

    2

    1

    b) Let Y1and Y2be two statistically independent Gaussian random variables,

    such that E[Y1] = m, E[Y2] = 3m, and var[Y1] = var[Y2] = 1; m is unknown.

    Obtain the ML estimates of m. 6

    OR

    10. a) Consider the problem where the observation is given by Y = ln X + N, where

    X is the parameter to be estimated. X is uniformly distributed over the interval

    [0, 1] and N has an exponential distribution given by

    otherwise,

    n,e)n(fn

    N

    0

    0

    =

    =

    Obtain the mean-square estimate, x^ms. 10

    b) Let Y1, Y2, ..., Yk be K independent variables with P(Yk = 1) = p and

    P(Yk = 0) = 1 p, where p, 0 < p < 1 is unknown

    Determine the lower bound on the variance of the estimator, assuming that the

    estimator is unbiased. Given that : 6

    ==

    === =

    =otherwise

    K...,,kand,,)p(p)p|y(f)p|y(f)p(L

    K

    kk

    ykkK

    kkp|T

    k

    0

    21101

    1

    1

    1

    1

    = pKy(1 p)K-Ky; since the Yks are iid.

    -5- [3963] 286

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    6/7

    11. a) A rectangular pulse of known amplitude A is transmitted starting at time instant

    t0 with probability 1/2. The duration T of the pulse is a random variable

    uniformly distributed over the interval [T1, T2]. The additive noise to the

    pulse is white Gaussian with mean zero and variance N0/2. Determine the likelihood

    ratio. 10

    b) In a binary detection problem, the transmitted signal under hypothesis H1 is

    either s1(t) or s2(t), with respective probabilities P1and P2. Assume P1 = P2 = 1/2,

    and s1(t) and s2(t) orthogonal over the observation time ]T,[t 0 . No signal

    is transmitted under hypothesis H0. The additive noise is white Gaussian with

    mean zero and power spectral density N0/2. Obtain the optimum decision

    rule, assuming minimum probability of error criterion and P(H0) = P(H1) = 1/2. 6

    OR

    12. In a simple binary communication system, during energy T seconds, one of two

    possible signals s0(t) and s1(t) is transmitted. Our two hypotheses are

    H0: s0(t) was transmitted

    H1 : s1(t) was transmitted

    We assume that s0(t) = 0 and s1(t) = 1 0 < t < T

    The communication channel adds noise n(t), which is a zero-mean normal random

    process with variance 1. Let x(t) represent the received signal :

    x(t) = si(t) + n(t) i = 0, 1

    [3963] 286 -6-

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/
  • 8/12/2019 Signal Coding Estimation Theory 2011

    7/7

    We observe the received signal x(t) at some instant during each signaling interval.Suppose that we received an observation X = 0.6.

    a) Using the maximum likelihood test, determine which signal is transmitted. The

    pdf of x under each hypothesis is given by 10

    20

    2

    2

    1 /xe)H|x(f

    =

    211

    2

    2

    1 /)x(e)H|x(f

    =

    b) Derive the Neyman-Pearson test. 6

    B/I/11/5,845

    -7- [3963] 286

    http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/http://www.stupidsid.com/