Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have...

28
Basic Principles (continuation) 1

Transcript of Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have...

Page 1: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Basic Principles

(continuation)

1

Page 2: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

A Quantitative Measure of Information

• As we already have realized, when a statistical experiment has n eqiuprobable outcomes, the average amount of information associated with an outcome is log n

2

Page 3: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

A Quantitative Measure of Information

• Let us consider a source with a finite number of messages and their corresponding transmission probabilities

• The source selects at random each one of these messages. Successive selections are assumed to be statistically independent.

• P{xk} is the probability associated with the selection of message xk:

3

1 2, ,..., kx x x

1 2{ }, { },..., { }kP x P x P x

Page 4: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

A Quantitative Measure of Information

• The amount of information associated with the transmission of message xk is defined as

• Ik is called the amount of self-information of the message xk.

• The average information per message for the source is

4

logk kI P x

1

statistical average of logn

k k kk

I I P x P x

Page 5: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

A Quantitative Measure of Information

• If a source transmits two symbols 0 and 1 with equal probability then the average amount of information per symbol is

• If the two symbols were transmitted with probabilities α and 1- α then the average amount of information per symbol is

5

1 1 1 1log log 1 bit

2 2 2 2I

log (1 ) log(1 )I

Page 6: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

ENTROPY

• The average information per message I is also referred to as the entropy (or the communication entropy) of the source. It is usually denoted by the letter H.

• The entropy of a just considered simple source is

• (p1, p2, …, pn) refers to a discrete complete probability scheme.

1 2 1 1 2 2, ,..., log log ... logn n np p p p p pH p p p

6

Page 7: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Basic Concepts of Discrete Probability

Elements of the Theory of Sets

7

Page 8: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Background

• Up to 1930s a common approach to the probability theory was to set up an experiment or a game to test some intuitive notions.

• This approach was very contradictory because it was not objective, being based on some subjective view.

8

Page 9: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Background

• Suppose, two persons, A and B, play a game of tossing a coin. The coin is thrown twice. If a head appears in at least one of the two throws, A wins; otherwise B wins.

• Solution?

9

Page 10: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Background

• The simplest intuitive approach leads to the 4 possible outcomes: (HH), (HT), (TH), (TT). It follows from this that chances of A to win are 3/4, since a head occurs in 3 out of 4 cases.

• However, the different reasoning also can be applied. If the outcome of the first throw is H, A wins, and there is no need to continue. Then only 3 possibilities need be considered: (H), (TH), (TT), and therefore the probability that A wins is 2/3.

10

Page 11: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Background

• This example shows that a good theory must be based on the axiomatic approach, which should not be contradictory.

• Axiomatic approach to the probability theory was developed in 1930s-1940s. The initial approach was formulated by A. Kolmogorov.

• To introduce the fundamental definitions of the theory of probability, the basic element of the theory of sets must first be introduced.

11

Page 12: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Sets

• The set, in mathematics, is any collection of objects of any nature specified according to a well-defined rule.

• Each object in a set is called an element (a member, a point). If x is an element of the set X, (x belongs to X) this is expressed by

• means that x does not belong to Xx X

x X

12

Page 13: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Sets• Sets can be finite (the set of students in the class),

infinite (the set of real numbers) or empty (null - a set of no elements).

• A set can be specified by either giving all its elements in braces (a small finite set) or stating the requirements for the elements belonging to the set.

• X={a, b, c, d}• X={x}|x is a student taking the “Information theory”

class

13

Page 14: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Sets• is the set of integer numbers• is the set of rational numbers• is the set of real numbers• is the set of complex numbers• is an empty set• is a set whose single element is an empty

set

X ZX QX RX C

X

14

Page 15: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Sets

• What about a set of the roots of the equation

• The set of the real roots is empty:• The set of the complex roots is ,

where i is an imaginary unity

22 1 0?x

/ 2, / 2i i

15

Page 16: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Subsets

• When every element of a set A is at the same time an element of a set B then A is a subset of B (A is contained in B):

• For example,

A BB A

, , , Z Q Z R Q R R C

16

Page 17: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Subsets• The sets A and B are said to be equal if they consist

of exactly the same elements.• That is,• For instance, let the set A consists of the roots of

equation

• What about the relationships among A, B, C ?

,A B B A A B

2( 1)( 4)( 3) 0

2, 1,0,2,3

| ,| | 4

x x x x

B

C x x x

Z

17

Page 18: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Subsets

A C

B C

A BA B

B A

18

Page 19: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Universal Set

• A large set, which includes some useful in dealing with the specific problem smaller sets, is called the universal set (universe). It is usually denoted by U.

• For instance, in the previous example, the set of integer numbers can be naturally considered as the universal set.

Z

19

Page 20: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Union

• Let U be a universal set of any arbitrary elements and contains all possible elements under consideration. The universal set may contain a number of subsets A, B, C, D, which individually are well-defined.

• The union (sum) of two sets A and B is the set of all those elements that belong to A or B or both:

A B20

Page 21: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Union{ , , , }; { , }; { , , , , , }

{ , , , }; { , , , }; { , , , , , }

{ , , , }; { , }; { , , , }

A a b c d B e f A B a b c d e f

A a b c d B c d e f A B a b c d e f

A a b c d B c d A B a b c d A

Important property:

B A A B A

21

Page 22: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Intersection

• The intersection (product) of two sets A and B is the set of all those elements that belong to both A and B (that are common for these sets):

• When the sets A and B are said to be mutually exclusive.

A B

A B

22

Page 23: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Intersection{ , , , }; { , };

{ , , , }; { , , }; { , }

{ , , , }; { , }; { , }

A a b c d B e f A B

A a b c d B c d f A B c d

A a b c d B c d A B c d B

Important property:

B A A B B

23

Page 24: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Difference

• The difference of two sets B and A (the difference of the set A relative to the set B ) is the set of all those elements that belong to the set B but do not belong to the set A:

or /B A B A

24

Page 25: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Operations on Sets: Complement

• The complement (negation) of any set A is the set A’ ( ) containing all elements of the universe that are not elements of A.

A

25

Page 26: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Algebra of Sets

• Let A, B, and C be subsets of a universal set U. Then the following laws hold.

• Commutative Laws:• Associative Laws:

• Distributive Laws:

;A B B A A B B A

( ) ( )

( ) ( )

A B C A B C

A B C A B C

( ) ( ) ( )

( ) ( ) ( )

A B C A B A C

A B C A B A C

26

Page 27: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Algebra of Sets

• Complementary:

• Difference Laws:

A A U

A A

A U U

A U A

A A

A

( ) ( )

( ) ( )

A B A B A

A B A B

A B A B

27

Page 28: Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.

Algebra of Sets

• De Morgan’s Law (Dualization):

• Involution Law: • Idempotent Law: For any set A:

A B A B

A B A B

A A

A A A

A A A

28