04 ch ken black solution

24
Chapter 4: Probability 1 Chapter 4 Probability LEARNING OBJECTIVES The main objective of Chapter 4 is to help you understand the basic principles of probability, specifically enabling you to 1. Comprehend the different ways of assigning probability. 2. Understand and apply marginal, union, joint, and conditional probabilities. 3. Select the appropriate law of probability to use in solving problems. 4. Solve problems using the laws of probability, including the law of addition, the law of multiplication , and the law of conditional probability. 5. Revise probabilities using Bayes' rule. CHAPTER TEACHING STRATEGY Students can be motivated to study probability by realizing that the field of probability has some stand-alone application in their lives in such applied areas human resource analysis, actuarial science, and gaming. In addition, students should understand that much of the rest of the course is based on probabilities even though they will not be directly applying many of these formulas in other chapters. This chapter is frustrating for the learner because probability problems can be approached by using several different techniques. Whereas, in many chapters of this text, students will approach problems by using one standard technique, in chapter 4, different students will often use different approaches to the same problem. The text attempts to emphasize this point and underscore it by presenting several different ways to solve probability problems. The probability rules and laws presented in the chapter can virtually always be used in solving probability problems. However, it is sometimes easier to construct a probability matrix or a tree diagram or use the sample space to solve the problem. If the student is aware that what they have at their hands is an array of tools or techniques, they will be less overwhelmed in approaching a probability problem. An

description

 

Transcript of 04 ch ken black solution

Page 1: 04 ch ken black solution

Chapter 4: Probability 1

Chapter 4 Probability

LEARNING OBJECTIVES

The main objective of Chapter 4 is to help you understand the basic principles of probability, specifically enabling you to

1. Comprehend the different ways of assigning probability. 2. Understand and apply marginal, union, joint, and conditional probabilities. 3. Select the appropriate law of probability to use in solving problems. 4. Solve problems using the laws of probability, including the law of addition, the

law of multiplication , and the law of conditional probability. 5. Revise probabilities using Bayes' rule.

CHAPTER TEACHING STRATEGY

Students can be motivated to study probability by realizing that the field of probability has some stand-alone application in their lives in such applied areas human resource analysis, actuarial science, and gaming. In addition, students should understand that much of the rest of the course is based on probabilities even though they will not be directly applying many of these formulas in other chapters.

This chapter is frustrating for the learner because probability problems can be

approached by using several different techniques. Whereas, in many chapters of this text, students will approach problems by using one standard technique, in chapter 4, different students will often use different approaches to the same problem. The text attempts to emphasize this point and underscore it by presenting several different ways to solve probability problems. The probability rules and laws presented in the chapter can virtually always be used in solving probability problems. However, it is sometimes easier to construct a probability matrix or a tree diagram or use the sample space to solve the problem. If the student is aware that what they have at their hands is an array of tools or techniques, they will be less overwhelmed in approaching a probability problem. An

Page 2: 04 ch ken black solution

Chapter 4: Probability 2

attempt has been made to differentiate the several types of probabilities so that students can sort out the various types of problems.

In teaching students how to construct a probability matrix, emphasize that it is

usually best to place only one variable along each of the two dimensions of the matrix. (That is place Mastercard with yes/no on one axis and Visa with yes/no on the other instead of trying to place Mastercard and Visa along the same axis).

This particular chapter is very amenable to the use of visual aids. Students enjoy

rolling dice, tossing coins, and drawing cards as a part of the class experience. Of all the chapters in the book, it is most imperative that students work a lot of

problems in this chapter. Probability problems are so varied and individualized that a significant portion of the learning comes in the doing. Experience is an important factor in working probability problems.

Section 4.8 on Bayes’ theorem can be skipped in a one-semester course without

losing any continuity. This section is a prerequisite to the chapter 18 presentation of “revising probabilities in light of sample information (section 18.4).

CHAPTER OUTLINE

4.1 Introduction to Probability

4.2 Methods of Assigning Probabilities Classical Method of Assigning Probabilities Relative Frequency of Occurrence Subjective Probability

4.3 Structure of Probability Experiment Event Elementary Events Sample Space Unions and Intersections Mutually Exclusive Events Independent Events Collectively Exhaustive Events Complimentary Events Counting the Possibilities The mn Counting Rule Sampling from a Population with Replacement Combinations: Sampling from a Population Without Replacement

4.4 Marginal, Union, Joint, and Conditional Probabilities

Page 3: 04 ch ken black solution

Chapter 4: Probability 3

4.5 Addition Laws Probability Matrices Complement of a Union Special Law of Addition

4.6 Multiplication Laws General Law of Multiplication

Special Law of Multiplication

4.7 Conditional Probability Independent Events

4.8 Revision of Probabilities: Bayes' Rule

KEY TERMS

A Priori Intersection Bayes' Rule Joint Probability Classical Method of Assigning Probabilities Marginal Probability Collectively Exhaustive Events mn Counting Rule Combinations Mutually Exclusive Events Complement of a Union Probability Matrix Complementary Events Relative Frequency of Occurrence

Conditional Probability Sample Space Elementary Events Set Notation Event Subjective Probability Experiment Union Independent Events Union Probability

SOLUTIONS TO PROBLEMS IN CHAPTER 4

4.1 Enumeration of the six parts: D1, D2, D3, A4, A5, A6 D = Defective part A = Acceptable part Sample Space: D1 D2, D2 D3, D3 A5 D1 D3, D2 A4, D3 A6 D1 A4, D2 A5, A4 A5 D1 A5, D2 A6, A4 A6

Page 4: 04 ch ken black solution

Chapter 4: Probability 4

D1 A6, D3 A4, A5 A6 There are 15 members of the sample space The probability of selecting exactly one defect out of two is:

9/15 = .60

4.2 X = {1, 3, 5, 7, 8, 9}, Y = {2, 4, 7, 9} and Z = {1, 2, 3, 4, 7,} a) X ⊥ Z = {1, 2, 3, 4, 5, 7, 8, 9}

b) X _ Y = {7, 9} c) X _ Z = {1, 3, 7} d) X ⊥ Y ⊥ Z = {1, 2, 3, 4, 5, 7, 8, 9} e) X _ Y _ Z = {7} f) (X ⊥ Y) _ Z = {1, 2, 3, 4, 5, 7, 8, 9} _ {1, 2, 3, 4, 7} = {1, 2, 3, 4, 7} g) (Y _ Z) ⊥ (X _ Y) = {2, 4, 7} ⊥ {7, 9} = {2, 4, 7, 9} h) X or Y = X ⊥ Y = {1, 2, 3, 4, 5, 7, 8, 9} i) Y and Z = Y _ Z = {2, 4, 7}

4.3 If A = {2, 6, 12, 24} and the population is the positive even numbers through 30, A’ = {4, 8, 10, 14, 16, 18, 20, 22, 26, 28, 30} 4.4 6(4)(3)(3) = 216 4.5 Enumeration of the six parts: D1, D2, A1, A2, A3, A4 D = Defective part A = Acceptable part Sample Space: D1 D2 A1, D1 D2 A2, D1 D2 A3, D1 D2 A4, D1 A1 A2, D1 A1 A3, D1 A1 A4, D1 A2 A3, D1 A2 A4, D1 A3 A4, D2 A1 A2, D2 A1 A3, D2 A1 A4, D2 A2 A3, D2 A2 A4, D2 A3 A4, A1 A2 A3, A1 A2 A4, A1 A3 A4, A2 A3 A4

Combinations are used to counting the sample space because sampling is done without replacement.

Page 5: 04 ch ken black solution

Chapter 4: Probability 5

6C3 = !3!3

!6 = 20

Probability that one of three is defective is: 12/20 = 3/5 .60

There are 20 members of the sample space and 12 of them have 1 defective part.

4.6 107 = 10,000,000 different numbers

4.7 20C6 = !14!6

!20 = 38,760

It is assumed here that 6 different (without replacement) employees are to be selected.

4.8 P(A) = .10, P(B) = .12, P(C) = .21

P(A _ C) = .05 P(B ⊥ C) = .03 a) P(A ⊥ C) = P(A) + P(C) - P(A _ C) = .10 + .21 - .05 = .26 b) P(B ⊥ C) = P(B) + P(C) - P(B _ C) = .12 + .21 - .03 = .30 c) If A, B mutually exclusive, P(A ⊥ B) = P(A) + P(B) = .10 + .12 = .22 4.9

D E F

A 5 8 12 25

B 10 6 4 20

C 8 2 5 15

23 16 21 60

a) P(A ⊥ D) = P(A) + P(D) - P(A _ D) = 25/60 + 23/60 - 5/60 = 43/60 = .7167 b) P(E ⊥ B) = P(E) + P(B) - P(E _ B) = 16/60 + 20/60 - 6/60 = 30/60 = .5000 c) P(D ⊥ E) = P(D) + P(E) = 23/60 + 16/60 = 39/60 = .6500 d) P(C ⊥ F) = P(C) + P(F) - P(C _ F) = 15/60 + 21/60 - 5/60 = 31/60 = .5167

Page 6: 04 ch ken black solution

Chapter 4: Probability 6

4.10

E F

A .10 .03 .13

B .04 .12 .16

C .27 .06 .33

D .31 .07 .38

.72 .28 1.00

a) P(A ⊥ F) = P(A) + P(F) - P(A _ F) = .13 + .28 - .03 = .38 b) P(E ⊥ B) = P(E) + P(B) - P(E _ B) = .72 + .16 - .04 = .84 c) P(B ⊥ C) = P(B) + P(C) =.16 + .33 = .49 d) P(E ⊥ F) = P(E) + P(F) = .72 + .28 = 1.000 4.11 A = event - flown in an airplane at least once T = event - ridden in a train at least once P(A) = .47 P(T) = .28 P (ridden either a train or an airplane) = P(A ⊥ T) = P(A) + P(T) - P(A _ T) = .47 + .28 - P(A _ T)

Cannot solve this problem without knowing the probability of the intersection. We need to know the probability of the intersection of A and T, the proportion who have ridden both.

4.12 P(L) = .75 P(M) = .78 P(M L) = .61 a) P(M ⊥ L) = P(M) + P(L) - P(M _ L) = .78 + .75 - .61 = .92 b) P(M ⊥ L) but not both = P(M ⊥ L) - P(M _ L) = .92 - .61 = .31 c) P(NM _ NL) = 1 - P(M ⊥ L) = 1 - .92 = .08 4.13 Let C = have cable TV Let T = have 2 or more TV sets P(C) = .67, P(T) = .74, P(C T) = .55 a) P(C ⊥ T) = P(C) + P(T) - P(C _ T) = .67 + .74 - .55 = .86 b) P(C ⊥ T but not both) = P(C ⊥ T) - P(C _ T) = .86 - .55 = .31

Page 7: 04 ch ken black solution

Chapter 4: Probability 7

c) P(NC _ NT) = 1 - P(C ⊥ T) = 1 - .86 = .14 d) The special law of addition does not apply because P(C _ T) is not .0000. Possession of cable TV and 2 or more TV sets are not mutually exclusive. 4.14 Let T = review transcript F = consider faculty references P(T) = .54 P(F) = .44 P(T _ F) = .35 a) P(F ⊥ T) = P(F) + P(T) - P(F _ T) = .44 + .54 - .35 = .63 b) P(F ⊥ T) - P(F _ T) = .63 - .35 = .28 c) 1 - P(F ⊥ T) = 1 - .63 = .37 d)

Y

N

Y .35 .19 .54

N .09 .37 .46

.44 .56 1.00

4.15

C D E F

A 5 11 16 8 40

B 2 3 5 7 17

7 14 21 15 57

a) P(A _ E) = 16/57 = .2807 b) P(D _ B) = 3/57 = .0526 c) P(D _ E) = .0000 d) P(A _ B) = .0000 4.16

D E F

A .12 .13 .08 .33

B .18 .09 .04 .31

C .06 .24 .06 .36

Page 8: 04 ch ken black solution

Chapter 4: Probability 8

.36 .46 .18 1.00

a) P(E _ B) = .09 b) P(C _ F) = .06 c) P(E _ D) = .00 4.17 Let D = Defective part a) (without replacement)

P(D1 _ D2) = P(D1) ⋅ P(D2 D1) = 2450

30

49

5

50

6 =⋅ = .0122

b) (with replacement)

P(D1 _ D2) = P(D1) ⋅ P(D2) = 2500

36

50

6

50

6 =⋅ = .0144

4.18 Let U = Urban I = care for Ill relatives a) P(U _ I) = P(U) ⋅ P(I U) P(U) = .78 P(I) = .15 P(IU) = .11 P(U _ I) = (.78)(.11) = .0858 b) P(U _ NI) = P(U) ⋅ P(NIU) but P(IU) = .11 So, P(NIU) = 1 - .11 = .89 and P(U _ NI) = P(U) ⋅ P(NIU) = (.78)(.89) = .6942 c)

U

Yes No

I

Yes .15

No .85

.78 .22

The answer to a) is found in the YES-YES cell. To compute this cell, take 11%

Page 9: 04 ch ken black solution

Chapter 4: Probability 9

or .11 of the total (.78) people in urban areas. (.11)(.78) = .0858 which belongs in the “YES-YES" cell. The answer to b) is found in the Yes for U and no for I cell. It can be determined by taking the marginal, .78, less the answer for a), .0858.

d. P(NU _ I) is found in the no for U column and the yes for I row (1st row and 2nd column). Take the marginal, .15, minus the yes-yes cell, .0858, to get .0642. 4.19 Let S = stockholder Let C = college P(S) = .43 P(C) = .37 P(CS) = .75 a) P(NS) = 1 - .43 = .57 b) P(S _ C) = P(S)⋅ P(CS) = (.43)(.75) = .3225 c) P(S ⊥ C) = P(S) + P(C) - P(S _ C) = .43 + .37 - .3225 = .4775 d) P(NS _ NC) = 1 - P(S ⊥ C) = 1 - .4775 = .5225 e) P(NS ⊥ NC) = P(NS) + P(NC) - P(NS _ NC) = .57 + .63 - .5225 = .6775 f) P(C _ NS) = P(C) - P(C _ S) = .37 - .3225 = .0475 4.20 Let F = fax machine Let P = personal computer Given: P(F) = .10 P(P) = .52 P(PF) = .91 a) P(F _ P) = P(F) ⋅ P(P F) = (.10)(.91) = .091 b) P(F ⊥ P) = P(F) + P(P) - P(F _ P) = .10 + .52 - .091 = .529 c) P(F _ NP) = P(F) ⋅ P(NP F) Since P(P F) = .91, P(NP F)= 1 - P(P F) = 1 - .91 = .09 P(F _ NP) = (.10)(.09) = .009 d) P(NF _ NP) = 1 - P(F ⊥ P) = 1 - .529 = .471 e) P(NF _ P) = P(P) - P(F _ P) = .52 - .091 = .429

P NP

F .091 .009 .10

NF .429 .471 .90

.520 .480 1.00

4.21 Let S = safety

Page 10: 04 ch ken black solution

Chapter 4: Probability 10

Let A = age P(S) = .30 P(A) = .39 P(A S) = .87 a) P(S _ NA) = P(S)⋅ P(NA S) but P(NA S) = 1 - P(A S) = 1 - .87 = .13 P(S _ NA) = (.30)(.13) = .039 b) P(NS _ NA) = 1 - P(S ⊥ A) = 1 - [P(S) + P(A) - P(S _ A)] but P(S _ A) = P(S) ⋅ P(A S) = (.30)(.87) = .261 P(NS _ NA) = 1 - (.30 + .39 - .261) = .571 c) P(NS _ A) = P(NS) - P(NS _ NA) but P(NS) = 1 - P(S) = 1 - .30 = .70 P(NS _ A) = .70 - 571 = .129 4.22 Let C = ceiling fans Let O = outdoor grill P(C) = .60 P(O) = .29 P(C O) = .13 a) P(C ⊥ O)= P(C) + P(O) - P(C _ O) = .60 + .29 - .13 = .76 b) P(NC _ NO) = 1 - P(C ⊥ O)= 1 - .76 = .24 c) P(NC _ O) = P(O) - P(C _ O) = .29 - .13 = .16 d) P(C _ NO) = P(C) - P(C _ O) = .60 - .13 = .47 4.23

E F G

A 15 12 8 35

B 11 17 19 47

C 21 32 27 80

D 18 13 12 43

65 74 66 205

a) P(GA) = 8/35 = .2286 b) P(BF) = 17/74 = .2297

Page 11: 04 ch ken black solution

Chapter 4: Probability 11

c) P(CE) = 21/65 = .3231 d) P(EG) = .0000 4.24

C D

A .36 .44 .80

B .11 .09 .20

.47 .53 1.00

a) P(CA) = .36/.80 = .4500 b) P(BD) = .09/.53 = .1698 c) P(AB) = .0000 4.25

Calculator

Yes No

Computer

Yes 46 3 49

No 11 15 26

57 18 75

Select a category from each variable and test P(V1V2) = P(V1). For example, P(Yes ComputerYes Calculator) = P(Yes Computer)?

75

49

57

46 = ?

.8070 ≠ .6533 Variable of Computer not independent of Variable of Calculator.

Page 12: 04 ch ken black solution

Chapter 4: Probability 12

4.26 Let C = construction Let S = South Atlantic 83,384 total failures 10,867 failures in construction 8,010 failures in South Atlantic 1,258 failures in construction and South Atlantic a) P(S) = 8,010/83,384 = .09606 b) P(C ⊥ S) = P(C) + P(S) - P(C _ S) = 10,867/83,384 + 8,010/83,384 - 1,258/83,384 = 17,619/83,384 = .2113

c) P(C S) =

384,83

8010384,83

1258

)(

)( =∩SP

SCP = .15705

d) P(S C) =

384,83

867,10384,83

1258

)(

)( =∩CP

SCP = .11576

e) P(NSNC) = )(

)(1

)(

)(

NCP

SCP

NCP

NCNSP ∪−=∩

but NC = 83,384 - 10,867 = 72,517 and P(NC) = 72,517/83,384 = .869675 Therefore, P(NSNC) = (1 - .2113)/(.869675) = .9069

f) P(NS C) = )(

)()(

)(

)(

CP

SCPCP

CP

CNSP ∩−=∩

but P(C) = 10,867/83,384 = .1303 P(C _ S) = 1,258/83,384 = .0151 Therefore, P(NS C) = (.1303 - .0151)/.1303 = .8842 4.27 Let E = Economy Let Q = Qualified

Page 13: 04 ch ken black solution

Chapter 4: Probability 13

P(E) = .46 P(Q) = .37 P(E _ Q) = .15 a) P(EQ) = P(E _ Q)/P(Q) = .15/.37 = .4054 b) P(QE) = P(E _ Q)/P(E) = .15/.46 = .3261 c) P(QNE) = P(Q _ NE)/P(NE) but P(Q _ NE) = P(Q) - P(Q _ E) = .37 - .15 = .22 P(NE) = 1 - P(E) = 1 - .46 = .54 P(QNE) = .22/.54 = .4074 d) P(NE _ NQ) = 1 - P(E ⊥ Q) = 1 - [P(E) + P(Q) - P(E _ Q)] = 1 - [.46 + .37 + .15] = 1 - (.68) = .32 4.28 Let A = airline tickets Let T = transacting loans P(A) = .47 P(TA) = .81 a) P(A _ T) = P(A) ⋅ P(TA) = (.47)(.81) = .3807 b) P(NTA) = 1 - P(TA) = 1 - .81 = .19 c) P(NT _ A) = P(A) - P(A _ T) = .47 - .3807 = .0893 4.29 Let H = hardware Let S = software P(H) = .37 P(S) = .54 P(SH) = .97 a) P(NSH) = 1 - P(SH) = 1 - .97 = .03 b) P(SNH) = P(S _ NH)/P(NH) but P(H _ S) = P(H) ⋅ P(SH) = (.37)(.97) = .3589 so P(NH _ S) = P(S) - P(H _ S) = .54 - .3589 = .1811 P(NH) = 1 - P(H) = 1 - .37 = .63 P(SNH) = (.1811)/(.63) = .2875 c) P(NHS) = P(NH _ S)/P(S) = .1811//54 = .3354

Page 14: 04 ch ken black solution

Chapter 4: Probability 14

d) P(NHNS) = P(NH _ NS)/P(NS) but P(NH _ NS) = P(NH) - P(NH _ S) = .63 - .1811 = .4489 and P(NS) = 1 - P(S) = 1 - .54 = .46 P(NHNS) = .4489/.46 = .9759 4.30 Let A = product produced on Machine A B = product produces on Machine B C = product produced on Machine C D = defective product P(A) = .10 P(B) = .40 P(C) = .50 P(DA) = .05 P(DB) = .12 P(DC) = .08

Event Prior Conditional Joint Revised

P(Ei) P(DEi)

P(D _ Ei)

A .10 .05 .005 .005/.093=.0538 B .40 .12 .048 .048/.093=.5161 C .50 .08 .040 .040/.093=.4301 P(D)=.093

Revise:P(AD) = .005/.093 = .0538 P(BD) = .048/.093 = .5161 P(CD) = .040/.093 = .4301 4.31 Let A = Alex fills the order B = Alicia fills the order C = Juan fills the order I = order filled incorrectly K = order filled correctly P(A) = .30 P(B) = .45 P(C) = .25 P(IA) = .20 P(IB) = .12 P(IC) = .05 P(KA) = .80 P(KB) = .88 P(KC) = .95 a) P(B) = .45 b) P(KC) = 1 - P(IC) = 1 - .05 = .95

Page 15: 04 ch ken black solution

Chapter 4: Probability 15

c)

Event Prior Conditional Joint Revised

P(Ei) P(IEi)

P(I _ Ei) P(EiI)

A .30 .20 .0600 .0600/.1265=.4743 B .45 .12 .0540 .0540/.1265=.4269 C .25 .05 .0125 .0125/.1265=.0988 P(I)=.1265

Revised: P(AI) = .0600/.1265 = .4743 P(BI) = .0540/.1265 = .4269 P(CI) = .0125/.1265 = .0988 d)

Event Prior Conditional Joint Revised

P(Ei) P(KEi)

P(K _ Ei) P(EiK)

A .30 .80 .2400 .2400/.8735=.2748 B .45 .88 .3960 .3960/.8735=.4533 C .25 .95 .2375 .2375/.8735=.2719 P(K)=.8735

4.32 Let T = lawn treated by Tri-state G = lawn treated by Green Chem V = very healthy lawn N = not very healthy lawn P(T) = .72 P(G) = .28 P(VT) = .30 P(VG) = .20

Event Prior Conditional Joint Revised

P(Ei) P(VEi)

P(V _ Ei) P(EiV)

A .72 .30 .216 .216/.272=.7941 B .28 .20 .056 .056/.272=.2059 P(V)=.272

Revised: P(TV) = .216/.272 = .7941 P(GV) = .056/.272 = .2059

Page 16: 04 ch ken black solution

Chapter 4: Probability 16

4.33 Let T = training Let S = small P(T) = .65 P(ST) = .18 P(NST) = .82 P(SNT) = .75 P(NSNT) = .25

Event Prior Conditional Joint Revised

P(Ei) P(NSEi)

P(NS _ Ei) P(EiNS)

T .65 .82 .5330 .5330/.6205=.8590 NT .35 .25 .0875 .0875/.6205=.1410 P(NS)=.6205

4.34

Variable 1

D E

A 10 20

Variable 2 B 15 5

C 30 15

55 40 95

a) P(E) = 40/95 = .42105 b) P(B ⊥ D) = P(B) + P(D) - P(B _ D) = 20/95 + 55/95 - 15/95 = 60/95 = .63158 c) P(A _ E) = 20/95 = .21053 d) P(BE) = 5/40 = .1250 e) P(A ⊥ B) = P(A) + P(B) = 30/95 + 20/95 = 50/95 = .52632 f) P(B _ C) = .0000 (mutually exclusive) g) P(DC) = 30/45 = .66667 h) P(AB)= P(A _ B) = .0000 = .0000 mutually exclusive

P(B) 20/95

Page 17: 04 ch ken black solution

Chapter 4: Probability 17

i) P(A) = P(AD)?? 30/95 = 10/95 ?? .31579 ≠ .18182 No, Variables 1 and 2 are not independent.

4.35

D E F G

A 3 9 7 12 31

B 8 4 6 4 22

C 10 5 3 7 25

21 18 16 23 78

a) P(F _ A) = 7/78 = .08974 b) P(AB) = P(A _ B) = .0000 = .0000 P(B) 22/78 c) P(B) = 22/78 = .28205 d) P(E _ F) = .0000 Mutually Exclusive e) P(DB) = 8/22 = .36364 f) P(BD) = 8/21 = .38095 g) P(D ⊥ C) = 21/78 + 25/78 – 10/78 = 36/78 = .4615 h) P(F) = 16/78 = .20513

4.36

Age(years)

<35 35-44 45-54 55-64 >65

Gender Male .11 .20 .19 .12 .16 .78

Female .07 .08 .04 .02 .01 .22

.18 .28 .23 .14 .17 1.00

a) P(35-44) = .28 b) P(Woman _ 45-54) = .04

Page 18: 04 ch ken black solution

Chapter 4: Probability 18

c) P(Man ⊥ 35-44) = P(Man) + P(35-44) - P(Man _ 35-44) = .78 + .28 - .20 = .86 d) P(<35 ⊥ 55-64) = P(<35) + P(55-64) = .18 + .14 = .32 e) P(Woman45-54) = P(Woman _ 45-54)/P(45-54) = .04/.23= .1739 f) P(NW _ N 55-64) = .11 + .20 + .19 + .16 = .66

4.37 Let T = thoroughness

Let K = knowledge P(T) = .78 P(K) = .40 P(T _ K) = .27 a) P(T ⊥ K) = P(T) + P(K) - P(T _ K) = .78 + .40 - .27 = .91 b) P(NT _ NK) = 1 - P(T ⊥ K) = 1 - .91 = .09 c) P(KT) = P(K _ T)/P(T) = .27/.78 = .3462 d) P(NT _ K) = P(NT) - P(NT _ NK) but P(NT) = 1 - P(T) = .22 P(NT _ K) = .22 - .09 = .13

4.38 Let R = retirement Let L = life insurance P(R) = .42 P(L) = .61 P(R _ L) = .33 a) P(RL) = P(R _ L)/P(L) = .33/.61 = .5410 b) P(LR) = P(R _ L)/P(R) = .33/.42 = .7857 c) P(L ⊥ R) = P(L) + P(R) - P(L _ R) = .61 + .42 - .33 = .70 d) P(R _ NL) = P(R) - P(R _ L) = .42 - .33 = .09 e) P(NLR) = P(NL _ R)/P(R) = .09/.42 = .2143

4.39 P(T) = .16 P(TW) = .20 P(TNE) = .17 P(W) = .21 P(NE) = .20 a) P(W _ T) = P(W)⋅P(TW) = (.21)(.20) = .042 b) P(NE _ T) = P(NE)⋅P(TNE) = (.20)(.17) = .034 c) P(WT) = P(W _ T)/P(T) = (.042)/(.16) = .2625

d) P(NENT) = P(NE _ NT)/P(NT) = {P(NE)⋅P(NTNE)}/P(NT) but P(NTNE) = 1 - P(TNE) = 1 - .17 = .83 and

Page 19: 04 ch ken black solution

Chapter 4: Probability 19

P(NT) = 1 - P(T) = 1 - .16 = .84 Therefore, P(NENT) = {P(NE)⋅P(NTNE)}/P(NT) = {(.20)(.83)}/(.84) = .1976 e) P(not W _ not NET) = P(not W _ not NE _ T)/ P(T) but P(not W _ not NE _ T) = .16 - P(W _ T) - P(NE _ T) = .16 - .042 - .034 = .084 P(not W _ not NE _ T)/ P(T) = (.084)/(.16) = .525

4.40 Let M = Mastercard A = American Express V = Visa P(M) = .30 P(A) = .20 P(V) = .25 P(M _ A) = .08 P(V _ M) = .12 P(A _ V) = .06 a) P(V ⊥ A) = P(V) + P(A) - P(V _ A) = .25 + .20 - .06 = .39 b) P(VM) = P(V _ M)/P(M) = .12/.30 = .40 c) P(MV) = P(V _ M)/P(V) = .12/.25 = .48 d) P(V) = P(VM)??

.25 ≠ .40

Possession of Visa is not independent of possession of Mastercard

e) American Express is not mutually exclusive of Visa

because P(A _ V) ≠ .0000

4.41 Let S = believe SS secure N = don't believe SS will be secure <45 = under 45 years old >45 = 45 or more years old P(N) = .51 Therefore, P(S) = 1 - .51 = .49 P(S>45) = .70 Therefore, P(N>45) = 1 - P(S>45) = 1 - .70 = .30 P(<45) = .57

Page 20: 04 ch ken black solution

Chapter 4: Probability 20

a) P(>45) = 1 - P(<45) = 1 - .57 = .43 b) P(>45 _ S) = P(>45)⋅P(S>45) = (.57)(.70) = .301 P(<45 _ S) = P(S) - P(>45 _ S) = .49 - .301 = .189 c) P(>45S) = P(>45 _ S)/P(S) = P(>45)⋅P(S>45)/P(S) = (.43)(.70)/.49 = .6143 d) (<45 ⊥ N) = P(<45) + P(N) - P(<45 _ N) = but P(<45 _ N) = P(<45) - P(<45 _ S) = .57 - .189 = .381 so P(<45 ⊥ N) = .57 + .51 - .381 = .699 Probability Matrix Solution for Problem 4.41:

S N

<45 .189 .381 .57

>45 .301 .129 .43

.490 .510 1.00

4.42 Let M = expect to save more R = expect to reduce debt NM = don't expect to save more NR = don't expect to reduce debt P(M) = .43 P(R) = .45 P(RM) = .81 P(NRM) = 1-P(RM) = 1 - .81 = .19 P(NM) = 1 - P(M) = 1 - .43 = .57 P(NR) = 1 - P(R) = 1 - .45 = .55 a) P(M _ R) = P(M)⋅P(RM) = (.43)(.81) = .3483 b) P(M ⊥ R) = P(M) + P(R) - P(M _ R) = .43 + .45 - .3483 = .5317 c) P(neither save nor reduce debt) = 1 - P(M ⊥ R) = 1 - .5317 = .4683 d) P(M _ NR) = P(M)⋅P(NRM) = (.43)(.19) = .0817 Probability matrix for problem 4.42:

Reduce

Yes No

Page 21: 04 ch ken black solution

Chapter 4: Probability 21

Save

Yes .3483 .0817 .43

No .1017 .4683 .57

.4500 .5500 1.00

4.43 Let R = read Let B = checked in the with boss P(R) = .40 P(B) = .34 P(BR) = .78 a) P(B _ R) = P(R)⋅P(BR) = (.40)(.78) = .312 b) P(NR _ NB) = 1 - P(R ⊥ B) but P(R ⊥ B) = P(R) + P(B) - P(R _ B) = .40 +.34 - .312 = .428 P(NR _ NB) = 1 - .428 = .572 c) P(RB) = P(R _ B)/P(B) = (.312)/(.34) = .9176 d) P(NBR) = 1 - P(BR) = 1 - .78 = .22 e) P(NBNR) = P(NB _ NR)/P(NR) but P(NR) = 1 - P(R) = 1 - .40 = .60 P(NBNR) = .572/.60 = .9533 f) Probability matrix for problem 4.43:

B NB

R .312 .088 .40

NR .028 .572 .60

.340 .660 1.00

4.44 Let: D = denial I = inappropriate C = customer P = payment dispute S = specialty G = delays getting care

Page 22: 04 ch ken black solution

Chapter 4: Probability 22

R = prescription drugs P(D) = .17 P(I) = .14 P(C) = .14 P(P) = .11 P(S) = .10 P(G) = .08 P(R) = .07 a) P(P ⊥ S) = P(P) + P(S) = .11 + .10 = .21 b) P(R _ C) = .0000 (mutually exclusive) c) P(IS) = P(I _ S)/P(S) = .0000/.10 = .0000 d) P(NG _ NP) = 1 - P(G ⊥ P) = 1 - [P(G) + P(P)] = 1 – [.08 + .11] = 1 - .19 = .81 4.45 Let R = retention P = process P(R) = .56 P(P _ R) = .36 P(RP) = .90 a) P(R _ NP) = P(R) - P(P _ R) = .56 - .36 = .20 b) P(PR) = P(P _ R)/P(R) = .36/.56 = .6429 c) P(P) = ?? P(RP) = P(R _ P)/P(P) so P(P) = P(R _ P)/P(RP) = .36/.90 = .40 d) P(R ⊥ P) = P(R) + P(P) - P(R _ P) = .56 + .40 - .36 = .60 e) P(NR _ NP) = 1 - P(R ⊥ P) = 1 - .60 = .40 f) P(RNP) = P(R _ NP)/P(NP) but P(NP) = 1 - P(P) = 1 - .40 = .60 P(RNP) = .20/.60 = .33 4.46 Let M = mail S = sales P(M) = .38 P(M _ S) = .0000 P(NM _ NS) = .41 a) P(M _ NS) = P(M) - P(M _ S) = .38 - .00 = .38 b) P(S): P(M ⊥ S) = 1 - P(NM _ NS) = 1 - .41 = .59 P(M ⊥ S) = P(M) + P(S) Therefore, P(S) = P(M ⊥ S) - P(M) = .59 - .38 = .21 c) P(SM) = P(S _ M)/P(M) = .00/.38 = .00

Page 23: 04 ch ken black solution

Chapter 4: Probability 23

d) P(NMNS) = P(NM _ NS)/P(NS) = .41/.79 = .5190 where: P(NS) = 1 - P(S) = 1 - .21 = .79 4.47 Let S = Sarabia T = Tran J = Jackson B = blood test P(S) = .41 P(T) = .32 P(J) = .27 P(B S) = .05 P(B T) = .08 P(B J) = .06

Event Prior Conditional Joint Revised

P(Ei) P(BEi)

P(B _ Ei) P(BiNS)

S .41 .05 .0205 .329 T .32 .08 .0256 .411 J .27 .06 .0162 .260 P(B)=.0623

4.48 Let R = regulations T = tax burden P(R) = .30 P(T) = .35 P(TR) = .71 a) P(R _ T) = P(R)⋅P(TR) = (.30)(.71) = .2130 b) P(R ⊥ T) = P(R) + P(T) - P(R _ T) = .30 + .35 - .2130 = .4370 c) P(R ⊥ T) - P(R _ T) = .4370 - .2130 = .2240 d) P(RT) = P(R _ T)/P(T) = .2130/.35 = .6086 e) P(NRT) = 1 - P(RT) = 1 - .6086 = .3914 f) P(NRNT) = P(NR _ NT)/P(NT) = [1 - P(R ⊥ T)]/P(NT) = (1 - .4370)/.65 = .8662 4.49

Event Prior Conditional Joint Revised

P(Ei) P(NSEi)

P(NS _ Ei) P(EiNS)

Soup .60 .73 .4380 .8456 Breakfast

Page 24: 04 ch ken black solution

Chapter 4: Probability 24

Meats .35 .17 .0595 .1149 Hot Dogs .05 .41 .0205 .0396 .5180

4.50 Let GH = Good health HM = Happy marriage FG = Faith in God P(GH) = .29 P(HM) = .21 P(FG) = .40 a) P(HM ⊥ FG) = P(HM) + P(FG) - P(HM _ FG) but P(HM _ FG) = .0000 P(HM ⊥ FG) = P(HM) + P(FG) = .21 + .40 = .61 b) P(HM ⊥ FG ⊥ GH) = P(HM) + P(FG) + P(GH) = .29 + .21 + .40 = .9000 c) P(FG _ GH) = .0000 The categories are mutually exclusive. The respondent could not select more than one answer. d) P(neither FG nor GH nor HM) = 1 - P(HM ⊥ FG ⊥ GH) = 1 - .9000 = .1000